Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 4 Next »

This blog is mostly driven by the ACFR software team. We plan to post on the new features that we continuously roll out in comma, snark, and other ACFR open source repositories (https://github.com/acfr), and occasionally on more general software topics.
How-to articles
A finite-state machine can be implemented in a few minutes on the command line or in a bash script using csv-join. Assume we have the following state machine: It has the following events and states: events: close open sensor closed sensor opened states: opened closing closed opening The state transition table can be expressed in a csv file state-transition.csv: # event,state,next_state $ cat state-transition.csv close,opened,closing close,opening,closing open,closing,opening open,closed,…
view-points --pass-through
view-points has gained a new option --pass-through (or --pass for short) that allows it to become part of a processing pipeline. The basic usage is: $ cat data.csv | some-operation | view-points --pass | some-other-operation | view-points --pass > output.csv or alternately: $ cat data.csv | some-operation | view-points "-;pass" "otherdata.csv" | some-other-operation | view-points "-;pass" > output.csv When multiple data sources are viewed only one can be given the pass option.…
Exposing classes and functions defined in C++ libraries to Python is now possible in comma by creating C++/Python bindings with Boost.Python http://www.boost.org/doc/libs/1_61_0/libs/python/doc/html/index.html. Example To illustrate this new capability, bindings for the C++ class format and its member function size() declared in csv/format.h have been defined in python/comma/cpp_bindings/csv.cpp: // python/comma/cpp_bindings/csv.…
In your bash scripts, when you are inside a loop, do not declare local array variables. Try to run the following script and observe time per iteration time grow linearly: #!/bin/bash num=${1:-1000} # How many iterations before report elapsed time A=$(date +%s%N); # Timestamp to nanoseconds iteration=0 function do_something() { while true; do (( ++iteration )) sleep 0.001 # Pretend to do some work local my_array=( 1 ) # create a local array # Report elapsed time if (( !…
We just have added support of Velodyne Puck (VLP-16) to velodyne-to-csv utility. Simply run velodyne-to-csv --puck with all other options specified as usual. For example, suppose your VLP-16 publishes its data over UDP on port 2368. Then you could get the individual points, e.g. as: > udp-client 2368 --timestamp | velodyne-to-csv --puck | head 20160609T072201.492662,1,1275,0,0,0,7.40179190377,-3.05843657996,0.139793775563,1.96263183445,8.01,1,1 20160609T072201.492664,2,3060,0,0,0,1.02118837253,…
csv-eval can now be used to update csv stream values in place. Simply assign new values to input stream fields. For example, $ ( echo 1,0.1,cat; echo 2,0.01,dog ) 1,0.1,cat 2,0.01,dog $ ( echo 1,0.1,cat; echo 2,0.01,dog ) | csv-eval --fields=,x --format=,d 'x = x**2' 1,0.01,cat 2,0.0001,dog $ ( echo 1,0.1,cat; echo 2,0.01,dog ) | csv-to-bin ui,d,s[3] | csv-eval --fields=,x --binary=ui,d,s[3] 'x = x**2' | csv-from-bin ui,d,s[3] 1,0.01,cat 2,0.0001,…
Bash Trap Gotchas
Traps (signal handlers) are useful for cleaning up resources, but have some unexpected quirks in bash. Multiple EXIT Traps You would hope that the following would call f1 then f2 on exit: function f1() { echo "one" >&2; } function f2() { echo "two" >&2; } trap f1 EXIT # nope! trap f2 EXIT ... but only f2 is called, since a new EXIT trap replaces an existing one. This is a particular problem when two EXIT traps are widely separated; for example,…
If you find yourself with a csv file with an uneven number of fields in each line a new option for csv-fields may help. csv-fields make-fixed will make every line have the same number of fields by adding fields to short lines or, with the --force option, stripping fields from long lines. For example: $ { echo "a,b,c,d"; echo "x,y,z"; } | csv-fields make-fixed --count=6 a,b,c,d,, x,y,z,,,$ { echo "a,b,c,d"; echo "x,y,z"; } | csv-fields make-fixed --count=3 --force a,b,c x,y,…
If you would like to make an extensive test suite more structured, rather than having all the checks in one flat expected file, i.e. you want to run test once, but check many well-structured suites of test cases. Now, instead of file, you can have a directory called expected. For example, suppose,…
The existing points-grep utility has been given a major facelift. Given a shape as a set of planes, e.g. a bounding box or a hull of a moving vehicle, it greps points that belong to that shape from a streamed point cloud. Let's make a dataset: just fill a cube with points: > for i in $( seq -5 0.5 5 ) ; do for j in $( seq -5 0.5 5 ) ; do for k in $( seq -5 0.5 5 ) ; do echo $i,$j,$k ; done ; done ; done > cube.csv Let's cut a slice from the cube, where 1,1,1,0.5 is 1,1,…
Default values of variables used in the formulas evaluated by csv-eval can now be specified by --default-values option. For instance, $ ( echo a,10 ; echo b,20 ) | csv-eval --fields=,y "a=x+y" --default-values="x=1;y=2" a,10,11 b,20,21 assigns default values to x and y. Since y is present in the input stream as specified by --fields, its default is ignored. On the other hand, x is not in the input stream and, therefore, its default value is used in the formula.…
A new functionality has recently been added to bash-related comma utilities that allows specifying default values of command line options where the options are defined.  The following code uses command line option --filename. The script defines a bash variable called filename, whose value is specified by the option on the command line. However, if the option is not given on the command line, the default value will be used.  #!/bin/bash   function options_description { cat <<END --filename=[<filename>]; default=example.</<END --filename=[<filename>…
An application publishes data, and one or more clients are listening. How do you solve these issues? The application publishes infrequently, but a client would like the data more often; The application is waiting for more input, but a client (which perhaps has just connected and thus might have missed the previous output) would like to know the last output line published. Enter csv-repeat. csv-repeat will pass stdin to stdout, repeating the last record after a period of inactivity.…
view-points can be used now to display triangles, which may be useful, if you would like to quickly visualize a triangulated surface without converting it into a CAD model. Draw triangles: > ( echo 0,0,0,1,1,1,1,0,0,0 ; echo 0,0,0,1,1,1,0,1,0,1 ; echo 0,0,0,1,1,1,0,0,1,2 ) | view-points "-;shape=triangle;fields=corners,id" Draw filled triangles: > ( echo 0,0,0,1,1,1,1,0,0,0 ; echo 0,0,0,1,1,1,0,1,0,1 ; echo 0,0,0,1,1,1,0,0,1,2 ) | view-points "-;shape=triangle;fields=corners,…
If you have a single image or an image stream and would like to apply to each image a mask generated from the image itself, you could use the mask filter. The mask filter allows you to run a pipeline of almost any filters available in cv-cat on an image to generate the mask and then apply it to the original image. The benefits of the mask filter are more obvious when it is used on an image stream rather than a single image,…
When processing binary fixed-width data,  comma and snark utilities use byte order of the computer on which they run. E.g. on most of the desktops (with x86 architectures), byte order is little endian, but ARM computers will have big endian byte order. If you have fixed-width data (e.g. from some external party or a device) that have endianness (byte order) different from your computer. There is a number of ways to deal with it at various levels (e.g. using htoi()-style functions,…
Introduction Just another example how comma and snark utilities could be combined to cobble together something that works and easily could be further polished, all in matter of minutes. Assume, you have information about the terrain and would like to find path from A to B. The following shows how the first cut of it could be done in a few command lines. It uses graph-search utility with distance as objective function,…
If you have csv records with multiple keys and would like to assign unique ids to those records, you could use csv-enumerate. (In particular, it would help to overcome the current limitation of csv-calc, which cannot handle multiple id fields.) csv-enumerate appends id to the input string. For example: > ( echo 20170101T000000,hello ; echo 20170101T000000,world ; echo 20170101T000001,hello ; echo 20170101T000000,world ) | csv-enumerate --fields ,greeting 20170101T000000,hello,0 20170101T000000,…
This blog entry described a pretty subtle bug that leads to unexpected behaviour when handling PIPE signal in C++. First, a brief reminder of how and when the PIPE signal is used. Assume we have a pipeline of commands: command | head -n 2 The commands generate and process textual output, but we take only the first 2 lines. Once head received two lines of output, it terminates. On the next write standard output of command has no recipient,…
This post outlines how to run a bash function in parallel using xargs. (Note, you could optionally use "parallel" instead of xargs, but I see no advantages/disadvantages at this stage) It may not be the best way, or the right way, and it may have unforeseen consequences, so I'd welcome any feedback on better practice. Rationale We often run scripts with for or while loops. In the simplest case if the operation within the loop is self contained, it's very easy to make it parallel. E.g.…
You can run shell commands directly from matlab using the "!" follwed by any standard shell command, or by running the function "system", which gives greater control of what you run and how you get the output. Common usage is: [status,result] = system('command') and if you want to inject matlab variables into the command, nest an "sprintf" command in there system(sprintf(...))   This worked fine on earlier versions of matlab for me, but in the latest 2017 version,…
The problem You are working with a data pipeline, and on a certain record, you want to end processing and exit the pipeline. But to break on some condition in the input, you need an application that parses each input record. Worse, the condition you want could be a combination of multiple fields, or use fields unrelated to the data you want to process. Introducing csv-eval --exit-if! Previously csv-eval had a --select option that passed through any records that matched the select condition.…
points-to-ros and points-from-ros are utilities for publishing and receiving PointCloud2 message on ROS. setup To build them you need to set "snark_build_ros" to ON in snark cmake. we use snark-graphics-test-pattern to generate some sample points in a cube: snark-graphics-test-pattern cube 100000 0.1 0.01 >cube.csv Here is the output: cube.csv To run ROS, you need to setup the environment and run roscore: source /opt/ros/kinetic/setup.…
csv-calc csv-calc is an application to calculate statistics (such as mean, median, size, standard deviation...) on multiple fields of an input file. Input records can be grouped by id, block, or both. One drawback of csv-calc is that it only outputs the statistics for each id and block. The input records themselves are not preserved. This means that you cannot use csv-calc as part of a pipeline. csv-calc --append The --append option to csv-calc passes through the input stream,…
A brief notification on the latest additions to cv-cat (and all other camera applications linking in the same filters). As of today, the application provides access to all the morphology operations available in OpenCV: erosion dilation opening closing morphological gradient top-hat black-hat See OpenCV documentation http://docs.opencv.org/trunk/d9/d61/tutorial_py_morphological_ops.html for more details. In addition, a skeleton (a.k.a.…
This is a brief introduction to cv-cat new filters:   Filter: Accumulated This filter is used to calculate pixel-wise (and channel-wise) average from the sequential series of input images. As it relies on the sequential accumulated input images, this filter is run in serial mode in cv-cat. This as implications when used with 'forked' image processing. However parallel processing is utilised on image rows dimension. Please download the following file which contains a total of 8 images: images.…
csv-shape is a new utility for various operations on reshaping csv data. For now, only one operation is implemented: concatenate: Concatenate by Grouping Input Records > ( echo 1,a; echo 2,b; echo 3,c; echo 4,d; ) | csv-shape concatenate -n 2 1,a,2,b 3,c,4,d Note: For ascii text inputs the records do not have to be regular or even have the same number of fields. Concatenate by Sliding Window ASCII: > ( echo 1,a; echo 2,b; echo 3,c; echo 4,d; ) | csv-shape concatenate -n 2 --sliding-window 1,a,2,…
A quick note on new operations in cv-calc utility. Time does not permit to present proper examples, but hopefully, cv-calc --help would be sufficient to give you an idea. cv-calc grep Output only those input images that conform a certain condition. Currently, only min/max number or ratio of non-zero pixels is supported, but the condition can be any set of filters applied to the input image (see cv-cat --help --verbose for the list of the filters available).…

Space contributors

{"mode":"list","scope":"descendants","limit":"5","showLastTime":"true","order":"update","contextEntityId":40044078}

 

 

  • No labels