Description

This node is designed to perform blob tracking on Point clouds.

Similarly to Blob Tracker which performs blob tracking on CV Image streams, this node will output a CV Image stream displaying the detection and tracking of blobs, as well as an array containing the tracking information for each blob.



Supported file formats :


Properties

An indicator shows if the system is Calibrated or not. You can reset the calibration with the Reset All button.

If the status is uncalibrated the node will not be able to track blobs and output a CV Image stream and must be calibrated first. There are two ways to proceed, detailed below.

There are three tabs in the Editor panel :

General

  • UID : A unique identifier for each Point Cloud to Blob node generated automatically when you create the node on your graph. This is useful to copy calibration data, as shown below.
  • Tracking interval (ms) : Displays the time taken to process each frame by the tracking algorithm.
    This value should stay below the refresh rate of the sensor used for detection or some precision may be lost.
  • Cycle interval (ms) : Displays the time taken to process input data and send it to the tracking algorithm.
    If this value is too high data is not being processed and sent fast enough to the tracking algorithm and some precision may be lost.
  • CV Image : Select the display desired in the CV Image output.
    Default : Detection & Tracking
    • Detection : Shows only detection data. Detected blobs are highlighted with a red circle in the image.
    • Tracking : Shows only tracking data. Detected blobs are highlighted each using a different random color and the bounding box and orientation (with an arrow) are displayed
    • Detection & Tracking : Shows detection data and tracking data side by side on the output.

Calibration

Calibrating this node is mandatory for it to function properly.

Click the Calibrate button to begin calibration. A dialog box will open and you can either choose to calibrate From stream using the live feed of your 3D sensor or From file using a previous calibration made using the From stream option.

When calibrating from stream using the live feed of the sensor, please take care to avoid any movement in the tracking field to avoid errors and false detections.

The calibration algorithm will scan the point cloud input stream and automatically detect planes in your scene. These planes will be listed by order of their size (number of points). Theoretically in your setup the plane represented by the highest number of points should correspond to the one on which you want blob detection to occur (floor, wall or any other plane of interest).

Selecting the planes in the list on the left will highlight them in red the point cloud on the right.

Once you choose a plane as your reference, the node will compute an orthographic projection of that plane that will be output as a CV Image stream and perform blob tracking on it.

To recall a previous calibration using the From file option, point to the folder containing the calibration data you would like to recall. These are stored in folders bearing the names of the UID of the nodes and are located in your Show’s folder, inside the ‘.log’ subfolder.

Restraining detection zones:

It is possible to restrain the detection zone in a given volume by clicking on the Mask button. This is a useful feature to exclude zones that are in the field of view of your LIDAR but that are not part of your setup. Moving objects or people inside this zone can then be ignored by the algorithm and not be part of the tracking data array.

The Mask button opens a new window in which simple volumes (cube or cylinder) can be selected in a dropdown menu and then created by clicking on the button. They can then be translated using the TX, TY and TZ fields, rotated using the RX, RY and RZ fields and scaled by modifying the shapes properties visible when the line is selected.

You can toggle the Invert checkbox to invert the volume to exclude points that are outside the shape instead of points that are inside.

Note that volumes can be nested in each other, making it possible to exclude regions inside an already restrained volume of detection.

The preview on the right shows a color-coded live view of the point cloud with red being the points that will be considered by the algorithm and blue being the points that will be ignored by the algorithm.

Mask mode window to restrain point cloud tracking to a given volume.Mask mode window to restrain point cloud tracking to a given volume.
  • Max points : Maximum amount of points to consider in the point cloud. If the sensor outputs point cloud data that exceeds this number the data will be downscaled automatically to respect this value.
    Default : 100000
  • Min percentage (%) : Minimum percentage of the total amount of points to be considered a plane during calibration process.
    Default : 4

Tracking

Tracking parameters :

  • BG mode : Choose whether or not and how to detect background in the CV Image stream.
    Default : Auto
    • Auto : The background is continuously being detected in the CV Image stream.
    • Learning : Launch a dialog box to manually create a CV Image of the background using the CV Image stream of your scene without any moving objects in it.
  • Learn BG : (Only if BG Mode is Learning) Click on the Start button to open a Dialog box to begin learning the background from the CV Image stream computed by the node.
  • Grid size (cm) : The approximate size that one pixel corresponds to in the field of tracking
    Default : 5
  • Intensity Tracking : Select the size of the window in which to use the intensity of pixels to compute and correct tracking data. Using higher values is more resource intensive without necessarily being more precise, depending on your setup.
    Default : None
    • None : Intensity Tracking is not used.
    • Small : 7-15 pixels window size.
    • Medium : 15-31 pixels window size.
    • Large : 31-63 pixels window size.
    • Extreme : 63-125 pixels window size.
  • Merge : Toggle to activate/deactivate the merging of blobs close to one another.
    Default : ON
  • Merge Distance : (Only if Merge is ON) Set the distance (in cm) between two blobs under which they will be considered the same blob.
    Default : 30
  • Max keep time : Sets how long (in milliseconds) the blob should be kept in the output once it is not detected anymore.
    Default : 500
  • Max Distance : Set the maximum distance (in cm) a blob is allowed to move from one frame to the next and still be considered the same blob.
    Default : 50
    .
  • Tracking History : Set the number of frames to keep in buffer to compute predictions & measure corrections. Setting this too high will consume more ressource with little gain on accuracy.
    Default : 10
  • Weight of predictions : Sets a threshold for sensor noise suppression for moving objects. Higher values may introduce latency in the tracking.
    Default : 0.5

Detection parameters :

  • Minimum height (m) : Sets the minimum height (in meters) for an object to be considered a blob. This threshold helps eliminate noise in sensor data near the elevation level of the plane used as reference.
    Default : 0.2
  • Filter by Area : Toggle to activate/deactivate the filtering by area. Blobs detected with an area lesser or greater than the Min/Max range will no longer be considered as blobs
    Default : ON
  • Min Area : Minimum area (in pixels) for a blob to be considered a blob.
    Default : 12.56
  • Max Area : Maximum area (in pixels) for a blob to be considered a blob.
    Default : 314

Inputs

Name Type Description
Pointcloud Point Cloud Input Point cloud stream to perform blob tracking on

Outputs

Name Type Description
CvDebugImage CV Image Output CV Image of orthographic projection of chosen reference plane with Detection and Tracking overlay possibilities
Array Array Output array containing all the tracking information for each blob detected

Example




In this example all properties were left to default values.

Need more help with this?
Don’t hesitate to contact us here.

Thanks for your feedback.