Attention

You are viewing an older version of the documentation. The latest version is 2.2.

Follow-me with ADBSCAN on Aaeon Robot

This tutorial tells you how to run the ADBScan-based Follow-me algorithm from Robotics SDK using Intel® RealSense™ camera input. We used a custom Aaeon Robot for validation of the algorithm. The Intel® RealSense™ camera publishes to “/camera/depth/color/points” topic. The adbscan_follow_me node subscribes to the corresponding topic, detects the obstacle array, and outputs to the “/cmd_vel” topic of type geometry_msg/msg/Twist. This twist message consists of updated angular and linear velocity of the robot to follow the target, which can be subsequently subscribed by a robot-driver.

Getting Started

Install Deb package

Install the ros-humble-follow-me-tutorial Deb package from the Intel® Robotics SDK APT repository.

sudo apt update
sudo apt install ros-humble-follow-me-tutorial
Copy to clipboard

Run Demo

Run the following script to launch the Follow Me Application tutorial on the Aaeon robot.

source /opt/ros/humble/setup.bash
/opt/ros/humble/share/tutorial-follow-me/scripts/follow-me.sh
Copy to clipboard

After running above command, you can observe that the robot detects the target if the target is within a tracking radius (~0.5 - 0.7 m) and follows when target person is moving

Note

There are reconfigurable parameters in /opt/ros/humble/share/follow_me_tutorial_aaeon/followme_adbscan_RS_params.yaml file. The user can modify parameters depending on the respective robot, sensor configuration and environments (if required) before running the tutorial. Find a brief description of the parameters in the following table.

Configurable Parameters

Lidar_type

Type of the pointcloud sensor. For Intel® RealSense™ camera inputs, the default value is set to RS

Lidar_topic

Name of the topic publishing pointcloud data

Verbose

If this flag is set to True, detected target object locations will be printed as screen log output.

subsample_ratio

This is the downsampling rate of the original pointcloud data. Default value = 15 (i.e. every 15-th data in original pointcloud is sampled and passed to the core adbscan algorithm)

x_filter_back

Pointcloud data with x-coordinate > x_filter_back are filtered out (positive x direction lies in front of the robot)

y_filter_left, y_filter_right

Pointcloud data with y-coordinate > y_filter_left and y-coordinate < y_filter_right are filtered out (positive y-direction is to the left of robot and vice versa)

Z_based_ground_removal

Pointcloud data with z-coordinate > Z_based_ground_removal will be filtered out.

base, coeff_1, coeff_2, scale_factor

These are the coefficients used to calculate adaptive parameters of the adbscan algorithm. These values are pre-computed and recommended to keep unchanged.

init_tgt_loc

This value describes the initial target location. The person needs to be at a distance of init_tgt_loc in front of the robot to initiate the motor.

max_dist

This is the maximum distance that the robot can follow. If the person moves at a distance > max_dist, the robot will stop following.

min_dist

This value describes the safe distance the robot will always maintain with the target. If the person moves closer than min_dist, the robot stops following.

max_linear

Maximum linear velocity of the robot

max_angular

Maximum angular velocity of the robot

max_frame_blocked

The robot will keep following the target for max_frame_blocked number of frames in the event of a temporary occlusion.

Troubleshooting

  • Failed to install Deb package: Please make sure to run sudo apt update before installing the necessary Deb packages.

  • You may stop the demo anytime by pressing ctrl-C.

  • If the robot rotates more than intended at each step, try reducing the parameter max_angular in the parameter file.

  • For general robot issues, go to: Troubleshooting for Robot Tutorials.

  • If the motor controller board does not start, restart the robot.