Follow-me Algorithm#

This tutorial tells you how to run the ADBScan-based Follow-me algorithm from EI for AMR using 2D Slamtec* RPLIDAR or Intel® RealSense™ camera input. We used a custom Pengo Robot with Kobuki driver for validation of the algorithm. Find more details of the hardware in Troubleshooting section.

The Intel® RealSense™ camera publishes to camera/depth/color/points topic and RPLidar publishes to /scan topic. The adbscan_follow_me node subscribes to the corresponding topic, detects the obstacle array, and outputs to the cmd_vel topic of type geometry_msg/msg/Twist. This twist message consists of updated angular and linear velocity of the robot to follow the target, which can be subsequently subscribed to a robot-driver.

Run the Follow-me Algorithm with Intel® RealSense™ Camera Input#

  1. Check if your installation has the amr-adbscan and amr-realsense Docker* images.

    docker images |grep amr-adbscan
    #if you have it installed, the result is:
    amr-adbscan
    
    docker images |grep amr-realsense
    #if you have it installed, the result is:
    amr-realsense
    

    Note

    If one or both of the images are not installed, continuing with these steps triggers a build that takes longer than an hour (sometimes, a lot longer depending on the system resources and internet connection).

  2. If one or both of the images are not installed, Intel® recommends installing the Robot Base Kit or the Robot Complete Kit with the Get Started Guide for Robots.

  3. Check that EI for AMR environment is set:

    echo $AMR_TUTORIALS
    # should output the path to EI for AMR tutorials
    /home/user/edge_insights_for_amr/Edge_Insights_for_Autonomous_Mobile_Robots_2023.1/AMR_containers/01_docker_sdk_env/docker_compose/05_tutorials
    

    If nothing is output, refer to Get Started Guide for Robots Step 5 for information on how to configure the environment.

  4. Check for the Intel® RealSense™ camera availability:

    1. Verify that realsense-viewer application is installed. Open the application:

      realsense-viewer
      
    2. Click on “Add Source” icon on the top left corner to reveal a list of available cameras. If cameras are available and working, you are good to go.

    3. Close realsense-viewer

  5. Run an automated yml file that opens the Intel® RealSense™ node and adbscan-follow-me node:

    docker compose -f $AMR_TUTORIALS/adbscan_FollowMe_realsense.tutorial.yml up
    
  6. Initiate robot tracking by standing within 0.5 m in front of the robot. Initial target location is reconfigurable through the init_tgt_loc parameter in followme_adbscan_2D_params.yaml file. Sourcing necessary robot driver files may be required, depending on the system and robot being used.

  7. Use docker exec in a separate terminal to debug and observe the Intel® RealSense™ and adbscan nodes and topic list. You can view the topic messages (e.g. published velocity of the robot):

    docker ps
    # This command will list the running docker containers. Find the container id for adbscan
    
    docker exec -it <container_id_for_adbscan> bash
    
    # Inside the container, you can run the following commands,
    source ros_entrypoint.sh
    ros2 node list
    ros2 topic list
    ros2 topic echo /cmd_vel
    exit
    
  8. To close the app, do the following:

    • Type Ctrl-c in the terminal where you did the up command.

    • Run this command to remove stopped containers:

    docker compose -f $AMR_TUTORIALS/adbscan_FollowMe_realsense.tutorial.yml down
    

Run the Follow-me Algorithm with RPLidar Input#

  1. Check if your installation has the amr-adbscan and amr-rplidar Docker* images.

    docker images |grep amr-adbscan
    #if you have it installed, the result is:
    amr-adbscan
    
    docker images |grep amr-rplidar
    #if you have it installed, the result is:
    amr-rplidar
    

    Note

    If one or both of the images are not installed, continuing with these steps triggers a build that takes longer than an hour (sometimes, a lot longer depending on the system resources and internet connection).

  2. If one or both of the images are not installed, Intel® recommends installing the Robot Base Kit or the Robot Complete Kit with the Get Started Guide for Robots.

  3. Check that EI for AMR environment is set:

    echo $AMR_TUTORIALS
    # should output the path to EI for AMR tutorials
    /home/user/edge_insights_for_amr/Edge_Insights_for_Autonomous_Mobile_Robots_2023.1/AMR_containers/01_docker_sdk_env/docker_compose/05_tutorials
    

    If nothing is output, refer to Get Started Guide for Robots Step 5 for information on how to configure the environment.

  4. Check for the RPLidar availability:

    1. Verify the udev rules that you configured for RPLIDAR in 2D LIDAR and ROS 2 Cartographer. Get the Slamtec* RPLIDAR serial port:

      dmesg | grep cp210x
      
    2. Check for similar logs:

    usb 1-3: SerialNumber: 0001
    cp210x 1-3:1.0: cp210x converter detected
    usb 1-3: cp210x converter now attached to ttyUSB0
    
    1. Export the port:

    export RPLIDAR_SERIAL_PORT=/dev/ttyUSB0
    # this value may differ from system to system, use the value returned in the previous step
    
  5. Run an automated yml file that starts the LIDAR node and adbscan-follow-me node:

    docker compose -f $AMR_TUTORIALS/adbscan_FollowMe_rplidar.tutorial.yml up
    
  6. Initiate robot tracking by standing within 0.5 m in front of the robot. Initial target location is reconfigurable through the init_tgt_loc parameter in followme_adbscan_2D_params.yaml file. Sourcing necessary robot driver files may be required, depending on the system and robot being used.

  7. Use docker exec in a separate terminal to debug and observe the lidar and adbscan nodes and topic list. You can view the topic messages (e.g. published velocity of the robot):

    docker ps
    # This command will list the running docker containers. Find the container id for adbscan
    docker exec -it <container_id_for_adbscan_from_prev_step> bash
    
    # Inside the container, run the follow commands,
    source ros_entrypoint.sh
    ros2 node list
    ros2 topic list
    ros2 topic echo /cmd_vel
    exit
    
  8. To close this, do the following:

    • Type Ctrl-c in the terminal where you did the up command.

    • Run this command to remove stopped containers:

    docker compose -f $AMR_TUTORIALS/adbscan_FollowMe_rplidar.tutorial.yml down
    

How to Configure Input Parameters#

Parameters can be configured in the followme_adbscan_RS_params.yaml and followme_adbscan_2D_params.yaml files located in ${CONTAINER_BASE_PATH}/01_docker_sdk_env/artifacts/01_amr/amr_generic/param/ folder. Some of the configurable parameters include:

  • Initial target location: init_tgt_loc

  • Maximum linear velocity of the robot: max_linear

  • Maximum angular velocity of the robot: max_angular

Troubleshooting#

  1. ROS_DOMAIN_ID or CONTAINER_BASE_PATH not found error:

    Remember to export these environment variables before starting the Docker* container. You can use any number between 0 and 101 (inclusive), to set ROS_DOMAIN_ID, as long as it is not used by a different ROS system. Run the following commands before launching the docker-compose yml file:

    source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
    export CONTAINER_BASE_PATH=`pwd`
    export ROS_DOMAIN_ID=12
    
  2. Hardware recommendations:

    We used a custom Pengo Robot with Kobuki driver for validation of the follow-me algorithm. It has an RPLidar on top and four Intel® RealSense™ cameras installed on the side. Select the front Intel® RealSense™ camera for accurate detection of the target frame. We used Intel® TigerLake processor (8 Cores, 15 GB Memory) as the ROS development platform of the Pengo Robot.

To watch demos of the follow-me algorithm on our Pengo Robot, check out the videos

Demo of ADBScan follow me algorithm with realsense camera

Demo 1 of ADBScan follow me algorithm with LIDAR

Demo 2 of ADBScan follow me algorithm with LIDAR

Demo of ADBScan follow me algorithm with realsense for a moving box