Objectives

Part I

Lab 2 Part 1 will ask you to program your robot to perform a few maneuvers using open loop control. The instructions at the end of this assignment are one way to do this but not the only way. You will be graded on whether your code, demo, video, and other submissions meet the specifications below, not on following the instructions exactly.

This lab requires that you create:

  • A new repo for your lab code. Follow the Setting Up instructions to create a new Github repo and share with your lab partner.
  • A new package to hold your open loop control code
  • A node or nodes that perform the following maneuver:
    • Drive straight for 1m then stop for 5 seconds,
    • Turn 90 degrees, drive 1m then stop for 5 seconds,
    • Repeat twice more to make a square.
  • A launch file to start this maneuver
  • Your code must use the Duckietown FSM node to allow joystick control to take over. Unless you want to experiment with the config file for the FSM node (optional), your code should:
    • Publish to the same topic that lane_controller_node sends to car_cmd_switch_node. (Run the lane following demo to find this topic)
    • Listen for the mode switch from FSM node to begin your pattern. It is difficult to run the entire pattern from the FSM callback, be wary about receiving duplicate messages when transferring modes.

Part II (EECE.5560 only)

The goal of this part of the lab is for you to develop a ROS node that estimates robot odometry on the Duckiebot. This odometry will not be exact because the robot will not do exactly what we tell it to do, but let’s try to get close. In this lab, you should:

  • Create one new node in your odometry package (from Homework 6) to calculate the current position of the robot based on its motion. This node should subscribe to either the wheel encoder data (if you have a DB21/DB19) or the wheel velocities (required for DB18, optional for all newer robots) and publish a Pose2D message with the current calculated estimate of the robot’s (x, y, theta) pose. See below for technical details.
  • Create a new launch files that each start your odometry node with the pattern made in Part 1.
  • Test the pattern on your Duckietown mats at least three times and report the final position that your odometry node calculates for each test. Record the position at each corner of your square by measuring the actual position of the robot and noting the calculated position of the robot.

Submission Instructions

Part I

Each student should submit a PDF (can be identical for both lab partners) to Blackboard with the following information:

  1. Provide a link to your lab repo and the tag that corresponds to this code (2 points)
  2. What node selects which command (joystick or lane following) to run on your Duckiebot? Justify your answer. (2 points)
  3. What do you think car_cmd_switch_node does? (2 points)
  4. What do the inputs to car_cmd_switch_node look like? You can focus on the joystick input here. Move the robot around and watch it change. Describe the components of the message (data types) and their purpose. (2 points)
  5. At some point, the command is converted from linear/angular velocity to commands for the left and right wheels. Which node does this? (2 points)
  6. During Lab 1, you calibrated the wheels. Which node do you think accepts this calibration value and adjusts commands as needed? Justify your answer. (2 points)
  7. What difficulties did you have in tuning your robot to make a pattern? (4 points)
  8. Do you think you could make your robot reliably drive around a circular/oval track like in the lab without any feedback from the camera? Why or why not? (4 points)

Demonstrate the pattern in class. Optionally submit a video to Blackboard.

Part II (EECE.5560 only)

Submit a PDF to Blackboard with the following information.

  1. Provide a link to your repo and the tag that corresponds to this code
  2. Report measured vs calculated (by your odometry node) positions for each leg of three trials of  your square (so 12 total measurements).
  3. A description of any problems you had completing this lab.

Demonstrate odometry of each pattern in class.

Rubric

Part I

5 points: Package creation

25 points algorithm

10 points accuracy at first corner

10 points accuracy at second corner

20 points completing the square

10 points demo

20 points: Answers to questions

Part II (EECE.5560 only)

10 points Algorithm

10 points Implementation

20 points Accuracy (measured versus actual position, different from Part 1)

10 points Demo

Detailed Instructions

Part I

  1. Experiment with your robot to answer the questions above. Start it up, start the lane following demo, and move it around with keyboard_control. Refer to Lab 1 if you have any problems here.
  2. Start your container and create a new package for this lab in the “packages” folder of your repo. Note that you will need to add duckietown_msgs as a dependency. This can be added to the end of the catkin_create_package command that you used before.
  3. Create a new node in that package that publishes a command in the right format for car_cmd_switch_node to accept. Do NOT include the vehicle name in the topic, use a namespace in your launch file as shown below instead. This will allow your code to be run on any vehicle. Select values that make your robot drive ~1m in a straight line. You will have to guess in this first trial, we will test in the next few steps. Make sure that you make this node executable
    $ chmod +x <node_name>
  4. Create a launch file for your node such that it looks something like this:
    <launch>
    
      <group ns="$(env VEHICLE_NAME)"> 
    
        START YOUR NODE HERE
    
      </group>
    
      <include file="$(find fsm)/launch/fsm_node.launch">
    
            <arg name="veh" value="$(env VEHICLE_NAME)"/>
    
            <arg name="param_file_name" value="lane_following"/>
    
        </include>
    
    </launch>

    This will start your node inside the namespace for your robot. Keep <group ns=”$(env VEHICLE_NAME)”> EXACTLY as shown above, it will find your vehicle name.

  5. Commit these files. At this point, you can exit your container as we will do the rest on the robot
  6. Start up your robot. We need to build the container with this new package in it, but we want to build it on the robot this time instead of on your laptop. Run:
    $ dts devel build -H <duckiebot_name>.local --pull

    The first time this command runs it may take a while. To get some additional feedback, you may want to run this first to download the core docker containers.

    For DB18/19:
    $ docker -H <duckiebot_name>.local pull duckietown/dt-core:daffy-arm32v7

    For DB21:
    $ docker -H <duckiebot_name>.local pull duckietown/dt-core:daffy-arm64v8

  7. Now we need to run the code. Start the container on your robot with:
    $ dts devel run -H <duckiebot_name>.local -s -M --cmd bash

    The -s flag copies your repo to the /code folder on the robot so that it can be mounted by the -M flag and you do not need to rebuild for every small change.

  8. Run your code within the container you just started on your robot. You should be able to see that the robot moves with joystick control but your code moves the robot when it is in autonomous mode. If not, debug! You can use
    $ dts start_gui_tools <duckiebot_name>

    to figure out what is going on using standard ROS tools.

  9. Revise your code until it can reliably drive 1m in a straight line then stop. The slides provide additional information for syncing code with your robot.
  10. Experiment to find the right command/timing for the robot to make a 90 degree turn in place.
  11. Follow the above instructions to complete the square pattern. Tune them until they perform acceptably.

Part II (EECE.5560 only)

There are two ways to complete this lab, depending on if you have a DB18 or a DB19/DB21:

DB18 (and later models if desired): Assume that the vehicle is performing the commands exactly as ordered and integrate the wheel velocities provided by the topic.
wheels_driver_node/wheels_cmd. This approach can also be done on the DB19/DB21, it is your choice. You will need to perform some experiments to determine the correspondence between the numbers given here and the actual speed of the wheels. It is highly unlikely that 1 unit of velocity input to the robot is exactly 1 m/s in the real world!

DB19/DB21: Use the wheel encoder data provided by left_wheel_encoder_node/tick and right_wheel_encoder_node/tick. You will need to find the correspondence of ticks to each wheel movement in real-life. You will then need to record each tick as they come in and, likely using a timer. Calculate odometry using both.

Find the correspondence between the data from the Duckiebot (using either method) and real-world measurements. Drive your robot around and record how the values change. You may find that the size of your wheels and the baseline distance between the wheels is important. You can use keyboard control or directly publish to the command topic to make your vehicle drive at a constant speed.

When you complete your node, test it thoroughly. You may find that the robot has a consistent bias in one direction or another. You may need to add a scaling factor to one or more components of the input or output. You are free to do this, as long as the same scaling factors are present in all trials.