Home Robotics/DuckieTown - Intersection Navigation and Vehicle Avoidance/Following
Post
Cancel

Robotics/DuckieTown - Intersection Navigation and Vehicle Avoidance/Following

This post is the fourth part of the Robotics/DuckieTown series.

All code in this lab is heavily taken and adapted from the duckietown demos found in dt-core [1].

See my github repository for the code in this document.

Introduction

Our next step in implementing autonomous driving in DuckieTown is to get our DuckieBots to successfully navigate through intersections, avoid rear-end collisions, and follow other DuckieBots around town. When navigating through intersections, we will use apriltags to denote which ways the DuckieBots can turn. We’ll need to use signal lights to ensure that other DuckieBots know which direction we’re turning as well. For collision avoidance and DuckieBot following, we’re going to detect DuckieBots driving ahead of us using the dotted pattern on the back of the robots.

We’re going to be adding these three features to our existing codebase which we implemented in the previous post. If you haven’t already read that post, I highly recommend reading it to get a sense of where we are.

Before continuing though, I want to briefly describe how we are going to be implementing DuckieBot following. We’re going to be using three different elements to mimic DuckieBot following: lane following, intersection navigation, and collision avoidance.

We are going to assume that we are only ever following a DuckieBot in DuckieTown and that the bot we are following will always follow the rules of the road. Our approach is as follows:

  • Perform regular lane following
  • If we approach a DuckieBot, stop behind it at a safe distance without crashing into it
  • Stop at intersections and turn:
    • If there is no other DuckieBot in sight, take a random turn
    • If there is a DuckieBot in sight, turn the same way it turns

If we follow these few rules, then we can in effect follow another DuckieBot (give our assumptions). Before, I said that we are actually mimicking DuckieBot following, and this is because of the two major limitations of this approach:

  1. This implementation will not work outside of DuckieTown.
  2. This implementation will not work if the DuckieBot we are following is not following the rules of the road.

Intersection Navigation

First, we going to discuss how to implement intersection navigation in DuckieTown. We’re going to be basing our implementation off the dt-core [1] implementation, but before we discuss this, we need to first be able to use signal lights.

Signal Light Aliases

The implementation of signal lights is fairly easy! You may have used the following command to change the LED lights on the DuckieBot before:

1
$ rosservice call /ROBOT_NAME/led_emitter_node/set_pattern "{pattern_name: {data: COLOUR}}"

where ROBOT_NAME is the name of your robot and COLOUR is one of RED, GREEN, BLUE, WHITE, or LIGHT_OFF. Where do these names come from, and how can we add new colours which can be used with the previous service call?

These aliases are defined in a configuration file which can be found in config/led_emitter_node/LED_protocol.yaml file in the led_emitter package. For example, here’s the definition of the BLUE alias in this file:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
LED_protocol:

  colors: # All triplets should be ordered [R,G,B]
    switchedoff: [0,0,0]
    white: [1,1,1]
    green: [0,1,0]
    red: [1,0,0]
    blue: [0,0,1]
    yellow: [1,0.8,0]
    purple: [1,0,1]
    cyan: [0,1,1]
    pink: [1,0,0.5]

    #...

    signals:
        BLUE:
          color_mask: []
          color_list: "blue"
          frequency_mask: []
          frequency:  0

First, this file defines a colors map. Each key in this map is a colour name, and the associated value is the RGB values (as an array) for this named colour. For example, we have colors["blue"] = [0, 0, 1].

Second, the file defines a list of signals (or aliases). These signals are set LED patterns which can be called with the /ROBOT_NAME/led_emitter_node/set_pattern service. We can see that the alias BLUE is defined by a few things:

  1. The color_list field defines what colour to set the LED lights on the DuckieBot. If you give a single value (string) as done here, then all lights are set to that colour. You can also give a list of colour names, one for each LED on the DuckieBot (5 in our case). If you do so, then each LED is set to the specified colour.
  2. The frequency field outlines how fast the LEDs blink in Hz.
  3. The frequency_mask is a list of Boolean (0/1) values, one for each LED on the DuckieBot (5 in our case). Each LED corresponds to one of the elements of this array. If the associated array element is 1, then that LED will blink with the frequency specified by the frequency field. If the associated array element is 0, then that LED will not blink.

Let’s add aliases for normal driving, for signaling left, for signaling right, and for signaling that the robot is continuing straight through the intersection. To do so, add the following signals to the signals list in the LED_protocol.yaml file:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
CAR_DRIVING:
  color_mask: []
  color_list: ["white","white","white","red","red"]
  frequency_mask: [0,0,0,0,0]
  frequency:  0

CAR_SIGNAL_RIGHT:
  color_mask: []
  color_list: ["white","white","yellow","yellow","red"]
  frequency_mask: [0,0,1,1,0]
  frequency: *f1

CAR_SIGNAL_LEFT:
  color_mask: []
  color_list: ["yellow","white","white","red","yellow"]
  frequency_mask: [1,0,0,0,1]
  frequency: *f1

CAR_SIGNAL_STRAIGHT:
  color_mask: []
  color_list: ["green","green","green","green","green"]
  frequency_mask: [0,0,0,0,0]
  frequency: *f1

Now, we can use CAR_DRIVING, CAR_SIGNAL_RIGHT, CAR_SIGNAL_LEFT, and CAR_SIGNAL_STRAIGHT as aliases in calls to the /ROBOT_NAME/led_emitter_node/set_pattern service!

When the car is driving normally, we’ll use the CAR_DRIVING LED pattern, where both front LEDs are white and both back LEDs are red. We can signal right/left with the CAR_SIGNAL_{RIGHT/LEFT} LED patterns. If turning right, this LED pattern will set both LEDs on the right of the DuckieBot to yellow and blink them, and similarly if turning left. We’ll use CAR_SIGNAL_STRAIGHT to signal that the DuckieBot is travelling straight through the intersection.

To test that this code is indeed working, build and run this node on your DuckieBot. Then, try calling:

1
2
3
4
$ rosservice call /ROBOT_NAME/led_emitter_node/set_pattern "{pattern_name: {data: CAR_DRIVING}}"
$ rosservice call /ROBOT_NAME/led_emitter_node/set_pattern "{pattern_name: {data: CAR_SIGNAL_RIGHT}}"
$ rosservice call /ROBOT_NAME/led_emitter_node/set_pattern "{pattern_name: {data: CAR_SIGNAL_LEFT}}"
$ rosservice call /ROBOT_NAME/led_emitter_node/set_pattern "{pattern_name: {data: CAR_SIGNAL_STRAIGHT}}"

and check that the LEDs change to the appropriate colours and blink as expected.

A Note on the Performance of the LED Emitter Node

The code for the LED Emitter Node in the led_emitter package is take directly from dt-core [1]. Nevertheless, it doesn’t run quite as smoothly as one might hope.

When calling the service, I noticed that sometimes, the LEDs would be set to seemingly random colours and blink speeds. Often, most LEDs would be set to the correct colour, but some LEDs would be turned off or not blink when they were supposed to. Probably the strangest behaviour happened once when I called

1
$ rosservice call /ROBOT_NAME/led_emitter_node/set_pattern "{pattern_name: {data: CAR_SIGNAL_LEFT}}"

and the LEDs were set to pink, green, orange, and blue, with the back left and front right LEDs blinking.

The LED service is also incredibly slow sometimes! You might have to wait multiple seconds before the LEDs are changed successfully. This can actually slow down the whole program running on your robot. When a node calls a service, it needs to wait for a response before continuing its execution. Because of this, you should be careful on the order of operations you use in your nodes! If you need to quickly adapt to sensor readings, you shouldn’t call the LED changing service in between detecting the sensor reading and issuing the response to this detection. I found that my DuckieBot ran much more smoothly without changing the LEDs.

A Brief Overview of Intersection Navigation in dt-core

Before we talk about our implementation of intersection navigation, I want to discuss how intersection navigation is implemented in dt-core [1]. The implementation in dt-core is both complicated and elegant. I tried to get the code from dt-core working, but unfortunately I wasn’t successful. This was especially strange because our lane following code was taken directly from dt-core and should integrate seamlessly with the intersection navigation code (the indefinite_navigation DuckieTown demo uses both lane following and intersection navigation).

I think the reason why I couldn’t get the intersection navigation code to work with the lane following code from dt-core is because recent changes to dts, dt-duckiebot-interface, and dt-core have introduced a lot of issues with the DuckieBots (in general, the DuckieTown people are more than happy to push changes which don’t pass all their unit tests and which break the DuckieBots, this has happened a number of times since I started writing these blog posts).

As previously stated, the indefinite_navigation DuckieTown demo uses both lane following and intersection navigation. How does this work? Basically, the DuckieBot uses the lane following code we implemented in the previous post. Upon approaching a stop line, a node from the stop_line_filter package broadcasts along a topic that a stop line is approaching, providing the distance to the stop line. The DuckieBot’s velocity is slowly decreased (by the LaneControllerNode in the lane_control package) until it reaches the stop line, at which point it stops. The DuckieBot will then decide which direction to turn based on an apriltag which denotes the type of intersection.

Turning is then implemented in the following way:

  1. The LaneControllerNode continues “regular” lane following but adapts the angular velocity of the DuckieBot by adding to it the value of a ROS parameter called /ROBOT_NAME/lane_controller_node/omega_ff.
  2. Based on the direction that the DuckieBot is turning, a node from the unicorn_intersection sets the turning speed (using the ROS parameter ROBOT_NAME/lane_controller_node/omega_ff) then sleeps for a set amount of time while the turn completes. After the node is done sleeping, it resets ROBOT_NAME/lane_controller_node/omega_ff to 0.
  3. The colours of the stop lines are changed from red to white. This makes the turn (especially left turns) easier since the DuckieBot can more easily hang along the right side of its lane. Unfortunately, I can no longer remember which file this happens in.

This is how the dt-core implementation is supposed to work [1]. Unfortunately, as I said before, a number of bugs have been introduced by recent commits to dts. For example, the unicorn_intersection package sets a ROS parameter called /ROBOT_NAME/lane_controller_node/omega_ff which defines the turning speed (initialized to 0) based on the turning direction. For example, when the DuckieBot should not turn, the unicorn_intersection package ensures this parameter is 0; when the DuckieBot should turn, the unicorn_intersection package ensures this parameter is non-zero.

The lane following code in the lane_control package is supposed to turn the DuckieBot using the turning speed defined by the /ROBOT_NAME/lane_controller_node/omega_ff parameter. Unfortunately, the implementation of the LaneControllerNode actually only reads this ROS parameter once, when the node is initialized and the parameter has a value of 0 (no turning). The LaneControllerNode should actually dynamically adapt to this ROS parameter using a DTParam object or by re-reading the parameter when turning.

Intersection Navigation Implementation

The dt-core implementation is too complicated for our purposes (we were only given a week to complete this lab, which already didn’t seem like enough time), and so we are going to be implementing a simpler method of intersection navigation. The basic idea is as follows:

  • Perform lane following using our implementation from the previous post
  • Stop at stop lines. When we stop at a stop line, figure out the legal directions to turn using an apriltag and select a direction to turn.
  • To actually perform the turn, set the velocity and angular velocity to apply to the vehicle, $v_i$ and $\omega_i$ respectively. Apply these to the vehicle and sleep for some time $t_i$ while these forces are applied. Here $i \in \{\text{left}, \text{right}, \text{straight}\}$, and the parameters $v_i$, $\omega_i$, and $t_i$ will need to be tuned to your DuckieBot.

Let’s get on to the implementation.

Stopping at Stop Lines

First, we’re going to use the stop_line_filter package from dt-core to detect stop lines. Go ahead and grab that package and add it to the packages directory in your codebase. We’re also going to have to be sure that the node from the stop_line_fitler package gets launched properly, which we can do with the master.launch launch file. As with the previous blog posts, I’m not going to go into too much detail on how to write these launch files. To see how to properly launch this node, see the packages/lab4/launch/master.launch file in my Github repo for this post.

After you’ve downloaded this package and added it to the launch file, your DuckieBot should just slow down and stop at stop lines. That’s all there is to it! Our previous codebase included all of the controller logic for stopping at stop lines. It just needed the StopLineFilter node from the stop_line_filter package to broadcast stop lines on the appropriate topic – /ROBOT_NAME/stop_line_filter_node/stop_line_reading.

Adapting the Controller

The next thing we are going to do is change how the control actions for the DuckieBot are computed. Right now, we use the callback function getControlAction to compute the control action. We want to change this since we are going to be calling rospy.sleep to sleep during a turn, and sleeping in callback functions presents quite a few challenges. To do this, first create a new method for saving the current lane pose of the DuckieBot:

1
2
def savePose(self, pose_msg):
	self.pose_msg = pose_msg

then, in the cbAllPoses callback method, change the call of getControlAction to savePose (line 194 here) and initialize self.pose_msg = LanePose() in the node constructor. This will cause the node to track the DuckieBot’s pose in the lane over time.

Next, we’ll change def getControlAction(self, pose_msg) to def drive(self) and replace any calls to pose_msg with self.pose_msg in the function body. Our new drive method now has all the logic for computing actions to control our DuckieBot, but unfortunately it’s not being called anywhere. We can fix this by adding the following lines to the end of the file (replacing any existing code in the `if name == “main” block):

1
2
3
4
5
6
7
8
9
if __name__ == "__main__":
    rospy.sleep(5)  # Wait for other nodes to start, not super important

    r = 10  # You may need to adapt this for your DuckieBot
    node = LaneControllerNode(node_name="lane_controller_node")
    rate = rospy.Rate(r)
    while not rospy.is_shutdown():
        node.drive()
        rate.sleep()

This will then ensure our drive method is called periodically to control our DuckieBot, while placing a limit on the maximum number of calls per second to be r.

Detecting Intersection Types using Apriltags

Next, we’re going to need to figure out what legal directions we can turn at intersections. We’re going to do this with apriltags, so be sure to place one at each intersection in your DuckieTown.

To detect these apriltags, dt-core provides us with the apriltags package which we’ve become familiar with throughout the course of this series. Go grab that package from dt-core and place it in your packages directory. Once again, you’ll need to launch these nodes, which you can do with the master.launch file. Once again, checkout out this file on my Github repo to see how to properly launch this package. Alternatively, you can reference your implementation from Part 1 of the the previous post.

After we have this apriltags package up and running, detecting intersection types in our LaneControllerNode is quite easy. We’ll add the following lines to the LaneControllerNode constructor:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
self.possible_turns = [0, 1, 2]  # turn IDs: left, straight, right

# This dictionary is a dictionary of apriltag IDs to legal turns, where turn 0
# is left, turn 1 is straight, and turn 2 is right
self.tags_to_turns = {
        63: [0, 1],
        59: [1, 2],
        67: [0, 2],
}

self.current_tag_id = None  # The ID of the current apriltag

# Subscriber to the apriltag detection topic
self.sub_apriltag = rospy.Subscriber(
    "/csc22939/apriltag_detector_node/detections",
    AprilTagDetectionArray,
    self.cbAprilTag,
)

and of course, we’ll also need to implement the cbAprilTag callback function:

1
2
3
4
5
6
7
8
9
10
def cbAprilTag(self, msg):
    if len(msg.detections) == 0:
        # No apriltag detected, reset
        self.current_tag_id = None
        return

    for detection in msg.detections:
        if detection.tag_id in self.tags_to_turns.keys():
            self.current_tag_id = detection.tag_id
            return

This callback function will cache the ID of the intersection apriltag seen from the camera. If more than one apriltag is detected, then the ID of the first one is used. If an apriltag is seen, but its ID is not in the self.tags_to_turns dictionary, then the apriltag is disregarded. Once an apriltag is no longer in view, the self.current_tag_id field is reset.

Next, we’ll add two convenience functions for selecting a random turn given a detected apriltag:

1
2
3
4
5
6
7
8
9
def _available_turns(self):
    avail_turns = self.tags_to_turns.get(self.current_tag_id, [1])
    return avail_turns

def selectTurn(self):
    # Take a random turn from the available turns (based on apriltag
    # detection)
    turn = self._available_turns()
    return np.random.choice(turn)

Calling selectTurn will return a random turn ID given the current apriltag detection. If no apriltag is detected, then the function will return turn ID 1 (straight).

Intersection Navigation

Now, we’re finally going to implement the logic for intersection navigation. All of this logic will be placed in the drive function we just talked about, and any parameters which affect the intersection navigation will be declared in the constructor of the LaneControllerNode.

To begin, let’s go through the parameters we need to add to the constructor.

Because of time constraints, I’ve occassionally hard-coded namespaces (i.e. I used csc22939 in place of ROBOT_NAME). If you use my code, be sure to adapt it to suit your robot.

Signaling

First, we need to be able to signal which direction we are turning. We have three different signals, one for turning left, one for tuning right, and one for turning straight (I know, this isn’t a turn, but implementing intersection navigation this way helps to avoid the lane following code from unexpectedly turning when trying to go straight through an intersection). We’ve implemented these LED signals as aliases to the ~set_pattern service, so let’s cache these aliases in a list:

1
2
3
4
5
self._led_signals = [
    String("CAR_SIGNAL_LEFT"),
    String("CAR_SIGNAL_STRAIGHT"),
    String("CAR_SIGNAL_RIGHT"),
]

Remember how I said that the code works better without using any LED signals? Let’s add a parameter to the constructor which controls whether or not LEDs are in use:

1
2
3
4
5
self.params["~use_LEDs"] = DTParam(
    "~use_LEDs",
    param_type=ParamType.BOOL,
    default=False,
)

You can adapt this parameter at runtime by calling:

1
$ rosparam set /ROBOT_NAME/lane_controller_node/use_LEDs FLAG

where FLAG is true or false. Next, we need to connect to the ~set_pattern service from the led_emitter_node so that we can turn on our signal lights. Also, we should set the LEDs to start in the CAR_DRIVING state:

1
2
3
4
5
6
7
8
9
10
11
12
# Connect to the LED service
rospy.wait_for_service("/csc22939/led_emitter_node/set_pattern")
self.led_svc = rospy.ServiceProxy(
    "/ROBOT_NAME/led_emitter_node/set_pattern", ChangePattern,
)

if self.params["~use_LEDs"].value:
    msg = ChangePatternRequest(String("CAR_DRIVING"))
    try:
        resp = self.led_svc(msg)
    except rospy.ServiceException as e:
        rospy.logwarn(f"could not set LEDs: {e}")

Alright, that’s all there is to set up the LED service. Now, we can easily change the LEDs of the DuckieBot by calling self.led_svc(ChangePatternRequest(self._led_signals[i])) for i ∈ [0, 1, 2].

Turning

The next thing we need to do is to actually set up the parameters needed for turning. Remember how we implement turning, for turn $i$, we will set the velocity of the bot to be $v_i$ and the angular velocity to be $\omega_i$. We will apply these to the DuckieBot and sleep for $t_i$ seconds, after which the turn should be completed.

So, we’re going to need nine different parameters: three velocities, three angular velocities, and three sleep times. You will have to tune each of these parameters yourself, but I’ll use the parameters which I’ve tuned for my robot in the following code blocks.

To set each of these nine turning parameters, we’ll use ROS parameters and DTParam objects which will allow us to change the parameters during runtime, making tuning a bit easier (we’ll be able to skip the build process during tuning, yay!).

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
# Velocity to apply during left turns
self.l_turn_v = DTParam(
    "~l_turn_v", default = 0.3,
)

# Angular velocity to apply during left turns
self.l_turn_omega = DTParam(
    "~l_turn_omega", default = 1.3,
)

# Time to sleep during left turns
self.l_turn_secs = DTParam(
    "~l_turn_secs", default = 1.75,
)

# Velocity to apply during right turns
self.r_turn_v = DTParam(
    "~r_turn_v", default = 0.3,
)

# Angular velocity to apply during right turns
self.r_turn_omega = DTParam(
    "~r_turn_omega", default = -5.0,
)

# Time to sleep during right turns
self.r_turn_secs = DTParam(
    "~r_turn_secs", default = 0.6,
)

# Velocity to apply during straight turns
self.s_turn_v = DTParam(
    "~s_turn_v", default = 0.3,
)

# Angular velocity to apply during straight turns
self.s_turn_omega = DTParam(
    "~s_turn_omega", default = 0.0
)

# Time to sleep during straight turns
self.s_turn_secs = DTParam(
    "~s_turn_secs", default = 2.5,
)

# Store the turning parameters in a list, where the index of the list
# corresponds to the paramters to use for the turn with corresponding turn ID.
# For example, left turns have turn ID 0, so the parameters for left turns are at
# index 0.
self.turn_params = [
    (self.l_turn_v, self.l_turn_omega, self.l_turn_secs),
    (self.s_turn_v, self.s_turn_omega, self.s_turn_secs),
    (self.r_turn_v, self.r_turn_omega, self.r_turn_secs),
]

Now, in order to turn, we just grab the corresponding v and omega parameters and apply those to our DuckieBot for the turn duration turn_secs:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
from duckietown_msgs.msg import Twist2DStamped
import rospy

# Construct the node
node = LaneControllerNode()

# Select which turn to take and get the corresponding turn parameters
turn = node.selectTurn()
v, omega, t = node.turn_params[turn]

# Create the turning command
cmd = Twist2DStamped()
cmd.header.stamp = rospy.Time.now()
cmd.v = v
cmd.omega = omega

# Start the turn
node.pub_car_cmd.publish(cmd)

# Sleep for the duration of the turn
rospy.sleep(t)

# Stop the turn
stop_cmd = Twist2DStamped()
stop_cmd.header.stamp = rospy.Time.now()
node.pub_car_cmd.publish(stop_cmd)
Tuning the Turning Parameters

Our implementation makes tuning the turning parameters easy. To tune the turn parameter with name TURN_PARAM, simply try out a few different values VALUE for this parameter by setting them during runtime with:

1
rosparam set /ROBOT_NAME/lane_controller_node/TURN_PARAM VALUE

Once you run this command, the DuckieBot will automatically use the value of VALUE for TURN_PARAM next time it turns at an intersection.

Putting it Together

To put all of this together, we simply need to adapt our drive function. Unfortunately, the actual implementation of the drive function is quite long, and so I am not going to include it here (you can find it here). Instead, I’ll include the pseudo-code which will give a sufficient picture of how the function works:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
def drive(self):
    # Housekeeping code here

    if self.at_stop_line or self.at_obstacle_stop_line:
        # (1) Stop

        if self.at_stop_line and not self.at_obstacle_stop_line:
            # (2) Use intersection navigation with signal lights
            # (2.1) Stop at the stop line
            # (2.2) Turn on signal lights
            # (2.3) Wait for 3 seconds at the stop line
            # (2.4) Choose turn
            # (2.5) Get the turning parameters
            # (2.6) Apply the velocity and angular velocity
            # (2.7) Sleep for the the turn duration
            # (2.8) Stop turning

    if not self.at_obstacle_stop_line:
        # (3) Use lane following

    # Housekeeping code here

We haven’t yet discussed self.at_obstacle_stop_line. For now, you can think of this as a flag indicating whether or not we are close behind another bot (we’ll be using a virtual stop line to stop behind vehicles in the next section). From the pseudo-code, we can see the basic idea. As long as we are not close behind a bot, we always perform lane following. If we approach a stop line though, we’ll first stop at the stop line and then turn (turning on and off the signal lights as needed) before continuing with lane following. Only if we approach another bot do we ever stop lane following completely.

Vehicle Avoidance

The next step of our DuckieBot following implementation is to avoid rear-ending other DuckieBots. Similarly to our implementation of stopping at stop lines, the implementation of vehicle avoidance here is quite simple because we already have most of the logic implemented in our codebase (gotten from dt-core in the previous post). The overall idea of collision avoidance for us is simple. We’ll have a node detect vehicles in front of us, and if they get too close, then we can publish a virtual stop line to stop our bot from colliding into them.

Once again, head on over to dt-core and grab their vehicle_detection package. This package has two nodes:

  • VehicleDetectionNode detects vehicles using the dotted pattern on the back of DuckieBots. The node publishes the coordinates of each of these circles in the camera optical frame (see the the previous post) along the ~centers topic.
  • VehicleFilterNode estimates the relative pose to the back pattern of a robot (the pose is not published along a topic) and publishes a virtual stop line behind the robot in front of us.

We’ll be using both of these nodes in order to avoid collisions. You can reference this file to see how to launch these two nodes with our codebase.

Once you’ve added this package to your codebase and adjusted your launch files, the code should just work. Your DuckieBot should stop before crashing into other robots in front of it.

Following Vehicles through Intersections

In order to follow a vehicle through an intersection, we need to do two things. First, we need to detect which way the vehicle is turning. Second, we need to tell our DuckieBot to turn the same way.

Detecting another Vehicle’s Turn

To detect which way another vehicle in front of us is turning, what we’re going to do is to detect how their back pattern is changing. If their back pattern drifts to the left, then the vehicle is turning left. If their back pattern drifts to the right, then the vehicle is turning right. If their back pattern does not drift much, then the vehicle is turning straight.

We were only given one week to complete this lab, which felt like not enough time at all. Because of this, I’m detecting vehicle drift in the camera optical frame, rather than in the world frame. This has significant limitations, discussed below.

To detect the drift of a bot, we’re going to use the /ROBOT_NAME/vehicle_detection_node/centers topic which we’ve gotten from the vehicle_detection package we just added. The VehicleDetectionNode sends duckietown_msgs.msg.VehicleCorners messages along this topic which tell the coordinates (in the camera optical frame, where the z-axis points out of the camera) of each of the dots on the back pattern of a robot.

To predict the turn that the bot we are following took, we’re going to get the x-value of each of these dots in the camera optical frame and average them. Then, we’ll use the change in the average x-value to determine which way the bot we are following is turning using the following algorithm:

Algorithm 1: Predicting Turns
Parameters:
$\quad$ Threshold $\mathscr{T} > 0$
$\quad \Delta x_{\text{avg}} = x_\text{avg}^t - x_\text{avg}^{t-1}$, the change in average x-value between two consecutive time steps

Algorithm:
$\quad$ If $|{\Delta x_{\text{avg}}}| < \mathscr{T}$; then
$\qquad \text{turn} \gets 1$
$\quad$ Else
$\qquad \text{turn} \gets \text{sign}(\Delta x_{\text{avg}}) + 1$
$\quad$ return $\text{turn}$

To begin, add the following lines to the constructor of the LaneControllerNode:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
self.prev_veh_avg_x = None     # The previous average x-value
self.followed_veh_turn = None  # The predicted turn that the followed bot took
self.sub_centres = rospy.Subscriber(
    "/ROBOT_NAME/vehicle_detection_node/centers",
    VehicleCorners,
    self._cb_process_veh,
    queue_size=1,
)

# Threshold, if the absolute change in average x-value is less than this
# we assume the followed bot proceeds straight forward rather than turning
self.params["~abs_turning_threshold_camera_frame"] = DTParam(
    "~abs_turning_threshold_camera_frame",
    param_type=ParamType.FLOAT,
    min_value=0.0,
    max_value=640.0,
    default=10.0,
)

Here, we subscribe to the /ROBOT_NAME/vehicle_detection_node/centers topic, as stated earlier. Algorithm 1 above will be implemented in the callback for this subscriber:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
def _cb_process_veh(self, msg):
    sum_x = 0
    total = len(msg.corners)

    # Compute the average x coordinate over all dots in the dotted pattern
    # on the back of the duckiebot, this will be our estimated x position
    # of the duckiebot in the **camera optical frame**.
    if total > 0:
        for corner in msg.corners:
            sum_x += corner.x

        avg_x = sum_x / total

        if self.prev_veh_avg_x is None:
            self.prev_veh_avg_x = avg_x
        else:
            # Compute the change in x position, which tells us which
            # direction the followed duckiebot is moving in the camera
            # frame
            delta = avg_x - self.prev_veh_avg_x

            self.prev_veh_avg_x = avg_x

            # Predict the turn that the vehicle we are following took
            thresh = self.params["~abs_turning_threshold_camera_frame"].value
            if np.abs(delta) > thresh:
                self.followed_veh_turn = np.sign(delta) + 1
            else:
                self.followed_veh_turn = 1

            rospy.loginfo(f"=== Saw turn {self.followed_veh_turn}")

At this point, we should discuss the /ROBOT_NAME/lane_controller_node/abs_turning_threshold_camera_frame parameter. This parameter controls the threshold $\mathscr{T}$ in Algorithm 1. You may need to tune this value to work best with your robot.

One significant limitation of this approach is that it estimates the turn taken by the DuckieBot we are following in the camera optical frame. This means that if our DuckieBot moves, it will affect our prediction. For example, imagine that we are behind a stationary robot and our robot turns left. In this case, $\Delta x_\text{avg} < 0$, and it will seem as if the robot in front of us has turned right, when in reality we have turned left.

You can address this by converting the x coordinates to the world frame first, before running Algorithm 1, using the tf and tf2 libraries [2]. Unfortunately, I ran out of time when completing this lab and could not implement this functionality, but I expect that this would significantly improve the accuracy of this implementation. As it is, this implementation of turn predictions is fairly inaccurate and works best if the DuckieBot has stopped.

If we wanted to stay in the camera optical frame, another implementation would be to check if the followed DuckieBot is on the left, right, or centre of the image and predict a turn based on that. This could be more accurate than my implementation, but I didn’t have enough time to test this.

Following another Vehicle’s Turn

Now, all that’s left to do in order to follow the vehicle’s turn is to update our selectTurn method to take into account whether or not we are tailing another bot:

1
2
3
4
5
6
7
8
9
def selectTurn(self):
    if self.followed_veh_turn is not None:
        # If following a vehicle, then follow it's turn
        return int(self.followed_veh_turn)
    else:
        # Take a random turn from the available turns (based on apriltag
        # detection)
        turn = self._available_turns()
        return int(np.random.choice(turn))

Running the Code

So, what does this all look like when you run it together? I wish I knew!

My lane following code worked flawlessly, which was not super surprising since I got it from dt-core. Unfortunately, right before I could test my implementation fully, when I started my DuckieBot, its RAM usage was nearly 100%. I had to reflash the SD card and update the DuckieBot. Once I did so, my lane following code didn’t work very well at all. The DuckieBot does not stay in its lane, and usually drives along the yellow line in the centre of the lanes. The DuckieBot also has an incredibly difficult time around the corner of DuckieTown, where the lanes curve. The bot usually gets to one of the corners of DuckieTown and then proceeds to turn around in circles forever. I’m not sure exactly what part of the update broke my code, but I’ll blame it on the fact that people are more than happy to push code to the DuckieTown repos that don’t pass all the unit tests or worse, break things completely.

I tried quite hard to fix the lane following implementation so that I could record a great video for this post, but unfortunately I wasn’t able to fix this portion of the code in time for the lab’s deadline.

Furthermore, I only have one DuckieBot and am working from home. I can’t actually record a video of my DuckieBot following another one when I don’t have an additional bot.

Because of these issues, I recorded two different videos.

The first video is a montage of the (un)successful attempts at:

  1. My DuckieBot turning left and right at intersections.
  2. My DuckieBot stopping behind another DuckieBot, where I simulated the second DuckieBot by (1) detaching the dotted back pattern off my DuckieBot and (2) holding that pattern in front of my DuckieBot while it drove.
  3. My DuckieBot following another (simulated) DuckieBot’s turn at an intersection.

Here is the video (maybe I’ll get some extra style points):

You’ll notice that in some of the clips I’ve turned off the LEDs. This is because the LEDEmitterNode is so buggy that it caused unexpected behaviour to happen very often. The code just worked better without the LEDEmitterNode running, and I got fed up trying to get a video when the LEDEmitterNode kept messing stuff up.

The second video is the actual video requested in the lab instructions the DuckieBot tailing another DuckieBot and following it through intersections. I’ve simulated the followed DuckieBot by removing my DuckieBot’s back pattern and manually moving this back pattern around to mimic a real-life DuckieBot. You can see in the video my DuckieBot stopping behind this back pattern I’m holding and following it (un)successfully through intersections.

Unfortunately, as I stated earlier, the lane following is no longer that accurate, and so my DuckieBot crosses the middle yellow line quite often. Furthermore, as stated earlier, the implementation to predict the turn of the followed DuckieBot is not super accurate (you can see this in the video, my DuckieBot does not always follow the simulated DuckieBot that it is supposed to follow). Finally, another issue is that my unfinished basement has terrible lighting, which I’m sure affects the performance of the DuckieBot significantly.

Here is the video:

Also, I’ve turned off the LEDEmitterNode since it is quite buggy, but you can see the signal lights working in the previous video.

A Few Limitations of our Implementation

I’d like to conclude this post with a note on the limitations of my implementation above:

  • All logic with respect to computing and choosing turn directions at intersections is done after we stop at the stop line. If we’re following a DuckieBot and it turns, leaving the camera’s view before we stop at the stop line, we will no longer be able to follow that robot.
  • The apriltag should be visible at the stop line, as soon as the apriltag is no longer visible to the camera, we will not turn at an intersection.
  • Detecting another bot’s turn at an intersection is implemented using coordinates in the camera optical frame because I ran out of time. This is not an accurate way to detect how the bot is turning, and using the world frame would be much more accurate. See here for a more thorough discussion of this limitation.
  • My implementation only works when following bots through DuckieTown when the other bots follow the rules of the road. If you’re not in DuckieTown, my implementation won’t work. If the other bots don’t follow the rules of the road, my implementation won’t work either.

Deliverables

In this lab, our goal is to follow another DuckieBot while avoiding colliding into it. We should also stop at stop lines and signal which way we are going to turn.

In order to drive a safe distance behind a DuckieBot and avoid collisions I do a few things. I assume that the DuckieBot I am following is following the rules of the road. My DuckieBot then just performs lane following (as implemented in Lab 3). Using the vehicle_detection package from dt-core, I can detect the back pattern of the DuckieBot I am following. Furthermore, the nodes in this package work together to detect the distance between my bot and the one I am following. If the two bots get too close, a message is published along a topic (~virtual_stop_line) indicating that the two bots are too close (in reality, the message is a StopLineReading message, indicating whether or not we are at a stop line, but maybe this is a bit too technical for this brief discussion). If I get a message which indicates the bots are too close, I just stop my bot from moving.

Please see Running the Code for the video(s) deliverable as well as a discussion about the video(s) and issues I had.

The overall performance of my implementation is okay I would say. As stated in Running the Code, updating my DuckieBot broke my lane following code for some reason (there were updated pushed to dt-duckiebot-interface on Dockerhub, so I’m thinking this is the culprit but I don’t know for certain). Because of this, it’s hard to completely gauge the accuracy of my implementation. Also, the lighting in my unfinished basement is terrible, which I’m sure affects the overall performance of this code.

I will say for certain that the code I’ve implemented for predicting how a bot we are following turns at an intersection is quite unreliable since I work in the camera optical frame (it would be better to work in the world frame, but I ran out of time). Furthermore, this code performs poorly if the followed DuckieBot leaves the camera’s view before our DuckieBot stops at a stop line.

The code I’ve implemented for turning at intersections is just okay too. It works most of the time, but required a significant amount of tuning of the parameters, and it’s just a simple open-loop controller that doesn’t really account for errors such as driving into the wrong lane when turning.

Another issue I experienced is that my implementation sometimes will stop at the stop line on the left side of the road (i.e. the stop line for oncoming traffic and not for our DuckieBot). This happens especially often when turning. If the DuckieBot turns too sharply it will cross over the yellow lane delimiter and see the red stop line in the opposite lane. I’ve addressed this by adding timers into the code which determine how often a stop line can be read, but this isn’t a foolproof solution and introduces other issues as well.

Finally, a further issue I noticed is that my DuckieBot seemed incredibly laggy during this lab. For example, I would log to the console when the DuckieBot would see the back pattern of another bot. I would run my bot standing up (so it wasn’t moving) and put the pattern in front of the camera. Nothing would be logged to the console and the bot would continue moving. A few seconds after I had removed the pattern from the camera’s view, then the bot would stop and text would be logged to the console saying that the bot saw the back pattern. I don’t think this was an issue with my code the code that resulted in this lagging behaviour I took straight from dt-core (which had previously worked for me flawlessly) and just added code to log text to the console. This happened quite often (even without the LED emitter node running), and other strange lagging behaviour happened too (e.g. similar behaviour described above, but with stop lines instead of DuckieBot back patterns).

References

[1] dt-core. Link.

[2] Coordinate Frames, Transforms, and TF. Link.

This post is licensed under CC BY 4.0 by the author.