-
Notifications
You must be signed in to change notification settings - Fork 0
Home
This repository is for developing agents using the ros2 clips executive for the robocup logistics league.
The ros2-clips-executive can be found here: https://github.com/fawkesrobotics/ros2-clips-executive/tree/tviehmann/major-cleanup
We will use ros2 jazzy.
Key aspects when working with ROS 2:
- projects utilize different packages
- packages are organized in workspaces
- colcon is used to build packages in a workspace
- vcstool and repos files may be used to fetch packages from multiple sources to easily set up workspaces
- ROS is tightly scoped, workspaces need to be sourced for them to be available in your current environment (here, environment typically refers to your current terminal)
ROS 2 offers several meta packages that can provide you with the core features needed in every ROS environment.
On Ubuntu machines the base installation is located under /opt/ros/jazzy
, on our Fedora-based lab machines the base installation is located at /usr/lib64/ros2-jazzy
as this matches packaging guidelines for fedora. The ROS packages on fedora come from here.
To source your base installation, simply run:
source /usr/lib64/ros2-jazzy/setup.bash
This will give you access in your current terminal to basic ros2 features, such as the ros2 command line interface ros2cli.
ros2 --help # check out what the cli offers
ros2 pkg list # example to list all packages known in your environment
ament_index packages # another useful tool to query the ament_index directly, which is enabling all this scoping magic
Get familiar with the basics of ROS 2 by doing the basic CLI tutorials
Notes:
- You do not have to (and in fact, can not install anything on the system, meaning that you should ignore all commands asking you to install packages via
apt
as those packages should already be available for you and are installed viadnf
by the system administrator. - Remember, sourcing the base installation is different compared to the description in the tutorials, as described above!
Now that you are familiar with the basics of ROS, ime to setup our infrastructure with CLIPS:
We will setup the our project using 3 different workspaces as follows follows:
ros2/
deps_clips_executive_ws # for dependencies that we do not need to update
clips_executive_ws
labcegor_ws
The idea is to have one workspace for dependencies that are not relevant.
Firstly, create a directory structure for ros2 workspaces
mkdir -p ~/ros2/{clips_executive_ws,deps_clips_executive_ws/labcegor_ws}/src
Then, get the ros2-clips-executive by following the build steps.
Note: Make sure to always be on the correct branch, which for now is tviehmann/major-cleanup
.
Get Familiar with the CLIPS-Executive by reading through the following readmes:
- Main repository
- CLIPS Environment Manager
- File Load Plugin
- Executive Plugin
- Ros Msgs Plugin
- cx_bringup
Utilizing the RosMsgs, FileLoad and the Executive plugin, try to control the Turtlebot by publishing to the topics /turtle1/cmd_vel
and subscribing to the topic /turtle1/pose
.
The turtle should move in a loop between bottom-left (1.0 , 1.0) bottom-right (1.0 , 9.0) top-right (9.0 , 9.0) and top-left (9.0 , 1.0).
You can use this repository to start the task and you should perform a few basic steps that will help you on your mission:
- Create a new workspace (e.g.,
~/ros2/labcegor_ws
) - inside the
src
directory of the workspace, clone this repository and create a new package, e.g., via:
ros2 pkg create --build-type ament_cmake --license Apache-2.0 labcegor_bringup
- Inside the package, you need
params
andlaunch
andclips
directories that also need to be installed in the respective CMakeLists.txt via
install(DIRECTORY launch params DESTINATION share/${PROJECT_NAME})
install(DIRECTORY clips/ DESTINATION share/${PROJECT_NAME}/clips/${PROJECT_NAME}/)
- A simple launch file adapted from cx_bringup will probably suffice for you:
import os
from ament_index_python.packages import get_package_share_directory
from launch import LaunchDescription
from launch.actions import DeclareLaunchArgument, SetEnvironmentVariable
from launch.actions import OpaqueFunction
from launch.substitutions import LaunchConfiguration
from launch_ros.actions import Node
def launch_with_context(context, *args, **kwargs):
labcegor_dir = get_package_share_directory('labcegor_bringup')
manager_config = LaunchConfiguration("manager_config")
log_level = LaunchConfiguration('log_level')
manager_config_file = os.path.join(labcegor_dir, "params", manager_config.perform(context))
# re-issue warning as it is not colored otherwise ...
if not os.path.isfile(manager_config_file):
logger = get_logger("cx_bringup_launch")
logger.warning(f"Parameter file path is not a file: {manager_config_file}")
cx_node = Node(
package='cx_bringup',
executable='cx_node',
output='screen',
emulate_tty=True,
parameters=[
manager_config_file,
],
arguments=['--ros-args', '--log-level', log_level]
)
return [cx_node]
def generate_launch_description():
declare_log_level_ = DeclareLaunchArgument(
"log_level",
default_value='info',
description="Logging level for cx_node executable",
)
declare_manager_config = DeclareLaunchArgument(
"manager_config",
default_value="clips_env_manager.yaml",
description="Name of the CLIPS environment manager configuration",
)
# The lauchdescription to populate with defined CMDS
ld = LaunchDescription()
ld.add_action(declare_log_level_)
ld.add_action(declare_manager_config)
ld.add_action(OpaqueFunction(function=launch_with_context))
return ld
- Write a simple CLIPS manager config (you can orient yourself on the config for the ros msgs plugin example from the cx_bringup package.
We use the standard turtlesim simulation launched via:
ros2 run turtlesim turtlesim_node
Inspect the topics /turtle1/cmd_vel
and /turtle1/pose
using the ros2
cli tool to find out what message types are used and get familiar how to steer the turtle from the command line.
Encode the task at hand in CLIPS with the help of the CLIPS basic programming guide (bpg) and the documentation for the RosMsgsPlugin (you can of course also look at the provided usage example from the cx_bringup package).
- Use deftemplate constructs to encode the task at hand. (Chapter 3 in bpg)
- Use the deffacts construct to initialize your knowledge about the task. (Chapter 4 in bpg)
- Write rules to interface with the ROS topics and steer the turtle. (Chapter 5.4 up to 5.4.9 in bpg)
- If needed, use deffunctions to write some functions. (Chapter 7 in bpg)
In addition, Chapter 12 (in particular up to 12.14) of the bpg serves as a reference for CLIPS functions available in every environment.
Now that we got the first hang on CLIPS and it`s interfaces to the outside world, we can start with the RoboCup Logistics League domain. The goal of this task is to buffer a cap at a cap station. More precisely:
- drive with the first robot to the Cap Station 1 of team Magenta (M-CS1)
- pick up a cap carrier from the shelf
- place it on the input of the machine
- instruct the machine to buffer a cap (RETRIEVE_CAP
- drive with the second robot to the output of M-CS1
- pick up the product with the second robot
In order to work on this task, make sure to read through the following subsections that will help you get started.
The full rulebook of the league can be found here (pdf) for reference.
The required software is bundled via containers with setup files in the rcll-get-started repository.
cd ~/
git clone -b tviehmann/lab-setup https://github.com/robocup-logistics/rcll-get-started.git
In order to use containers on our lab machines, we additionally add a local config ~/.config/containers/storage.conf
:
# This file is is the configuration file for all tools
# that use the containers/storage library.
# See man 5 containers-storage.conf for more information
# The "container storage" table contains all of the server options.
[storage]
# Default Storage Driver
driver = "overlay"
# Primary Read/Write location of container storage
graphroot = "/var/tmp/$USER/container/storage/"
# Storage path for rootless users
#
rootless_storage_path = "/home/$USER/.containers/storage"
Additionally, make sure your user has a subgid and subuid range. If this is the case, the following command will return a non-empty output:
grep $USER /etc/subgid
grep $USER /etc/subuid
In essence, just sourcing the contained setup.sh
file of the rcll-get-started
repository should provide you with a bunch of new terminal commands, all prefixed by rc_
(press tab twice after typing the prefix to see a full list of commands).
cd ~/rcll-get-started
source setup.sh
We mainly need the two commands
-
rc_start
which starts everything -
rc_stop
which stops everything
In order to verify that everything works as expected, you can query the state of containers using podman. The main useful commands are:
podman ps # shows list of containers
podman pod ps # shows list of pods
podman rm <c-id> # removes a container
podman pod rm <p-id> # removes a pod
The workflow for starting a game is the following:
- Run
rc_start
- If everything is running correctly, you should be able to open a browser and go to localhost:8080 to see the refbox frontend.
- Pressing
ctrl
+alt
+o
lets you connect as referee. - Press the play button in the top middle to go to Setup phase. This generates a new game instance.
- You should see that the simulated robots are now connected.
- Switch to Production phase by clicking on the phase on top.
All communication in the RCLL is done by exchanging messages via broadcast peers that transmit protobuf messages. The cx_protobuf_plugin of the CLIPS-Executive lets you interface with protobuf from within CLIPS.
There is already a useful repository containing message definitions at https://github.com/carologistics/rcll-protobuf which you can clone an build in your workspace to obtain all message definitions that you need. A suitable plugin config is depicted below.
protobuf:
plugin: "cx::ProtobufPlugin"
pkg_share_dirs: ["rcll_protobuf_msgs"]
proto_paths: ["rcll-protobuf-msgs"]
Basic information about the refbox is described in the wiki Check out the following articles:
- Concepts and Terminologies
- The
General
section of the Communication Protocol section. - Machine States
We will use protobuf for 3 things:
- Command the robots via messages defined in AgentTask.proto
- Instruct machines via messages in MachineInstructions.proto
- Observe the information sent by the refbox
The ports used to communicate with the refbox are defined in rcll-get-started/config/refbox/comm/default_comm.yaml.
The ports used to communicate with the simulator are defined in rcll-get-started/simulator/config.yaml.
The main branch of this repository contains a code skeleton that you can use as a starting point.
- init.clp uses the YAML Configuration Plugin to load all relevant values directly in the main CLIPS environment. This additionally requires the Ament Index Plugin.
- Based of the provided configuration, all relevant protobuf connections are established.
- refbox-comm-init.clp opens protobuf connections to the refbox. It listens on the public channel and once it receives a team color from the refbox it additionally opens an encrypted private peer.
- sim-comm-init.clp opens the protobuf connections to the robots.
- refbox-recv processes the basic game information for you.
- deftemplates.clp contains all deftemplates used in the skeleton.
In order to not multicast to other lab machines, we will need to setup Cyclone-DDS. Open a new file in the editor of your choice:
gedit ~/cyclone_dds.xml
Paste this in:
<?xml version="1.0" encoding="UTF-8" ?>
<?xml-model href="https://raw.githubusercontent.com/eclipse-cyclonedds/cyclonedds/master/etc/cyclonedds.xsd" schematypens="http://www.w3.org/2001/XMLSchema" ?>
<CycloneDDS xmlns="https://cdds.io/config">
<Domain Id= "any">
<General>
<Interfaces>
<NetworkInterface address="127.0.0.1"/>
</Interfaces>
<AllowMulticast>true</AllowMulticast>
<EnableMulticastLoopback>true</EnableMulticastLoopback>
</General>
</Domain>
</CycloneDDS>
Lastly, register Cyclone-DDS as your ROS middleware (replace <YOUR-USER-NAME>
by your user name) by putting these lines in your terminal config (.bashrc
):
gedit ~/.bashrc
export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp
export CYCLONEDDS_URI=file:///home/<YOUR-USER-NAME>/cyclone_dds.xml
Notes:
- This In order for these changes to take effect in existing terminals (and terminal tabs), they need to reload the
.bashrc
again:
source ~/.bashrc
- It might require to stop the ros2 daemon once for the changes to take effect:
ros2 daemon stop
colcon accepts a range of arguments. One particularly handy feature is the ability to build using sym-links:
colcon build --symlink-install
This prevents files from being copied into the installation directory and instead just creates symbolic links. The benefits of this are that it costs less memory and that all changes to files in the source directory are also applied to the installed packages without a need for building the workspace again (this does not apply for files that are processed at build-time, such as C++ files that are compiled into binaries). E.g., changes to existing python launch files, yaml configuration or CLIPS files are directly available in the installed package.
Limitations:
- Symbolic links are overridden by file copies again, if
colcon build
is called without the--symlink-install
argument later. - Symbolic links can not override file copies, if
colcon build --symlink-install
is called after a regularcolcon build
. To achieve the desired result in this scenario, just delete the build and install directory of your workspace and runcolcon build --symlink-install
again. - New files cannot magically appear in the installation directory. Hence, if you create a new file, make sure to run
colcon build --symlink-install
again.
It is also possible to define default arguments for colcon.
Just create the following file in the .colcon directory of your home directory: $HOME/.colcon/defaults.yaml
.
An example configuration, also including some useful cmake args, is shown below:
build:
cmake-args:
- -DBUILD_TESTING=OFF
- -DCMAKE_VERBOSE_MAKEFILE=ON
- -DCMAKE_BUILD_TYPE=Debug
- -DBUILD_TESTING=OFF
symlink-install: true
In order for colcon to accept the default configuration, the COLCON_HOME variable needs to point to the .colcon directory location.
export COLCON_HOME=$HOME/.colcon/
(You might want to add this to your .bashrc
to automatically set the value in every terminal)