Skip to content

Triton-Robotics-Training/TR-Autonomy-2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 

Repository files navigation

TR-Autonomy-2

Second Training Module for TR Autonomy Recruits

Autonomy Training Resources

NOTE: IF YOU'RE USING A MAC M1 (ARM) USE THE NOOPENGL BRANCH OF THIS REPO

Task Overview

In this module you will be given a simulated robot which you will have to control to point at a target.

  • You don't have to write a controller, just tell the robot which angle to point by publishing to /desired_angle.
  • The target is always visible from the 0 radians position, and the FOV of the camera is 90 degrees, so the furthest one should rotate is $\pm\frac{\pi}{4}$ radians.
  • The robot will score a "point" if it centers the target cube on screen for 2 seconds.
  • A point is scored if the average position of the red pixels is within 60 pixels in either direction of the x-coordinate center of the image.
  • The robot's position will be homed to the 0 radians position instantly after every successful point scored.

The goal of this assignment is for the robot to score points without user input. However it does this, as long as it uses the provided topics, is allowed.

test.mp4

Getting Started

Make sure you have pip installed.

# check pip
python3 -m pip --version
# if not existing, install it
sudo apt install python3-venv python3-pip

Clone the repository to setup your next workspace.

git clone https://github.com/Triton-Robotics-Training/TR-Autonomy-2.git

To build and run the package, follow the same steps as last module:

cd TR-Autonomy-2/
source /opt/ros/humble/setup.bash
# python dislikes systemwide install, but ROS only works with system python so we set this ENV 
export PIP_BREAK_SYSTEM_PACKAGES=1
rosdep install -i --from-path src --rosdistro humble -y
colcon build
# OPEN_NEW_TERMINAL AND NAVIGATE TO YOUR REPOSITORY
source install/setup.bash
ros2 run spinnyrobot spinnyrobot

Architecture

The following topics are available:

$ ros2 topic list
/current_angle
/desired_angle
/parameter_events
/robotcam
/rosout
/scored_point

The spinnyrobot node publishes to /current_angle, /robotcam, and /scored_point

The /current_angle is of type std_msgs/msg/Float32:

$ ros2 topic info /current_angle
Type: std_msgs/msg/Float32
Publisher count: 1
Subscription count: 0

There is also /scored_point which is of type std_msgs/msg/Empty and is transmitted every time the robot scores a point, and /robotcam which is an /sensor_msgs/msg/Image using bgr8 encoding. If you want to look at it, you can run:

ros2 run rqt_image_view rqt_image_view /robotcam

Your task is to publish a Float32 to /desired_angle, somehow making the robot point at the target and scoring a point.

An outline of how to get the image into ROS2 C++ for processing with opencv is here: https://www.theconstructsim.com/how-to-integrate-opencv-with-a-ros2-c-node/

For extra help, this is a github code example of subscribing to a ROS image publisher (/robotcam) and converting it to an openCV image type to be processed: link.

This is what the result should look like:

graph TD;
sol("yoursolution")
rob("spinnyrobot")
sol == /desired_angle ==> rob;
rob == /robotcam ==> sol;
rob == /current_angle ==> sol;
rob == /scored_point ==> sol;
Loading

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages