This repository provides a ROS 2 Humble + Gazebo Classic simulation of a simple packing-house layout with moving conveyors, actors (walkers), and mounted perception sensors. It includes a ground-truth safety detector based on Gazebo state, and a pole-mounted sensor example that runs YOLO on RGB/Depth data and classifies people into safety zones.
The simulation is designed to:
-
Provide a reproducible packing-house scene with two conveyor belts, moving props (boxes/balls), and actor walkers.
-
Offer safety-zone detection baselines:
- A ground-truth detector using Gazebo model state/actor state services.
- A camera/LiDAR pole example that runs YOLO detections and assigns zone levels based on world-space intersections.
-
Serve as a starting point for experimenting with sensor placement (e.g., moving the pole, changing camera/LiDAR angles) and adjusting detection logic.
This package targets ROS 2 Humble with Gazebo Classic. Ensure you have ROS 2 installed and sourced.
# Example: install core dependencies (Ubuntu 22.04 + ROS 2 Humble)
sudo apt install \\
ros-humble-rclpy \\
ros-humble-gazebo-ros \\
ros-humble-gazebo-msgs \\
ros-humble-geometry-msgsThe pole-based example uses the ultralytics YOLO package:
pip install ultralyticsYou will also need PyTorch;
ultralyticstypically pulls it in, but you may need to install it manually depending on your environment.
If you want to launch the UR3e example, install these additional ROS packages:
sudo apt install \\
ros-humble-ur-description \\
ros-humble-ros2-control \\
ros-humble-ros2-controllers \\
ros-humble-gazebo-ros2-controlcolcon build
source install/setup.bashYou always run two nodes:
-
Zone publisher → publishes
/human\_zone\_level0: clear1: Space A2: Space B
-
Arm safety controller → subscribes to
/human\_zone\_leveland slows/stops the UR3e arm.
The arm safety controller in this repo is:
packing\_house\_sim/ur3e\_safety\_work\_cycle\_levels.py
The ground-truth example uses Gazebo model state and actor state services to detect when a person enters a 3D axis-aligned box region. It publishes /human\_zone\_level for the arm controller.
Terminal 1 — start the sim (with UR3e):
ros2 launch packing\_house\_sim packhouse\_with\_ur3e.launch.pyTerminal 2 — arm safety controller (zone-level based):
ros2 run packing\_house\_sim ur3e\_safety\_work\_cycle\_levelsTerminal 3 — ground-truth zone classifier:
ros2 run packing\_house\_sim human\_zone\_classifier(Optional) Override Space A/B bounds at runtime:
ros2 run packing\_house\_sim human\_zone\_classifier --ros-args \\
-p space\_A\_min:='\[-1.2,-1.8,-1.6]' -p space\_A\_max:='\[1.6,1.2,1.8]' \\
-p space\_B\_min:='\[-0.8,-1.2,-0.8]' -p space\_B\_max:='\[1.3,0.8,1.4]'Verify it’s working:
ros2 topic echo /human\_zone\_level
ros2 topic echo /safety\_alert- The classifier reads
/gazebo/model\_statesand/gazebo/get\_entity\_stateto track the walker actors in the scene. - It checks if any detected human/actor is inside the configured AABB (axis-aligned bounding box) and publishes the zone level.
- Box size and actor names are configurable via ROS parameters.
Relevant files:
- Zone classifier:
packing\_house\_sim/human\_zone\_classifier.py - Arm safety controller:
packing\_house\_sim/ur3e\_safety\_work\_cycle\_levels.py - World:
worlds/packhouse\_control\_walkers.sdf
The pole example uses the simulated RGB-D camera mounted on the pole in the SDF. It runs YOLO, projects detections into world coordinates, intersects them with the ground plane, and classifies them into Zone A / B.
Terminal 1 — sim (with UR3e):
ros2 launch packing\_house\_sim packhouse\_with\_ur3e.launch.pyTerminal 2 — arm safety work-cycle (pole version):
ros2 run packing\_house\_sim ur3e\_safety\_work\_cycle\_levels\_poleTerminal 3 — YOLO (pole, world coords):
ros2 run packing\_house\_sim human\_yolo\_topdown\_detector\_poleCommon useful overrides (match what we tested):
ros2 run packing\_house\_sim human\_yolo\_topdown\_detector\_pole --ros-args \\
-p rgb\_topic:=/pole/pole\_depth/image\_raw \\
-p depth\_topic:=/pole/pole\_depth/depth/image\_raw \\
-p depth\_info\_topic:=/pole/pole\_depth/depth/camera\_info \\
-p conf:=0.15 \\
-p use\_depth\_filter:=FalseVerify it’s working:
ros2 topic echo /human\_zone\_level-
Subscribes to pole RGB/Depth topics (from the SDF model).
-
Runs YOLO (person class) and converts detections to world-space points using TF and depth camera intrinsics.
-
Intersects detection rays with the ground plane and publishes a zone level:
0: no person1: Zone A2: Zone B
Relevant files:
- Pole detector node:
packing\_house\_sim/human\_yolo\_topdown\_detector\_pole.py - Arm safety controller:
packing\_house\_sim/ur3e\_safety\_work\_cycle\_levels\_pole.py - Pole sensors in SDF:
worlds/packhouse\_control\_walkers.sdf
If you want to change the sensor layout or pose, you typically need to update two places:
-
Edit the SDF world to move the pole or adjust sensor offsets:
-
File:
worlds/packhouse\_control\_walkers.sdf -
Key sections:
model name="sensor\_pole"(move the pole by editing<pose>)link name="panel\_link"(panel pose/rotation)- Sensor
<pose>entries forpole\_lidar,pole\_rgb, andpole\_depth
-
-
Update the TF static transforms if you rely on the UR3e + TF launch file:
- File:
launch/packhouse\_with\_ur3e.launch.py - Adjust the static transform publishers (e.g.,
world\_to\_panel\_link,panel\_link\_to\_pole\_lidar) so they match your updated SDF poses.
- File:
If you only use the bringup world and do not launch the UR3e TF publishers, you can often skip the TF updates. If you use
human\_yolo\_topdown\_detector\_pole.py, the TF chain it uses must match the sensor pose in the SDF.
To launch the world plus a UR3e robot (and TF publishers for the pole), run:
ros2 launch packing\_house\_sim packhouse\_with\_ur3e.launch.pyThis launches:
- The packing-house world (same as bringup).
- The UR3e robot in Gazebo.
- Static TF publishers for the robot base and sensor pole frames.
Pole sensors (namespace /pole):
- RGB image:
/pole/camera/image - RGB camera info:
/pole/camera/camera\_info - Depth image:
/pole/rgbd/depth\_image - Depth camera info:
/pole/rgbd/camera\_info - Point cloud:
/pole/rgbd/points - LiDAR points:
/pole/lidar/points
- If you see missing TF errors, ensure the static transforms in
packhouse\_with\_ur3e.launch.pymatch the SDF poses. - If YOLO doesn’t start, verify
pip install ultralyticsand that your Python environment matches the ROS 2 overlay. - If Gazebo can’t find models, ensure
GAZEBO\_MODEL\_PATHis set (the bringup launch file sets this automatically).
This project is licensed under the Apache License 2.0 — see the LICENSE file for details.