Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Final project for EECS 467: Autonomous Robotics Design Experience | SimBot, a hazardous avoiding autonomous robot

Notifications You must be signed in to change notification settings

467-Team1/SimBot

Repository files navigation

SimBot: Integrated Robotic Control System 🤖

GitHub Stars GitHub Forks C++ Language Python Language Computer_Vision April_Tags Autonomous_Robotics

SimBot is a comprehensive final project developed for EECS 467 (or equivalent) that integrates various robotic control and computer vision functionalities on an MBot platform. The system includes AprilTag recognition for object localization, hand gesture teleoperation for remote control, and SLAM (Simultaneous Localization and Mapping) for autonomous navigation, all tied together by a coordinated Python environment.


Table of Contents


Authors

Name GitHub Email
Laasya Chukka @lchukka450 [email protected]
Ansh Mehta @anshm10 [email protected]
Christian Vega @cpgvega [email protected]
Wendi Zhang @zwendi123 [email protected]

Project Demo

A brief demonstration of the SimBot's capabilities, including AprilTag recognition and teleoperation via hand gestures: https://youtu.be/FMWMjoAIFdI\


Tech Stack

The project is built upon the following hardware and software components:

Category Component Purpose
Robotics Platform Mbot (with Raspberry Pi & LIDAR) Core mobile robotics platform.
Primary Language Python 3 Main scripting language for control and computer vision.
Vision/ML MediaPipe, OpenCV, TensorFlow Used for hand gesture recognition and AprilTag detection.
SLAM RPLIDAR Driver, slam binary Used for Simultaneous Localization and Mapping.
Networking WiFi Router, SSH Establishes a local network for communication between the Mbot and the local machine.

Prerequisites/Materials

Ensure you have the following hardware and software ready before attempting to run the project.

Required Hardware & Networking

  • MBot platform.
  • LIDAR sensor (must be plugged in and turned on for SLAM).
  • WiFi Router: Both your Local Machine and the MBot must be connected to the same network.
  • Terminal access.

Core Software

  • Python 3 (Installed on both Mbot and Local Machine)

AprilTag Dependency

Due to CMake dependencies specific to local machines, the AprilTag repository must be cloned and set up manually:

  1. Clone the AprilTag Repo: You must clone the AprilTag repository locally.
  2. Refer to Documentation: For build details, please consult the april_tag/README.md file.
  3. Copy Support Files: Add necessary files from this repository to the cloned AprilTag directory:
    cp -r support_files/templates AprilTag/scripts/
    cp support_files/receive_stream.py AprilTag/scripts/

Hand Gesture Model Specifics

The following Python packages are required, specifically for the hand gesture recognition:

Package Version (Minimum) Notes
mediapipe 0.8.1 (0.10.9 for Mac) Core library for hand tracking.
OpenCV 3.4.2 For video stream processing.
Tensorflow 2.3.0 For the gesture recognition model.
tf-nightly 2.5.0.dev Only required if you are creating a TFLite for an LSTM model.
scikit-learn 0.23.2 Optional: Needed to display the confusion matrix.
matplotlib 3.3.2 Optional: Needed to display the confusion matrix.

System Usage

The system is composed of four main functionalities, each requiring a specific setup across multiple terminals. Note: Locally refers to your personal computer, while MBot refers to the Mbot's Raspberry Pi.

A. April Tag Recognition

This setup streams the MBot's camera feed to the local machine for AprilTag processing.

Terminal Location Directory Command Notes
1 MBot camera_stream/ python3 camera_final.py Starts the camera stream.
2 Locally april_tag/scripts/ python3 receive_stream.py Receives the stream and performs AprilTag recognition.
3 MBot teleop_gesture/python/ python3 data_delivery.py For sending AprilTag data to the MBot's microcontrollers.

B. Teleop Gesture

This is the core setup for the MBot's low-level control, including real-time communication and gesture processing.

  1. Reflash Mbot: Reflash the appropriate *.uf2 file into the MBot microcontroller.
  2. Start Services (MBot): Open three separate terminals on the MBot and run the following commands in order:
Terminal Directory Command Purpose
1 teleop_gesture/shim_timesync_binaries/ chmod +x shim then ./shim Runs the low-level shim service.
2 teleop_gesture/shim_timesync_binaries/ ./timesync Runs the time synchronization service.
3 teleop_gesture/python/ python3 teleop_gesture_v#.py Runs the main teleoperation script (replace # with the version number).

C. Hand Gesture Model

This runs the hand gesture recognition model locally to generate control commands.

Terminal Location Directory Command
1 Locally hand_gesture_recognition_mediapipe_main/ python3 app.py

D. SLAM

This sets up the Simultaneous Localization and Mapping using the LIDAR sensor. Ensure the LIDAR is plugged in and turned on first.

Terminal Location Directory Command
1 MBot bot_lab/bin/ ./rplidar_driver
2 MBot bot_lab/bin/ ./slam
3 MBot bot_lab/ source setenv.sh
MBot bot_lab/bin/ ./botgui

Configuration Notes

IP Address Updates

The IP address of the MBot's wlan0 interface must be manually updated in the following four Python scripts to ensure proper network communication:

  • camera_final.py (line 26)
  • app.py (line 47)
  • teleop_gesture_v#.py (line 42)
  • receive_stream.py (line 31)

To find the correct IP address, run ifconfig on the Mbot.

Useful Commands

1. Enabling UI on VNC Viewer/SSH: If you are using an SSH connection and need the UI to display (e.g., for ./botgui), use the -X flag and set the display environment variable:

# Connect with X forwarding
ssh -X pi@[insert Mbot IP Address here]
# Once connected, run:
export DISPLAY=:0.0

2. Symlinking receive_stream.py: If you want changes to your locally cloned AprilTag's receive_stream.py to be reflected in your project's support files, use a symbolic link. You may need to temporarily move the existing support_files/receive_stream.py before linking.

ln AprilTag/scripts/receive_stream.py support_files/receive_stream.py

References

This project utilizes and builds upon the work in the following repositories:


If you find any issues or have suggestions, please open an issue in the repository.

About

Final project for EECS 467: Autonomous Robotics Design Experience | SimBot, a hazardous avoiding autonomous robot

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •