ROS architecture for simulating a robot into a discrete 2D enviroment. The robot has 3 behaviors:
- Play
- Sleep
- Normal
The user can interact with the robot via command and pointed gestures
The architercture is composed by 4 components:
-
Command: simulate user vocal command
-
Gesture: simulate user pointed gestures
-
Motion: simulate robot motion
-
Command manager: implement robot behaviors through a FSM
The finite state machine is composed by 3 state (robot behaviors):
-
PLAY: robot moves from person position to pointed gestures and viceversa
-
NORMAL (initial state): robot moves in a random ways
-
SLEEP: robot goes to home position, it stays there for a certain time and then goes to NORMAL state
The messages are:
Point
: 2D position used for pointed gestures and target positionString
: used for user command and feedback from "motion" component
The parameters are:
home_pos_x,home_pos_x
: define the home position in the map (double)person_pos_x,person_pos_y
: define the person position in the map (double)map_x,map_y
: define the dimensions of the map (integer)min_delay_command,max_delay_command
: define the min and max delay for sending the command "play" (double)min_delay_gesture,max_delay_gesture
: define the min and max delay for sending the pointed gesture (double)min_delay_robot_motion,max_delay_robot_motion
: define the min and max delay for simulating the robot motion (double)min_transition_play_normal,max_transition_play_normal
: define the min and max delay to trasit between PLAY and NORMAL (integer)min_transition_normal_sleep,max_transition_normal_sleep
: define the min and max delay to trasit between NORMAL and SLEEP (integer)min_sleep_delay,max_sleep_delay
: define the min and max delay for the SLEEP state (double)
There are 3 packages:
Sensoring
: contains the command.py and gesture.py files used to simulate the user command and pointed gesturesRobot control
: contains the motion.py file used to simulate robot motionCommand manager
: contains the command_manager.py file that implements the FSM of robot behaviors.
In order to run this software, the following prerequisities are needed:
Before running the software, you must have all files as executable otherwise you can make them executable with the following command
cd <your_workspace>/src/Behavioral-Architecture
chmod +x sensoring/src/*
chmod +x robot_control/src/*
chmod +x manager/src/*
To run the software
cd <your_workspace>
catkin_make
source devel/setup.bash
cd src/Behavioral-Architecture
roslaunch launch_file.launch
The robot interact with a human via a command and pointed gestures. It moves inside a 2D discrete enviroment. Both robot targets position and pointed gestures belongs to the map. The robot has 3 behaviors: Play,Normal,Sleep. The robot can receive any command while executing PLAY state and it can receive any command or pointed gesture while executing SLEEP state but all of them are ignored while executing one of the two states. The initial state is NORMAL. The only command is "play". There two predifined positions inside the map ("Person" and "Home" position) that cannot be changed during the execution. When the robot moves, it cannot respond to other stimulus. The robot can go into Person or "Home" position during NORMAL behavior and it can go into Home position during PLAY state.
- Specify different dimensions of the map
- It is possibile to define different delays for the simulation
- Define different position of the person and the "home" inside the map before start the simulation
- It is possible to visualize the states transition in the shell
- The robot will notify if it will reach the target position and it is possibile to visualize it in the shell (the position that the robot has reached)
- It is not possibile to generate a pointed gesture equal to the person position
- There is no graphical interface to view the map and the movement of the robot within it
- Commmand, pointed gesture and robot motion are simulated
- There isn't check if the user puts a position for the person or the "home" outside the map
- Since it is used ROS Noetic, the state transition visualization is limited to the shell (it is necessary to fix some files in order to use smach viewer)
- Add other behaviors to the robot
- Use a graphical interface for viewing the simulation
- Add error handling in order to prevent some user position outside the map
- Implemente a real robot motion control
- Implements a user interface for the command and pointed gestures
- Add more commands
- Prevent the robot from going into "unauthorized" positions during some behaviors