Artif Life Robotics (2011) 16:5–9 DOI 10.1007/s10015-010-0877-5
© ISAROB 2011
ORIGINAL ARTICLE Sheng-Ven Shiau · Kuo-Lan Su · Chun-Chieh Wang Jr-Hung Guo
Path planning of a multiple mobile robot system
Received and accepted: December 13, 2010
Abstract We present path-planning techniques for a multiple mobile robot system. The mobile robot has the shape of a cylinder, and its diameter, height, and weight are 8 cm, 15 cm, and 1.5 kg, respectively. The controller of the mobile robot is an MCS-51 chip, and it acquires detection signals from sensors through I/O pins. It receives commands from the supervising computer via a wireless RF interface, and transmits the status of the robots to the supervising computer via a wireless RF interface. The mobile robot system is a module-based system, and contains a controller module (including two DC motors and drivers), an obstacle detection module, a voice module, a wireless RF module, an encoder module, and a compass detection module. We propose an evaluation method to arrange the position of the multiple mobile robot system, and develop a path-planning interface on the supervising computer. In the experimental results, the mobile robots were able to receive commands from the supervising computer, and to move their next positions according to the proposed method. Key words Path planning · Mobile robot · Wireless RF · Compass
S.-V. Shiau · J.-H. Guo Graduate School of Engineering Science and Technology, National Yunlin University of Science and Technology, Yunlin, Taiwan K.-L. Su (*) Department of Electrical Engineering, National Yunlin University of Science and Technology, Douliou, Yunlin 640, Taiwan e-mail:
[email protected] C.-C. Wang Department of Electronic Engineering, Chienkuo Technology University, Changhua, Taiwan This work was presented in part at the 15th International Symposium on Artificial Life and Robotics, Oita, Japan, February 4–6, 2010
1 Introduction With new developments in robotic technology occurring every day, robot systems have been widely employed in many applications, including the field of automation. Recently, more and more research has been carried out into intelligent robots which can help people in their daily lives, such as service robots, office robots, security robots, education robots, and so on. In the future, we believe that robots will play an important role in our daily life. Many experts have reported their research into the intelligent mobile robot in the literature. Some research has addressed the development of target-tracking systems with mobile robots.1–3 Kobayashi and Yanagida4 proposed a method of detecting a human being by an autonomous mobile guard robot. Shimosasa et al.5 developed the autonomous guard robot, which integrates the security and service systems of an autonomous guard robot. This robot can guide visitors in the daytime and patrol at night. Gilbreath and Ciccimaro6 and Ciccimaro and Everett7 developed the autonomous security robot “ROBART III”, which is equipped with a nonlethal response weapon. Some researchers have also studied path planning for the intelligent mobile robot.8 This article considers the problem of multiple robot systems working together. A multiple mobile robot system has more advantages than a single robot system.9 First, multiple mobile robots have the potential to finish some tasks faster than a single robot.10 Furthermore, using several robots introduces redundancy. Multiple mobile robots can therefore be expected to be more fault-tolerant than only one robot. Another advantage of multiple mobile robots is the merging of overlapping information, which can help compensate for sensor uncertainty.11 This article is organized as follows. Section 2 describes the system architecture of the multiple mobile robot system, and explains the functions of the robots. Section 3 explains the path-planning method for the multiple mobile robots on the user interface, and how to execute the formation exchange step by step. Section 4 presents some experimental
6
results on formation exchange using the multiple mobile robot system. Section 5 presents some brief concluding remarks.
2 System architecture The system architecture of the multiple mobile robot system is shown in Fig. 1. The system contains a supervising computer, a monitor, a wireless RF interface, a remote supervising computer, a color CCD, and some mobile robots. The supervising computer can transmit commands to control robots, and receives the status of the mobile robots via the wireless RF interface. This information contains the orientation and displacement of the mobile robots in real time. Each robot is given an ID code. The supervising computer transmits the ID code to the mobile robot, and also transmits a command to the mobile robot about its next orientation and position. The mobile robot can move to the next position according the command from the supervising computer. The mobile robot has a cylindrical shape and is equipped with a microchip (MCS-51) as its main controller, two DC motors, several sensor circuits, a voice driver module, an encoder module, a compass module, a switch input, three Li batteries, and a wireless RS232 interface. In the input signals, the encoder module transmits a pulse signal to the controller, and programs the movement of the mobile robot. The reflective IR sensors detect the nearest crossing point on the motion platform, and learn the location of the mobile robot. The compass module can calculate the orientation of the mobile robot. The mobile robot can control two DC motors and a voice module through I/O pins, and communicate with the supervising computer via the wireless RF interface. The core of the RF module is a microprocessor (AT89C2051), which communicates with the controller via a series interface (RS232). The mobile robot has four wheels to provide the capability of autonomous mobility. A block diagram of the mobile robot is shown in Fig. 2. The structure of the mobile robot is shown in Fig. 3. It has several hardware circuits that are classified into four levels in the robot. The shape of each level is a circle. The bottom of the mobile robot is level one, and the top is level
Fig. 1. The architecture of the multiple robot system
four. Level one of the mobile robot is an encoder module. This module can calculate the displacement of the mobile robot using reflective IR sensors. We use two reflecting IR sensors to calculate the pulse signals from the two wheels of the mobile robot. The mobile robot is powered by three Li batteries embedded in level two, and connected in a parallel arrangement. This level has three obstacle detection modules using IR sensors, and also contains two DC motors to drive the robot. Level three of the mobile robot has the main board. The controller of the mobile robot can acquires detection signals from the sensors through I/O pins, and receives the status of the robot via the wireless RF interface. The controller of the mobile robot transmits the detection
Fig. 2. Block diagram of the mobile robot
Fig. 3. The structure of the mobile robot
7
results to the supervising computer via the wireless RF interface. The switch input can turn on the power of the robot, and selects the power input to either the Li batteries or the adapter. Level four contains a wireless RF interface, a compass module, and a voice driver module. At the bottom of the mobile robot is an encoder module. This module can calculate the distance that the robot has moved. We plot a white line and a black line on the wheels of the mobile robot, and use two reflecting IR sensors to calculate the pulse signals from the two wheels. We set the pulse number per revolution as P, and the mobile robot movement pulse number as B. We can then calculate the displacement D of the mobile robot using the equation D = 4.25 × π
B P
(1)
The diameter of the robot’s wheels is 4.25 cm.
3 User interface The user interface of the multiple mobile robot system is shown in Fig. 4. This interface has many parts. Position “A” is the communication protocol between the supervising computer and the mobile robots via the wireless RF interface. Position “B” sets the starting position of the mobile robot. Position “C” sets the goal position on the motion platform. Position “D” is the user’s command data region. Position “E” can set information about the displacement and orientation of the mobile robot. Position “F” displays the experimental results (success or failure) for the mobile robot. Position “G” can display the status of the mobile robot in real time. Position “H” displays the status of the mobile robot. It contains the starting information and the success or failure of the search. Position “I” displays the position in real time, and also the final position of the robot.
Fig. 4. The user interface of the multiple mobile robot system
Position “J” displays the orientation in real time, and also the final orientation of the robot. Position “K” displays the number of successes and failures of the mobile robot in the test. The motion platform of the multiple mobile robot system is shown in Fig. 5. The platform is a chessboard-based rectangle. The arrangement of the platform is 7 squares in the horizontal direction, and 9 squares in the vertical direction. The user can set the starting position and the goal position on the platform. The mobile robot moves on the platform from the starting position to the goal position, and transmits its status (displacement and orientation) to the supervising computer step by step via the wireless RF interface. The mobile robot measures its displacement and orientation using the encoder module and the compass module. If the mobile robot moves past its goal position, we define the test as a “failure”. However, if the mobile robot moves to the goal position, we can define the test as a “success”. The mobile robot can decide its next orientation using random processing during the test.
4 Experimental results We must test the accuracy of the displacement and orientation of the mobile robot. The experimental results on the displacement of the mobile robot are shown in Fig. 6. The starting position of the robot is shown in Fig. 6a. We set the displacement of the mobile robot at 10 cm. The supervising computer calculates the pulse number to be about 54, and
Fig. 5. The interface of the passive detection modules
8
a
a
b
c
c
b
d
d
Fig. 6. The displacement test of the mobile robot. a Starting position. b Displacement 10 cm. c Move to 30 cm (continue). d Displacement 30 cm
a
c
b
d
Fig. 7. The orientation test of the mobile robot. a Starting position. b Turn right 45°. c Starting position. d Turn left 90°
transmits the command to the mobile robot. The mobile robot receives the pulse number from the encoder module and moves to the next position on the 54 pulse. The experimental result is shown in Fig. 6b. We then set the displacement at 30 cm. Using Eq. 1, we can compute the pulse number to be 162. The mobile robot moves forward on the 162 pulse from the encoder module. The experimental results are shown in Fig. 6c and d. The experimental results for the orientation of the mobile robot are shown in Fig. 7. The starting position of the mobile robot is shown in Fig. 7a. We set the orientation of the mobile robot at 45°. The supervising computer calcu-
e
f
g
h
Fig. 8. Formation change for multiple mobile robots. a Starting position. b First arrangement. c Second arrangement. d Third arrangement. e Fourth arrangement. f Fifth arrangement. g Sixth arrangement. h Final arrangement
lates the pulse number to be about 21 on the two driving wheels, and transmits the command to the mobile robot. The mobile robot receives the pulse number from the encoder module and moves to the next position on the 21 pulse. The right-hand driving wheel moves forward, and the left-hand driving wheel moves backward. The experimental result is shown in Fig. 7b. We then set the orientation to be 90°. The mobile robot turns 90° to the left according to the command from the supervising computer. Using Eq. 1, we compute the pulse number to be 42. The mobile robot moves forward on the 42 pulse from the encoder module, and the left wheel moves backward on the 42 pulse. The experimental results are shown in Fig. 7c and d. We implemented formation change using the multiple mobile robot system. In the experiment, we used four mobile robots arranged in the four directions of a rectangle. The experimental scenario is shown in Fig. 8a. The supervising computer gives the command to the four mobile robots. Robot 1 turns left and moves one grid space. Robot 2 turns through 180° and move three grid spaces. Robot 3 turns left and moves two grid spaces. Robot 4 turns left and moves two grid spaces. The experimental results are shown in Fig.
9
8b and c. Next, the four robots turn right and move four grid spaces (Fig. 8d and e). Then the four robots turn left and move one grid space (Fig. 8f and g). Finally, robots 1 and 2 turn right, and robot 3 turns left. The experimental result is shown in Fig. 8h. The programming processing is for minimum displacement in the test using the four mobile robots.
5 Conclusion We have developed a path-planning method for a multiple mobile robot system, and have implemented the scenario on a platform. The system contains a supervising computer, a monitor, a wireless RF interface, a remote supervising computer, an experimental platform, and four mobile robots. Each mobile robot is the shape of a cylinder with diameter, height, and weight of 8 cm, 15 cm, and 1.5 kg, respectively, and executes the planned path using two interfaces. One is a wireless RF interface, and the other is a voice interface. The supervising computer can control multiple mobile robots, and receive the status of the robots via the wireless RF interface. Users can program the motion trajectories of the robots on the supervising computer. This article has shown formation exchange using four mobile robots. In the future, we want to develop more applications of the multiple mobile robot system. Acknowledgment This work was supported by the project “The Development of Multiple Module-Based Smart Robots,” under the Industrial Development Bureau of the Ministry of Economic Affairs of Taiwan.
References 1. Lee SO, Cho YJ, et al (2000) A stable target-tracking control for unicycle mobile robots. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000), vol 3, Takamatsu, Japan, October 31 – November 5, pp 1822–1827 2. Parker LE, Emmons BA (1997) Cooperative multi-robot observation of multiple moving targets. IEEE International Conference on Robotics and Automation, vol 3, Grenoble, France, September 7–11, pp 2082–2089 3. Lee MJ, Hwang GH (2008) Object tracking for mobile robot based on intelligent method. Artif Life Robotics 13(1): 359–363 4. Kobayashi H, Yanagida M (1995) Moving object detection by an autonomous guard robot. 4th IEEE International Workshop on Robot and Human Communication, Tokyo, Japan, July 5–7, pp 323–326 5. Shimosasa Y, Kanemoto J, et al (2000) Some results of the test operation of a security service system with autonomous guard robots. IEEE International Conference on Industrial Electronic, Control and Instrumentation, vol 1, Nagoya, Japan, October 22–28, pp 405–409 6. Gilbreath GA, Ciccimaro DA (2000) An advanced telereflexive tactical response robot. Proceedings of Workshop 7: Vehicle Teleoperation Interfaces, IEEE International Conference on Robotics and Automation, ICRA2000, San Francisco, CA 7. Ciccimaro DA, Everett HR (1999) A supervised autonomous security response robot. American Nuclear Society 8th International Topical Meeting on Robotics and Remote Systems (ANS’99), Pittsburgh, pp 25–29 8. Fu YY, Wu CJ, et al (2008) A time-scaling method for near timeoptimal control of an omni-directional robot along specified paths. Artif Life Robotics 13(1):350–354 9. Cao Y, et al (1997) Cooperative mobile robotics: antecedents and directions. Auton Robots 4(1):7–27 10. Guzzoni D, et al (2006) Many robots make short work. AI Mag 18(1):55–64 11. Burgard W, Moors M, et al (2005) Coordinated multi-robot exploration. IEEE Trans Robotics 21(3):376–386