Auton Robot (2006) 20:231–238 DOI 10.1007/s10514-006-7100-5
Mobile robot navigation using vision and olfaction to search for a gas/odor source Hiroshi Ishida · Hidenao Tanaka · Haruki Taniguchi · Toyosaka Moriizumi
Published online: 8 June 2006 C Springer Science + Business Media, LLC 2006
Abstract This paper presents a new approach to search for a gas/odor source using an autonomous mobile robot. The robot is equipped with a CMOS camera, gas sensors, and airflow sensors. When no gas is present, the robot looks for a salient object in the camera image. The robot approaches any object found in the field of view, and checks it with the gas sensors to see if the object is releasing gas. On the other hand, if the robot detects the presence of gas while wandering around the area, it turns toward the direction of the wind that carries the gas. The robot then looks for any visible object in that direction. These navigation strategies are implemented into the robot under the framework of the behavior-based subsumption architecture. Experimental results on the search for a leaking bottle in an indoor environment are presented to demonstrate the validity of the navigation strategies. Keywords Gas/odor source localization . Vision . Olfaction . Mobile robot . Behavior-based architecture
1. Introduction Over the past decade, the applications of chemical sensing technologies on mobile robot platforms have become an acH. Ishida · H. Tanaka · H. Taniguchi · T. Moriizumi Department of Physical Electronics, Tokyo Institute of Technology, 2-12-1 Ookayama, Meguro-ku, Tokyo 152-8552, Japan H. Ishida () Department of Mechanical Systems Engineering, Tokyo University of Agriculture and Technology, 2-24-16 Nakacho, Koganei, Tokyo 184-8588, Japan
tive research field in robotics. Olfactory-guided search is one of the most popular types of behaviors in the animal kingdom (Dusenbery, 1992). Dogs are famous for their ability to track trace scents. Male moths can follow sexual pheromones over long distances to find their mates. Those successful examples have encouraged the development of mobile robots that can detect various gases and can localize their sources. The potential applications for such robots include searching for gas leaks, hazardous chemicals, and pollutant sources. The diffusion of gas molecules into air is generally a slow process, and even a slight airflow has a large impact on the dispersal of the released gas. Therefore, gas molecules released from their source form an aerial trail called a plume (Murlis et al., 1992) in the direction downwind from the source. Airflows in indoors and outdoors are almost always turbulent, and a number of eddies stretch and twist the plume. Tracking the resultant patchy meandering plume down to its source is not a trivial task. Research on gas/odor source localization using mobile robots started in the early 1990s. Attempts were first made to search for a source by tracking gas concentration gradients (Sandini et al., 1993) and chemical concentration gradients underwater (Consi et al., 1994). In this approach, a robot is generally equipped with a pair of gas sensors and is programmed to turn toward the side with the higher gas concentration (Consi et al., 1994; Kuwana et al., 1995; Sandini et al., 1993). However, the major problem is that there is no smooth concentration gradient in the patchy meandering plumes. The detected concentration gradients often fluctuate and mislead the robots. To cope with the problem, some animals have evolved an alternative strategy. A representative example is a male moth following a plume of sexual pheromone released from a conspecific female. The flight of a moth is known to consist of
Springer
232
Springer
2002). The robot was simply programmed to drive toward a salient visual feature while monitoring the gas sensor outputs. The olfactory signal was only used to stop the robot when sufficiently high concentration of gas was detected. In contrast, our navigation strategies achieve closer cooperation between vision and olfaction by closed-loop repetition of three behaviors, i.e., looking around for a salient object, sniffing the smell of the object, and turning toward the direction from which a puff of smell comes. Experimental results are presented to show the validity of the proposed algorithms.
2. Experimental setup 2.1. Task description The task for the mobile robot addressed in this paper is to find a leaking bottle laid on the floor in an unventilated room. Figure 1 shows the experimental environment. A 3.4 × 3.4 m square arena was surrounded by a cardboard wall and three real walls of the laboratory room. Although no artificial airflow generator was used, the flow-filed measurement using an ultrasonic anemometer (Model 81000, Young) suggested the presence of a stable convective airflow going down along the windows and then traveling over the arena to the right. Since the experiments were done in the wintertime, the air was cooled down at the windows on the left wall. The velocity of the airflow at the center of the arena was 3–6 cm/s. Figure 2 shows a photograph of the experimental arena. A 12.5 cm long bottle with the diameter of 4.5 cm was laid at the center of the arena as a gas source. Tissue paper soaked with 5 ml of ethanol was placed in the bottle to imitate a leaking bottle. Three empty bottles of 10–14 cm long were also laid in the arena to confuse the robot. Distinctive colors
Starting position
Wind
Odorless object
Pillar O Fig. 1 Experimental setup
Gas source
Cardboard wall (62 cm in height)
Obstacles
y
Window
upwind surges when in contact with the plume and casting to search for a plume when the contact is lost (Dusenbery, 1992). To mimic this behavior, airflow sensors were incorporated into the second generation of gas-sensing robots (Hayes et al., 2002; Ishida et al., 1994, 1996; Russell et al., 1995). Experimental results suggest that the wind direction is a reliable source of information for robotic searchers (Grasso and Atema, 2002; Hayes et al., 2002; Ishida et al., 1994, 1996; Russell et al., 1995). However, the successful demonstrations on upwind plume tracking have been mostly given under simplified experimental conditions. Ventilation systems and/or electric fans were often used to create unidirectional wind fields. Gas plumes with well-defined structures were established and, therefore, the robots were able to track the plumes easily with airflow sensors. Despite the success in those artificially created environments, upwind plume tracking is not always feasible with current sensor and robotic technologies in general indoor environments. The velocities of air currents are often too small to detect with airflow sensors on the robots. Temperature variations in the room produce three-dimensional convective airflows (Wandel et al., 2003). Wheeled robots cannot track the three-dimensional plumes formed in such flow fields even if detectable air currents exist. In this paper, we propose behavior-based navigation strategies for mobile robots to search for a gas/odor source. A leap from purely olfactory-guided navigation is achieved by exploiting visual information as an aid to locate a gas source. Extensive research has been done on computer vision. There are varieties of vision systems ready to use in real-life applications whereas gas-sensing systems for robots are still in the development phase. The aim of the study presented in this paper is two-fold. Firstly, it is shown that a visual aid is extremely effective in the olfactory search task. A vision system has a potential to raise the performance of the gas-source localization robot beyond the limitation imposed by the currently available gas sensors and airflow sensors. Secondly, it is also demonstrated that even a simple gas sensing system can help in decreasing the cost for visual signal processing required to accomplish the task. To emphasize the second point, an extremely simple vision system was implemented into the robot. It is shown that the robot can successfully localize a gas source even though the vision system alone is not powerful enough to accomplish the task. Marques et al. proposed a behavior-based architecture for plume tracking robots (Marques et al., 2002). However, their aim was to coordinate olfactory-guided search behavior with other behaviors including foraging and obstacle avoidance. On the other hand, the emphasis of our behavior-based strategies is laid on the close cooperation between vision and olfaction. As far as we know, cooperation between vision and olfaction in the context of gas-source localization has appeared only once in literatures (Martinez and Perrinet,
Auton Robot (2006) 20:231–238
Starting position x
Auton Robot (2006) 20:231–238
233
Fig. 4 Sample image captured using the onboard camera. The original image was taken in 24 bit color
Fig. 2 Photograph of experimental arena
were applied to the gas source and the odorless bottles to simplify the visual detection of those objects. The robot was challenged to declare which object was releasing the gas. 2.2. Robots and sensors The robot prepared for the experiments is shown in Fig. 3. A CMOS digital camera (EyeCam-1, Joker Robotics) was mounted on the front side of the robot. The field of view is 48 degrees horizontally and 36 degrees vertically, which covers the floor up to 1.5 m in front of the robot. Figure 4 shows a sample image consisting of 80 × 60 pixels. To detect any object on the floor, the RGB-to-HSV conversion is performed on the obtained images (Br¨aunl and Graf, 1999). The hue of each pixel is then compared with the hue of the floor, which has been taught a priori to the robot. The minimum size of the object is set to 20 pixels in order to prevent detecting a small blob on the floor as a salient object. Since no upper limit is set for the size of the object in the image, even a wall is detected in the current vision algorithm as a salient object on the floor. As shown in Fig. 2, different colors were used for the gas source and the odorless objects so that we could
Fig. 3 Robot equipped with gas sensors, airflow sensors, and a camera
easily identify the gas source in the movie clips during the detailed analyses of the experimental results. It should be noted that any objects having colors different from the floor are identified as salient objects in the program implemented into the robot. Therefore, the robot responded to the gas source and the odorless objects in exactly the same way. As shown in Fig. 3, the robot is equipped with four tin-oxide semiconductor gas sensors (TGS2602, Figaro Engineering). The sensor response, R, is defined as Rgas /Rair where Rgas and Rair are the sensor resistances in gas and in air, respectively. When the gas concentration increases, the semiconductor gas sensor becomes more conductive and, therefore, R decreases. One of the gas sensors is sticking out from the front side of the robot. When the robot approaches an object, this front sensor is used to check its smell. Since all the objects are laid on the floor, the front sensor is mounted close to the floor. The other three sensors are placed on top of the robot to detect the gas carried by air currents from a distant place. The response time of the gas sensors is a few seconds whereas the recovery of the sensor response to its initial level needs more than 30 s. Thermistor airflow sensors (F6201-1, Shibaura Electronics) are used to determine the direction of air currents. Four sensors are placed around a square pillar after its predecessor (Ishida et al., 1994, 1996). The eight sensor output patterns measured for every 45 degrees of the airflow direction and the one with equal sensor outputs are stored in the memory of the onboard microprocessor. The airflow direction is determined by selecting the pattern closest to the measured sensor outputs. When the pattern with equal sensor outputs is selected, the airflow velocity is considered to be too small to perform reliable directional determination. The response time of the airflow sensors is approximately 0.1 s. Since the detection limit of the sensor is approximately 5 cm/s, airflow was not always detectable in the arena. An onboard microprocessor (Eyebot controller M4, Joker Robotics) performs all data acquisition, signal processing, and motor control. The robot speed is set to 5 cm/s considering the slow recovery time of the gas sensor response. The relatively slow locomotion speed is also required for the Springer
Although plumes of hazardous gases are generally invisible, their sources, e.g., open bottles of chemicals, are often visible. Once a potential gas source is detected using a visual sensing system, the accurate direction and distance to the source can be obtained. The direction to the gas source can be also estimated by measuring the direction of airflow carrying the gas. Unlike the propagation of light, however, puffs of the gas may not have traveled along a straight path from the source location. Therefore, the accuracy of the source direction estimated with airflow sensors is, in general, significantly lower than the visually estimated source direction. If the target gas source has a distinctive visual feature, it would be easier to navigate the robot visually than to try olfactory-guided navigation. On the other hand, olfactory signal is a direct indication of the presence of a certain chemical substance. The camera can find an object that appears to be a bottle of a hazardous chemical. However, highly sophisticated image recognition is required to discriminate between an empty bottle and an open bottle filled with a hazardous chemical. Proper assessment of hazardousness is difficult even with a state-of-the-art vision system when a bottle labeled “poison” is filled with water or when a bottle of mineral water is actually filled with a toxic chemical solution. When the robot finds a bottle laid on the floor, potential risks can be easily and accurately assessed by bringing a gas sensor close to the bottle. Although some animals may be using visual cues as an aid in their olfactory search behavior, how they achieve the fusion of different sensory information is still an open question. Taking human behavior into consideration, however, it seems that the following behavioral strategies could be used for a robotic searcher to find a gas source.
r When a salient object is found in the field of view, a searcher approaches it and checks its smell.
r When the searcher smells an airborne scent, it turns to the upwind direction and looks for any visible object. Here we propose two behavior-based navigation algorithms shown in Figs. 5 and 6 that achieve the above two behaviors. Each algorithm consists of three or four reflexive behaviors that are activated under certain conditions. A fixed priority level is assigned to each reflexive behavior as in the subsumption architecture (Brooks, 1986). The scheduler in Figs. 5 and 6 picks up the behavior with the highest priority among those activated, and transmits the commands to the motors. Springer
Visual search Random search
Motors
Camera
Source checking
Fig. 5 Algorithm 1: Cooperation of vision and olfaction
Gas sensors Airflow sensors Camera
Source checking Olfactory search Visual search
Motors
3. Navigation strategies
Gas sensors
Scheduler
visual navigation described later since the image processing speed was approximately one frame per second with the onboard Motorola MC68332 processor.
Auton Robot (2006) 20:231–238
Scheduler
234
Random search
Fig. 6 Algorithm 2: Cooperation of vision, olfaction and airflow sensing
3.1. Algorithm 1: Cooperation of vision and olfaction A simple form of cooperation between vision and olfaction is implemented in the navigation algorithm shown in Fig. 5. It consists of three reflexive behaviors as follows.
r Random search behavior: The robot takes arched paths with randomly chosen lengths and curvatures. This behavior is always activated. However, it emerges only when no visual or olfactory signal is detected. Since its priority is the lowest, the random search behavior is suppressed whenever the other behaviors are activated. r Visual search behavior: When any object is found in the camera image, the visual search behavior is activated. The direction and distance to the object are estimated from the position of the object in the image, assuming that the floor is flat and that the object is laid on the floor (Br¨aunl and Graf, 1999). The robot drives to the estimated location of the object. r Source checking behavior: This behavior has the highest priority level. It is activated when the robot has reached exactly in front of the object. The robot stops and measures the response of the front gas sensor for 3 s. If gas with high concentration exists and the response value goes below 0.4, the search is terminated and the robot declares that a source is found. Otherwise, the robot backs up by 15 cm (approximately equals to the robot length), and makes a turn to a new direction to prevent going to the same object again. The robot then resumes the random or visual search behavior. The wait time of 3 s was set considering the response time of the gas sensor (a few seconds). In order
Auton Robot (2006) 20:231–238
3.2. Algorithm 2: Cooperation of vision, olfaction, and airflow sensing When we smell that something is burning, we turn around and look for a fire or smoke. In the case of human behavior, knowledge on the locations of potential fire origins, e.g., heaters and stoves, might be exploited to choose the turning direction. Although such a priori information is not always available to the robot, the onboard airflow sensors can be used to determine the direction from which the smell is carried. This smell-and-turn behavior is implemented in the navigation strategy shown in Fig. 6. In addition to the three behaviors shown in Fig. 5, the following behavior is added.
r Olfactory search behavior: When gas is detected and the response of any gas sensors goes below 0.8, the robot stops and measures the airflow direction. The airflow measurement is performed five times at the interval of 0.5 s. The measured values are usually fluctuating due to the turbulence of the airflow. When more than four out of five measured angle values are the same, the robot makes a turn and proceeds in that direction. If not, the measured airflow direction is considered to be unreliable and no turning is performed. A single measurement of the airflow direction was found completely unreliable. It was experimentally confirmed that the airflow directions obtained by the above-mentioned procedure were within 45◦ from the actual directions with the probability of 70%. The threshold value for the gas sensor response, 0.8, was determined so that neither electronic noise nor the accumulation of gas in the room would affect the decision making. Even when the robot is performing the visual search behavior and proceeding to a certain object, the robot should stop when gas is detected. Since the object seen in the camera image may not be a gas source, it should make a turn when the gas is coming from some other direction. Therefore, the priority level of the olfactory search behavior is set higher than the visual search. The olfactory search behavior is activated only when the airflow direction is measured with a certain confident. When airflow is too small to detect, the robot tries to locate a gas source using either the visual search or the random search.
4. Experimental results 4.1. Algorithm 1: Cooperation of vision and olfaction To evaluate the validity of the proposed Algorithms 1 and 2, their performances were compared with that of complete random navigation. Search for a gas source was started from the two starting positions shown in Fig. 1 with two different initial orientations. The experiments were repeated two times for each of those four starting conditions. Therefore, the total of eight search trials were conducted for each navigation algorithm. For the random navigation, Algorithm 1 was employed but the visual search behavior was kept deactivated. The robot with the random navigation algorithm, therefore, wanders around in the experimental arena. The source checking behavior is activated only when the robot happens to collide against the gas source. The success rates for the three algorithms and the average search time for the successful trials are summarized in Table 1. A trial was counted as a success when the robot declared the termination of search at the correct location of the gas source within ten minutes. In seven out of eight trials, the robot with the random navigation algorithm was not able to find the gas source within the time allowed. Remarkable improvement from the random navigation was attained for the robot with Algorithm 1. Although the average search time in the successful trials was similar, the success rate was raised to 100% for this set of experiments. A typical path of the robot with Algorithm 1 is shown in Fig. 7. The robot was placed at the starting position with the head pointing to the negative x direction. The random search behavior was first activated. However, the robot drove toward the wall Table 1
Result summary of search trials
Random search Algorithm 1 Algorithm 2
Success rate
Average time
1/8 8/8 7/8
157 s 166 s 197 s
A 300 250 y [cm]
to avoid declaring an odorless object to be a gas source, the threshold value, 0.4, was set to the level never achieved by the accumulation of the gas in the room. The new direction for the search is chosen always from the counterclockwise angles (60◦ –120◦ ) to avoid pointing to similar directions repeatedly.
235
Starting position
B
200
C
150
Gas source
100 Odorless object
50 0
0
50 100 150 200 250 300 x [cm]
Fig. 7 Example of the path of the robot with Algorithm 1
Springer
Auton Robot (2006) 20:231–238
4.2. Algorithm 2: Cooperation of vision, olfaction, and airflow sensing
1 0.8 Right Center Left Front
0.6 0.4 0.2 0
0
20
40
60 80 Time [s]
100 120
Fig. 8 Gas sensor responses observed during the trial shown in Fig. 7
immediately after the start of the search. The color of the floor was taught to the robot but, intentionally, no information on the color of the wall was given. Therefore, the robot mistook the wall as a salient object laid on the floor. After checking the smell of the wall several times (A in Fig. 7), the robot found an empty bottle (B in Fig. 7). Since no gas was detected from the bottle, the search was continued and the bottle leaking gas was found (C in Fig. 7). Figure 8 shows the response curves observed during the search trial in Fig. 7. It should be noted that no ethanol gas was detected most of the time during the search trial. The gas sensor response of 0.8 corresponds approximately to the ethanol concentration of 1 ppm. The plume of ethanol vapor was extending to the right side of the gas source in Fig. 7, and the gas was detected only in this limited region. From 70 s to 110 s from the start of the search, practically no difference was observed between the left and right sensors except for those arose from the baseline drift of the gas sensors. Therefore, the robot would not be able to perform concentration gradient tracking nor upwind tracking of the plume from the two starting positions tested here. On the arrival at the gassource location, the front gas sensor exhibited a steep drop in its resistance value since the sensor was brought to the immediate vicinity of the source. The search time was 126 s in the case of Fig. 7. As shown in Table 1, the positive false detection of the potential sources did not affect the search success rate since the smells of the visually detected objects were always checked with the front gas sensor. Although the walls and the odorless objects were often captured in the field of view, the robot never declared them to be the gas source. However, there is still a room for improving the search efficiency. The robot was sometimes observed to approach and check the same object over and over. A significant improvement in the search efficiency could be accomplished by memorizing the objects that has been already checked. Sophistication of the visual object recognition algorithm to discriminate bottles from other objects like walls would also lead to a significantly shorter search time. Springer
As summarized in Table 1, Algorithm 2 scored similar success rate and average time to reach the source as Algorithm 1. An example of the path of the robot with Algorithm 2 is shown in Fig. 9. Although the initial orientation of the robot was the negative y direction, a right turn was made in the random search behavior immediately after the start of the search. The robot then found an empty bottle and approached it (D in Fig. 9). After checking the smell of the bottle, the random search behavior brought the robot in the area downwind from the source location. Since the gas sensors started to respond, the robot turned back to the upwind direction (E in Fig. 9). After making upwind progress several times, the robot successfully found the gas source. Figure 10 shows the gas sensor responses measured during the trial in Fig. 9. Since the gas source was laid on the floor, the gas sensors at different heights showed significantly different response curves. Ethanol vapor is heavier than air, and the plume trails along the floor. Therefore, the front sensor generally showed larger resistance drop than the upper sensors. As the gas plume extends in the downwind direction, its horizontal and vertical spans both evolve. Therefore, the response of the upper sensors and the front
Starting position D
300 250
E y [cm]
1.2
200 150
Gas source
100 Odorless object
50 0
0
50 100 150 200 250 300 x [cm]
Fig. 9 Example of the path of the robot with Algorithm 2
Gas sensor response R
Gas sensor response R
236
1.2 Right
1 0.8
Left
0.6
Center
0.4 0.2 0
Front 0
20
40
60 80 Time [s]
100 120
Fig. 10 Gas sensor responses observed during the trial shown in Fig. 9
Auton Robot (2006) 20:231–238
237
300
y [cm]
250 200
300 Starting position
250 y [cm]
sensor were relatively similar at the downwind locations (25– 85 s in Fig. 10). When the robot approached the gas source, the response of the front sensor exhibited a quick descent, indicating the presence of gas with a high concentration. On the other hand, the response of the upper sensors started to recover since the vertical extent of the plume in the vicinity of the source location did not reach the upper sensors. Such difference in the bottom and upper sensor behaviors was exploited in the robolobster (Grasso and Atema, 2002) to judge the proximity to a chemical source. A gas-source localization robot can make use of this sensor behavior to judge if the targeting object is a gas source before actually reaching to the object. Small performance difference between Algorithms 1 and 2 can be attributed to the fact that the olfactory search behavior in Algorithm 2 is activated only when gas is detected. The dots in Fig. 11 show the locations at which the olfactory search behavior was activated. As shown in Fig. 1, the wind was traveling to the right wall. The activation of the olfactory search behavior occurred mostly in the narrow region downwind from the source although the olfactory search was sometimes activated outside this region due to the fluctuations in wind direction and the accumulation of the gas in the room. Once the robot entered the narrow plume, the robot successfully took a direct path to the source location using either the airflow direction or the visual cues. However, the robot spent much of the time in wandering outside the plume even in the successful trials, which resulted in the similar scores in Table 1 for Algorithms 1 and 2. A slight increase in the average search time for Algorithm 2 is due to the pause for 2 s in measuring the airflow direction (5 airflow-sensor readings taken at the interval of 0.5 s). In the trial shown in Fig. 12, for example, the robot spent long time in checking the smells of the walls and the odorless objects until it finally found the gas source. Almost no gas was detected during this 379-s long trial.
200 150
Gas source
100 Odorless object
50 0
0
50 100 150 200 250 300 x [cm]
Fig. 12 Example of the path of the robot with Algorithm 2 extending widely in the arena
It should also be noted that upwind turning was actually performed only 46 times although the olfactory search behavior was activated 76 times. In the rest of the cases, a reliable value for the airflow direction was not obtained due to too small airflow velocity or too much fluctuation in the airflow direction. It is obvious that the upwind plume-tracking strategies in the previous papers (Ishida et al., 1994, 1996) are unable to accomplish the exploration task with such a low probability of obtaining the airflow direction. On the other hand, the robot with Algorithm 2 can continue the exploration with visual signals. Once the upwind turning was performed, the robot captured the gas source in the field of view with high probability. The robot was observed to continue progress toward the gas source even when the airflow direction became unavailable on the way to the source. As shown in Fig. 12, the robot could even find the gas source from the upwind side. The robots in the previous papers would have to somehow return to the downwind side of the arena in order to find the gas source by upwind tracking of the plume. In case the gas source is not visually detected for some reason, the robot with Algorithm 2 will try plume tracking by steering upwind in the olfactory search behavior whenever the reliable value of the airflow direction is obtained. However, a different criterion should be used to declare the source without visual signals (Hayes et al., 2002). 5. Conclusions
150 Gas source
100 50 0 0
50 100 150 200 250 300 x [cm]
Fig. 11 Locations at which the olfactory search behavior was activated during the eight trials with Algorithm 2
Mobile robot navigation using vision and olfaction to search for a gas/odor source was presented. Two navigation strategies were proposed. In the first algorithm, the robot looks for a visible object laid on the floor, and drives to it to check the object with the gas sensor. The olfactory signal plays a more important role in the second algorithm. When gas is detected, the robot turns toward the upwind direction and looks for any visible object in that direction.
Springer
238
In the experiments, the robot was challenged to find a bottle containing ethanol in the presence of empty bottles with similar appearances. The proposed strategies were proved to be useful. The previous gas-source localization algorithms were evaluated only in much simpler environments with artificially produced steady airflows. Since those algorithms perform the robot navigation based only on gas concentration gradients and airflow directions, they are unusable when gas concentration near the floor is too low to detect or airflow is undetectably small. Even under such conditions, the robot with the proposed navigation strategies can accomplish the source-localization task with the visual aid. In this paper, detection of salient objects in the camera image was performed simply using the difference in color. Future work will include implementation of more sophisticated image processing to achieve robust detection of visual saliency without a priori knowledge. In real-life applications, the robot will encounter a number of objects scattered in the field. The current algorithms spend too much time in checking the smell of the same object repeatedly. Therefore, the future work should also be addressed to implement the mapping function for memorizing the objects that have been already checked. The performance of the robotic searcher will then be investigated thoroughly in experiments that simulate real-life situations more precisely.
References Br¨aunl, T. and Graf, B. 1999. Autonomous mobile robots with onboard vision and local intelligence. In Proc. 2nd IEEE Workshop on Perception for Mobile Agents, pp. 51–57.
Springer
Auton Robot (2006) 20:231–238 Brooks, R.A. 1986. A robust layered control system for a mobile robot. IEEE J. of Robot. Automat. 2:14–23. Consi, T.R., Atema, J., Goudey, C.A., Cho, J., and Chryssostomidis, C. 1994. AUV guidance with chemical signals. In Proc. 1994 Symp. on Autonomous Underwater Vehicle Technology, pp. 450–455. Dusenbery, D.B. 1992. Sensory Ecology. WH Freeman and Company: New York, NY. Grasso, F.W. and Atema, J. 2002. Integration of flow and chemical sensing for guidance of autonomous marine robots in turbulent flows. Environmental Fluid Mechanics, 2:95–114. Hayes, A.T., Martinoli, A., and Goodman, R.M. 2002. Distributed odor source localization. IEEE Sensors J., 2:260–271. Ishida, H., Kagawa, Y., Nakamoto, T., and Moriizumi, T. 1996. Odorsource localization in the clean room by an autonomous mobile sensing system. Sens. Actuators B, 33:115–121. Ishida, H., Suetsugu, K., Nakamoto, T., and Moriizumi, T. 1994. Study of autonomous mobile sensing system for localization of odor source using gas sensors and anemometric sensors. Sens. Actuators A, 45:153–157. Kuwana, Y., Shimoyama, I., and Miura, H. 1995. Steering control of a mobile robot using insect antennae. In Proc. IEEE/RSJ Int. Conf. on Intell. Robots Syst., pp. 530–535. Marques, L., Nunes, U., and de Almeida, A.T. 2002. Olfaction-based mobile robot navigation. Thin Solid Films, 418:51–58. Martinez, D. and Perrinet, L. 2002. Cooperation between vision and olfaction in a koala robot. In Report on the 2002 Workshop on Neuromorphic Engineering, pp. 51–53. Murlis, J., Elkinton, J.S, and Card´e, R.T. 1992. Odor plumes and how insects use them. Annu. Rev. Entomol., 37:505–532. Russell, R.A., Thiel, D., Deveza, R., and Mackay-Sim, A. 1995. A robotic system to locate hazardous chemical leaks. In Proc. IEEE Int. Conf. Robot. Automat., pp. 556–561. Sandini, G., Lucarini, G., and Varoli, M. 1993. Gradient driven selforganizing systems. In Proc. IEEE/RSJ Int. Conf. on Intell. Robots Syst., pp. 429–432. Wandel, M., Lilienthal, A., Duckett, T., Weimar, U., and Zell, A. 2003. Gas distribution in unventilated indoor environments inspected by a mobile robot. In Proc. Int. Conf. Adv. Robot., pp. 507– 512.