Pattern Anal Applic DOI 10.1007/s10044-015-0501-3
SHORT PAPER
Frequency analysis of a bumblebee (Bombus impatiens) wingbeat Joaquı´n Santoyo1 • Willy Azarcoya1 • Manuel Valencia1 • Alfonso Torres2 Joaquı´n Salas1
•
Received: 9 September 2014 / Accepted: 25 June 2015 Ó Springer-Verlag London 2015
Abstract The wingbeat of an insect relates directly to energy consumption, is a strong indicator of its rate of metabolism and physical structure, and inversely relates to the length of its wing and to the mass of its body. It is also a principal component in understanding the aerodynamic properties of its flight. In this paper, we introduce a method based on the use of high-speed cameras and computer vision techniques to analyze a bumblebee (Bombus impatiens) wingbeat. We start capturing images with a virtual stereo system when a bumblebee crosses two intersecting laser beams. Then, we detect moving objects using background subtraction. Next, via Fourier analysis of the observed optical flow contraction/expansion, and marginalization of prior knowledge, we estimate the wingbeat frequency. Finally, the information from the two virtual cameras is fused using a robust state estimation. Our system is well prepared to handle occlusions; it works with untethered insects; and it does not require the synchronization of a multi-camera system. Keywords Wingbeat analysis Bombus impatiens Insect flapping frequency Optical flow analysis
& Joaquı´n Salas
[email protected] 1
Instituto Polite´cnico Nacional, Cerro Blanco 141, Colinas del Cimatario, 76090 Quere´taro, Mexico
2
Koppert Me´xico, Circuito El Marque´s 82 Nte Parque Industrial El Marque´s, El Marque´s, Quere´taro, Mexico
1 Introduction Some insects flap their wings at an amazing speed. The highest reported wingbeat corresponds to the observations of high wing-stroke and thoracic vibration frequency in some midges (Forcipomyia), found by Sotavalta [26] to reach above 1000 beats/s. Therefore, it is not surprising that some types of flights, such as hovering, represent the highest rate of sustained aerobic energy expenditure in the animal kingdom. Some years ago, Pringle [20] discovered that for some flying insects muscle contraction is determined by impulses from the central nervous system (synchronous muscle) or by mechanical resonant characteristics of the wing and thorax (asynchronous). At that point, researchers suspected that there was a strong relationship between the wingbeat frequency and the resonance or natural frequency of the wings. But recently, San et al. [22] determined that that was not the case. In any event, the capability to wingbeat at high frequencies provides the insects with an outstanding maneuverability capability, as wing instabilities need up to 15 times more to manifest compared to the wingbeat period [29]. Indeed, Fry et al. [9] found, after analyzing the magnitude of the torque and body motion of Drosophila during rapid turns, that inertia, and not friction, dominates the flight dynamics of insects. In addition, wingbeat count could demonstrate to be an effective way to distinguish between different species [31]. In this paper, we introduce a method based on the use of high-speed cameras and computer vision techniques to automatically characterize a bumblebee wingbeat. First, we start by capturing images with a virtual stereo system when a bumblebee crosses two intersecting beam lasers; then, we detect moving objects by performing background subtraction; next, we estimate the wingbeat frequency by Fourier analysis of the observed contraction/expansion of optical flow, in conjunction with marginalization based on prior
123
Pattern Anal Applic
knowledge; and finally, we make use of robust state estimation to combine information from the two virtual sources of video.
2 Related work Due to its importance, as a principal component in the understanding of flying insects, there is a large number of studies oriented towards the measurement of their wingbeat frequency. For example, Koehler et al. [15] summarized the results obtained to estimate wingbeat frequency for locust, hoverfly, fruitfly, dronefly, and dragonfly; Casey et al. [3] did the same for several species of euglossine bees, including Eulaema, Eufriesea, Euglossa, and Exaerete. With these measurements, some researchers, such as Deakin [5] and Sudo et al. [28], have developed empirical formulas connecting wingbeat frequency of insects with their masses and wing areas. Practitioners have used several techniques to measure wingbeat frequency. For instance, Vanderplank [32] used a stroboscope and flash photography to measure the wingbeat of Glossina papalis tsetse flies. He also attempted to trace the wing tip path on a kymograph drum. Although, this method has the inconvenience of loading the wing. In addition to that, Sotavalta [25] explored the flight-tone recorded with a microphone linked to an oscilloscope. This method has the possible difficulty that the microphone has to be very close to the insect to avoid sound contamination, which makes this method more effective with tethered insects. In [3], Casey and May determined wing-stroke frequency by playing back on an oscilloscope a tape recording obtained at the mouth of a jar, where the researchers have placed the bees. In a different approach, Sane and Jacobson [23] measured airflow of tethered hawk moths, which beats at a frequency of about 30 flaps/s, using hot wire anemometry. More recently, researchers have been using either optical tachometers or high-speed cameras. Unwin and Ellington [30] utilized an optical tachometer which they reported capable of measuring the wingbeat frequency at distances of 1 m for Smittia aterrima, and up to 10 m for Bombus sp. The same instrument was used by Roberts and Harrison [21] to measure wingbeat frequency of bees. The work in [6, 13, 17, 22, 27, 29] is representative of a type of approach in which, even recently, there is a human count of the wingbeat frequency as observed with a highspeed camera. Usually, a certain position, e.g., such as the wings in full extension, is used as a reference. Then, the wingbeat count is the result of dividing the image grabbing frequency by the number of frames required for the insect
123
wings to reach the same relative position. In this approach, it is difficult to determine inter-flap frequency, and it is common to be in error by a few frames, which may result in a poor wingbeat count estimation. This is a strong argument supporting research on automated methods. Indeed, some of the automated methods include the one developed by Sudo et al. [28] who studied insect wingbeat frequency from an aerobiomechanics perspective. They did flapping analysis with a multi-camera motion analysis system. In their approach, they observed tethered insects with an optical displacement detector system, which signals they analyzed using a fast Fourier transform (FFT), focused on the extrinsic skeleton. Also using tethered insects, Graetzel et al. [10] use tethered Drosophila to implement their wingbeat measurement system using image analysis. First, the wings are extracted by background subtraction. Then, the wings hinge position is extracted automatically by finding the intersection of the strongest line on each side of the body. More sophisticated computer vision techniques have also been approached. For instance, Fontaine et al. [8] present a system which fits a model of the wings and body of insects to analyze the flight initiation of Drosophila. In their system they take the input of three high-speed cameras. Then, an integrated 3D model of the Drosophila’s head, body and wings is fit to the observations in every frame of the sequence. In the first frame, the user provides initial locations for the head, tail, and join/tip locations of both wings for two out of the three cameras views. In successive images, the model is registered with respect to the observations. A method that also makes use of multiple cameras to count the wingbeat of bats with marked wings was recently introduced by Bergou et al. [2]. FFT has been used in the study of the wingbeat frequency of other animals, mainly with bats [1] and birds [4, 18]. Atanbori et al. [1] used the width and height of bounding boxes as input to the Fourier analysis. For bees, Casey et al. [3] recorded the sound of flapping wings with a directional microphone and their wingbeat with an oscilloscope. The flapping count ranged from as low as 82 flaps/s and as high as 265 flaps/s. Using laser-based techniques, Zeng et al. [33] developed a method to project and then scan lines on the insects. They used an acousto-optic deflector (a laser beam which direction is modulated with sound waves) to project lines at different angles. The projected lines are recognized and from them wingbeat is measured. In their experiments they tethered bumblebees to a rigid needle. More recently, van Roy et al. [31] introduced the use of a large area, 100 mm2 , photovoltaic cell. They applied their method to classify between bumblebees Boombus terrestris (N ¼ 17) and Bombus ignitus (N ¼ 25). First, their system produces a voltage signal which is then
Pattern Anal Applic
evaluated using power spectra density analysis. At our end, we use dense optical flow to measure displacements, conditional marginalization to introduce priors, and robust state estimation to fuse independent visual measurements. Our method is applied to unmarked and free-flying bumblebees as they cross the field of view of an automatically operated high-speed camera.
the flow field using the Frobenius norm x(k) of the covariance matrix for each frame. As the bumblebee flaps its wings, this value increases and decreases.
3 Measuring wingbeat
Xðm; xÞ ¼
3.2 Frequency analysis For frequency analysis, we use the short-time Fourier transform, such as 1 X
xðnÞwðn mÞejxn ;
ð2Þ
n¼1
The method we have developed requires to preprocess and characterize the image elements, to perform frequency analysis, and to fuse data from different sources. Figure 2 illustrates our approach to the measurement of bumblebee wingbeat frequency. 3.1 Image description For the incoming image stream fIi ðxÞgki¼1 , we construct a model of the background Bk ðxÞ using a mixture of Gaussian distributions with the method developed by Zivkovic [35], and Zivkovic and van der Heijden [36, 37], where the number of Gaussian distributions is a free variable for each particular pixel of the image. The moving objects F k ðxÞ, for frame k, are detected by considering the weights of the Gaussians and the portion of the data that is apparently part of the foreground not influencing the background model. Holes in the estimated foreground F k ðxÞ are filled up with a dilate morphology operation as F k ðxÞ ¼ F k ðxÞ S;
ð1Þ
where S is a suitable structuring element [24]. Then, dense optical flow Ok ðxÞ ¼ fOxk ðxÞ; Oyk ðxÞg for successive frames k 1 and k is computed using the algorithm proposed by Farneba¨ck [7]. This method approximates the flow for pixel neighborhoods with quadratic polynomials and is solved pointwise using prior knowledge of the displacement field to obtain an iterative solution, via multiscale analysis. During a bumblebee’s wing stroke, the corresponding optical flow field follows the wing’s displacement. When the stroke reaches the extreme positions, the wings remain still for a brief moment and the corresponding dispersion of the flow field reduces perceptibly. We model the distribution of displacements resulting from the flow field observing that its dispersion expands and collapses as the stroke progresses. The direction and magnitude of the field depend on factors such as the particular orientation of the bumblebee, foreshortening the distance from the camera to the object. There are many forms to determine statistical dispersion [16]. We find it suitable to model the state of
where we choose wðn mÞ to be a Gaussian window, as in [12]. For the rest of the analysis, we use the energy spectral density kXðm; xÞk2 . Once we obtain the frequencies for which the spectrum gives high responses, we search for the one corresponding to the number of flaps per second. Marginalization provides us a clean framework to incorporate prior knowledge. In our work, we are interested in the flapping frequency f ¼ x=2p for which the evidence z is most logical. Thus pðxjzÞ ¼ pðzjxÞpðxÞ=pðzÞ:
ð3Þ
In this work, we define the likelihood as kXðm; xÞk2 pðzjxÞ ¼ P ; 2 x kXðm; xÞk
ð4Þ
and as prior information the Gaussian is defined as ! 1 ðz lÞ2 pðzÞ ¼ pffiffiffiffiffiffi exp : 2r2 r 2p
ð5Þ
3.3 Data fusion Our camera setup illustrates the advantage that we have from making two measurements x to obtain an estimate of the real flapping speed w. In our case, the measurements come from the frequency obtained using prior knowledge. A standard way to fuse two sources of information is provided by the Kalman filter [14]. The Kalman filter is an interactive formulation where predictions wþ about the state wk are followed by updates based on measurements xk . In the standard Kalman formulation, there is an uncertainty associated with the transition of the state, expressed by covariance Rp , and an uncertainty associated with the observation of measurements under a certain state, expressed by covariance Rm . The equation for the prediction stage is given by Prince [19] wþ ¼ lp þ Wwk1 ; and Rþ ¼ Rp þ WRk1 WT :
ð6Þ
While the update equation is given by
123
Pattern Anal Applic
wk ¼ wþ þ Kðxk lm Uwþ Þ; and Rk ¼ ðI KUÞRþ : ð7Þ Here, Kalman computes the filter gain as K ¼ Rþ UT ðRm þ URþ UT Þ1 ;
ð8Þ
and W describes the state transition, U the observability of measurements under a certain state, and lp and lm biases in the expression of the former and the latter, respectively.
Fig. 1 Elements of the image capturing system. a A high-speed camera system v grabs a sequence of images using first-surface mirrors Ni to simulate a stereo system. We constructed an external electronic trigger system with a pair of intersecting lasers and associated photoreceptors. We released 20 bumblebees inside a fisher bowl. In b, we show the external trigger. An electronic circuit triggers the high-speed camera when a bumblebee crosses two intersecting laser beams. The high-speed camera is focused on the intersection of
123
3.4 High-speed camera trigger An external input signal using a TTL pulse (see Fig. 1b) triggers the operation of the high-speed camera. The trigger system consists of two VLM-650-02-LPA 650 nm, 3 mW, laser modules, two photo transistors SFH314 and a NAND gate. Each emitting laser light beam points towards a phototransistor, such that the beams intersect at the camera lens focus. The transistors are in switch mode, so that when light strikes the sensor, a voltage signal passes between the
these beams. When a bumblebee crosses the intersection of both lasers, it activates the camera, which starts recording at 5000 frames/ s. In c, we show a mechanical support, which is devised to observe bumblebees from two different viewpoints, reducing occlusions. First-surface mirrors are held on bases 1–4, while the camera lens is placed on plate A. The mirrors can move along the beams. In d, we show a typical image resulting from our system
Pattern Anal Applic
collector and the emitter. The circuit sends these negated output signals (TTL) to a NAND logic gate. A falling edge signal activates the external trigger. In this case, the NAND gate provides the desired logic signal to activate the camera, i.e., the camera is activated when both light beams are interrupted by the flight of a bumblebee. A cylindrical cover protects the phototransistors from exposure to direct light other than that from the laser from reaching them. 3.5 Mechanical support The double virtual camera system proposed by Zhang et al. [34] inspired our image capturing system. In their design, Zhang et al. project the scene onto two planar reflectors, then through a prism, and finally to a high-speed camera lens. This resolves the problems of occlusion with a single camera, and synchronization with a multi-camera system. Their vision system measures several wing motion distortions on honey bees, including the angles of wing flapping, lag, attack, and torsion, as well as camber deformation. Of course, in our case, we are interested on studying the wingbeat frequency of bumblebees. Figure 1c illustrates the mechanical device that we constructed. Assuming that the reference coordinate system is at the center of projection of the camera, the mechanical platform is attached to the frontal face of the camera through plate A. We attach the mirrors with screws on bases 1–4. The central mirrors have one degree of freedom, along the Z-beam axis. The lateral mirrors have three degrees of freedom, along the Z-beam and X-beam axes and around their own axis (the vertical symmetry axis of the reflective face of a mirror). The Z-beam, however, can be attached to plate A at three positions along the Y-beam axis (not shown), at heights 10 mm apart, yielding a vertical distance between the center of the camera lens and the base of the central mirrors of 32, 42 and 52 mm. We constructed the device with aluminum 1100 alloy.
4 Results To test our system, we setup an experimental platform around a Memrecam high-speed camera, model GX-1, with an available internal memory of 2 GB, grabbing images at 5000 frames/s (see Fig. 1). The camera points to a fish tank of 50 cm 30 cm 26 cm, width, depth and height, respectively. The electronic trigger consists of two intersecting 650 nm red lasers pointing to corresponding light detectors. We focus the camera on the point where the two laser beams intersect. The electronic circuitry relays the signal produced by these sensors to the external trigger system of the camera. When operating in external trigger mode, the camera records images in a loop using the available memory. Once a
bumblebee passes through the intersection of the laser beams, the system generates the trigger signal. Upon the reception of this signal, the camera is programmed to retain the last 0.4075 s of the recorded images. The camera then continues recording until the available memory is exhausted. For our experiment, we released 200 bumblebees (Bombus impatiens) inside the fish bowl. We eventually recorded 50 videos of flying bumblebees (Fig. 1). After the algorithm detects moving foreground objects, holes are covered with a dilate operation which receives as input a square structural element with dimensions 4 4 filled with ones, 144 . From prior knowledge about the bumblebee wingbeat frequency pðzjl; rÞ, the initial values for l and r are set to 180 and 6 flaps/s, respectively. For robust state estimation and fusion of the two virtual sources of video, the state transition covariance Rp is set to 2, the uncertainty in the measurements Rm is set to 100I22 , the state transition W is set to 1, and the observability of the measurements U is set to ½1; 1T . To quantify the precision of our method, three of the coauthors of this paper visually counted the number of wingbeats per second by retrieving the frame f0 when a wing reached a certain position, i.e., the position of its largest span. Then, we registered every frame number fi of the sequence when the wing reached that position. From these data, we computed the beats per second by dividing the high-speed camera capturing frequency (for these experiments 5000 frames/s) by the difference between these landmark observations ðfi fi1 Þ. Figure 2g shows this comparison qualitatively. Then, we computed the RMS error between the predicted beat speed and ground truth for 50 flying bumblebee video sequences, estimating an error of 13:6326 flaps/s. To compare this with related work, we notice that van Roy et al. [31] provide ranges of uncertainty. Although they use their system with Bombus terrestris and Bombus ignitus, the respective uncertainties in their measurements are 11:215 and 15:062 flaps/s. If we consider that at 5000 frames/s, and one wingbeat every 25 frames, an uncertainty of 8 flaps/s represents about 1 frame of difference with respect to the true value, we may conclude that a principal source of error is related the image capture frequency. To foster the research in this area and to possibly establish a comparison baseline, we are making available the videos along with the ground truth annotations in our website (http://imagenes.cicataqro.ipn. mx:8080/imagenes/local/bumblebees).
5 Conclusion In this manuscript, we introduce the first method to automatically analyze the wingbeat frequency of free-flying (untethered) bumblebees using computer vision techniques. The method, based on the observation of bumblebees from
123
Pattern Anal Applic
(a)
(b) (c) 1500 210
3
flaps/s= 200
2.5
200
1000
2 1.5
flaps/s
Power
average displacement (pxl)
3.5
190 180
500 170
1 0.5 0
160 0 50 100 150 200 250 300 350 400
0
500
1000 1500 2000 2500
500
1000
frame
(f)
Frequency (flaps/s)
frame number
0
(e)
(d) 210
flaps/s
200
190
180
170
160
(g)
0
500
1000
frame The continuous (black) line corresponds to the combination of both sources
of information. The dotted (blue) line corresponds to the manually computed ground truth. Fig. 2 Method to analyze a bumblebee wingbeat frequency. In a, the algorithm extracts moving objects using background subtraction, as shown in (b). Then, we compute dense optical flow, as in (c), using consecutive frames. Next, we estimate statistical dispersion of the optical flow displacement using the Frobenius norm, as in (d).
Following this, we use the short-time Fourier transform, as in (e), to obtain the candidates’ wingbeat frequencies. Next, as in (f), we implemented a Bayesian scheme to incorporate prior knowledge. Finally, we fuse both sources of information, as in (g), using a Kalman filter (color figure online)
two viewpoints, offers some robustness to visual interference, and is primarily based on the analysis of dense optical flow. In addition, the virtual stereo system is constructed with an array of mirrors, which makes this system less expensive than a multi-camera system. In our
approach, we avoid the problem of synchronization of different sources of video, although we need to add a mechanical support and first-surface mirrors to the system. In our approach, there is no need to tether or mark the insect. Since the method uses dense optical flow, it has the
123
Pattern Anal Applic
potential of being of general use. Also, the virtual stereo system makes it possible to use robust techniques to fuse diverse sources of information. In the future, we expect to combine this system with other modules, to analyze the wing structure and to incorporate particle image velocimetry to characterize airflow. All these variables will be useful to increase our understanding of a particular insect’s general well-being. It could also be a tool to distinguish between different bumblebee species. Furthermore, it may be interesting to analyze other insects using the same methodology. In addition, it could be possible to use a bumblebee wingbeat model to eliminate the use of a high-speed camera. The computer vision system could generate multiple hypotheses and refine its predictions based on the measurements of a much sparser set of observations, as in a particle filter [11] approach. Acknowledgments This work was partially funded by SIP-IPN under contract 20140325, and Fomix CONACYT-GDF under Grant 189085.
References 1. Atanbori J, Cowling P, Murray J, Colston B, Eady P, Hughes D, Nixon I, Dickinson P (2013) Analysis of bat wing beat frequency using Fourier transform. In: Wilson R, Hancock E, Bors A, Smith W (eds) Computer analysis of images and patterns, pp 370–377. Springer, Berlin 2. Bergou A, Swartz S, Breuer K, Taubin G (2011) 3D reconstruction of bat flight kinematics from sparse multiple views. In: IEEE international conference on computer vision workshops, pp 1618–1625. IEEE 3. Casey T, Michael M, Morgan K (1985) Flight energetics of euglossine bees in relation to morphology and wing stroke frequency. J Exp Biol 116(1):271–289 4. Cochran W, Bowlin M, Wikelski M (2008) Wingbeat frequency and flap-pause ratio during natural migratory flight in thrushes. Integr Comp Biol 48(1):134–151 5. Deakin M (2010) Formulae for insect wingbeat frequency. J Insect Sci 10(1):1–9. doi:10.1673/031.010.9601 6. Dudley R, Ellington C (1990) Mechanics of forward flight in bumblebees: I. Kinematics and morphology. J Exp Biol 148(1):19–52 7. Farneba¨ck G (2003) Two-frame motion estimation based on polynomial expansion. In: Bigun J, Gustavsson T (eds) Image analysis, pp 363–370. Springer, Berlin 8. Fontaine E, Zabala F, Dickinson M, Burdick J (2009) Wing and body motion during flight initiation in Drosophila revealed by automated visual tracking. J Exp Biol 212(9):1307–1323 9. Fry S, Sayaman R, Dickinson M (2003) The aerodynamics of free-flight maneuvers in Drosophila. Science 300(5618):495–498 10. Graetzel F, Fry S, Nelson B (2006) A 6000 Hz computer vision system for real-time wing beat analysis of Drosophila. In: International conference on biomedical robotics and biomechatronics, pp 278–283. IEEE 11. Isard M, Blake A (1998) Condensation: conditional density propagation for visual tracking. Int J Comput Vis 29(1):5–28 12. Jacobsen E, Lyons R (2003) The sliding DFT. Signal Process Mag IEEE 20(2):74–80
13. Jantzen B, Eisner T (2008) Hindwings are unnecessary for flight but essential for execution of normal evasive flight in Lepidoptera. Proc Natl Acad Sci 105(43):16636–16640 14. Kalman R (1960) A new approach to linear filtering and prediction problems. J Basic Eng 82(1):35–45 15. Koehler C, Liang Z, Gaston Z, Wan H, Dong H (2012) 3D reconstruction and analysis of wing deformation in free-flying dragonflies. J Exp Biol 215(17):3018–3027 16. McQuarrie D (2000) Statistical mechanics. University Science, Sausalito 17. Mountcastle A, Combes S (2013) Wing flexibility enhances loadlifting capacity in bumblebees. Proc R Soc B Biol Sci 280(1759):1–8. doi:10.1098/rspb.2013.0531 18. Pennycuick C (2001) Speeds and wingbeat frequencies of migrating birds compared with calculated benchmarks. J Exp Biol 204(19):3283–3294 19. Prince S (2012) Computer vision: models, learning, and inference. Cambridge University Press, Cambridge 20. Pringle J (1949) The excitation and contraction of the flight muscles of insects. J Physiol 108(2):226–232 21. Roberts S, Harrison J (1998) Mechanisms of thermoregulation in flying bees. Am Zool 38(3):492–502 22. San N, Truong Q, Goo N, Park H (2013) Relationship between wingbeat frequency and resonant frequency of the wing in insects. Bioinspir Biomim 8(4):046008 23. Sane S, Jacobson N (2006) Induced airflow in flying insects II. Measurement of induced flow. J Exp Biol 209(1):43–56 24. Serra J (1986) Introduction to mathematical morphology. Comput Vis Graph Image Process 35(3):283–305 25. Sotavalta O (1952) Flight-tone and wing-stroke frequency of insects and the dynamics of insect flight. Nature 170(4338):1057–1058 26. Sotavalta O (1953) Recordings of high wing-stroke and thoracic vibration frequency in some midges. Biol Bull 104(3):439–444 27. Steen R (2014) The use of a low cost high speed camera to monitor wingbeat frequency in hummingbirds (Trochilidae). Ardeola 61(1):111–120 28. Sudo S, Tsuyuki K, Kanno K (2005) Wing characteristics and flapping behavior of flying insects. Exp Mech 45(6):550–555 29. Sun M, Xiong Y (2005) Dynamic flight stability of a hovering bumblebee. J Exp Biol 208(3):447–459 30. Unwin D, Ellington C (1979) An optical tachometer for measurement of the wing-beat frequency of free-flying insects. J Exp Biol 82(1):377–378 31. van Roy J, De Baerdemaeker J, Saeys W, De Ketelaere B (2014) Optical identification of bumblebee species: effect of morphology on wingbeat frequency. Comput Electron Agric 109:94–100 32. Vanderplank FL (1950) Air-speed/wing-tip speed ratios of insect flight. Nature 165:806–807. doi:10.1038/165806a0 33. Zeng L, Hao Q, Kawachi K (2000) A scanning projected line method for measuring a beating bumblebee wing. Optics Commun 183(1):37–43 34. Zhang G, Sun J, Chen D, Wang Y (2008) Flapping motion measurement of honeybee bilateral wings using four virtual structured-light sensors. Sens Actuators A Phys 148(1):19–27 35. Zivkovic Z (2004) Improved adaptive Gaussian mixture model for background subtraction. In: International conference on pattern recognition, vol 2, pp 28–31. IEEE 36. Zivkovic Z, van der Heijden F (2004) Recursive unsupervised learning of finite mixture models. IEEE Trans Pattern Anal Mach Intell 26(5):651–656 37. Zivkovic Z, van der Heijden F (2006) Efficient adaptive density estimation per image pixel for the task of background subtraction. Pattern Recognit Lett 27(7):773–780
123