CHINESE JOURNAL OF MECHANICAL ENGINEERING Vol. 25, No. 1, 2012
·173·
DOI: 10.3901/CJME.2012.01.173, available online at www.springerlink.com; www.cjmenet.com; www.cjmenet.com.cn
Verticality Detection Algorithm Based on Local Image Sharpness Criterion ZHANG Jin, WANG Zhong*, YE Shenghua, YANG Chun, and LI Lin State Key Laboratory of Precision Measuring Technology and Instruments, Tianjin University, Tianjin 300072, China Received December 20, 2010; revised May 12, 2011; accepted May 18, 2011
Abstract: In the high precision image measurement system, the verticality error between the axis of the shooting system and the measured object can bring error of the measurement result. The high demand of the system’s vertical degree is raised by measure system due to the demands of high precision and disposable full field imaging in the micro-parts imaging measurement. The existing method of optical axis verticality detection cannot meet the demand all. In order to achieve the high-precision adjustment of the system optical axis, the algorithm of detecting verticality based on regional image definition is proposed. First, the objected standard image is divided into fixed area. Then, the object plane is moved from the downside to the upside of the focus plane, meanwhile, recording the definition function values of each standard image region at each step, and fitting out the clearest positions of the regions. Finally, according to the inter-regional relations between the locations and the height difference of the each regional clearest position, the small angle between the optical axis and the measured surface can be calculated. The experiment is based on the given image of lithography template with the scale of 10 μm as move unit, and the results show that this method effective reduced the small angle between the system optical axis and the measured body in high-precision image measuring system, the evaluation accuracy is less than 0.1°, meeting the requirements in high-precision measurement. The proposed method of detecting verticality based on regional image definition can evaluate the verticality error between the axis of the shooting system and the measured object accurately, effectively and conveniently. Key words: verticality detection, image definition, optical axis adjustment
1
Introduction
Due to the angle deviation between the axis of the shooting system and the measured object, the errors between measured and real value are brought into the two-dimensional image measurement process. The errors could be neglected in the system of low accuracy while could significantly reduce the accuracy in the system of high accuracy[1–4]. The high demand of the system’s vertical degree is raised by measure system due to the demands of high precision and disposable full field imaging in the micro-parts imaging measurement. Two adverse effects will be caused if the axis of the shooting system and the measured object are not vertical[5]. One is that the distortion of the image in image plane may lead to decline of image quality; the other is the loss of the depth of field in vertical direction. The first point is easier to understand and also most concerned about in the literature. The second does not exist in the zoom lens system, but in a micro-measurement system, it will cause measurement error and a serious loss of depth of * Corresponding author. E-mail:
[email protected] This project is supported by Major Science and Technology Funded Project of National High-grad CNC of China (Grant No. 2009ZX04014-092), and Tianjin Municipal Key Natural Science Foundation of China (Grant No. 09JCZDJC26700) © Chinese Mechanical Engineering Society and Springer-Verlag Berlin Heidelberg 2012
field[6]. So the high-precision measurement system, with requirements of the depth of field, asks for high verticality. However, the small angle between the axis and object plane is difficult to be accurately detected. The classic method of verticality detecting can be roughly divided into two categories: First, alignment measurement method, namely clearance measurements, includes the following: (1) using self-collimator, (2) using CMM, (3) using diffraction fringe generated by laser to get spacing. Second, planar image processing method, including the following: (1) shooting the standard circle to inter-compare, (2) shooting parallel straight line then fitting the line and arithmetic bias angle. A system developed by LIU, et al[7], achieved high accuracy, but the system can’t measure the verticality of the axis. A verticality detecting method based on image processing is proposed by GONG, et al[8], the advantage of this method is convenient, simple and real-time measurement, but it is only adapted to ring body surface and the accuracy is low than the requirement of micro-part object plane measurement. The alignment method is a good method for measuring small angles, but it only performs well in distance measurement and calibrating the optical instruments. The accuracy of image method is not high enough for high precision measurement of micro-parts. In view of the shortage of the methods we discussed above, this paper proposes a method to detect vertical degrees by regional- resolution in the course of measuring escapement heel, a micro-part in the watch.
·174·
ZHANG Jin, et al: Verticality Detection Algorithm Based on Local Image Sharpness Criterion
2 Verticality Detection Method Based on Local Image Criteria Differences and Its Mathematical Model 2.1 Influence of axis and object plane is not vertical For example, in a micro-measurement system, this requests to measure the relative accuracy of 1/2 048 of disposable full field imaging. The vertical error will cause measurement error and a serious loss of depth of field according to imaging principle that the system resolution is higher and the depth of field is smaller. As shown in Fig. 1, AB is the theoretical object surface that is vertical with the optical axis, while AB¢ is in the opposite situation. BB¢ is Z direction error caused by the angle deviation and BT is the changes in diameter due to the projection angle bias. The deviation angle between the measured body surface and the vertical surface of the shooting system optical axis will cause Z-axis direction error z and radial error x :
z = AB sin ,
z = AB - AB ¢ cos = AB (1- cos ).
can use clarity of image to evaluate current focusing which is the core idea of the method proposed by this paper. So the key of this method whether can be used for detecting verticality effectively is the precise indicators or criteria of definition evaluation can be proposed. Therefore the consistency of definition in different regions can be taken as evidence for evaluating the verticality in precision measurement system while the focal length is constant. Fig. 2 shows two conditions that the axis of the shooting system and the measured object is not vertical. In these conditions, the image sharpness will be influenced according to the imaging principle by the analysis of Fig. 3.
(1)
(2)
Fig. 2. Two bad verticality conditions
Fig. 1.
Axis and object plane out of the vertical
Eqs. (1) and (2) show that while the field of the system is 6.9 mm´6.9 mm and the measuring resolution is 0.003 mm, the angle deviation allowed is 6.9´ (1- cos ) < 0.003, and the loss of depth of field is 6.9´sin1.689 6°=0.029 485. The loss of depth of field is 0.029 485 mm because this angle in Z-axis. The tiny angle deviate a harmful influence on the micro-parts imaging measurement system for the actual depth of field is only 0.1 mm while the designed depth is 0.2 mm. The influence is acceptable in an identification system but not in micro-parts imaging measurement system. 2.2
Theoretical basis of evaluating verticality by definition The other ignored important feature of the verticality is the definition differences of the image regions while the traditional verticality detection is focusing on analyzing the plane image. In addition, the use of local image definition differences to detect verticality is still rare or not reported in the literature. Each point of the object plane reaches the clearest position at different time when the object moves along the optical axis. In other words, definition of different regions is not the same when the measured object image is in a fixed position if there is vertical deviation. Therefore, we
Fig. 3.
Relationship between object and image of telecentric lens
The key issue of the method proposed in this paper is to choose a criterion to judge image sharpness. From the Newton imaging equation xx ¢ = ff ¢, it can be seen that when the object focus f and the imaging focus f ¢ of an optical system are fixed, there is the hyperbolic relation between an object distance x and an imaging distance x ¢ , and they correspond one by one, which is called the conjugate relation of object and imaging. It is easy to find the image position of the object points on the conjugate surface. In Fig. 3 conjugate relation exists between object point A1 and its image A1¢, b and e are the conjugate surfaces of object and imaging under the real imaging. The sensitization surface of CCD is on surface e. B2 is on surface c, image on surface f. The object’s image on surface d and f is a dispersion circle. So the image of C1 is a dispersion circle[9]. It is understood that object is father
CHINESE JOURNAL OF MECHANICAL ENGINEERING from with surface f , so the image sharpness is worse. Therefore, a conclusion can be drawn: only focusing in the right circumstances, the image points have the strongest contrast of the gray scale; focusing deviation is greater and the gray scale contrast is weaker. By the above principle, when all the points of the plane have the clearest image on surface e at the same time, the verticality is best currently. 2.3 Calculation of the verticality deviation We chose object side telecentric lens in this measurement system. If the object is located at b, conjugates with e as shown in Fig. 3, the image is on the surface e. Suppose object plane is not vertical with the axis as shown in Fig. 4, D1D2 is the initial location of the object plane. Move the object from D1D2 to B1B2 until C1C2. In order to facilitate our explanation, suppose intersections that object plane with the Surface b are B1, C2 and the A1A2 stands the position that object plane is vertical with axis. When B1 is on the Surface b, its image is clear. Then continue move the object surface toward A1A2.When the object is in the position of C1C2, C2 is on the Surface b and it also has a clear image.
·175·
object tilt angle. 2.4 Measurement method Adjust the camera and grab the image at the near position of the focus. Partition the image into different regions and select four symmetric regions , more edge information the four regions involve the more exact the result is. The selection is shown in Fig. 5. Each region is of size 500 pixels´700 pixels and rectangular. Read the gray value of the selected regions from image caching and select a definition function to calculate the definition value.
Fig. 5. Coordinate diagram of the selection local image regions
Fig. 4.
Object and the object surface location map
Fig. 4 clearly reflects the geometric relationship when the object moves from the position O1 to O2 and then to O3. When the object moves from O2 to O3, in the triangle C2B1C1, the object moving distance is B1C1, which can be read out by mechanical devices. L1 is the distance of the symmetrical region of the selected image. There suppose that D1D2 representatives of the regional distances between regional 2 and regional 3. By Eq. (2), the object tilt angle can be obtained: D2 D1 = (2 048 - 500) ST ,
= arcsin
B1C1 D2 D1
,
(3)
(4)
where 2 048 is on behalf of the camera resolution. 500 is the width of the selected area in Fig. 6. L1 is (2 048-500). ST is on behalf of pixels equivalent and its value is 3.419 765 μm. B1C1 is object movement distance. is the
Record the values of definition function of the standard image while raising the measured object above the object plane by adjusting the fine tuning device. Then we can fit the values to the best definition position. According to the height difference of the best definition position of the four regions, the micro angle between the optical axis and the measured surface is calculated based on equation(4). The angle is the basis for adjusting the fine tuning device. Then we can obtain the best verticality when the four values of definition function increase or decrease synchronous in repeated measurements. The specific regulation processes are shown in Fig. 6. 2.5 Selection of definition function Definition describes the clarity of the image-border and the ability of subtle images. Brenner function, the absolute gradient function, square gradient function, Roberts gradient function operator, Sobel operator gradient function, Tenengrad function, the effective pixel gray value sums function, effective pixel count function, gray entropy function and the maximum difference function of gray are the usually used focusing algorithm[10]. The evaluation function with no-bias, single peak, defocusing sensitivity, enough signal-to-noise ratio and little calculation quality, visible trend near the focal plane and high sensitivity can meet the needs of vertical adjustment. In this paper, we chose the gradient class
ZHANG Jin, et al: Verticality Detection Algorithm Based on Local Image Sharpness Criterion
·176·
functions, mainly used in extracting the image edge gradient information in the image processing. The clarity of the image is enhanced with the increase of gradient of the edge sharpness in the function such as Brenner function, Tenengrad function and the gradient square function[11]. Here we design experiments for Brenner function, Tenengrad function, the gradient square function, as well as four kinds of amendment gradient square functions, and early experiments show that amendment gradient square function 3 is most appropriate for this micro-part system[12–15]. The equation of amendment gradient square function 3 is as follows: f (I ) =
åå {[ I ( x + 4, y) - I ( x, y)]2 + x
= arcsin
10 m = 0.107 5 = 6.45¢¢ . D2 D1
(6)
3.2 Mechanical adjustment error Mechanical adjustment error is caused by the precision error of micro-mobile platforms and angle error of the system. The scale of the fine tuning device is 10 μm/div in this paper. From Eq. (3), the maximum deviation is 6.45.
y
[ I ( x, y ) + 4 - I ( x, y )].
Fig. 6.
3
distance, less than 10 μm, between the micro-mobile platform and the position where we can get the peak value of definition function. The maximum deviation caused by this distance is
(5)
Adjustment Steps and the Experimental Data
4.1 Select appropriate definition evaluation function In this paper, we verify the definition evaluating capacity of gradient functions, adjust the shoot system in vertical direction by the fine tuning device with the scale of 10 μm. We chose the Adimec4000m camera and design the lens to meet the accuracy demand of the escape-wheel measurement process. The shoot object is a series of round uniformly distributed on the lithography module and illuminated by OSE red light source. Before shooting can begin, we move the micro-mobile platform 0.4 mm away from the clear position. Then we move it approach the clear position for 10 μm as a step, take the picture of the object at each step and calculate the clarity evaluation function value. After repeating stepping 40 times, we select peaks of the function of each region to analyze. The analysis of experiments of the seven gradient class functions shows that the peak of each function generally appeared at step 20 or 21; the peak positions of each function are approximate. So we can conclude that the gradient class function is feasible in accurately focusing. For the amendment gradient square function 3 performed best in the experiment, it’s chosen to be used in verticality adjustment.
Adjustment process flow chart
Adjustment Precision Analysis
Two measurement errors, evaluation error mechanical adjustment error, are analyzed below.
4
and
3.1 Evaluation error The evaluation error is caused by the limited distinguishing ability of definition evaluation function. The lens depth of field of the system is 0.2 mm, but because of the error of mechanical devices and the camera processing error, the actual depth of field is about 0.1 mm. For the scale of the adjusting system is 10 μm, there might be a
4.2 Select the appropriate image region The analysis in above paragraphs shows that the image edge features are the evidence for judging definition, as the sharpness of edge increase, the image is much clearer. Therefore, the fundamental principles of selecting an area are the area contains much edge information as possible and we can distinguish along with which direction, the area image definition changes. As shown in Fig. 7, in this paper we choose symmetrical lithography template graphics in measurement. Fig. 8 shows the four image regions (500 pixel ´700 pixel) we selected in the image are contain 4 to 5 small circles edge in theory.
CHINESE JOURNAL OF MECHANICAL ENGINEERING Table 1. Position
Fig. 7.
9 10 15 16 17 18 19 20 21 25 26 27
Standard template map
Table 2. Fig. 8. Calibration area
4.3 Adjustment process The camera capture function is compiled with VC++6.0, the values of definition function of the four regions are displayed in a dialog box. Set a timer function, the time interval of the camera capturing image is 1 s, the image data is store in the data buffer. Read the gray values of pixels of the four regions from the buffer, and then store the values in dynamic array. Then the definition values are calculated from the calculated in the array and displayed on the screen. At last the values in the array are erased and thus to prepare the next calculation. The step of the micro-mobility is 10 μm, the image became clear and then fuzzy in the adjusting process. The data is stored as a set and analyzed to identify the direction of bias. Adjust the mechanical structure and re-measure until the values of the definition function of the four regions reach the peak at the same time. Now the verticality is best, so we can stop collecting image. 4.4 Comparative analysis of experimental data The verticality adjustment of X and Y direction are independent because the camera’s area is symmetrical. We use the fine-tuning device to change the focal length, meanwhile, the image approaches clear. Table 1 shows the data with a mobile step of 20 μm and Table 2 is with 10 μm. The data in columns 1 and 6 are the data of current position, after moving a step of 20 μm. In Table1, the image in region 3 is clear at the 10th position, region 2 at the 26th, region 1 at the 20th and region 4 at the 17th. The vertical deviation is (20–17)´10 μm, the horizontal deviation is (26–10)´10 μm and the deviation angle is 1.72° by Eq. (4). Adjust the system and then re-measure. As shown in Table 2, the image in region 2 is clear at the 8th position, the deviation of the two regions in vertical direction is (20–8)´10 μm, the deviation angle is 1.29° by Eq. (4).
Position 0 2 4 6 8 10 12 14 16 18 20 22
·177·
Experimental data with a mobile step of 20 μm Region 1 7 574 7 777 8 566 8 653 8 740 8 784 8 798 8 791 8 751 8 355 8 217 8 050
Gray value Region 2 Region 3 6 487 5 839 6 719 5 878* 7 912 5 698 8 159 5 620 8 397 5 524 8 604 5 435 8 787 5 327 8 978 5 223 9 139 5 098 9 515 4 607 9 532 4 485 9 511 4 357
Region 4 6 852 6 998 7 488 7 529 7 549* 7 546 7 522 7 476 7 406 6 980 6 846 6 084
Experiment data with a mobile step of 10 μm Region 1 5 061 5 361 5 645 5 917 6 156 6 332 6 448 6 497 6 466 6 355 6 183 5 951
Gray value Region 2 Region 3 6 077 3 328 6 345 3 527 6 542 3 726 6 693 3 933 6 763 4 157 6 745 4 352 6 654 4 536 6 491 4 699 6 269 4 811 5 977 4 887 5 679 4 892 5 359 4 840
Region 4 4 462 4 079 4 929 5 139 5 318 5 443 5 518 5 541 5 501 5 390 5 237 5 046
The definition function value of images in region 1 and region 4 reach peak at the 14th position, indicate that a good verticality in the vertical direction. So we can only adjust the level knob to re-measure, accord the data until the definition function values of the four regions reach peaks at the same time. Now we obtain the best verticality and finish regulating.
5
Conclusions
(1) A method which based on image sharpness criterion/criteria to detect and correct the optical verticality for high-precision measurement system is proposed. (2) Experiments show that this method can effectively regulate the small angle between the system optical axis and the measured surface in high-precision video measuring system, evaluation error of this method is better than 0.1°, meeting the high-precision measurement requirements. (3) The method proposed is suitable for detecting verticality of various object surfaces and axis, with real-time measurement, easy operation, measurement and high precision. References [1] MOU Li, YANG Min. Research on algorithm of verticality inspection[J]. Modern Manufacturing Engineering, 2007(3): 101–103. (in Chinese)
·178·
ZHANG Jin, et al: Verticality Detection Algorithm Based on Local Image Sharpness Criterion
[2] ZHAO Jingjing, BAI Ruilin, LI Du. A new method of verticality adjusting between optical axis and object surface of embedded machine vision controller[J]. Optoelectronic Engineering, 2010, 37(5): 63–69. (in Chinese) [3] GONG Hao, LU Naiguang, LOU Xiaoping. A method of verticality adjusting between optical axis and carrier in two—dimensional vision measurement[J]. Journal of Beijing Institute of Machinery, 2006, 21(3): 35–38. (in Chinese) [4] LUO Xinxin, LIU Bingqi, SUN Dongping. Improvement of optical axis parallelism correction for laser range finder[J]. Journal of Applied Optics, 2009, 30(3): 519–522. (in Chinese) [5] ZHANG Junjie. Complete imaging measuring instrument technology research[D]. Tianjin: Tianjin University, 2007. (in Chinese) [6] YU Taoyin, TAN Hengying. Engineering optics[M]. Beijing: China Machine Press, 2002. (in Chinese) [7] LIU Hongxing, LIN Yuchi, ZHOU Jingjing. Development of the portable laser interference alignment system[J]. Electronic Measurement Technology, 2007, 8(30): 187–190. (in Chinese) Naiguang, LOU Xiaoping, et al. A method of [8] GONG Hao, LU verticality adjusting between optical axis and carrier in two-dimensional vision measurement[J]. Journal of Beijing Institute of Machinery, 2006, 21(3): 35–38. (in Chinese) [9] CHEN Guojin, ZHU Miaofen, WANG Yaka, et al. Study on definition evaluation function based on image contrast variation[C]// Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17–19, 2007: 254. [10] WANG Yong, TAN Yihua, TIAN Jinwen. A new image clarity evaluation function[J]. Journal of Wuhan University of Technology, 2007, 3(29): 124–126. (in Chinese) [11] YANG Zaihua, LI Yu, LI Qingxiang. Feature extraction based on edge image clarity evaluation function[J]. Computer Engineering and Applications, 2007, 3(20): 33–35.
[12] ZHANG Likun. Construction and adjusting of inspecting and measuring semiautomatic system based on machine vision for escapement[D]. Tianjin: Tianjin University, 2010. (in Chinese) [13] ZHANG Yujin. Image engineering[M]. Beijing: Tsinghua University Press, 2007. [14] ZHANG Guangjun. Vision measurement[M]. Beijing: Science Press, 2008. [15] LYVERS E P, MITCHELL O R. Subpixel measurements using a moment based edge operator[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence (SOl62-8828), 1989, 11(12): 1 293–1 309.
Biographical notes ZHANG Jin, male, born in 1985, gained his master degree from Tianjin University, China, in 2007. Now He is studying at College of Precision Instrument and Opto-electronics Engineering, Tianjin University, China, as a doctor. His major directions of research are data traffic technology, precision measurement and instrument intelligence. E-mail:
[email protected] WANG Zhong, male, born in 1953, gained his master degree from Tianjin University, China, in 1991. Now he is a professor at Tianjin University, China. His major directions of research are precision measurement, instrument intelligence, micro-accessory measurement technology based on full imaging and on-time detection and classification technology in industry. Tel: +86-22-87401582; E-mail:
[email protected] YE Shenghua, male, born in 1934, a professor at Tianjin University, China, and China Academy of Engineering, is mainly engaged in the field of machine vision inspection and related research.