Int J Adv Manuf Technol (2009) 45:81–90 DOI 10.1007/s00170-009-1949-3
ORIGINAL ARTICLE
Automated vision inspection in network-based production environment Yongjin James Kwon & Richard Chiou
Received: 28 July 2008 / Accepted: 26 January 2009 / Published online: 20 February 2009 # Springer-Verlag London Limited 2009
Abstract This paper presents a new, holistic, Internetbased quality control approach. In recent years, a rapidly changing business environment driven by fierce international and domestic competitions has pushed companies to focus more on quality issues. A new strategy in quality control better suited to information-integrated production environments is a natural byproduct. The new strategy is called EQM, short for e-quality for manufacture, which allows designers located away from the production facilities to monitor, control, and program the quality inspection processes as the product design evolves. EQM also allows the quality data to be integrated within the company’s information network for automated quality monitoring and control functions. Such integration relieves the human operators from laborious, error-prone, and boring tasks of quality monitoring and samplings practices, which will help reduce production costs and lower the chance of shipping defective products to the customers. Keywords Remote quality control . EQM . Network-integrated production environment
Y. J. Kwon (*) Industrial and Information Systems Engineering, College of Engineering, Ajou University, Suwon, South Korea, Zip 443-749 e-mail:
[email protected] R. Chiou Applied Engineering Technology, Drexel University, 3001 Market St. Suite 100, Philadelphia, PA 19104, USA
1 Introduction The vision for tomorrow’s factory is to integrate design, manufacturing, quality, and business functions within the company information and knowledge networks with the use of Internet. It is a trend that web-based gauging, measurement, inspection, diagnostic system, and quality control have become critical components within the framework of e-manufacturing systems and management [1]. Quality itself, more generally, constitutes a competitive strategy to avoid domestic job losses. In 2003, Ford Motor Company claimed to have saved $1 billion through waste elimination after inaugurating a quality control effort in 2000. In 2008, LG Display Co. reported the savings of $900 M through the manufacturing capacity improvement and reduction of defects in the production line [2–4]. However, improving product quality requires constant vigilance, state-of-the-art technologies, various forms of automation, and other quality-enhancing techniques [5–10]. Ever since its inception a few decades ago, the Internet has become a ubiquitous tool in every imaginable area of human activities. Different equipment is now being connected to the Internet for a variety of remote access applications. One such area (i.e., e-manufacturing) integrates every level involved in the enterprise system from the factory floor, to the suppliers and customers [11]. A prime impetus behind the penetration of Internet in the manufacturing sector is to improve efficiency and productivity, hence subsequently reducing the time to deploy the product to the market. Different activities, ranging from product design, prototyping, task scheduling, testing, preventive maintenance and troubleshooting, support, machining, and assembly, are now seeing a sweeping change in the way they are being conducted with the introduction of e-manufacturing techniques. Industrial robots are not an exception to this
82
Int J Adv Manuf Technol (2009) 45:81–90
phenomenon. Being a relatively new comer, the field of Internet-controllable robotics has opened up an immense number of potential applications, which is limited only by the imagination [12, 13]. Machine vision technologies have also kept up pace in parallel with this development. Highperformance machine vision systems have enabled industries to perform visual inspections on products with impeccable accuracy at high repetition rates [14]. A recent advent that has encompassed this ongoing trend is e-quality for manufacture (EQM)—techniques to remotely monitor and control quality functions through the Internet. Quality control in manufacturing involves development of control strategies and variation reduction methodologies to ensure that manufactured products meet or exceed certain requirements or specifications [15]. Dimensions that characterize quality include performance, reliability, durability, serviceability, aesthetics, features, and conformance to standards and specifications [16]. EQM allows for the testing of these quality dimensions remotely and automatically. With current technology, 100% real-time inspection is possible, which prevents defective components from propagating to the next stage of production, thus leading to a significant cost reduction [17–20]. This is especially necessary when various functional entities within a company are geographically dispersed. For many global companies, it is not uncommon that design, production, marketing, and other business functions are located in different nations to capitalize on unique merits (e.g., low wages, latest trends in the market) that each country provides. Under such circumstances, it is crucial to establish real-time communications between the business functions and decision making procedures associated with production quality. In EQM, remotely located designers can monitor and update quality control routines and perform corrective actions. It eliminates the need for a human operator to be present at the floor and allows for collaborative work among a group of people. It provides
increased flexibility to pull up relevant historical data and compare performance as well as storage of current quality inspection data. In tune with the trend, this study investigates a remote quality control system to monitor part quality status using the Internet-based machine vision and robotics system. The system integrates monitoring, diagnosis, communication, and control functions related with quality control data. The system provides remotely located users with a live analysis on the quality of products, moving on a conveyor belt in a production line. If any design changes arise, quality functions can be instantly updated and accorded. As the progression of production process continues, quality-related production data can be automatically deposited into centralized-company database, upon which any necessary business functions will be evaluated according to the quality data. The Internet-based machine vision system makes non-contact measurements on key dimensions and data are reported back to an application server, which in turn distributes this information to remote users. Based on the measurements and position information, the robot is directed to pick up and stack the objects according to their quality standards. Most activities can be conducted off-site and quality engineers do not have to travel to the distant production facility. EQM takes any quality-related issues into account, e.g., selection of sensors and quality parameters, analog data collection and digitization, data processing and filtering, updates of production database, automated decision making, notification of operators via email, mobile phone, PDAs, in case of anomalies, generation of quality report, connectivity to other computerized business functions, such as materials resource planning, supply chain management, business forecasting, etc. The EQM-related activities can be manifested (both visible and controllable) over the company network, hence, presenting an ideal platform for implementing quality control into informationintegrated production environments (see Fig. 1).
• • • •
• •
• • • • • • • • • • •
Fig. 1 E-quality for manufacture conceptualized within the framework of networked production environment
Int J Adv Manuf Technol (2009) 45:81–90
Fig. 2 Setup of Internet-controllable equipment as a test bed for EQM
2 System development Currently, at Drexel University, both Visual Basic and the Java-based application programming interface (API) have been developed for EQM. The constituents include webcontrollable robots and machine vision systems, and other devices and sensors, which can be accessed and controlled through individual IP addresses. The experimental setup includes: a Yamaha SCARA robot YK-250X, a RCX40 Fig. 3 The API displays realtime information from the web camera, robot controller, machine vision system, and also the inspection results
83
robot controller with an on-board Ethernet card, Omron optical sensors, a conveyor belt, LED lights, micro on/off switches, a Festo SmartCube vacuum control valve, network cameras, and a Cognex DVT 540 machine vision system (see Fig. 2). A DLink DCS-5300 web camera is used for viewing the robot movement. The Yamaha YK 250X SCARA (selective compliance assembly robot arm) robot is specifically configured to have a high rigidity along the vertical axis, while compliant along horizontal directions in the form of swing arm motions [21]. This renders the robot particularly suitable for pick and place or assembly operations with a high degree of accuracy and speed. For instance, the robot’s repeatability along horizontal planes is +/−0.01 mm (+/−0.0004 in.). For part handling, a variable speed Dorner 6100 conveyor system is connected with robot’s I/O device ports in order to synchronize the conveyor with the motion of robot. The robot’s RCX 40 controller is equipped with an on-board Ethernet card that connects the robot controller over the Internet. The communications protocol utilizes TCP/IP (Transmission Control Protocol/Internet Protocol), which is a standard Internet Protocol. The unit uses 10BASE-T specifications and UTP cables (unshielded twisted-pair) or STP cables (shielded twisted-pair) can be used. PCs with Internet access can exchange data with the robot controller using Telnet. Once the connection is established, programming
84
Int J Adv Manuf Technol (2009) 45:81–90
vertical Z-axis also provides for the rotational R-axis. Thus, the robot’s movement can be mapped by a Cartesian coordinate system. The image plane generated by the camera and another horizontal plane generated by the swing arm motion of robot directly over the conveyor are considered non-overlapping (i.e., parallel). Any arbitrary point on the image plane (denoted as ai and bi in Fig. 4) needs to be transformed into the robot coordinates, where μ represents a transformation operator:
and controlling of robot can be commenced remotely. One drawback to this approach is the lack of auditory/visual communications between the robot and remotely situated operators. In order to properly perceive and understand the workings of the remotely located equipment, an adequate form of visual and auditory feedback is required. To counter this problem, the Telnet procedure has been included in the Visual Basic codes to develop an API, including windows for the robot control, machine vision, web camera, and online part tracking/inspection interface. The connection between the API and the system was established by the utilization of Winsock components and various ActiveX controls that communicate through IP addresses. The API improves not only the visualization of robot operations in the form of an intuitive interface, but also provides enhanced controllability to the operators (see Fig. 3). In a camera, a built-in pan (left/right) and tilt (up/down) mechanism enables users to change the direction of camera lens. The camera has a magnification of up to five times (optical zoom), hence, users can zoom in onto any specific areas of the setup. The camera also comes with scan, presets, auto patrol, and various other monitoring features. Users can listen to the sound from the equipment during the experiment. The sound is transferred from the built-in microphone in the camera. The visual and auditory feedback from the remote experimental setup provides enhanced realism to the users. The robotic arm is composed of two links that are coupled with one another only at one end. The resulting movement is in a horizontal plane (X, Y). The end-effecter attached to the second link moves in the
f : P ð x i ; y i Þ ç m P ð ai ; bi Þ
ð1Þ
The transformation effectively converts the image coordinates into the robot’s coordinates, hence the robot can be guided to the desired locations. By operating individual values of ai and bi by the scale factors Sx and Sy, the functional relationship is defined as: Sx 0 xr ai a P ð xi ; yi Þ ¼ þ 0 þ" ð2Þ 0 Sy yr bi b0 where, i= 1, 2…n, ε= any errors associated with the coordinate mapping, and xr, yr = a robot coordinate at the origin of the image plane. Considering the work area as a 2D surface, the scale factors for each axis can be represented as: " #1=2 ðx1 xr Þ2 þðy1 yr Þ2 Sx ¼ ; ð3Þ ða1 a0 Þ2 þðb1 b0 Þ2 " Sy ¼
ðx2 xr Þ2 þðy2 yr Þ2
#1=2
ða2 a0 Þ2 þðb2 b0 Þ2
Fig. 4 Coordinate system for machine vision and robot +
& (x
(
)
(
(x
) + ( )
(
(
2
2)
& (x2
V)
2)
. ( ) +
Int J Adv Manuf Technol (2009) 45:81–90
85
The error due to lens distortion can be minimized by dividing the region captured by the camera into a number of small regions (an m by n array), and applying separate scaling factors for better accuracy. The robot Cartesian coordinates at every intersection of the grid lines and the coordinate axes is stored and scale factors for each grid are calculated. Therefore, any point detected within the image plane will be scaled with respect to the increment in the grid from the origin. If P(ai, bi) is assumed to be any point detected, then P(xi, yi) can be calculated as: xi ¼ xr þ
p1 P
Sx;n jan an1 j þ Sx;p ai ap1 þ "x ;
n¼1
yi ¼ yr þ
q1 P
Sy;m jbm bm1 j þ Sy;q bi bq1 þ "y
m¼1
ð4Þ where n = no. of columns, m = no. of rows, p and q = the number of grids from the origin where P(ai, bi) is located, and εx and εy = imprecision involved in the scaling. Since the detected object’s position can be mapped to robot coordinates, it can be subsequently used to guide the robot to a correct pickup position.
3 Applications of EQM In testing the notion of EQM, the following scenario has been devised. A pseudo company, ATech, is producing new, customer-ordered parts at a remote production site, located in another country. The parts assume two distinctive shapes, a Circle and a Square. For the Circle, its radius of 30 mm is measured, and for the Square, two key dimensions (length and width, both of which are 30 mm) are measured. The design specification requires the objects machined with a tolerance of +/−0.5 mm. Any object outside this range should be rejected. A quality engineer situated remotely from the production site needs to develop necessary inspection algorithms, and make sure that no defective parts are shipped to the customers. The production rate is high, hence, a manual inspection is not suitable. The machined parts are carried into the vision-based inspection station, and depending on the dimensions, good parts and bad parts are sorted and stacked accordingly. Inspection and robotic part sorting are performed automatically by the API. Part inspection data are generated by the vision system, transferred into the API then stored into the server. Based on the inspection rules, transformed image coordinates are fed back to the robot controller for the sorting operations. In this scenario, the EQM procedure represents the overall functionality of quality assurance, which ties with other production related data and parameters for the manufacture of parts. Assuming that the parts are manu-
factured by a stamping press (i.e., blanking operation), any gradual increase or decrease of part dimensions indicates a systematic process change (e.g., die wear, irregular stamping force or machine misalignment). The EQM procedure not only accounts for 100%, automated part inspection, but also tries to analyze the overall performance and condition of production environment through the use of sensor generated data. When the part dimensions are projected to reach the tolerance limits, warning algorithms in the EQM procedure will automatically notify the operator and other related parties for impending process halt, and dictate the inspection of necessary hardware. The EQM procedure encompasses the automatic generation of business messages to the tool and die suppliers, for the possible repair or reorder of die sets, and inquires about the soonest service or delivery date, cost, etc. All these functions are designed to maximize the production efficiency, while ensuring zerodefect manufacture of customer parts. For the experiment, about 19 pieces each were made, with a few purposely machined-out of tolerance limits. The entire objects from each type were mixed and fed through the conveyor belt twice. The object thickness was limited to 1 mm, since a higher thickness would result in problems due to shadowing effect. Prior to the inspection, the vision system was calibrated with a Mitsutoyo high-precision gage block set, and the measurement accuracy was verified at a specific focal length. Under the current focal length, the image frame occupies the space defined by the boundary of 150.25 mm (h) by 113.64 mm (v), and the corresponding pixel width and height found to be 0.2347 mm (h) by 0.2367 mm (v). The boundary was measured by positioning the robot end effector at the four corner points of the image frame, then by taking readings of the robot coordinates displayed on the robot teach pendant. This procedure is very similar to the standard robot calibration steps with the use of four corner points of a square, prescribed by the manufacturer. For the inspection, the vision system is set with the following image parameters: CCD exposure time: 4 ms; digitizing time: 26 ms; image processing, and inspection time: 13 ms; no internal trigger; hence the time from image capture to inspection finish: 43 ms. The image processing and analysis were made by the algorithms built into the API, hence no triggering events associated with the camera hardware were made. When the moving object is detected, the API automatically tracks and processes the image frames. The intensity of ambient lighting is enough for the camera CCD sensor (with the use of integrated white LEDs), so the exposure time of 4 ms was found to be adequate. The part is carried by the conveyor with a speed of 125 mm/s in accordance with the pre-defined production rate. The part then takes 580 ms to pass through the inspection area set by the current camera field of view. Since the vision system only sends out a very small string
86
Int J Adv Manuf Technol (2009) 45:81–90
of text data, there was no delay problems associated with the large data transfer over the Internet. Image processing and analysis software packages for the Cognex DVT 540 vision system are configured for the inspection. SoftSensors are pre-defined set of image analysis algorithms inside the camera. Every type of SoftSensor serves a specific purpose, and the combination of SoftSensors represents the overall result of the inspection. All SoftSensors can perform pre-defined, very specific inspections to user defined, particular tasks (programmable). They include: Measurement, Math Tools, Readers, Blob Tools, Template Match, Object Find, Pixel Counting, Segmentation, SmartLink, Script, and Spectrograph. The script is a programmable tool that can access other SoftSensors for data gathering and data communication, etc. The camera is initially trained to learn the part profile and make measurements on the object. The dark objects provide a near perfect condition for the camera, in terms of object separation from the background, for the subsequent image processing. Once the objects are isolated and the boundaries defined, other image processing algorithms count the number of pixels within the boundaries and calculate the center location, dimensions, etc. The area of a moving object, delineated by the camera detected boundaries, is defined as [22]:
Ad mm
2
¼#y
XX a
I ða; bÞ
ð5Þ
b
where χ = the calibrated pixel size (mm) along vision X axis, = = the calibrated pixel size (mm) along vision Y axis, and 1 if intensity of pixel at ða; bÞ threshold I ða; bÞ ¼ 0 otherwise: For the center of each object: Ctrx ¼ K 1
K X
½Xek Xsk 21 ; Ctry
k¼1
¼ G1
G X
Yeg Ysg 21
ð6Þ
g¼1
where K and G = the total numbers of pixel rows and columns in the object, respectively, Xe = the x coordinate point for the left most pixel in row k, Xs = the x coordinate point for the right most pixel in row k, Ye = the y coordinate point for a bottom pixel in column g, and Ys = the y coordinate point for a top pixel in column g. Depending on the moving speed of conveyor, the API automatically calculates the center location of moving objects, and sends the future location of part center in order to coincide the robot suction cup with the moving
part. The moments of moving object can be used for the shape analysis. In this case, the shapes are limited to two different types. If the separated object is binary, the central moments of the each image column can be obtained. The moment values can be regarded as extracted features, and typical image orders range from two to three. Assume that f (a, b) an image, then the moment of order (α+β), Mab , is in the form of [23–25]:
Zþ1
Zþ1
Mab ¼
aa bb f ða; bÞ da db 1
ð7Þ
1
The central momentum, lab , of image is in the form of [23–25]:
Zþ1
Zþ1
lab ¼ 1
b ða aÞa b b f ða; bÞ da db
ð8Þ
1
where a ¼ M10 M 00; b ¼ M01 =M 00 . Since the f(a, b) represents digitized pixel values for the image, the above equation is equivalent to [23–25]: lab ¼
XX a
b ða aÞa b b f ða; bÞ
ð9Þ
b
The camera has a DVT DL component to make a connection with the Visual Basic API. The DVT DL Control is used for passing data acquired from the SoftSensors to the API. DataLink is a built-in tool used to send data out of the system and even receive a limited number of commands from other devices. This tool is product-specific, that is, every product has its own DataLink that can be configured depending on the inspection and the SoftSensors being used. DataLink consists of a number of ASCII strings that are created based on information from the SoftSensors:
.Connect2 (strIPAddress as String, iPort as Integer). This is the syntax to connect to the DVT DL control. The DVTSID (DVT Sampled Image Display) ActiveX control provides a link between DVT camera and an application running on a PC. It is designed to show sampled images from the camera for monitoring of the inspection process. The control can be also used to send a limited set of commands to a DVT camera. The images and commands are transferred over an Ethernet TCP/IP connection. The control takes care of the low-level communications, parsing of data and displaying of images. Several display options are offered for various image sizes, orientations and mirroring. In addition, the DVTSID control provides the ability to save images onto the PC. The Syntax is: . Here, the remoteHost means the IP of the DVT system, to which we wish to connect, and the remotePort has a value of (Default: 3246).
Int J Adv Manuf Technol (2009) 45:81–90
87 Comparison of Length 40.00 35.00 30.00
Length (mm)
The DVT Manager ActiveX control provides the ability to browse the network to find any DVT systems connected to that network. It also includes functions to set DVT systems to diagnostics mode, download firmware, restore system files from the PC to a camera and backup system files stored in the camera memory. The application starts with connecting to the Robot and then to the web camera. After the connection is made with the camera, the remote inspection starts. The conveyor waits for an object until the optical sensors detect the presence of parts. When the object is detected, the conveyor is automatically activated, and the object is carried into the camera field of view. The camera sees the object, and the measurements of the object are recorded in the string within the DL control of the camera. From there, the text data are passed onto the API. Depending on the inspection results, the robot sorts the object in “C_Good” or “C_Bad” for the Circle and “S_Good” or “S_Bad” for the Square. The API calculates whether it’s a compliant object by checking the dimensions of the object compared to the tolerance specifications. If it is compliant, then it is stacked onto the other good objects; otherwise, it is kept in the bad stack. This process continues until the last part or a user stops the process. Figs. 5, 6 and 7 indicate the comparison between the vision measurement data and manually gauged data. The accuracy of the vision-based measurement has a direct impact on whether or not good parts are separated from the bad, which affects the performance, cost, and efficiency of the production line. In order to ascertain the accuracy, the parts are post inspected by a Vernier caliper, and compared with the vision-generated measurement data. The comparison displays that average errors are 3.1%, 3.3%, and 2.5% for the radius, length, and width, respectively. The detailed illustration is provided in Table 1. The errors are small, and other than a few data points, both measurements coincide well. However, it is necessary to ascertain that the discrepancy between the two measure-
25.00 20.00 15.00 10.00 5.00 0.00 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37
Part No. Vision
Manual
Fig. 6 Comparison data for length
ments have no systematic correlation. In other words, the discrepancy should be centered around a mean of zero with no signs of systematic fluctuation (i.e., no autocorrelation). The discrepancy model between the two measurements is represented as [26]: d t ¼ f ðt; n Þ þ "t
ð10Þ
where ν is the vector of discrepancies and εt represents the uncorrelated errors between the successive discrepancies. Assuming that the discrepancy model is nearly constant, the Eq. (10) can be expressed as: d t ¼ n 0 þ "t
ð11Þ
where ν0 is considered as myt ’ 0. Here, the error terms are considered as [26]: "t ¼ f "t1 þ y t
ð12Þ
where εt is the error terms in the discrepancy model at the time period t, = t is an NID 0; s 2y random variable, and φ is a parameter that defines the relationship between successive values of the error terms, εt and εt−1. It is required that jfj < 1, hence the discrepancy in the time period t is a fraction of the discrepancy appeared in the immediately preceding period plus a normally and inde-
Comparison of Radius 35.00
Comparison of Width 35.00
30.00
Width (mm)
Radius (mm)
30.00
25.00 20.00 15.00 10.00 5.00
25.00 20.00 15.00 10.00 5.00
0.00
0.00 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37
Part No. Vision
Fig. 5 Comparison data for radius
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37
Part No.
Manual
Vision
Fig. 7 Comparison data for width
Manual
88
Int J Adv Manuf Technol (2009) 45:81–90
Table 1 Comparison of data and errors
Vision
Manual
Error (%)
No.
Radius
Length
Width
Radius
Length
Width
Radius
Length
Width
1 2 3 4 5 6 7 8 9 10 11 12
29.76 29.91 29.92 27.55 27.74 32.18 29.94 29.89 27.86 29.65 27.00 25.89
30.08 30.27 28.93 30.24 28.31 29.99 30.06 30.34 30.19 30.72 30.09 30.24
30.47 30.46 29.00 30.01 28.52 30.29 30.00 30.41 30.24 30.56 29.91 30.31
29.84 29.85 29.82 27.83 27.84 31.83 31.83 27.89 27.84 27.83 29.82 27.84
30.00 29.97 30.00 29.99 30.02 29.98 30.18 29.89 30.00 28.99 31.95 31.96
30.00 29.98 29.97 29.91 30.04 29.90 28.99 29.97 30.00 29.94 32.01 31.94
0.3 0.2 0.3 1.0 0.3 1.1 5.9 7.2 0.1 6.5 9.4 7.0
0.3 1.0 3.6 0.8 5.7 0.0 0.4 1.5 0.6 6.0 5.8 5.4
1.6 1.6 3.2 0.3 5.1 1.3 3.5 1.5 0.8 2.1 6.6 5.1
13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
29.37 31.73 26.68 27.31 29.65 28.33 31.69 30.03 30.09 28.80 27.99 27.98 32.07 31.98 28.03 28.04 27.99 29.83
29.44 31.69 32.53 32.28 32.05 30.76 28.48 30.09 30.87 30.62 30.84 30.85 30.47 30.06 30.04 30.19 27.72 32.54
29.72 31.55 32.73 32.07 31.89 30.42 28.31 29.28 30.06 29.17 30.10 30.53 30.06 29.98 30.42 30.02 30.56 32.70
29.85 29.83 31.83 31.81 27.87 29.83 31.82 29.84 29.85 29.82 27.83 27.84 31.83 31.83 27.89 27.84 27.83 29.82
31.92 32.03 31.98 37.91 30.06 27.90 27.99 30.00 29.97 30.00 29.99 30.02 29.98 30.18 29.89 30.00 28.99 31.95
32.00 31.97 31.93 31.80 30.04 27.77 28.04 30.00 29.98 29.97 29.91 30.04 29.90 28.99 29.97 30.00 29.94 32.01
1.6 6.4 16.2 14.1 6.4 5.0 0.4 0.7 0.8 3.4 0.6 0.5 0.8 0.5 0.5 0.7 0.6 0.1
7.8 1.1 1.7 14.9 6.6 10.3 1.8 0.3 3.0 2.1 2.8 2.8 1.6 0.4 0.5 0.6 4.4 1.8
7.1 1.3 2.5 0.8 6.2 9.5 1.0 2.4 0.3 2.7 0.6 1.6 0.5 3.4 1.5 0.1 2.1 2.2
31 32 33 34 35 36 37 38
30.02 29.67 30.03 30.35 31.83 28.73 29.53 32.06
32.40 32.82 32.88 32.93 31.93 30.52 28.20 28.50
32.41 31.62 32.43 31.65 32.65 30.07 27.99 25.38
27.84 29.85 29.83 31.83 31.81 27.87 29.83 31.82
31.96 31.92 32.03 31.98 37.91 30.06 27.90 27.99
31.94 32.00 31.97 31.93 31.80 30.04 27.77 28.04
7.8 0.6 0.7 4.6 0.1 3.1 1.0 0.8
1.4 2.8 2.7 3.0 15.8 1.5 1.1 1.8
1.5 1.2 1.4 0.9 2.7 0.1 0.8 9.5
pendently distributed random disturbance that is unique to the current period [26]. Therefore, the discrepancy model can be represented as:
d t ¼ n 0 þ "t ¼ n 0 þ f "t1 þ y t ¼ n 0 þ
1 X
fj y tj
j¼0
ð13Þ
This means that the discrepancy for period t is just a linear combination of all of the current and previous realizations of the NID 0; s 2y random variable = t. Furthermore, it is reasonable to express [26]: 1 E ð"t Þ ¼ 0; Varð"t Þ ¼ s 2 ¼ s 2y 1 f2 ; 1 Cov "t ; "tj ¼ fj s 2y 1 f2
ð14Þ
Int J Adv Manuf Technol (2009) 45:81–90
89
Therefore, the autocorrelation between two discrepancies that are one period apart is represented as [26]: Cov "t ; "tj r1 ¼ pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffipffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi Varð"t Þ Varð"t Þ 1
fj s 2y ð1 f2 Þ ¼ qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiqffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ¼ f 1 1 s 2y ð1 f2 Þ s 2y ð1 f2 Þ
ð15Þ
Figure 8 represents the autocorrelation graph, and all the data points (except one) lie within the 5% significance limits, meaning that there is no autocorrelation, and the discrepancy between the vision and manual gauging is only chance-based. For the data point outside the limits, it is easily reasoned that a large discrepancy stems from the variations, such as the position of the part within the camera field of view and associated lens distortion effects, lighting variations, signal noise, etc. When the part is placed near the corner of the camera field of view, the lens distortion effect magnifies unless a telecentric lens is employed. The parts are randomly placed on a conveyor, hence, the vision system produces slightly different results compared to the initial testing stage when parts were positioned in the center of the camera field of view.
4 Conclusion Newly developed automated production and measuring instrumentation enable real-time inspection in which critical dimensions are constantly measured and verified, while parts are being produced [27, 28]. The industry has recognized the current and future demands in Internet controllable production facility, so that many newer products are introduced with Internet options for data transfer and remote monitoring functionality. However, one downside is that integrating many different hardware
Autocorrelation Function for Error (with 5% significance limits for the autocorrelations) 1.0 0.8
Autocorrelation
0.6 0.4 0.2 0.0 -0.2 -0.4
platforms requires a development of communication, control and data exchange algorithms, which may entail substantial costs and time for the companies. Also, it should be noted that the part shapes have been made simple (i.e., a square and a circle) in order to effectively test the system’s performance. Overlapped parts or highly complex part features demand more sophisticated algorithms to detect, separate from the background, find the center locations, and guide the robot to sort. For instance, part heights more than 1 mm also demands a new lighting apparatus, in which a ring light floods the part sides to eliminate the shadow effects. Increasing the part moving speed entails faster vision processors, multiple lighting stations, and a dedicated vision workcell, all of which manifests a higher capital investment than the current setup. However, performance enhancement is not the main theme of the study. Rather, this study focuses on the feasibility of the remote, sensorbased quality monitoring and control scheme that is becoming a crucial element of the contemporary enterprise system. Overall, this work successfully demonstrates a notion of EQM through the implementation of Internet-based quality control system. Various image processing and analysis algorithms have been integrated with the robot for remote quality inspection. Depending on the quality check results, the robot integrated with the vision system automatically picks up and sorts the parts. This approach confers an immediate cost reduction advantage by terminating processing of defective parts at any stage of the manufacturing process. More importantly, remote accessibility and the ability to control equipment over the Internet offer novel benefits. Designers at remote locations can carry out inspections and quality checks even as processes associated with both design and manufacturability evolve. During the production stages, operators can remotely adjust inspection routines in the event of process changes, such as introduction of new features, part size variations, and quality criteria. This approach eliminates the need for a human operator to be present at the shop floor and allows for collaborative work among a group of people, located at various places. In addition to the 100% real-time inspection, this approach also provides increased flexibility, as to automatically pulling up relevant historical data and comparing current quality performance. Such setting is expected to save production costs, while at the same time, improve overall product quality.
-0.6 -0.8 -1.0 1
10
20
30
40
50
60
70
Lag
Fig. 8 Autocorrelation graph for discrepancy
80
90
100
110
Acknowledgments This work was supported by the U.S. National Science Foundation (CCLI Phase II DUE-0618665), the U.S. Dept. of Education (Award # P116B060122) and Yamaha Robotics Company. This work was also supported by the 2007 Ajou University Faculty Start-up Funding for Research and Development. The authors wish to express sincere gratitude for their financial support.
90
References 1. Waurzyniak P (2001) Moving toward the e-factory: manufacturing industry takes first steps toward implementing collaborative e-manufacturing Systems. SME Manuf Eng 43-60 2. R Nathan Soderborg (2004) Design for six sigma at Ford, Six Sigma Forum Magazine 15-22 3. Making the economic case for quality (2004) American Society for Quality, White Paper 4. Chosun.com Digital Biz, March 28 (2008) URL; http://news. chosun.com/svc/list_in/list.html?catid=1O 5. Manufacturing in America A Comprehensive Strategy to Address the Challenges to U.S. Manufacturers (2004) Washington, D. C., USA 6. Fabtech 2001 preview-Ethernet simplifies press cell monitoring and diagnostics (2001) SME Forming Fabricating 8:47–51 7. Zayia D (2004) Probing technology moves ahead. SME Manuf Eng 117–119 8. Automation technology: robotic automation can cut costs (2005) SME Manuf Eng 65–72 9. Measurement and inspection: vision sensors for gaging and highprecision machine vision sensor technologies aid In-process inspection (2005) SME Manuf Eng 77–80 10. More automation, less manpower (2005) SME Manuf Eng 85-108 11. Joao J, Ferreira P (2004) E-Manufacturing: business paradigms and supporting technologies. Kluwer, Dordrecht, Netherlands 12. Yang Y, Zhang X, Liu F, Xie Q (2005) An internet-based product customization system for CIM. J Robot Computer-Integrated Manuf 21(2):109–118. doi:10.1016/j.rcim.2004.06.002 13. Wang L, Xi F, Zhang D (2006) A parallel robotic attachment and its remote manipulation. J Robot Computer-Integrated Manuf 22 (5–6):515–525. doi:10.1016/j.rcim.2005.11.012 14. Mak KL, Peng P (2008) An automated inspection system for textile fabrics based on Gabor filters. J Robot ComputerIntegrated Manuf 24(2):359–369. doi:10.1016/j.rcim.2007.02.019 15. Lee G, Mou J, Shen Y (1997) An analytical assessment of measurement uncertainty in precision inspection and machine calibration. J Mach Tools Manuf 37(3):263–276. doi:10.1016/ S0890-6955(96)00061-2
Int J Adv Manuf Technol (2009) 45:81–90 16. Douglas MC (2005) Introduction to statistical quality control, 5th edn. Wiley, New York, NY, USA 17. Ooi TH, Kumar K (1988) Real-time monitoring of test and quality control management. J Computer-Aided Eng 5(3):132– 136 18. Lo C-H, Yuan J, Ni J (1995) An application of real-time error compensation on a turning center. J Mach Tools Manuf 35 (12):1669–1682. doi:10.1016/0890-6955(95)97296-C 19. Ghasempoor A, Jeswiet J, Moore TN (1999) Real time implementation of on-line tool condition monitoring in turning. J Mach Tools Manuf 39(12):1883–1902. doi:10.1016/S0890-6955(99) 00035-8 20. Lei WT, Hsu YY (2003) Accuracy enhancement of five-axis CNC machines through real-time error compensation. J Mach Tools Manuf 43(9):871–877. doi:10.1016/S0890-6955(03)00089-0 21. Groover MP (2001) Automation, production systems and computer integrated manufacturing, 2/eth edn. Prentice Hall, Inc., NJ, USA 22. Wilson WJ, William Hulls CC, Janabi-Sharifi F (2000) Chapter 13. Robust image processing and position-based visual servoing. In: Vincze M, Hager GD (eds) Hager Robust vision for visionbased control of motion. IEEE Press, New York, NY, USA 23. Rafael C, Gonzalez R, Woods E (2001) Digital image processing. Prentice Hall, New Jersey, USA 24. Teh C-H, Chin RT (1998) On image analysis by the method of moments. IEEE Transactions on PAMI 10:496–513 25. Nagarajan R, Yaacob S, Pandian P, Karthigayan M, Amin SHJ, Khalid M (2007) A real time marking inspection scheme for semiconductor industries. Int J Adv Manuf Technol 34(9/10):926– 932. doi:10.1007/s00170-006-0669-1 26. Douglas MC, Jennings CL, Kulahci M (2008) Introduction to time series analysis and forecasting. Wiley, Hoboken, NJ, USA 27. Lei WT, Hsu YY (2003) Accuracy enhancement of five-axis CNC machines through real-time error compensation. Int J Mach Tools Manuf 43(9):871–877. doi:10.1016/S0890-6955(03) 00089-0 28. van Wieringen WN, van den Heuvel ER (2005) A comparison of methods for the evaluation of binary measurement systems. Qual Eng 17(4):495–507. doi:10.1080/08982110500225562