菜单
  

    2) The camera can be calibrated, and the camera’s internaland external parameters can be obtained. Camera calibra-tion is a key and difficult problem in the machine visionmethod, and calibration precision will influence the finalmeasurement precision [21], [22].3) When the camera is located at the focusing positionsrelative to the measured feature points based on the afore-mentioned monocular vision method, a single camera isused to perform stereo measurement by using the twoimages of the two areas that contain the same measuredfeature point. The two images are shot at the left and rightpositions by the transversal displacement of the CCDcamera that are at the focusing positions relative to thesame measured feature point. The 2-D image coordinatesof the intersection points of cross-cutting lines of the leftand right images in the computer image coordinate sys-tem can be calculated, and the calculated 2-D image co-ordinates should be corrected according to the calibrateddistortion coefficients of the camera optical system.4) In terms of the binocular stereo vision method and thecalibrated results of internal parameters of the camera,3-D coordinates of the measured feature points in thecamera coordinate system can be calculated.5) According to the 3-D coordinates of the measured featurepoints in the camera coordinate system, the calibratedresults of external parameters of the camera, and the Fig. 2. Basic principle of the measuring system.moving distances of the camera in the XYZ directionsof the large-scale CMM, coordinate transformation canbe performed. Then, the 3-D coordinates of the measuredfeature points in the unified large-scale CMM coordinatesystem can be obtained.The aforementioned noncontact 3-D vision measuring sys-tem has the advantages of large measuring range, high precisionand efficiency, simple calibration process, low manufacturingcosts, etc. All the aforementioned advantages satisfy the re-quirements of 3-D coordinate measurement of cross-cuttingfeature points on the surface of a large-scale workpiece in theactual manufacturing process.III. MONOCULAR VISION ALGORITHM FORCAMERA LOCATINGIn the proposed vision measuring system, the camera islocated by the combination of position-from-defocus andposition-from-focus of the monocular vision method. The keytechnique contains two aspects.1) Design of the New Image Focus Measure Function: Thecurrent image focus measure functions cannot satisfy the mea-suring requirements of cross-cutting feature points [23], [24];thus, a new kind of image focus measure function of energy-spectrum entropy is proposed. The experimental results indicatethat it is more accurate, reliable, and stable than other imagefocus measure functions.The 2-D discrete Fourier transformation of the Kth image inthe image sequence can be described as follows:FFT2(u, v)=NX  xI=1NY  yI=1fk(xI ,yI )• exp −j • 2π  xINX• u + yINY• v  . (1)Its normalized energy spectrum corresponding to the 2-Dharmonic component (u, v) can be described as follows:E(u, v)=⎡⎢ ⎢ ⎢ ⎣Re⎛⎜ ⎜ ⎜ ⎝FFT2(u, v)NX  xI=1NY  yI=1fk(xI ,yI )⎞⎟ ⎟ ⎟ ⎠⎤⎥ ⎥ ⎥ ⎦2+⎡⎢ ⎢ ⎢ ⎣Im⎛⎜ ⎜ ⎜ ⎝FFT2(u, v)NX  xI=1NY  yI=1fk(xI ,yI )⎞⎟ ⎟ ⎟ ⎠⎤⎥ ⎥ ⎥ ⎦2. (2)To completely analyze the synthetic effects of various 2-Dharmonic components of the image energy spectrum, theentropy of energy spectrum of all 2-D harmonic componentsof the Kth image in the image sequence can be calculated asfollows:E_Ek = −NX  u=1NY  v=1E(u, v) • ln E(u, v). (3)LetE_Eoptimum = max1≤k≤N{E_Ek}, 1 ≤ k ≤ N. (4)Then, the corresponding position of E_Eoptimum is the camerafocusing position.
    2) New Camera Focusing Position Locating Algorithm:First, the image energy-spectrum entropy of position zi in thedepth direction is denoted by E_E(zi) in the position-from-defocus process, and the function curve of E_E(zi) shouldcoincide with the Gaussian distribution near the peak. Threepoints near the peak are selected and fitted according to theGaussian model, and then, the peak position of the E_E(zi)function curve can be calculated and considered the approxi-mate camera focusing position. Second, the more accurate camera focusing position can becalculated by the position-from-focus algorithm near the ap-proximate camera focusing position. Because the approximatecamera focusing position has been calculated by the position-from-defocus algorithm, the searching range has been reducedfor the position-from-focus algorithm, and the searching speedand locating precision can be increased. This is the reasonwhy position-from-defocus and position-from-focus methodsare combined together in the proposed camera focusing positionlocating algorithm.The depth of focus of the camera optical system is the mainerror factor of the position-from-focus method. Let Δl denotethe depth of focus. N frame images are shot in the rangeof ±3Δl near the approximate camera focusing position thathas been calculated by the position-from-defocus algorithm,the image energy-spectrum entropy of each frame E_E(zi) iscalculated, and all the function values of E_E(zi) are fitted byan m-order least-squares polynomial.Let δ_sum(m) denote the sum of squares of the differencesof the m-order least-squares fitting polynomial values with thecorresponding image energy-spectrum entropy function values.The fitting order M corresponding to the minimum value ofδ_sum(m) is selected, and the camera position correspondingto the maximum value of the M-order least-squares fittingpolynomial is the final accurate camera focusing position.IV. CALIBRATION ALGORITHM OF THE CAMERAThe ideal 2-D image coordinates ( ˆ XP , ˆ YP ) and the actual2-D image coordinates with distortion errors (ˆ xP , ˆ yP ) arerelative to the left top corner of the CCD image plane. The cor-responding 2-D image coordinates in the computer image co-ordinate system are ( ˆ XI , ˆ YI ) and (ˆ xI , ˆ yI ), respectively, whichare relative to the left top corner of the 2-D image data array.Correspondingly, (XP ,YP ) is relative to (XP0,YP0), whereas(xP ,yP ) is relative to the actual principle point (xP0,yP0)of the CCD image plane. (XI ,YI ) is relative to (XI0,YI0),whereas (xI ,yI ) is relative to the pixel position (xI0,yI0) of(xP0,yP0).The geometrical distortions of the optical imaging systemmainly contain three types [25], [26].1) Radial distortion: This causes an inward or outwarddisplacement of a given image point from its ideal loca-tion. The radial distortion of a perfectly centered lens isgiven byδr = k1ρ3+ k2ρ5+ k3ρ7+ ••• (5)where ρ is the radial distance from the center of the CCDimage plane, and k1, k2, and k3 are the coefficients ofradial distortion. Since each image point can be repre-sented by polar coordinates (ρ, ϕ), the polar coordinatesof each image point can be transferred to the Cartesiancoordinates (XP ,YP ) with XP = ρ • cos ϕYP = ρ • sin ϕ.(6)Then, the radial distortion in the Cartesian coordinatescan be described as follows: δXPr = k1XP X2P + Y 2P + O (XP ,YP )5 δYPr = k1YP X2P + Y 2P + O (XP ,YP )5 .(7)2) Decentering distortion: This has both radial and tangen-tial components, which can be described by the followingequation: δrd =3(j1ρ2 + j2ρ4 + •••)sin(ϕ − ϕ0)δtd =(j1ρ2 + j2ρ4 + •••)cos(ϕ − ϕ0)(8)where ϕ0 is the angle between the positive
  1. 上一篇:机械手英文文献和中文翻译
  2. 下一篇:先导式减压阀英文文献和中文翻译
  1. 电-气动驱动的垂直计算机...

  2. PLC工业机器人英文文献和中文翻译

  3. 工件焊接机器人的设计英文文献和中文翻译

  4. 视觉伺服系统英文文献和中文翻译

  5. 扭矩笛卡尔阻抗控制技术...

  6. 未来工业机器人控制的发...

  7. 耦合侧向扭转频率的不对...

  8. 当代大学生慈善意识研究+文献综述

  9. 大众媒体对公共政策制定的影响

  10. 乳业同业并购式全产业链...

  11. 中考体育项目与体育教学合理结合的研究

  12. 河岸冲刷和泥沙淤积的监测国内外研究现状

  13. 杂拟谷盗体内共生菌沃尔...

  14. 十二层带中心支撑钢结构...

  15. 电站锅炉暖风器设计任务书

  16. java+mysql车辆管理系统的设计+源代码

  17. 酸性水汽提装置总汽提塔设计+CAD图纸

  

About

751论文网手机版...

主页:http://www.751com.cn

关闭返回