CN113359461B - Kinematics calibration method suitable for bionic eye system - Google Patents
Kinematics calibration method suitable for bionic eye system Download PDFInfo
- Publication number
- CN113359461B CN113359461B CN202110711297.2A CN202110711297A CN113359461B CN 113359461 B CN113359461 B CN 113359461B CN 202110711297 A CN202110711297 A CN 202110711297A CN 113359461 B CN113359461 B CN 113359461B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- error
- bionic eye
- kinematic
- bionic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 239000011664 nicotinic acid Substances 0.000 title claims abstract description 129
- 238000000034 method Methods 0.000 title claims abstract description 44
- 210000001508 eye Anatomy 0.000 claims abstract description 147
- 238000005259 measurement Methods 0.000 claims abstract description 68
- 239000011159 matrix material Substances 0.000 claims description 43
- 230000009466 transformation Effects 0.000 claims description 35
- 238000003384 imaging method Methods 0.000 claims description 28
- 238000013519 translation Methods 0.000 claims description 15
- 238000005457 optimization Methods 0.000 claims description 10
- 238000000691 measurement method Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 6
- 239000012636 effector Substances 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 238000003754 machining Methods 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 210000005252 bulbus oculi Anatomy 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 1
- 208000037170 Delayed Emergence from Anesthesia Diseases 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B13/00—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
- G05B13/02—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
- G05B13/04—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric involving the use of models or simulators
- G05B13/042—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric involving the use of models or simulators in which a parameter or coefficient is automatically adjusted to optimise the performance
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
本发明提供了一种适用于仿生眼系统的运动学标定方法,属于仿生眼技术领域。本发明方法通过引入运动学模型误差后的机器仿生眼系统三维定位测量和激光测距仪三维定位测量,建立仿生眼系统三维定位测量误差模型,将运动学模型误差进行分组,并利用泰勒展开对引入运动学模型误差后的仿生眼系统三维定位测量的近似解析式进行求解,最后对所引入的运动学模型误差中的运动学误差参数进行了辨识和补偿,以提高仿生眼系统三维定位测量的精度。
The invention provides a kinematic calibration method suitable for a bionic eye system, belonging to the technical field of bionic eyes. The method of the present invention establishes the three-dimensional positioning measurement error model of the bionic eye system by introducing the three-dimensional positioning measurement of the machine bionic eye system after the kinematic model error and the three-dimensional positioning measurement of the laser rangefinder, and groups the kinematic model errors, and uses Taylor expansion to The approximate analytical formula of the three-dimensional positioning measurement of the bionic eye system after introducing the kinematic model error is solved, and finally the kinematic error parameters in the introduced kinematic model error are identified and compensated to improve the accuracy of the three-dimensional positioning measurement of the bionic eye system. precision.
Description
技术领域technical field
本发明属于仿生眼技术领域,具体涉及一种适用于仿生眼系统的运动学标定方法。The invention belongs to the technical field of bionic eyes, and in particular relates to a kinematic calibration method suitable for a bionic eye system.
背景技术Background technique
仿生眼系统中存在的运动学模型误差会降低三维定位测量的精度。仿生眼系统的运动学模型误差主要来源于机械加工误差、装配误差以及关节零位误差。通过运动学标定获取仿生眼系统的运动学模型误差中的运动学误差参数并对误差进行补偿,对于提高仿生眼系统三维定位测量的精度具有重要意义。The kinematic model error in the bionic eye system will reduce the accuracy of 3D positioning measurement. The kinematic model error of bionic eye system mainly comes from machining error, assembly error and joint zero position error. Obtaining the kinematic error parameters in the kinematic model error of the bionic eye system through kinematic calibration and compensating the error is of great significance for improving the accuracy of the three-dimensional positioning measurement of the bionic eye system.
三维定位测量是指通过测量设备获得待测目标信息并对信息进行解析,从而获取待测目标的三维坐标的过程。三维定位测量通常分为接触式和非接触式两类方法。接触式方法利用特定仪器快速直接地对目标进行三维定位测量,具有测量精度高的优点,但仅能应用于仪器能够接触到待测量目标的场景,而且在测量时可能造成被测量目标的损伤。非接触式方法指不接触待测量目标的前提下,对目标进行三维定位测量的方法。非接触式方法的测量精度没有接触式方法高,但由于接触式方法存在上述局限性,因此非接触式方法的应用范围比接触式方法更广泛。非接触式方法主要包括主动式和被动式两种三维定位测量方法。Three-dimensional positioning measurement refers to the process of obtaining the information of the target to be measured through the measuring equipment and analyzing the information, so as to obtain the three-dimensional coordinates of the target to be measured. Three-dimensional positioning measurement is usually divided into contact and non-contact methods. The contact method uses a specific instrument to quickly and directly measure the three-dimensional positioning of the target, which has the advantage of high measurement accuracy, but it can only be applied to the scene where the instrument can touch the target to be measured, and may cause damage to the target during measurement. The non-contact method refers to the method of performing three-dimensional positioning measurement on the target without touching the target to be measured. The measurement accuracy of the non-contact method is not as high as that of the contact method, but due to the above-mentioned limitations of the contact method, the application range of the non-contact method is wider than that of the contact method. Non-contact methods mainly include active and passive three-dimensional positioning measurement methods.
主动式三维定位测量方法指主动地向待测量目标发射可控信号,通过对发射信号与返回信号的解析实现对目标的三维定位测量。主动式三维定位测量方法需要专门的信号发生及控制装置,测量系统相对比较复杂,测量所需的成本较高。主动式三维测量方法主要包括结构光法、激光扫描法、飞行时间法(Time of Flight,TOF)等。被动式三维定位测量方法指直接依赖自然光源,通过对相机获取的图像中的信息进行解析实现对目标的三维定位测量。与主动式三维定位测量方法相比,被动式三维定位测量方法操作相对简单,成本也相对较低,能够应用于各种复杂环境。被动式三维定位测量方法根据相机数量的不同可分为单目视觉法、双目视觉法及多目视觉法。The active three-dimensional positioning measurement method refers to actively transmitting a controllable signal to the target to be measured, and realizing the three-dimensional positioning measurement of the target by analyzing the transmitted signal and the returned signal. The active three-dimensional positioning measurement method requires a special signal generation and control device, the measurement system is relatively complicated, and the cost required for the measurement is relatively high. Active three-dimensional measurement methods mainly include structured light method, laser scanning method, time of flight (Time of Flight, TOF) and so on. The passive three-dimensional positioning measurement method refers to directly relying on natural light sources, and realizing the three-dimensional positioning measurement of the target by analyzing the information in the image acquired by the camera. Compared with the active three-dimensional positioning measurement method, the passive three-dimensional positioning measurement method is relatively simple in operation and relatively low in cost, and can be applied in various complex environments. Passive three-dimensional positioning measurement methods can be divided into monocular vision method, binocular vision method and multi-eye vision method according to the number of cameras.
机器人运动学标定对于提高机器人的定位精度而言具有重要的意义。在国内外学者开展的大多数机器人运动学标定研究中,机器人的定位精度是指机器人末端执行器的位姿精度。由于机械加工误差、装配差、关节零位偏差等因素的存在,机器人理想的末端执行器位姿与实际的末端执行器位姿之间存在差异,因此会降低机器人的定位精度。机器人运动学标定是指不改变机器人的硬件配置,通过修正运动学控制模型提高机器人定位精度的过程。根据研究表明,几何参数误差是影响机器人定位精度的主要因素,因此许多学者针对几何参数因素开展机器人运动学标定研究。机器人运动学标定主要分为四个步骤:运动学建模、测量、参数辨识及误差补偿。Robot kinematics calibration is of great significance for improving the positioning accuracy of robots. In most of the robot kinematics calibration researches carried out by domestic and foreign scholars, the positioning accuracy of the robot refers to the pose accuracy of the robot end effector. Due to the existence of factors such as machining errors, poor assembly, and joint zero position deviation, there are differences between the ideal end-effector pose of the robot and the actual end-effector pose, which will reduce the positioning accuracy of the robot. Robot kinematics calibration refers to the process of improving the positioning accuracy of the robot by modifying the kinematics control model without changing the hardware configuration of the robot. According to research, the error of geometric parameters is the main factor affecting the positioning accuracy of robots, so many scholars have carried out research on robot kinematics calibration for geometric parameters. Robot kinematics calibration is mainly divided into four steps: kinematics modeling, measurement, parameter identification and error compensation.
(1)运动学建模(1) Kinematic modeling
1955年Denavit等人提出了Denavit-Hartenberg(D-H)模型,该模型是目前应用最广泛的机器人运动学模型。In 1955, Denavit et al. proposed the Denavit-Hartenberg (D-H) model, which is currently the most widely used robot kinematics model.
(2)测量(2) Measurement
在机器人运动学标定的过程中,利用外部测量设备对机器人末端执行器的位姿进行测量,将测量的末端执行器位姿转换到机器人基坐标系中,然后进行参数辨识,测量的精度直接决定了参数辨识的精度。常用的外部测量设备有激光追踪仪、经纬仪、三坐标测量机、球杆仪、视觉测量设备等。In the process of robot kinematics calibration, external measuring equipment is used to measure the pose of the end effector of the robot, and the measured pose of the end effector is transformed into the base coordinate system of the robot, and then parameter identification is performed. The accuracy of the measurement is directly determined The accuracy of parameter identification is improved. Commonly used external measurement devices include laser trackers, theodolites, three-coordinate measuring machines, ballbars, and visual measurement devices.
(3)参数辨识(3) Parameter identification
参数辨识通常是指通过建立机器人运动学参数误差与末端位姿误差之间的映射关系,利用优化算法辨识出机器人运动学参数误差的过程,是机器人运动学标定方法中的核心问题。最常用的参数辨识优化算法是最小二乘法。Parameter identification usually refers to the process of identifying the error of robot kinematic parameters by establishing the mapping relationship between the robot kinematic parameter error and the terminal pose error, and using the optimization algorithm, which is the core issue in the robot kinematic calibration method. The most commonly used optimization algorithm for parameter identification is the least squares method.
(4)误差补偿(4) Error compensation
误差补偿是指将辨识得到的运动学参数误差补偿到机器人运动学参数中,使机器人运动学参数与末端位姿之间的映射关系更加精确,从而提高机器人的定位精度。目前常用的误差补偿方法包括微分误差补偿、关节空间补偿、基于神经网络的实时误差补偿等。Error compensation refers to compensating the identified kinematic parameter error to the robot kinematic parameters, so that the mapping relationship between the robot kinematic parameters and the terminal pose is more accurate, thereby improving the positioning accuracy of the robot. Currently commonly used error compensation methods include differential error compensation, joint space compensation, and real-time error compensation based on neural networks.
在仅颈部可动的机器人主动式双目视觉系统中,由于两个相机固定,能够获取的视觉信息有限,特别是无法感知距双目视觉系统很近的物体。两个相机及颈部可动的机器人主动式双目视觉系统更接近人类的视觉系统,能够获取更多的视觉信息。然而,现有的两个相机及颈部可动的机器人主动式双目视觉系统大多不能满足轻量化、小型化的要求。In the active binocular vision system of a robot with only a movable neck, because the two cameras are fixed, the visual information that can be obtained is limited, especially the objects that are very close to the binocular vision system cannot be perceived. The robot's active binocular vision system with two cameras and a movable neck is closer to the human visual system and can obtain more visual information. However, most of the existing active binocular vision systems for robots with two cameras and a movable neck cannot meet the requirements of light weight and miniaturization.
大多数机器人双目视觉系统实现三维测量时,需满足两个相机之间的相对位姿保持不变的条件。当两个相机的相对位姿发生变化时(例如对运动目标进行三维定位测量时),需要重新对立体外参进行标定。现有的机器人双目视觉系统三维定位测量方法中,立体外参误差及两幅图像中匹配成像点之间的视差误差会对三维定位测量的结果产生很大影响。When most robot binocular vision systems realize 3D measurement, it is necessary to meet the condition that the relative pose between the two cameras remains unchanged. When the relative pose of the two cameras changes (for example, when performing three-dimensional positioning measurement on a moving target), it is necessary to re-calibrate the stereo extrinsic parameters. In the existing three-dimensional positioning measurement method of the robot binocular vision system, the stereo external parameter error and the parallax error between the matching imaging points in the two images will have a great influence on the result of the three-dimensional positioning measurement.
仿生眼系统中存在的机械加工误差、装配误差、关节零位误差等因素会降低仿生眼系统三维定位测量的精度。现有的机器人运动学标定方法大多用于提高工业机器人末端的定位精度。因此需要提出新的适用于仿生眼系统的运动学标定方法,对仿生眼系统的运动学模型误差中的运动学误差参数进行辨识和补偿,从而提高仿生眼系统三维定位测量的精度。Factors such as machining errors, assembly errors, and joint zero position errors in the bionic eye system will reduce the accuracy of the three-dimensional positioning measurement of the bionic eye system. Most of the existing robot kinematics calibration methods are used to improve the positioning accuracy of the industrial robot end. Therefore, it is necessary to propose a new kinematic calibration method suitable for the bionic eye system, to identify and compensate the kinematic error parameters in the kinematic model error of the bionic eye system, so as to improve the accuracy of the three-dimensional positioning measurement of the bionic eye system.
发明内容Contents of the invention
针对现有技术中存在不足,本发明提供了一种适用于仿生眼系统的运动学标定方法,提高仿生眼系统三维定位测量的精度。Aiming at the deficiencies in the prior art, the present invention provides a kinematics calibration method suitable for the bionic eye system, which improves the accuracy of the three-dimensional positioning measurement of the bionic eye system.
本发明是通过以下技术手段实现上述技术目的的。The present invention achieves the above-mentioned technical purpose through the following technical means.
一种适用于仿生眼系统的运动学标定方法,通过引入运动学模型误差后的机器仿生眼系统三维定位测量和激光测距仪三维定位测量,建立仿生眼系统三维定位测量误差模型,然后将运动学模型误差分组,利用泰勒展开对引入运动学模型误差后的仿生眼系统三维定位测量的近似解析式进行求解,最后利用非线性优化算法对引入的运动学模型误差中的运动学误差参数进行了辨识,并将辨识得到的误差参数补偿到引入运动学模型误差后的仿生眼系统三维定位测量的近似解析式中。A kinematics calibration method suitable for bionic eye system. By introducing the three-dimensional positioning measurement of the machine bionic eye system and the three-dimensional positioning measurement of the laser rangefinder after introducing the kinematic model error, the three-dimensional positioning measurement error model of the bionic eye system is established, and then the motion Grouping the errors of the kinematic model, using Taylor expansion to solve the approximate analytical formula of the three-dimensional positioning measurement of the bionic eye system after introducing the kinematic model error, and finally using the nonlinear optimization algorithm to calculate the kinematic error parameters in the introduced kinematic model error Identification, and the error parameters obtained from the identification are compensated into the approximate analytical formula of the three-dimensional positioning measurement of the bionic eye system after introducing the kinematic model error.
进一步地,所述仿生眼系统三维定位测量误差模型为:Further, the three-dimensional positioning measurement error model of the bionic eye system is:
其中,是世界坐标系到仿生眼ON-XNYNZN坐标系的变换矩阵逆矩阵,是空间点在仿生眼基坐标系ON-XNYNZN下的坐标,是利用激光测距仪与反射球配合测量的空间点在世界坐标系的坐标。in, is the inverse matrix of the transformation matrix from the world coordinate system to the bionic eye O N -X N Y N Z N coordinate system, is the coordinate of the space point in the bionic eye-based coordinate system O N -X N Y N Z N , It is the coordinate of the space point in the world coordinate system measured by the laser rangefinder and the reflective ball.
进一步地,所述按照如下公式计算得到:Further, the Calculated according to the following formula:
P′N=NT′ClP′Cl (1)P′ N = N T′ Cl P′ Cl (1)
其中,P′Cl为引入误差后目标空间点P在仿生眼左眼相机坐标系下的坐标,引入误差后的左眼相机坐标系相对于基坐标系ON-XNYNZN的齐次变换矩阵 NT2表示坐标系{2}相对于基坐标系ON-XNYNZN的齐次变换矩阵,2T3l表示坐标系{3l}相对于坐标系{2}的齐次变换矩阵,表示存在偏差的坐标系相对于坐标系{3l}的齐次变换矩阵,表示坐标系{4}相对于存在偏差的坐标系的齐次变换矩阵,表示存在偏差的坐标系相对于坐标系{4}的齐次变换矩阵,表示坐标系{5}相对于存在偏差的坐标系的齐次变换矩阵,表示存在偏差的坐标系相对于坐标系{5}的齐次变换矩阵,表示仿生眼左眼相机坐标系相对于存在偏差的坐标系的齐次变换矩阵;Among them, P′ Cl is the coordinate of the target space point P in the left-eye camera coordinate system of the bionic eye after the error is introduced, and the left-eye camera coordinate system after the error is introduced is aligned with the base coordinate system O N -X N Y N Z N secondary transformation matrix N T 2 represents the homogeneous transformation matrix of the coordinate system {2} relative to the base coordinate system O N -X N Y N Z N , 2 T 3l represents the homogeneous transformation matrix of the coordinate system {3l} relative to the coordinate system {2} , Indicates the offset coordinate system The homogeneous transformation matrix with respect to the coordinate system {3l}, Indicates that the coordinate system {4} is relative to the offset coordinate system The homogeneous transformation matrix of , Indicates the offset coordinate system The homogeneous transformation matrix with respect to the coordinate system {4}, Indicates that the coordinate system {5} is relative to the offset coordinate system The homogeneous transformation matrix of , Indicates the offset coordinate system The homogeneous transformation matrix with respect to the coordinate system {5}, Indicates that the coordinate system of the left eye camera of the bionic eye is deviated relative to the coordinate system The homogeneous transformation matrix of ;
将公式(1)表示成函数形式:Express the formula (1) in a functional form:
其中函数的输入为目标空间点P在仿生眼左、右相机成像平面中的成像点的像素坐标防生眼左、右眼各关节角θi以及运动学模型误差δχ,其中i=4,5,6,7。The input of the function is the pixel coordinates of the imaging point of the target space point P in the imaging plane of the left and right cameras of the bionic eye The joint angle θ i of the left and right eyes of the anti-health eye and the error δχ of the kinematic model, where i=4,5,6,7.
更进一步地,所述δχ是由25组误差参数组成的运动学模型误差,且误差参数共40个,具体为:坐标系{3l}在X、Y、Z方向上的平移误差参数δx3l、δy3l、δz3l,绕坐标系{3l}的Z、Y、X轴的旋转误差参数δα3l、δβ3l、δγ3l,坐标系{4}在x、Y、Z方向上的平移误差参数δx4、δy4、δz4,绕坐标系{4}的Z、Y、X轴的旋转误差参数δα4、δβ4、δγ4,坐标系{5}在X、Y、Z方向上的平移误差参数δx5、δy5、δz5,绕坐标系{5}的Z、Y、X轴的旋转误差参数δα5、δβ5、δγ5,坐标系{3r}在X、Y、Z方向上的平移误差参数δx3r、δy3r、δz3r,绕坐标系{3r}的Z、Y、X轴的旋转误差参数δα3r、δβ3r、δγ3r,坐标系{6}在X、Y、Z方向上的平移误差参数δx6、δy6、δz6,绕坐标系{6}的Z、Y、X轴的旋转误差参数δα6、δβ6、δγ6,坐标系{7}在X、Y、Z方向上的平移误差参数δx7、δy7、δz7,绕坐标系{5}的Z、Y、X轴的旋转误差参数δα7、δβ7、δγ7,δθi为第i个关节角的零位误差参数。Furthermore, the δχ is a kinematic model error composed of 25 sets of error parameters, and there are 40 error parameters in total, specifically: the translation error parameters δx 3l of the coordinate system {3l} in the X, Y, and Z directions, δy 3l , δz 3l , the rotation error parameters δα 3l , δβ 3l , δγ 3l around the Z, Y, and X axes of the coordinate system {3l}, the translation error parameters δx of the coordinate system {4} in the x, Y, and Z directions 4 , δy 4 , δz 4 , the rotation error parameters δα 4 , δβ 4 , δγ 4 around the Z, Y, and X axes of the coordinate system {4}, and the translation errors of the coordinate system {5} in the X, Y, and Z directions Parameters δx 5 , δy 5 , δz 5 , the rotation error parameters δα 5 , δβ 5 , δγ 5 around the Z, Y, and X axes of the coordinate system {5}, the rotation error parameters of the coordinate system {3r} in the X, Y, and Z directions Translation error parameters δx 3r , δy 3r , δz 3r , rotation error parameters δα 3r , δβ 3r , δγ 3r around the Z, Y, and X axes of the coordinate system {3r}, and the coordinate system {6} in the X, Y, and Z directions The translation error parameters δx 6 , δy 6 , δz 6 on , the rotation error parameters δα 6 , δβ 6 , δγ 6 around the Z, Y, and X axes of the coordinate system {6}, and the coordinate system {7} in X, Y, Translation error parameters δx 7 , δy 7 , δz 7 in the Z direction, rotation error parameters δα 7 , δβ 7 , δγ 7 around the Z, Y, and X axes of the coordinate system { 5 }, δθ i is the ith joint angle The zero error parameter of .
进一步地,利用泰勒展开对引入误差后的仿生眼系统三维测量的近似解析式进行求解,具体为:Further, the approximate analytical formula of the three-dimensional measurement of the bionic eye system after the error is introduced is solved by using Taylor expansion, specifically:
其中:包含的误差参数,o(·)表示泰勒展开式的高阶小项。in: Included error parameter, o( ) represents the higher-order minor term of the Taylor expansion.
更进一步地,所述引入误差后的仿生眼系统三维定位测量近似解析式为:Furthermore, the approximate analytical formula for the three-dimensional positioning measurement of the bionic eye system after the error is introduced is:
更进一步地,所述用非线性优化算法对引入的运动学模型误差中的运动学误差参数进行了辨识,具体为:通过迭代获取优化后的变换矩阵NTW,并辨识出运动学模型误差δχ中的40个仿生眼系统运动学误差参数。Furthermore, the non-linear optimization algorithm is used to identify the kinematic error parameters in the introduced kinematic model error, specifically: obtain the optimized transformation matrix N T W through iteration, and identify the kinematic model error 40 kinematic error parameters of the bionic eye system in δχ.
更进一步地,优化仿生眼系统运动学误差模型的目标是:寻找仿生眼左、右眼运动学模型误差δχ和世界坐标系到仿生眼基坐标系ON-XNYNZN的变换矩阵NTW,使得误差模型最小,表示为: Furthermore, the goal of optimizing the kinematics error model of the bionic eye system is to find the error δχ of the kinematics model of the left and right eyes of the bionic eye and the transformation matrix from the world coordinate system to the base coordinate system of the bionic eye O N -X N Y N Z N N T W , making the error model the smallest, expressed as:
本发明的有益效果为:The beneficial effects of the present invention are:
(1)本发明从人类眼球和颈部的结构与运动机理、人类眼球的视觉机理出发,设计并研制了双眼随动仿生眼系统,建立并验证了仿生眼系统的运动学模型,所设计的仿生眼系统能够满足轻量化、小型化的要求,具有灵活性高、视野宽的特点。(1) The present invention sets out from the structure and motion mechanism of human eyeball and neck, the visual mechanism of human eyeball, designs and develops the bionic eye system of binocular follow-up, establishes and verifies the kinematic model of bionic eye system, the designed The bionic eye system can meet the requirements of light weight and miniaturization, and has the characteristics of high flexibility and wide field of view.
(2)本发明针对仿生眼系统中存在的运动学模型误差会降低三维定位测量精度的问题,提出了一种适用于仿生眼系统的运动学标定方法,对引入运动学模型误差后仿生眼系统三维测量的误差进行了建模,将运动学模型误差进行分组,并利用泰勒展开对引入运动学模型误差后的仿生眼系统三维定位测量的近似解析式进行求解,大大降低了符号运算的运算量,最后对所引入的运动学模型误差中的运动学误差参数进行了辨识和补偿,以提高仿生眼系统三维定位测量的精度。(2) The present invention aims at the problem that the kinematic model error existing in the bionic eye system will reduce the accuracy of three-dimensional positioning measurement, and proposes a kinematic calibration method suitable for the bionic eye system. The error of the three-dimensional measurement is modeled, and the kinematic model errors are grouped, and the approximate analytical formula of the three-dimensional positioning measurement of the bionic eye system after introducing the kinematic model error is solved by using Taylor expansion, which greatly reduces the amount of calculation of the symbolic operation Finally, the kinematic error parameters in the introduced kinematic model error are identified and compensated to improve the accuracy of the three-dimensional positioning measurement of the bionic eye system.
附图说明Description of drawings
图1为本发明所述适用于仿生眼系统运动学标定方法流程图;Fig. 1 is a flow chart of the kinematic calibration method applicable to the bionic eye system according to the present invention;
图2为本发明使用的高精度绝对激光追踪仪实物图;Fig. 2 is the physical figure of the high-precision absolute laser tracker used in the present invention;
图3为本发明使用的反射球实物图;Fig. 3 is the physical figure of the reflecting ball used in the present invention;
图4为本发明所述仿生眼系统的坐标系示意图。Fig. 4 is a schematic diagram of the coordinate system of the bionic eye system of the present invention.
具体实施方式detailed description
下面结合附图以及具体实施例对本发明作进一步的说明,但本发明的保护范围并不限于此。The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments, but the protection scope of the present invention is not limited thereto.
本发明适用于仿生眼系统的运动学标定方法,通过引入运动学模型误差后的机器仿生眼系统三维定位测量和激光测距仪三维定位测量,建立机器仿生眼系统三维定位测量误差模型,然后将运动学模型误差分组,利用泰勒展开对引入运动学模型误差后的仿生眼系统三维定位测量的近似解析式进行求解,最后利用非线性优化算法对引入的运动学模型误差中的运动学误差参数进行了辨识,并将辨识得到的误差参数补偿到引入运动学模型误差后的仿生眼系统三维定位测量的近似解析式中,从而提高仿生眼系统三维定位测量的精度。The present invention is applicable to the kinematics calibration method of the bionic eye system. By introducing the three-dimensional positioning measurement of the machine bionic eye system and the three-dimensional positioning measurement of the laser rangefinder after introducing the kinematic model error, the three-dimensional positioning measurement error model of the machine bionic eye system is established, and then the Kinematic model errors are grouped, and Taylor expansion is used to solve the approximate analytical formula of the three-dimensional positioning measurement of the bionic eye system after the introduction of kinematic model errors. The identification is carried out, and the error parameters obtained from the identification are compensated to the approximate analytical formula of the three-dimensional positioning measurement of the bionic eye system after introducing the kinematic model error, so as to improve the accuracy of the three-dimensional positioning measurement of the bionic eye system.
如图1所示,本发明适用于仿生眼系统的运动学标定方法,具体包括如下步骤:As shown in Figure 1, the present invention is applicable to the kinematic calibration method of the bionic eye system, which specifically includes the following steps:
步骤(1),基于三维定位测量的仿生眼系统运动学误差建模Step (1), kinematic error modeling of bionic eye system based on 3D positioning measurement
1)仿生眼系统三维定位测量理想模型1) The ideal model for three-dimensional positioning measurement of the bionic eye system
基于标准D-H法定义的仿生眼系统的坐标系如图2所示。The coordinate system of the bionic eye system defined based on the standard D-H method is shown in Figure 2.
仿生眼系统的D-H参数如表1所示,D-H参数表的每个参数的具体含义解释如下:The D-H parameters of the bionic eye system are shown in Table 1, and the specific meaning of each parameter in the D-H parameter table is explained as follows:
Joint distancedi(连杆偏距):沿着zi-1轴,从xi-1到xi的距离。Joint distanced i (connecting rod offset): along the z i-1 axis, the distance from x i-1 to x i .
Joint angleθi(关节角):沿着zi-1轴,从xi-1到xi的角度。Joint angleθ i (joint angle): along the z i-1 axis, the angle from x i-1 to x i .
Link length ai(连杆长度):沿着xi轴,从zi-1到zi的距离。Link length a i (link length): along the x i axis, the distance from z i-1 to z i .
Link twist angleαi(连杆转角):沿着xi轴,从zi-1到zi的角度。Link twist angleα i (connecting rod rotation angle): along the x i axis, the angle from z i-1 to z i .
qi:仿生眼在初始位姿下的每个关节的关节角θi。q i : the joint angle θ i of each joint of the bionic eye in the initial pose.
利用表1中的D-H参数和公式(1)(i-1Ti是从关节i到关节i-1的齐次变换矩阵)建立的仿生眼系统中相邻两关节之间的齐次变换矩阵如公式(2)-(9)所示。The homogeneous transformation matrix between two adjacent joints in the bionic eye system established by using the DH parameters in Table 1 and formula (1) ( i - 1 T i is the homogeneous transformation matrix from joint i to joint i-1) As shown in formulas (2)-(9).
表1 仿生眼系统的D-H参数Table 1 D-H parameters of the bionic eye system
仿生眼系统的立体外参CrTCl可计算如下:The stereo extrinsic parameters Cr T Cl of the bionic eye system can be calculated as follows:
CrTCl=(2T3r 3rT6 6T7 7TCr)-1(2T3l 3lT4 4T5 5TCl) (10) Cr T Cl =( 2 T 3r 3r T 6 6 T 7 7 T Cr )- 1 (2T 3l 3l T 4 4 T 5 5 T Cl ) (10)
其中5TCl和7TCr分别为标定后的仿生眼系统左、右相机的头眼参数,标定结果如下:Among them, 5 T Cl and 7 T Cr are the head-eye parameters of the left and right cameras of the bionic eye system after calibration, and the calibration results are as follows:
仿生眼系统左、右相机的内参数矩阵表示如下:The internal parameter matrix of the left and right cameras of the bionic eye system is expressed as follows:
对仿生眼系统左、右相机图像进行畸变矫正,畸变矫正后目标空间点P在左、右相机成像平面中的成像点的像素坐标分别表示为 Distortion correction is performed on the left and right camera images of the bionic eye system. After distortion correction, the pixel coordinates of the imaging points of the target space point P in the left and right camera imaging planes are expressed as
将世界坐标系OW-XWYWZW固定在仿生眼的左眼相机坐标系。目标空间点P在世界坐标系下的坐标为[xW yW zW 1]T,现实世界的目标空间点P,经过投影后,落在物理成像平面(像素平面)上,在相机平面上的成像点pC的坐标为[xC yC zC 1]T(在左眼相机平面上的成像点pCl的坐标为[xCl yCl zCl 1]T,在右眼相机平面上的成像点pCr的坐标为[xcr yCr zCr 1]T)。目标空间点P在左、右眼相机坐标系下的坐标分别如下:Fix the world coordinate system O W -X W Y W Z W to the left eye camera coordinate system of the bionic eye. The coordinates of the target space point P in the world coordinate system are [x W y W z W 1] T , the target space point P in the real world, after projection, falls on the physical imaging plane (pixel plane), and on the camera plane The coordinates of the imaging point p C are [x C y C z C 1] T (the coordinates of the imaging point p Cl on the left-eye camera plane are [x C y Cl z Cl 1 ] T , and on the right-eye camera plane The coordinates of the imaging point p Cr are [x cr y Cr z Cr 1] T ). The coordinates of the target space point P in the left and right eye camera coordinate systems are as follows:
目标空间点P在世界坐标系下的齐次坐标[xW yW zW 1]T与成像点p在像素坐标系下的齐次坐标[u v 1]T之间的映射关系如下:The mapping relationship between the homogeneous coordinate [x W y W z W 1] T of the target space point P in the world coordinate system and the homogeneous coordinate [uv 1] T of the imaging point p in the pixel coordinate system is as follows:
像素坐标系定义为:原点位于图像的左上角,u轴向右与x轴平行,v轴向下与y轴平行。像素坐标系与成像平面之间,相差了一个缩放和一个原点的平移。假设像素坐标在u轴上缩放了α倍,在v轴上缩放了β倍,同时,原点平移了[u0 v0]T,则成像点pC的坐标与像素坐标[u v]T的关系式为其中f为相机的焦距;将αf合并成fu,将βf合并成fv,在等式两边同乘zC,并写成矩阵的表示形式:其中为相机的内参矩阵。相机的位姿由它的旋转矩阵R和平移向量t来描述,又称为相机的外参,则齐次变换矩阵得到式中隐含了一次齐坐标到非齐次坐标的转换(K·T),为了式子能够成立,需要乘以转换矩阵[I3×3|0],其中,I3×3为三维的单位矩阵。The pixel coordinate system is defined as: the origin is located in the upper left corner of the image, the u-axis is parallel to the right of the x-axis, and the v-axis is parallel to the y-axis downward. Between the pixel coordinate system and the imaging plane, there is a difference of one zoom and one translation of the origin. Assuming that the pixel coordinates are scaled by α times on the u-axis and β times on the v-axis, and at the same time, the origin is translated by [u 0 v 0 ] T , then the relationship between the coordinates of the imaging point p C and the pixel coordinates [uv] T Formula is Where f is the focal length of the camera; combine αf into f u , combine βf into f v , multiply z C on both sides of the equation, and write it in the form of a matrix: in is the internal parameter matrix of the camera. The pose of the camera is described by its rotation matrix R and translation vector t, also known as the external parameters of the camera, then the homogeneous transformation matrix get The formula implies a transformation from homogeneous coordinates to non-homogeneous coordinates (K·T). In order for the formula to be established, it needs to be multiplied by the transformation matrix [I 3×3 |0], where I 3×3 is the three-dimensional identity matrix.
根据公式(17),可得出目标空间点P在左相机成像平面中的成像点的像素坐标与点P在世界坐标系下的坐标[xW yW zW 1]T之间的关系如下:According to formula (17), the pixel coordinates of the imaging point of the target space point P in the imaging plane of the left camera can be obtained The relationship with the coordinate [x W y W z W 1] T of point P in the world coordinate system is as follows:
目标空间点P在右相机成像平面中的成像点的像素坐标与点P在世界坐标系下的坐标[xW yW zW 1]T之间的关系如下:The pixel coordinates of the imaging point of the target space point P in the imaging plane of the right camera The relationship with the coordinate [x W y W z W 1] T of point P in the world coordinate system is as follows:
在公式(18)、公式(19)中分别消去ZCl、ZCr,然后联立两个公式可得:Eliminate Z Cl and Z Cr in formula (18) and formula (19) respectively, and then combine the two formulas to get:
将公式(20)表达为AX=B的形式。可利用X=(ATA)-1(ATB)对X的最小二乘解进行求解,求得X的表达式为目标空间点P在仿生眼左眼相机坐标系下的坐标PCl。Formula (20) is expressed in the form of AX=B. X = (A T A) -1 (A T B) can be used to solve the least squares solution of X, and the expression to obtain X is the coordinate P Cl of the target space point P in the bionic eye left eye camera coordinate system .
将仿生眼的颈部各关节设置为初始状态(关节角处于零位)并固定,并将基坐标系ON-XNYNZN建立在仿生眼颈部的末端。目标空间点P在基坐标系ON-XNYNZN下的三维坐标可计算如下:Set the joints of the neck of the bionic eye to the initial state (the joint angle is at zero) and fix it, and establish the base coordinate system O N -X N Y N Z N at the end of the neck of the bionic eye. The three-dimensional coordinates of the target space point P in the base coordinate system O N -X N Y N Z N can be calculated as follows:
PN=NTClPCl (21)P N = N T Cl P Cl (21)
其中NTCl表示为左眼相机坐标系相对于基坐标系ON-XNYNZN的齐次变换矩阵,且NTCl可计算如下:where N T Cl is expressed as the homogeneous transformation matrix of the left-eye camera coordinate system relative to the base coordinate system O N -X N Y N Z N , and N T Cl can be calculated as follows:
NTCl=NT2 2T3l 3lT4 4T5 5TCl (22) N T Cl = N T 2 2 T 3l 3l T 4 4 T 5 5 T Cl (22)
其中NT2表示为仿生眼颈部末端坐标系(坐标系{2}相对于基坐标系ON-XNYNZN的齐次变换矩阵),表示如下:Among them, N T 2 is expressed as the coordinate system of the end of the bionic eye neck (the homogeneous transformation matrix of the coordinate system {2} relative to the base coordinate system O N -X N Y N Z N ), expressed as follows:
2)引入运动学模型误差后的仿生眼系统三维定位测量模型建立2) Establishment of the three-dimensional positioning measurement model of the bionic eye system after introducing kinematic model errors
由于仿生眼系统属于主动式双目视觉系统,其三维定位测量精度很大程度上受到运动学模型误差的影响。从误差来源分析,仿生眼系统中主要存在以下两类运动学模型误差:Since the bionic eye system is an active binocular vision system, its three-dimensional positioning measurement accuracy is largely affected by the error of the kinematic model. From the analysis of error sources, there are mainly two types of kinematic model errors in the bionic eye system:
①由于机械加工误差、装配误差的存在,如图2所示的仿生眼系统坐标系{3l}、{4}、{5}、{3r}、{6}、{7}相对于理想位置存在偏差,将存在偏差的坐标系对应定义为 仿生眼系统各坐标系的偏差可定义如下:①Due to the existence of machining errors and assembly errors, the coordinate system {3l}, {4}, {5}, {3r}, {6}, {7} of the bionic eye system shown in Figure 2 exists relative to the ideal position Deviation, the corresponding coordinate system with deviation is defined as The deviation of each coordinate system of the bionic eye system can be defined as follows:
其中k=3l,4,5,3r,6,7;Trans([δxk δyk δzk])表示平移(translation)误差矩阵,RotZ(δαk)、RotY(δβk)、RotX(δγk)表示旋转(rotation)误差矩阵。Where k=3l, 4, 5, 3r, 6, 7; Trans([δx k δy k δz k ]) represents the translation error matrix, Rot Z (δα k ), Rot Y (δβ k ), Rot X (δγ k ) represents a rotation error matrix.
公式(24)中:In formula (24):
其中δxk、δyk、δzk分别为坐标系{k}在X、Y、Z方向上的平移误差参数。Among them, δx k , δy k , and δz k are the translation error parameters of the coordinate system {k} in the X, Y, and Z directions, respectively.
上式中,δαk、δβk、δγk分别为绕坐标系{k}的Z、Y、X轴的旋转误差参数。In the above formula, δα k , δβ k , and δγ k are the rotation error parameters around the Z, Y, and X axes of the coordinate system {k}, respectively.
②由于关节零位误差的存在,仿生眼系统的左、右眼各关节角θi(i=4,5,6,7)的偏差可定义如下:②Due to the existence of joint zero position error, the deviation of each joint angle θ i (i=4, 5, 6, 7) of the left and right eyes of the bionic eye system can be defined as follows:
其中为存在偏差的关节角,δθi为第i个关节角的零位误差参数。in is the joint angle with deviation, and δθ i is the zero error parameter of the i-th joint angle.
根据公式(10)、公式(24)、公式(29),可得出引入误差后的仿生眼系统的立体外参CrT′Cl如下:According to formula (10), formula (24) and formula (29), it can be concluded that the stereo extrinsic parameters Cr T′ Cl of the bionic eye system after introducing errors are as follows:
引入误差后目标空间点P在左相机成像平面中的成像点的像素坐标与点P在世界坐标系下的坐标[xW yW zW 1]T之间的关系仍如公式(18)所示。The pixel coordinates of the imaging point of the target space point P in the imaging plane of the left camera after the error is introduced The relationship with the coordinate [x W y W z W 1] T of point P in the world coordinate system is still as shown in formula (18).
引入误差后目标空间点P在右相机成像平面中的成像点的像素坐标与点P在世界坐标系下的坐标[xW yw zW 1]T之间的关系如下:The pixel coordinates of the imaging point of the target space point P in the imaging plane of the right camera after the error is introduced The relationship with the coordinate [x W y w z W 1] T of point P in the world coordinate system is as follows:
在公式(18)和公式(31)中分别消去ZCl、ZCr,然后联立两个公式可得:Eliminate Z Cl and Z Cr in formula (18) and formula (31) respectively, and then combine the two formulas to get:
将公式(32)表达为A′X=B′的形式。同样利用X=(A′TA′)-1(A′TB′)对X的最小二乘解进行求解,求得X的表达式为引入误差后目标空间点P在仿生眼左眼相机坐标系下的坐标P′Cl。Formula (32) is expressed in the form of A'X=B'. Also use X=(A' T A') -1 (A' T B') to solve the least squares solution of X, and obtain the expression of X as the target space point P in the bionic eye left eye camera after the error is introduced The coordinates P′ Cl in the coordinate system.
引入误差后目标空间点P在基坐标系ON-XNYNZN下的三维坐标可计算如下:After the error is introduced, the three-dimensional coordinates of the target space point P in the base coordinate system O N -X N Y N Z N can be calculated as follows:
P′N=NT′ClP′Cl (33)P' N = N T' Cl P' Cl (33)
其中NT′Cl可计算如下:where N T'Cl can be calculated as follows:
其中NT2仍利用公式(23)表示。Where N T 2 is still represented by formula (23).
公式(33)可表达为如下函数形式:Formula (33) can be expressed in the following functional form:
公式(35)中,函数的输入为目标空间点P在仿生眼左、右相机成像平面中的成像点的像素坐标仿生眼左、右眼各关节角θi(i=4,5,6,7)以及运动学模型误差δχ,其中δχ是由25组误差参数组成的运动学模型误差,如表2所示。由表2可知,仿生眼左、右眼运动学模型误差δχ共包含40个待优化的误差参数。In formula (35), the input of the function is the pixel coordinates of the imaging point of the target space point P in the imaging plane of the left and right cameras of the bionic eye The joint angles of the left and right eyes of the bionic eye θi ( i =4, 5, 6, 7) and the kinematic model error δχ, where δχ is the kinematic model error composed of 25 sets of error parameters, as shown in Table 2. It can be seen from Table 2 that the error δχ of the kinematic model of the left and right eyes of the bionic eye contains a total of 40 error parameters to be optimized.
表2 仿生眼左、右眼运动学模型误差δχ分组Table 2 The error δχ grouping of the kinematic model of the left and right eyes of the bionic eye
综上,引入误差后的仿生眼系统三维测量方法(算法1)可总结如下:In summary, the 3D measurement method (algorithm 1) of the bionic eye system after introducing errors can be summarized as follows:
3)基于三维定位测量的误差建模3) Error modeling based on 3D positioning measurement
目标空间点P在基坐标系ON-XNYNZN下的三维坐标P′N与目标空间点P在仿生眼左、右相机成像平面中的成像点的像素坐标仿生眼左、右眼各关节角θi(i=4,5,6,7)以及运动学模型误差δχ相关,其中目标空间点P在仿生眼左、右相机成像平面中的成像点的像素坐标仿生眼左、右眼各关节角θi(i=4,5,6,7)已知,运动学模型误差δχ未知。运动学模型误差δχ会很大程度上影响仿生眼系统的三维定位测量精度,因此需要获取运动学模型误差以补偿三维定位测量误差。The three-dimensional coordinates P′ N of the target space point P in the base coordinate system O N -X N Y N Z N and the pixel coordinates of the imaging point of the target space point P in the left and right camera imaging planes of the bionic eye The joint angle θ i (i=4, 5, 6, 7) of the left and right eyes of the bionic eye is related to the error δχ of the kinematic model, where the pixel of the imaging point of the target space point P in the imaging plane of the left and right cameras of the bionic eye coordinate The joint angles θ i (i=4, 5, 6, 7) of the left and right eyes of the bionic eye are known, and the error δχ of the kinematic model is unknown. The kinematic model error δχ will greatly affect the three-dimensional positioning measurement accuracy of the bionic eye system, so it is necessary to obtain the kinematic model error to compensate for the three-dimensional positioning measurement error.
利用高精度绝对激光测距仪Leica AT960(图3)与反射球(图4)配合测量一组(M个)空间点在世界坐标系的坐标世界坐标系固定在激光测距仪坐标系。根据算法1的流程计算这组空间点在仿生眼基坐标系ON-XNYNZN下的三维坐标世界坐标系到仿生眼ON-XNYNZN坐标系的变换矩阵为NTW。仿生眼系统与激光测距仪对一组空间点进行三维定位测量的误差模型如下:Use the high-precision absolute laser rangefinder Leica AT960 (Figure 3) and the reflective ball (Figure 4) to measure the coordinates of a group (M) of space points in the world coordinate system The world coordinate system is fixed in the laser rangefinder coordinate system. Calculate the three-dimensional coordinates of this group of space points in the bionic eye-based coordinate system O N -X N Y N Z N according to the process of Algorithm 1 The transformation matrix from the world coordinate system to the bionic eye O N -X N Y N Z N coordinate system is N T W . The error model for the three-dimensional positioning measurement of a group of space points by the bionic eye system and the laser rangefinder is as follows:
步骤(2)仿生眼系统运动学误差参数辨识与补偿Step (2) Identification and compensation of bionic eye system kinematics error parameters
1)引入误差后的仿生眼系统三维定位测量近似解析式1) Approximate analytical formula for 3D positioning measurement of the bionic eye system after introducing errors
如前所述,可利用算法1对引入误差后目标空间点P在仿生眼基坐标系ON-XNYNZN下的三维坐标P′N进行测量。由于仿生眼左、右眼运动学模型误差δχ共包含40个待优化的误差参数(表2),利用算法1进行P′N的符号运算时运算量非常大。因此,可利用泰勒展开对引入误差后的仿生眼系统三维测量的近似解析式进行求解。As mentioned above, Algorithm 1 can be used to measure the three-dimensional coordinates P′ N of the target space point P in the bionic eye-based coordinate system O N -X N Y N Z N after the error is introduced. Since the error δχ of the left and right eye kinematic models of the bionic eye contains a total of 40 error parameters to be optimized (Table 2), the amount of calculation is very large when using Algorithm 1 to perform the symbolic operation of P'N . Therefore, Taylor expansion can be used to solve the approximate analytical formula of the three-dimensional measurement of the bionic eye system after the error is introduced.
根据泰勒展开,有:According to Taylor expansion, there are:
公式(37)中:In formula (37):
其中δχk(k=1,2,...,25)包含的误差参数,见表2;o(·)表示泰勒展开式的高阶小项。The error parameters included in δχ k (k=1, 2,..., 25) are shown in Table 2; o(·) represents the high-order small term of the Taylor expansion.
另外,有:Additionally, there are:
公式(39)可表达为:Formula (39) can be expressed as:
其中为输入运动学模型误差为时求得目标空间点P在仿生眼基坐标系ON-XNYNZN下的三维坐标P′N。in The input kinematic model error is The three-dimensional coordinates P′ N of the target space point P in the bionic eye-based coordinate system O N -X N Y N Z N are obtained at the same time.
将公式(40)代入到公式(37)中,忽略高阶项,可得:Substituting formula (40) into formula (37), ignoring higher-order terms, we can get:
公式(41)即为引入误差后的仿生眼系统三维定位测量近似解析式。与算法1相比,由于近似解析式中最多包含4个待优化的误差参数,利用公式(41)进行P′N的符号运算时的运算量大大下降。本发明利用科学计算软件Mathematica对引入误差后的仿生眼系统三维定位测量的近似解析式进行求解。Formula (41) is the approximate analytical formula for the three-dimensional positioning measurement of the bionic eye system after the error is introduced. Compared with Algorithm 1, due to the approximate analytical formula It contains at most 4 error parameters to be optimized, and the calculation amount when using the formula (41) to perform the symbolic operation of P'N is greatly reduced. The invention uses the scientific calculation software Mathematica to solve the approximate analytical formula of the three-dimensional positioning measurement of the bionic eye system after the error is introduced.
综上,引入误差后的仿生眼系统三维定位测量的近似解析式求解算法(算法2)可总结如下:In summary, the approximate analytical solution algorithm (Algorithm 2) for the three-dimensional positioning measurement of the bionic eye system after introducing errors can be summarized as follows:
2)运动学误差参数辨识与补偿2) Kinematic error parameter identification and compensation
仿生眼系统运动学误差参数辨识是指通过建立仿生眼系统运动学误差参数与三维测量误差之间的映射关系,利用非线性优化算法辨识出仿生眼系统运动学误差参数的过程。本发明优化仿生眼系统运动学误差模型的目标是:寻找仿生眼左、右眼运动学模型误差δχ和世界坐标系到仿生眼基坐标系ON-XNYNZN的变换矩阵NTW,使得公式(36)的误差模型 能够尽可能的小,表示如下:The identification of kinematic error parameters of the bionic eye system refers to the process of identifying the kinematic error parameters of the bionic eye system by using a nonlinear optimization algorithm by establishing the mapping relationship between the kinematic error parameters of the bionic eye system and the three-dimensional measurement error. The goal of the present invention to optimize the kinematics error model of the bionic eye system is to find the error δχ of the kinematics model of the left and right eyes of the bionic eye and the transformation matrix N T from the world coordinate system to the base coordinate system of the bionic eye O N -X N Y N Z N W , such that the error model of equation (36) can be as small as possible, expressed as follows:
其中的近似解析式可利用算法2进行求解。in The approximate analytical formula of can be solved by Algorithm 2.
利用谷歌的非线性优化库Ceres对公式(42)中的非线性优化问题进行求解。利用Mathematica求解的近似解析式。公式(42)中NTW、δχ=(δχ1,δχ2,...,δχ25)T为优化变量,在Ceres库中称为参数块。利用公式(42)构建代价函数,在Ceres库中称为残差块。然后可通过迭代获取优化后的变换矩阵NTW,并辨识出δχ中的40个仿生眼系统运动学误差参数。Use Google's nonlinear optimization library Ceres to solve the nonlinear optimization problem in formula (42). Using Mathematica to solve an approximate analytical formula for . In formula (42), N T W , δχ=(δχ 1 , δχ 2 , ..., δχ 25 ) T are optimization variables, which are called parameter blocks in the Ceres library. The cost function is constructed using Equation (42), called a residual block in the Ceres library. Then the optimized transformation matrix N T W can be obtained through iteration, and 40 kinematic error parameters of the bionic eye system in δχ can be identified.
将辨识得到的仿生眼左、右眼运动学模型误差δχ中的40个运动学误差参数补偿到(带入)公式(41)中,得到对于设定的阈值ε,判断和优化后的变换矩阵NTW代入误差模型e的表达式中后,是否能够满足e<ε,若不满足则反复迭代,直至满足e<ε为止,从而达到降低仿生眼系统三维定位测量误差的目的。Compensate (introduce) the 40 kinematic error parameters in the identified kinematic model error δχ of the bionic eye left and right eyes into formula (41), and get For the set threshold ε, the judgment After substituting the optimized transformation matrix N T W into the expression of the error model e, whether it can satisfy e<ε, if not, iterate repeatedly until e<ε is satisfied, so as to reduce the three-dimensional positioning measurement error of the bionic eye system the goal of.
所述实施例为本发明的优选的实施方式,但本发明并不限于上述实施方式,在不背离本发明的实质内容的情况下,本领域技术人员能够做出的任何显而易见的改进、替换或变型均属于本发明的保护范围。The described embodiment is a preferred implementation of the present invention, but the present invention is not limited to the above-mentioned implementation, without departing from the essence of the present invention, any obvious improvement, replacement or modification that those skilled in the art can make Modifications all belong to the protection scope of the present invention.
Claims (6)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202110711297.2A CN113359461B (en) | 2021-06-25 | 2021-06-25 | Kinematics calibration method suitable for bionic eye system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202110711297.2A CN113359461B (en) | 2021-06-25 | 2021-06-25 | Kinematics calibration method suitable for bionic eye system |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN113359461A CN113359461A (en) | 2021-09-07 |
| CN113359461B true CN113359461B (en) | 2022-12-27 |
Family
ID=77536640
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202110711297.2A Active CN113359461B (en) | 2021-06-25 | 2021-06-25 | Kinematics calibration method suitable for bionic eye system |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN113359461B (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115979231B (en) | 2023-03-20 | 2023-07-18 | 广东工业大学 | A Dimensionless Kinematics Calibration Method Based on Virtual Points and Related Devices |
Family Cites Families (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6278906B1 (en) * | 1999-01-29 | 2001-08-21 | Georgia Tech Research Corporation | Uncalibrated dynamic mechanical system controller |
| WO2004033159A1 (en) * | 2002-10-11 | 2004-04-22 | Fujitsu Limited | Robot control algorithm construction device, robot control algorithm construction program, robot control device, robot control program, and robot |
| US8398541B2 (en) * | 2006-06-06 | 2013-03-19 | Intuitive Surgical Operations, Inc. | Interactive user interfaces for robotic minimally invasive surgical systems |
| US8244402B2 (en) * | 2009-09-22 | 2012-08-14 | GM Global Technology Operations LLC | Visual perception system and method for a humanoid robot |
| CN106950841B (en) * | 2017-05-12 | 2019-10-25 | 山东大学 | Model-independent PD-SMC bionic eye motion control method |
| US11602853B2 (en) * | 2019-06-28 | 2023-03-14 | University Of Denver | Therapeutic social robot |
| CN110497407A (en) * | 2019-08-16 | 2019-11-26 | 深圳华数机器人有限公司 | A drive-control integrated intelligent track-following system applied to industrial robots |
| CN111983927B (en) * | 2020-08-31 | 2022-04-12 | 郑州轻工业大学 | An Ellipsoid Set Membership Filtering Method Based on Maximum Coentropy MCC Criterion |
-
2021
- 2021-06-25 CN CN202110711297.2A patent/CN113359461B/en active Active
Also Published As
| Publication number | Publication date |
|---|---|
| CN113359461A (en) | 2021-09-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN109859275B (en) | Monocular vision hand-eye calibration method of rehabilitation mechanical arm based on S-R-S structure | |
| CN110276806B (en) | Online hand-eye calibration and grasping pose calculation method for four-degree-of-freedom parallel robot stereo vision hand-eye system | |
| Lenz et al. | Calibrating a cartesian robot with eye-on-hand configuration independent of eye-to-hand relationship | |
| CN113681559B (en) | Line laser scanning robot hand-eye calibration method based on standard cylinder | |
| Li | Camera calibration of a head-eye system for active vision | |
| CN113658266B (en) | Visual measurement method for rotation angle of moving shaft based on fixed camera and single target | |
| CN115741720B (en) | Zero calibration system and method for robot based on binocular vision technology and LM algorithm | |
| CN117381800B (en) | A hand-eye calibration method and system | |
| CN108827155A (en) | A kind of robot vision measuring system and method | |
| CN102818524A (en) | On-line robot parameter calibration method based on visual measurement | |
| Fan et al. | High-precision external parameter calibration method for camera and LiDAR based on a calibration device | |
| CN108656116A (en) | Serial manipulator kinematic calibration method based on dimensionality reduction MCPC models | |
| CN106323286B (en) | A kind of robot coordinate system and the transform method of three-dimensional measurement coordinate system | |
| CN118322188A (en) | Kinematic calibration and error compensation method for serial-parallel fracture reduction robot | |
| CN119228893A (en) | Adaptive estimation method of spatial pose of manipulator based on fusion of motion information and visual information | |
| CN113359461B (en) | Kinematics calibration method suitable for bionic eye system | |
| CN118744434A (en) | Automatic plugging and unplugging method of charging gun for mobile charging robot based on active visual positioning technology | |
| CN113211433B (en) | Separated visual servo control method based on composite characteristics | |
| CN111975756B (en) | Hand-eye calibration system and method of 3D vision measurement system | |
| Jin et al. | Scara+ system: bin picking system of revolution-symmetry objects | |
| Zhao et al. | Robust geometry self-calibration based on differential kinematics for a redundant robotic inspection system | |
| CN116012461A (en) | Multi-camera system external parameter calibration method, device, computer equipment and medium | |
| CN112894814B (en) | Mechanical arm DH parameter identification method based on least square method | |
| CN115107024A (en) | A kinematic parameter identification method of industrial robot based on laser tracker multi-station technology | |
| CN117340879A (en) | Industrial machine ginseng number identification method and system based on graph optimization model |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |























































































































