CN115294217B - Visual experiment platform calibration method, positioning method and related equipment - Google Patents

Visual experiment platform calibration method, positioning method and related equipment Download PDF

Info

Publication number
CN115294217B
CN115294217B CN202211237174.0A CN202211237174A CN115294217B CN 115294217 B CN115294217 B CN 115294217B CN 202211237174 A CN202211237174 A CN 202211237174A CN 115294217 B CN115294217 B CN 115294217B
Authority
CN
China
Prior art keywords
distance
axis
image
platform
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211237174.0A
Other languages
Chinese (zh)
Other versions
CN115294217A (en
Inventor
吕小戈
杨旭韵
罗惠元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ji Hua Laboratory
Original Assignee
Ji Hua Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ji Hua Laboratory filed Critical Ji Hua Laboratory
Priority to CN202211237174.0A priority Critical patent/CN115294217B/en
Publication of CN115294217A publication Critical patent/CN115294217A/en
Application granted granted Critical
Publication of CN115294217B publication Critical patent/CN115294217B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本发明涉及视觉实验平台标定领域,具体为一种视觉实验平台标定方法、定位方法及相关设备。该视觉实验平台标定方法包括步骤:在载物台上安放工件,工件朝向相机一侧设置有至少4个标志点;选取任意一个标志点作为第一识别点;根据第一识别点建立平台坐标系;以除第一识别点以外的其他标志点作为第二识别点,基于平台坐标系,获取第二识别点在平台坐标系下的第一平台坐标;获取第二识别点在工件坐标系下的工件坐标;根据第一平台坐标和工件坐标,获取工件坐标系和平台坐标系之间的转换矩阵,本发明针对分离模组式的视觉实验平台提出一种可行的标定方法,克服了该视觉实验平台无法通过传统的九点标定法进行有效标定的技术难题。

Figure 202211237174

The invention relates to the field of visual experiment platform calibration, in particular to a visual experiment platform calibration method, a positioning method and related equipment. The visual experiment platform calibration method includes the steps of: placing a workpiece on the stage, and setting at least 4 marker points on the side of the workpiece facing the camera; selecting any marker point as the first recognition point; establishing a platform coordinate system according to the first recognition point ;Take other marker points except the first recognition point as the second recognition point, based on the platform coordinate system, obtain the first platform coordinate of the second recognition point in the platform coordinate system; obtain the second recognition point in the workpiece coordinate system Workpiece coordinates; according to the first platform coordinates and workpiece coordinates, the conversion matrix between the workpiece coordinate system and the platform coordinate system is obtained. The present invention proposes a feasible calibration method for the separated modular visual experiment platform, which overcomes the visual experiment The technical problem that the platform cannot be effectively calibrated by the traditional nine-point calibration method.

Figure 202211237174

Description

Visual experiment platform calibration method, positioning method and related equipment
Technical Field
The invention relates to the field of visual experiment platform calibration, in particular to a visual experiment platform calibration method, a positioning method and related equipment.
Background
A commonly used calibration method for a visual experiment platform in the prior art is a nine-point calibration method (or called hand-eye calibration), which is substantially used for obtaining a conversion matrix between different coordinate systems, thereby realizing accurate positioning on an image, a platform and a workpiece in a coordinate form;
however, the nine-point calibration method can only be applied to a visual experiment platform in which an X-axis module, a Y-axis module and a Z-axis module are connected with each other and can move relative to an object stage, and because the X-axis module, the Y-axis module and the Z-axis module can move relative to the object stage, platform coordinates of each point on a workpiece (the workpiece is placed on the object stage) can be acquired; however, for the visual experiment platform of the split module type, the Y-axis module is fixed on the object stage, and the workpiece is placed on the object stage and then moves synchronously with the Y-axis module, so that the Y-axis module can only obtain the Y-axis coordinate value of the object stage, and cannot obtain the specific Y-axis coordinate value of each position on the workpiece (the workpiece is placed on the object stage), so that the nine-point calibration method cannot be applied to the visual experiment platform of the split module type.
Accordingly, there is a need for improvement and development in the art.
Disclosure of Invention
The invention aims to provide a visual experiment platform calibration method, a positioning method and related equipment, which can effectively finish the calibration of a visual experiment platform in a separation module mode.
In a first aspect, the application provides a visual experiment platform calibration method, which is used for calibrating a split-module type visual experiment platform, wherein the visual experiment platform comprises a camera, an objective table, an X-axis module, a Y-axis module and a Z-axis module, the camera is mounted on the Z-axis module, and the Z-axis module can drive the camera to reciprocate along the vertical direction; the Z-axis module is arranged on the X-axis module in a sliding mode, and the X-axis module can drive the Z-axis module to move in a reciprocating mode along the transverse direction; the object stage is independently arranged below the camera, the Y-axis module is fixedly connected with the object stage, and the Y-axis module can drive the object stage to move back and forth along the longitudinal direction; the visual experiment platform calibration method comprises the following steps:
s1, placing a workpiece on an objective table, wherein at least 4 mark points are arranged on one side of the workpiece facing a camera;
s2, selecting any one mark point as a first identification point;
s3, establishing a platform coordinate system according to the first identification point;
s4, taking other mark points except the first identification point as second identification points, and acquiring first platform coordinates of the second identification points in the platform coordinate system based on the platform coordinate system;
s5, acquiring a workpiece coordinate of the second identification point in a workpiece coordinate system;
s6, acquiring a conversion matrix between the workpiece coordinate system and the platform coordinate system according to the first platform coordinate and the workpiece coordinate.
The effective calibration of the visual experiment platform in the separation module mode is realized by setting the mark points and establishing a platform coordinate system by the mark points so as to obtain a conversion matrix between the platform coordinate system and the workpiece coordinate system.
Further, the specific steps in step S3 include:
s31, controlling the X-axis module and the Y-axis module to move, and acquiring a first image when the first identification point appears in the visual field of the camera;
s32, acquiring a first pixel distance between the image center of the first image and the first identification point; the first pixel distance comprises a distance in an X-axis direction and a distance in a Y-axis direction;
s33, calculating a first actual distance between the image center of the first image and the first identification point according to the first pixel distance; the first actual distance comprises a distance in an X-axis direction and a distance in a Y-axis direction;
s34, controlling the X-axis module and the Y-axis module according to the first actual distance, and acquiring a second image when the image center of the first image is superposed with the first identification point;
and S35, establishing the platform coordinate system by taking the first identification point as an origin based on the second image.
The first pixel distance between the image center of the first image and the first identification point is used for automatic alignment, and compared with manual operation, the automatic alignment method is high in speed and accuracy.
Further, the specific steps in step S4 include:
sequentially executing the following steps for each second identification point:
s41, controlling the X-axis module and the Y-axis module to move, and acquiring a third image when the second identification point appears in the visual field of the camera;
s42, acquiring a second pixel distance between the image center of the third image and the second identification point; the second pixel distance comprises a distance in an X-axis direction and a distance in a Y-axis direction;
s43, calculating a second actual distance between the image center of the third image and the second identification point according to the second pixel distance; the second actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
s44, controlling the X-axis module and the Y-axis module according to the second actual distance, and acquiring a fourth image when the image center of the third image is coincident with the second identification point;
s45, acquiring the first platform coordinate of the second identification point in the platform coordinate system based on the fourth image.
And the second pixel distance between the image center of the third image and the second identification point is used for automatic alignment, so that the speed is higher and the precision is higher.
Further, the specific steps in step S45 include:
acquiring a first numerical value of the X-axis module corresponding to the second image, wherein the first numerical value is an X-axis coordinate value of the objective table corresponding to the second image;
acquiring a second numerical value of the Y-axis module corresponding to the second image, wherein the second numerical value is a Y-axis coordinate value of the objective table corresponding to the second image;
acquiring a third numerical value of the X-axis module corresponding to each fourth image, wherein the third numerical value is an X-axis coordinate value of the objective table corresponding to the fourth image;
acquiring a fourth numerical value of the Y-axis module corresponding to each fourth image, wherein the fourth numerical value is a Y-axis coordinate value of the objective table corresponding to the fourth image;
obtaining an X-axis coordinate value of each second identification point in the platform coordinate system by calculating a difference value between the first numerical value and each third numerical value;
and obtaining the Y-axis coordinate value of each second identification point in the platform coordinate system by calculating the difference value between the second numerical value and each fourth numerical value.
The simple operation process occupies less resources of the system, and has relatively low requirements on hardware equipment of the system.
Further, before step S33 and step S43, the method further includes the steps of:
A1. placing the calibration plate in the visual field of the camera to obtain a calibration image;
A2. acquiring a third actual distance between any two calibration points in the calibration plate; the third actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
A3. acquiring a third pixel distance between the two calibration points according to the calibration image; the third pixel distance comprises a distance in an X-axis direction and a distance in a Y-axis direction;
A4. calculating to obtain a unit pixel distance according to the third actual distance and the third pixel distance, wherein the unit pixel distance is an actual distance corresponding to a unit pixel interval; the unit pixel distance comprises a distance in an X-axis direction and a distance in a Y-axis direction;
the specific steps in step S33 include:
converting the first pixel distance to the first actual distance based on the unit pixel distance;
the specific steps in step S43 include:
converting the second pixel distance to the second actual distance based on the unit pixel distance.
In a second aspect, the present application further provides a positioning method applied to a split-module type visual experiment platform calibrated by the visual experiment platform calibration method, where the positioning method includes the following steps:
B1. controlling the X-axis module and the Y-axis module to move, and when a target point appears in the visual field of the camera, acquiring a fifth image and calculating a first moving distance of the X-axis module and a second moving distance of the Y-axis module;
B2. acquiring a fourth pixel distance between the image center of the fifth image and the target point; the fourth pixel distance comprises a distance in an X-axis direction and a distance in a Y-axis direction;
B3. converting the fourth pixel distance to a fourth actual distance based on the unit pixel distance; the fourth actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
B4. judging the quadrant of the target point in the fifth image;
B5. calculating a second platform coordinate of the target point in the platform coordinate system according to the first moving distance of the X-axis module, the second moving distance of the Y-axis module, the fourth actual distance and the quadrant of the target point in the fifth image;
the specific steps in step B5 include:
calculating the second platform coordinates according to the following formula:
Figure 100002_DEST_PATH_IMAGE002
Figure 100002_DEST_PATH_IMAGE004
wherein,
Figure 100002_DEST_PATH_IMAGE006
the coordinate value of the X axis in the second platform coordinate is shown;
Figure 100002_DEST_PATH_IMAGE008
the coordinate value of the Y axis in the second platform coordinate is obtained;
Figure 100002_DEST_PATH_IMAGE010
is the first value;
Figure 100002_DEST_PATH_IMAGE012
is the first movement distance;
Figure 100002_DEST_PATH_IMAGE014
a first distance value in the X-axis direction in the fourth actual distance according to the position of the target point in the fifth imageThe quadrant is positive and negative;
Figure 100002_DEST_PATH_IMAGE016
is the second value;
Figure 100002_DEST_PATH_IMAGE018
is the second movement distance;
Figure 100002_DEST_PATH_IMAGE020
and the second distance value is a second distance value in the Y-axis direction in the fourth actual distance, and the second distance value is positive or negative according to the quadrant of the target point in the fifth image.
The method has simple operation process, is beneficial to improving the speed of the system to calculate the result and reducing the processing time to reduce the delay between input and output.
In a third aspect, the invention further provides a visual experiment platform calibration device, which is used for calibrating a visual experiment platform in a split module type, wherein the visual experiment platform comprises a camera, an objective table, an X-axis module, a Y-axis module and a Z-axis module, the camera is mounted on the Z-axis module, and the Z-axis module can drive the camera to reciprocate along the vertical direction; the Z-axis module is arranged on the X-axis module in a sliding mode, and the X-axis module can drive the Z-axis module to move back and forth along the transverse direction; the object stage is independently arranged below the camera, the Y-axis module is fixedly connected with the object stage, and the Y-axis module can drive the object stage to reciprocate along the longitudinal direction; the visual experiment platform calibration device comprises:
the preparation module is used for placing a workpiece on the objective table, and at least 4 mark points are arranged on one side of the workpiece facing the camera;
the selection module is used for selecting any one of the mark points as a first identification point;
the construction module is used for establishing a platform coordinate system according to the first identification point;
the first acquisition module is used for taking the mark points except the first identification point as second identification points and acquiring first platform coordinates of the second identification points in the platform coordinate system based on the platform coordinate system;
the second acquisition module is used for acquiring the workpiece coordinates of the second identification point in a workpiece coordinate system;
and the third acquisition module is used for acquiring a conversion matrix between the workpiece coordinate system and the platform coordinate system according to the first platform coordinate and the workpiece coordinate.
The visual experiment platform in the split module type can be effectively calibrated, and the dilemma that the visual experiment platform cannot be calibrated by using a traditional calibration method is broken through.
In a fourth aspect, the present invention further provides a positioning device, applied to a visual experiment platform in a split module type calibrated by the visual experiment platform calibration method, where the positioning device includes:
the fourth acquisition module is used for controlling the X-axis module and the Y-axis module to move, and acquiring a fifth image and calculating a first moving distance of the X-axis module and a second moving distance of the Y-axis module when a target point appears in the visual field of the camera;
a fifth obtaining module, configured to obtain a fourth pixel distance between an image center of the fifth image and the target point; the fourth pixel distance comprises a distance in an X-axis direction and a distance in a Y-axis direction;
a conversion module for converting the fourth pixel distance to a fourth actual distance based on the unit pixel distance; the fourth actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
the judging module is used for judging the quadrant of the target point in the fifth image;
the calculation module is used for calculating a second platform coordinate of the target point under the platform coordinate system according to the first moving distance of the X-axis module, the second moving distance of the Y-axis module, the fourth actual distance and the quadrant of the target point in the fifth image;
the calculation module executes when calculating a second platform coordinate of the target point in the platform coordinate system according to the first movement distance of the X-axis module, the second movement distance of the Y-axis module, the fourth actual distance, and the quadrant of the target point in the fifth image:
calculating the second platform coordinates according to the following formula:
Figure 200977DEST_PATH_IMAGE002
Figure 406831DEST_PATH_IMAGE004
wherein,
Figure 867899DEST_PATH_IMAGE006
the coordinate value of the X axis in the second platform coordinate is shown;
Figure 281432DEST_PATH_IMAGE008
the coordinate value of the Y axis in the second platform coordinate is obtained;
Figure 991899DEST_PATH_IMAGE010
is the first value;
Figure 735864DEST_PATH_IMAGE012
is the first movement distance;
Figure 113755DEST_PATH_IMAGE014
taking the first distance value in the X-axis direction in the fourth actual distance as a positive value and a negative value according to the quadrant of the target point in the fifth image;
Figure 917763DEST_PATH_IMAGE016
is the second value;
Figure 630373DEST_PATH_IMAGE018
is the second movement distance;
Figure 178029DEST_PATH_IMAGE020
and the second distance value is a second distance value in the Y-axis direction in the fourth actual distance, and the second distance value is positive or negative according to the quadrant of the target point in the fifth image.
The positioning process is simple in operation, rapid and accurate positioning is facilitated, and requirements on hardware of equipment are low.
In a fifth aspect, the present invention provides an electronic device, including a processor and a memory, where the memory stores computer-readable instructions, and the computer-readable instructions, when executed by the processor, perform the steps in the visual experiment platform calibration method and/or the positioning method described above.
In a sixth aspect, the present invention provides a storage medium, on which a computer program is stored, wherein the computer program, when being executed by a processor, executes the steps of the above-mentioned visual experiment platform calibration method and/or positioning method.
Therefore, the calibration method applicable to the split module type visual experiment platform can effectively overcome the technical blind area that the split module type visual experiment platform cannot be calibrated by using the traditional nine-point calibration method, and accurate and effective calibration is favorable for ensuring that the split module type visual experiment platform can complete accurate visual positioning.
Drawings
Fig. 1 is a flowchart of a calibration method for a visual experiment platform according to an embodiment of the present disclosure.
Fig. 2 is a flowchart of a positioning method according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of a calibration apparatus for a visual experiment platform according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a positioning device according to an embodiment of the present disclosure.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined or explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not construed as indicating or implying relative importance.
It should be noted that, because there is a certain distortion in the imaging of the camera, in order to improve the positioning accuracy, the first image, the second image, the third image, the fourth image, the fifth image and the calibration image, which are described below, are subjected to perspective transformation to reduce errors caused by image distortion, and the perspective transformation is performed in the prior art, and the transformation process is not described any more.
In the prior art, most of the X-axis module, the Y-axis module, and the Z-axis module in the vision experiment platform are arranged on the gantry to control the X-axis direction, the Y-axis direction, and the Z-axis direction of the camera, and the stage is fixed below the gantry, after the workpiece is placed on the stage, the camera can be driven to move to any position on the workpiece by controlling the X-axis module, the Y-axis module, and the Z-axis module, and then the platform coordinates of any position on the workpiece can be obtained by reading the values in the X-axis module, the Y-axis module, and the Z-axis module (all vision experiment platforms leave from the factory and have a preset platform coordinate system, and the values corresponding to the coordinate axes of the platform coordinate system can be read from the encoders corresponding to the X-axis module, the Y-axis module, and the Z-axis module, and can be understood as a physical coordinate system established by the X-axis module, the Y-axis module, and the Z-axis module), when the actual application is applied, the workpiece is placed on the stage, and the stage is fixed, therefore, the X-axis module, the Y-axis module, and the workpiece can move relatively, and the coordinates of the workpiece can be obtained by calculating the coordinate system of the current coordinate system based on the technology, and the image of the existing platform, and the image can be obtained by the calibration platform.
However, for the visual experiment platform of the split module type, only the X-axis module and the Z-axis module are arranged on the portal frame, the Y-axis module is separately and fixedly connected to the objective table and can drive the objective table to move along the Y-axis direction, under the visual experiment platform, the position of the camera is moved, only the X-axis coordinate value and the Z-axis coordinate value of any position of one side of the workpiece facing the camera in the platform coordinate system can be read from the encoders corresponding to the X-axis module and the Z-axis module, and the numerical value read from the encoder corresponding to the Y-axis module is only the Y-axis coordinate value of the objective table in the platform coordinate system, not the Y-axis coordinate of the point corresponding to the current camera position on the workpiece in the platform coordinate system, only the X-axis coordinate and the Z-axis coordinate of the point corresponding to the current camera position on the workpiece in the platform coordinate system can not be obtained, and the data loss of the part can not be used for calibrating the visual experiment platform of the split module type by adopting a nine-point calibration method.
In some embodiments, referring to fig. 1, a method for calibrating a visual experiment platform is used for calibrating a split-module type visual experiment platform, the visual experiment platform includes a camera, an object stage, an X-axis module, a Y-axis module and a Z-axis module, the camera is mounted on the Z-axis module, and the Z-axis module can drive the camera to reciprocate along a vertical direction; the Z-axis module is arranged on the X-axis module in a sliding manner, and the X-axis module can drive the Z-axis module to reciprocate along the transverse direction; the object stage is independently arranged below the camera, the Y-axis module is fixedly connected with the object stage, and the Y-axis module can drive the object stage to reciprocate along the longitudinal direction; the visual experiment platform calibration method comprises the following steps:
s1, placing a workpiece on an objective table, wherein at least 4 mark points are arranged on one side of the workpiece facing a camera;
s2, selecting any one mark point as a first identification point;
s3, establishing a platform coordinate system according to the first identification point;
s4, taking other mark points except the first identification point as second identification points, and acquiring first platform coordinates of the second identification points in a platform coordinate system based on the platform coordinate system;
s5, acquiring a workpiece coordinate of the second identification point in a workpiece coordinate system;
and S6, acquiring a conversion matrix between the workpiece coordinate system and the platform coordinate system according to the first platform coordinate and the workpiece coordinate.
In this embodiment, at least 4 marker points need to be set on the surface of the workpiece facing the camera in advance during calibration (if one marker point is used as a reference point, then calibration can be completed only by 3 additional sets of data, which is similar to the data requirements of the nine-point calibration method and is not described herein again).
It should be noted that, the platform coordinate system established by the first identification point is actually a "relative" platform coordinate system reconstructed under the platform coordinate system preset by the visual experiment platform itself, where the first identification point itself is a point under the platform coordinate system preset by the visual experiment platform itself, and the reestablishment of the "relative" platform coordinate system by taking the first identification point as a reference point does not affect the coordinate acquisition and positioning control in the future, and only changes to acquiring the "relative" coordinate and performing the "relative" positioning control by taking the first identification point as a new reference (instead of the coordinate origin of the platform coordinate system preset by the visual experiment platform itself).
In certain embodiments, the specific steps in step S3 include:
s31, controlling the X-axis module and the Y-axis module to move, and acquiring a first image when a first identification point appears in the visual field of the camera;
s32, acquiring a first pixel distance between the image center of the first image and the first identification point; the first pixel distance comprises a distance in an X-axis direction and a distance in a Y-axis direction;
s33, calculating a first actual distance between the image center of the first image and the first identification point according to the first pixel distance; the first actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
s34, controlling an X-axis module and a Y-axis module according to the first actual distance, and acquiring a second image when the image center of the first image is coincident with the first identification point;
and S35, establishing a platform coordinate system by taking the first identification point as an origin on the basis of the second image.
During practical application, the image center of the first image is controlled to be aligned with the first identification point in the calibration process, the X-axis module, the Y-axis module and the Z-axis module can be manually controlled to move, and the embodiment automatically aligns by using the first pixel distance between the image center of the first image and the first identification point, and is higher in speed and precision compared with manual operation.
In certain embodiments, the specific steps in step S4 include:
and sequentially executing the following steps for each second identification point:
s41, controlling the X-axis module and the Y-axis module to move, and acquiring a third image when a second identification point appears in the visual field of the camera;
s42, acquiring a second pixel distance between the image center of the third image and the second identification point; the second pixel distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
s43, calculating a second actual distance between the image center of the third image and the second identification point according to the second pixel distance; the second actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
s44, controlling the X-axis module and the Y-axis module according to the second actual distance, and acquiring a fourth image when the image center of the third image is coincident with the second identification point;
s45, acquiring first platform coordinates of the second identification point in a platform coordinate system based on the fourth image.
Like the above embodiment, the second pixel distance between the center of the third image and the second recognition point is used for automatic alignment, so that the speed is higher and the precision is higher.
In certain embodiments, the specific steps in step S45 include:
acquiring a first numerical value of the X-axis module corresponding to the second image, wherein the first numerical value is an X-axis coordinate value of the objective table corresponding to the second image;
acquiring a second numerical value of the Y-axis module corresponding to the second image, wherein the second numerical value is a Y-axis coordinate value of the objective table corresponding to the second image;
acquiring a third numerical value of the X-axis module corresponding to each fourth image, wherein the third numerical value is an X-axis coordinate value of the objective table corresponding to the fourth image;
acquiring a fourth numerical value of the Y-axis module corresponding to each fourth image, wherein the fourth numerical value is a Y-axis coordinate value of the objective table corresponding to the fourth image;
obtaining the X-axis coordinate value of each second identification point in the platform coordinate system by calculating the difference value between the first numerical value and each third numerical value;
and obtaining the Y-axis coordinate value of each second identification point in the platform coordinate system by calculating the difference value between the second numerical value and each fourth numerical value.
It should be noted that the first numerical value of the X-axis module corresponding to the second image refers to a numerical value read from the X-axis module in this state when the visual experiment platform shoots the second image, which is taken as the first numerical value, and the rest of the second numerical values, the third numerical value, and the fourth numerical value are the same, and are not described herein again.
The difference value between the first numerical value and each third numerical value obtained by calculation reflects the distance between each second identification point and each first identification point along the X-axis direction, namely the X-axis coordinate value of each second identification point under a platform coordinate system established by taking the first identification point as a reference point; similarly, the calculated difference value between the second numerical value and each fourth numerical value reflects the distance between each second identification point and the first identification point along the Y-axis direction, that is, the Y-axis coordinate value of each second identification point in the platform coordinate system established by using the first identification point as the reference point; and then the coordinates of each second identification point in a platform coordinate system established by taking the first identification point as a reference point are obtained.
In some embodiments, before step S33 and step S43, the method further comprises the steps of:
A1. placing the calibration plate in the visual field of a camera to obtain a calibration image;
A2. acquiring a third actual distance between any two calibration points in the calibration plate; the third actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
A3. acquiring a third pixel distance between the two calibration points according to the calibration image; the third pixel distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
A4. calculating to obtain a unit pixel distance according to the third actual distance and the third pixel distance, wherein the unit pixel distance is an actual distance corresponding to the unit pixel interval; the unit pixel distance comprises a distance in an X-axis direction and a distance in a Y-axis direction;
the specific steps in step S33 include:
converting the first pixel distance into a first actual distance based on the unit pixel distance;
the specific steps in step S43 include:
the second pixel distance is converted into a second actual distance based on the unit pixel distance.
In this embodiment, in order to ensure the accuracy of the automatic alignment process in the above embodiments, the actual distance corresponding to a unit pixel in the image acquired by the camera, that is, the unit pixel distance, needs to be obtained in advance, and the pixel distance can be accurately converted into the actual distance, and the accurate actual distance is input into the X-axis module, the Y-axis module, and the Z-axis module, so that the movement can be accurately controlled, and the center of the image is aligned to the identification point (referred to as the first identification point and the second identification point).
Referring to fig. 2, fig. 2 is a schematic diagram of a positioning method applied to a visual experiment platform in a split module type calibrated by a visual experiment platform calibration method according to an embodiment of the present application, including the steps of:
B1. controlling the X-axis module and the Y-axis module to move, and acquiring a fifth image and calculating a first moving distance of the X-axis module and a second moving distance of the Y-axis module when a target point (the target point refers to a position point which needs to be positioned by a user) appears in the visual field of the camera;
B2. acquiring a fourth pixel distance between the image center of the fifth image and the target point; the fourth pixel distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
B3. converting the fourth pixel distance to a fourth actual distance based on the unit pixel distance; the fourth actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
B4. judging the quadrant of the target point in the fifth image;
B5. and calculating a second platform coordinate of the target point in the platform coordinate system according to the first moving distance of the X-axis module, the second moving distance of the Y-axis module, the fourth actual distance and the quadrant of the target point in the fifth image.
In this embodiment, the vision experiment platform is calibrated in the above embodiments, and can be applied to positioning control after obtaining the unit pixel distance, and in the positioning control process, it is only necessary to make the target point appear in the camera view without controlling the camera to move to make the image center coincide with the target point; in practical application, for example, a user acquires workpiece coordinates of a target point through a drawing, the workpiece coordinates are input into a visual experiment platform, the visual experiment platform converts the input workpiece coordinates according to a conversion matrix obtained by calibration to obtain corresponding platform coordinates, and finally the visual experiment platform controls an X-axis module, a Y-axis module and a Z-axis module according to the converted platform coordinates to align the center of an image acquired by a camera to the target point, so that positioning is completed on the visual experiment platform; for example, the user determines the platform coordinates of the target point from the visual experiment platform, the visual experiment platform converts the platform coordinates of the target point into corresponding workpiece coordinates according to the conversion matrix obtained by calibration and feeds the workpiece coordinates back to the user, and the user can accurately position the target point on the corresponding drawing according to the corresponding workpiece coordinates obtained by conversion, so that the positioning is completed on the workpiece drawing.
It should be noted that the quadrant distribution in the fifth image is distinguished based on the image center of the fifth image, the image center of the fifth image is used as an original point, a cross mark is set on the original point, and two line segments forming the cross mark are respectively parallel to the moving direction of the Y-axis module and the moving direction of the X-axis module.
In certain embodiments, the specific steps in step B5 include:
the second platform coordinates are calculated according to the following formula:
Figure 144848DEST_PATH_IMAGE002
Figure 650916DEST_PATH_IMAGE004
wherein,
Figure DEST_PATH_IMAGE022
is the X-axis coordinate value in the second platform coordinate;
Figure DEST_PATH_IMAGE024
the coordinate value of the Y axis in the second platform coordinate;
Figure DEST_PATH_IMAGE026
is a first value;
Figure DEST_PATH_IMAGE028
a first movement distance;
Figure DEST_PATH_IMAGE030
the first distance value in the X-axis direction in the fourth actual distance is positive or negative according to the quadrant of the target point in the fifth image;
Figure DEST_PATH_IMAGE032
is a second numberA value;
Figure DEST_PATH_IMAGE034
is the second movement distance;
Figure DEST_PATH_IMAGE036
and the second distance value is a second distance value in the Y-axis direction in the fourth actual distance, and the second distance value is positive or negative according to the quadrant of the target point in the fifth image.
It is required to be noted that
Figure 139839DEST_PATH_IMAGE030
And
Figure 491186DEST_PATH_IMAGE036
with its own sign, the above embodiment determines the quadrant of the target point in the fifth image for determining
Figure 578090DEST_PATH_IMAGE030
And
Figure 238748DEST_PATH_IMAGE036
taking a positive or negative value, the rule is determined as follows:
if the target point falls in the first quadrant or the fourth quadrant, then
Figure 145524DEST_PATH_IMAGE030
Is a positive value, otherwise
Figure 34982DEST_PATH_IMAGE030
Is a negative value;
if the target point is in the first quadrant or the second quadrant, the target point is in the first quadrant or the second quadrant
Figure 241973DEST_PATH_IMAGE036
Is positive, otherwise
Figure 73532DEST_PATH_IMAGE036
Is negative.
Referring to fig. 3, fig. 3 is a schematic diagram of a visual experiment platform calibration apparatus for calibrating a split-module type visual experiment platform according to some embodiments of the present application, where the visual experiment platform includes a camera, an object stage, an X-axis module, a Y-axis module, and a Z-axis module, the camera is mounted on the Z-axis module, and the Z-axis module can drive the camera to reciprocate along a vertical direction; the Z-axis module is arranged on the X-axis module in a sliding manner, and the X-axis module can drive the Z-axis module to reciprocate along the transverse direction; the object stage is independently arranged below the camera, the Y-axis module is fixedly connected with the object stage, and the Y-axis module can drive the object stage to reciprocate along the longitudinal direction; the visual experiment platform calibration device is integrated in the rear-end control equipment of the visual experiment platform calibration device in the form of a computer program, and comprises:
a preparation module 100, configured to place a workpiece on the stage, where a side of the workpiece facing the camera is provided with at least 4 mark points;
a selecting module 200, configured to select any one of the mark points as a first identification point;
a building module 300, configured to build a platform coordinate system according to the first identification point;
a first obtaining module 400, configured to take other mark points except the first identification point as second identification points, and obtain, based on the platform coordinate system, first platform coordinates of the second identification points in the platform coordinate system;
a second obtaining module 500, configured to obtain workpiece coordinates of the second identification point in the workpiece coordinate system;
and a third obtaining module 600, configured to obtain a transformation matrix between the workpiece coordinate system and the platform coordinate system according to the first platform coordinate and the workpiece coordinate.
In certain embodiments, the building module 300 performs when used to establish the platform coordinate system from the first identified point:
s31, controlling the X-axis module and the Y-axis module to move, and acquiring a first image when a first identification point appears in the visual field of the camera;
s32, acquiring a first pixel distance between the image center of the first image and the first identification point; the first pixel distance comprises a distance in an X-axis direction and a distance in a Y-axis direction;
s33, calculating a first actual distance between the image center of the first image and the first identification point according to the first pixel distance; the first actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
s34, controlling the X-axis module and the Y-axis module according to the first actual distance, and acquiring a second image when the image center of the first image is coincident with the first identification point;
and S35, establishing a platform coordinate system by taking the first identification point as an origin on the basis of the second image.
In some embodiments, the first obtaining module 400 performs, when the second identifying point is a mark point other than the first identifying point, obtaining the first platform coordinate of the second identifying point in the platform coordinate system based on the platform coordinate system:
and sequentially executing the following steps for each second identification point:
s41, controlling the X-axis module and the Y-axis module to move, and acquiring a third image when a second identification point appears in the visual field of the camera;
s42, acquiring a second pixel distance between the image center of the third image and the second identification point; the second pixel distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
s43, calculating a second actual distance between the image center of the third image and the second identification point according to the second pixel distance; the second actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
s44, controlling the X-axis module and the Y-axis module according to the second actual distance, and acquiring a fourth image when the image center of the third image is coincident with the second identification point;
s45, acquiring first platform coordinates of the second identification point in a platform coordinate system based on the fourth image.
In some embodiments, the first obtaining module 400 performs when configured to obtain the first platform coordinates of the second identification point in the platform coordinate system based on the fourth image:
acquiring a first numerical value of the X-axis module corresponding to the second image, wherein the first numerical value is an X-axis coordinate value of the objective table corresponding to the second image;
acquiring a second numerical value of the Y-axis module corresponding to the second image, wherein the second numerical value is a Y-axis coordinate value of the objective table corresponding to the second image;
acquiring a third numerical value of the X-axis module corresponding to each fourth image, wherein the third numerical value is an X-axis coordinate value of the objective table corresponding to the fourth image;
acquiring a fourth numerical value of the Y-axis module corresponding to each fourth image, wherein the fourth numerical value is a Y-axis coordinate value of the objective table corresponding to the fourth image;
obtaining the X-axis coordinate value of each second identification point under the platform coordinate system by calculating the difference value between the first numerical value and each third numerical value;
and obtaining the Y-axis coordinate value of each second identification point in the platform coordinate system by calculating the difference value between the second numerical value and each fourth numerical value.
In some embodiments, the construction module 300 is configured to calculate a first actual distance between the image center of the first image and the first identified point based on the first pixel distance; the first actual distance comprises the distance in the X-axis direction and the distance in the Y-axis direction, and the first obtaining module 400 is configured to calculate a second actual distance between the center of the image of the third image and the second recognition point according to the second pixel distance; the second actual distance comprises the distance in the X-axis direction and the distance in the Y-axis direction, and the following steps are performed before:
A1. placing the calibration plate in the visual field of a camera to obtain a calibration image;
A2. acquiring a third actual distance between any two calibration points in the calibration plate; the third actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
A3. acquiring a third pixel distance between the two calibration points according to the calibration image; the third pixel distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
A4. calculating to obtain a unit pixel distance according to the third actual distance and the third pixel distance, wherein the unit pixel distance is an actual distance corresponding to the unit pixel interval; the unit pixel distance comprises a distance in an X-axis direction and a distance in a Y-axis direction;
the construction module 300 is used for calculating a first actual distance between the image center of the first image and the first identification point according to the first pixel distance; when the first actual distance comprises the distance in the X-axis direction and the distance in the Y-axis direction, the following steps are performed:
converting the first pixel distance into a first actual distance based on the unit pixel distance;
the first obtaining module 400 is configured to calculate a second actual distance between the image center of the third image and the second recognition point according to the second pixel distance; performing when the second actual distance includes a distance in the X-axis direction and a distance in the Y-axis direction:
the second pixel distance is converted into a second actual distance based on the unit pixel distance.
Referring to fig. 4, fig. 4 is a positioning apparatus in some embodiments of the present application, applied to a visual experiment platform in a split-module type calibrated by a visual experiment platform calibration method, the positioning apparatus being integrated in a back-end control device of the positioning apparatus in the form of a computer program, the positioning apparatus including:
a fourth obtaining module 700, configured to control the X-axis module and the Y-axis module to move, and when a target point appears in the field of view of the camera, obtain a fifth image and calculate a first moving distance of the X-axis module and a second moving distance of the Y-axis module;
a fifth obtaining module 800, configured to obtain a fourth pixel distance between the image center of the fifth image and the target point; the fourth pixel distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
a conversion module 900 for converting the fourth pixel distance into a fourth actual distance based on the unit pixel distance; the fourth actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction;
the judging module 1000 is configured to judge a quadrant of the target point in the fifth image;
the calculating module 1100 is configured to calculate a second platform coordinate of the target point in the platform coordinate system according to the first moving distance of the X-axis module, the second moving distance of the Y-axis module, the fourth actual distance, and a quadrant of the target point in the fifth image;
the calculation module 1100 is executed when calculating the second platform coordinate of the target point in the platform coordinate system according to the first movement distance of the X-axis module, the second movement distance of the Y-axis module, the fourth actual distance, and the quadrant of the target point in the fifth image:
the second platform coordinates are calculated according to the following formula:
Figure 733183DEST_PATH_IMAGE002
Figure 426332DEST_PATH_IMAGE004
wherein,
Figure 956671DEST_PATH_IMAGE006
is the X-axis coordinate value in the second platform coordinate;
Figure 709863DEST_PATH_IMAGE008
the coordinate value of the Y axis in the second platform coordinate;
Figure 840499DEST_PATH_IMAGE010
is a first value;
Figure 602919DEST_PATH_IMAGE012
is a first movement distance;
Figure 253343DEST_PATH_IMAGE014
the first distance value in the X-axis direction in the fourth actual distance is positive or negative according to the quadrant of the target point in the fifth image;
Figure 177437DEST_PATH_IMAGE016
is a second value;
Figure 546101DEST_PATH_IMAGE018
is the second movement distance;
Figure 830321DEST_PATH_IMAGE020
is the fourth embodimentAnd a second distance value in the Y-axis direction in the inter-distance, wherein the second distance value is positive or negative according to the quadrant of the target point in the fifth image.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, where the present disclosure provides an electronic device including: the processor 1301 and the memory 1302, the processor 1301 and the memory 1302 are interconnected and communicate with each other through a communication bus 1303 and/or other connection mechanism (not shown), and the memory 1302 stores a computer program executable by the processor 1301, and when the computing apparatus runs, the processor 1301 executes the computer program to execute the visual experiment platform calibration method in any optional implementation manner of the embodiment of the first aspect, so as to implement the following functions: placing a workpiece on the objective table, wherein at least 4 mark points are arranged on one side of the workpiece facing the camera; selecting any one mark point as a first identification point; establishing a platform coordinate system according to the first identification point; taking other mark points except the first identification point as second identification points, and acquiring first platform coordinates of the second identification points in a platform coordinate system based on the platform coordinate system; acquiring the workpiece coordinates of the second identification point in a workpiece coordinate system; and acquiring a conversion matrix between the workpiece coordinate system and the platform coordinate system according to the first platform coordinate and the workpiece coordinate.
And/or to perform the positioning method in any of the alternative implementations of the embodiments of the second aspect described above, to implement the following functions: the X-axis module and the Y-axis module are controlled to move, when a target point appears in the visual field of the camera, acquiring a fifth image and calculating a first moving distance of the X-axis module and a second moving distance of the Y-axis module; acquiring a fourth pixel distance between the image center of the fifth image and the target point; the fourth pixel distance comprises a distance in the X-axis direction and a distance in the Y-axis direction; converting the fourth pixel distance to a fourth actual distance based on the unit pixel distance; the fourth actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction; judging the quadrant of the target point in the fifth image; calculating a second platform coordinate of the target point in the platform coordinate system according to the first moving distance of the X-axis module, the second moving distance of the Y-axis module, the fourth actual distance and the quadrant of the target point in the fifth image; the second platform coordinates are calculated according to the following formula:
Figure 600831DEST_PATH_IMAGE002
Figure 695826DEST_PATH_IMAGE004
wherein,
Figure 551786DEST_PATH_IMAGE006
the coordinate value of the X axis in the second platform coordinate is taken;
Figure 656008DEST_PATH_IMAGE008
the coordinate value of the Y axis in the second platform coordinate;
Figure 264713DEST_PATH_IMAGE010
is a first value;
Figure 530609DEST_PATH_IMAGE012
is a first movement distance;
Figure 873866DEST_PATH_IMAGE014
the first distance value in the X-axis direction in the fourth actual distance is positive or negative according to the quadrant of the target point in the fifth image;
Figure 516200DEST_PATH_IMAGE016
is a second value;
Figure 261302DEST_PATH_IMAGE018
is the second movement distance;
Figure 947367DEST_PATH_IMAGE020
and the second distance value is a second distance value in the Y-axis direction in the fourth actual distance, and the second distance value is positive or negative according to the quadrant of the target point in the fifth image.
An embodiment of the present application provides a storage medium, where a computer program is stored, and when the computer program is executed by a processor, the method for calibrating a visual experiment platform in any optional implementation manner of the embodiment of the first aspect is executed, so as to implement the following functions: placing a workpiece on an objective table, wherein at least 4 mark points are arranged on one side of the workpiece facing a camera; selecting any one mark point as a first identification point; establishing a platform coordinate system according to the first identification point; taking other mark points except the first identification point as second identification points, and acquiring first platform coordinates of the second identification points in a platform coordinate system based on the platform coordinate system; acquiring the workpiece coordinates of the second identification point in a workpiece coordinate system; and acquiring a conversion matrix between the workpiece coordinate system and the platform coordinate system according to the first platform coordinate and the workpiece coordinate.
And/or performing the positioning method in any optional implementation manner of the embodiment of the second aspect to implement the following functions: controlling the X-axis module and the Y-axis module to move, and acquiring a fifth image and calculating a first moving distance of the X-axis module and a second moving distance of the Y-axis module when a target point appears in the visual field of the camera; acquiring a fourth pixel distance between the image center of the fifth image and the target point; the fourth pixel distance comprises a distance in the X-axis direction and a distance in the Y-axis direction; converting the fourth pixel distance to a fourth actual distance based on the unit pixel distance; the fourth actual distance comprises a distance in the X-axis direction and a distance in the Y-axis direction; judging the quadrant of the target point in the fifth image; calculating a second platform coordinate of the target point in the platform coordinate system according to the first moving distance of the X-axis module, the second moving distance of the Y-axis module, the fourth actual distance and the quadrant of the target point in the fifth image; the second platform coordinates are calculated according to the following formula:
Figure 512341DEST_PATH_IMAGE002
Figure 692786DEST_PATH_IMAGE004
wherein,
Figure 26816DEST_PATH_IMAGE006
Is the X-axis coordinate value in the second platform coordinate;
Figure 149361DEST_PATH_IMAGE008
the coordinate value of the Y axis in the second platform coordinate;
Figure 201631DEST_PATH_IMAGE010
is a first value;
Figure 920188DEST_PATH_IMAGE012
is a first movement distance;
Figure 639883DEST_PATH_IMAGE014
taking the first distance value in the X-axis direction in the fourth actual distance, and taking the positive and negative values of the first distance value according to the quadrant of the target point in the fifth image;
Figure 418483DEST_PATH_IMAGE016
is a second value;
Figure 207316DEST_PATH_IMAGE018
is the second movement distance;
Figure 995144DEST_PATH_IMAGE020
and the second distance value is a second distance value in the Y-axis direction in the fourth actual distance, and the second distance value is positive or negative according to the quadrant of the target point in the fifth image.
The storage medium may be implemented by any type of volatile or nonvolatile storage device or combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the units into only one type of logical function may be implemented in other ways, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1.一种视觉实验平台标定方法,用于标定分离模组式的视觉实验平台,所述视觉实验平台包括相机、载物台、X轴模组、Y轴模组和Z轴模组,所述相机安装在所述Z轴模组上,且所述Z轴模组可带动所述相机沿竖直方向往复移动;所述Z轴模组滑动设置在所述X轴模组上,且所述X轴模组可带动所述Z轴模组沿横向方向往复移动;所述载物台独立设置在所述相机的下方,所述Y轴模组与所述载物台固定连接,且所述Y轴模组可带动所述载物台沿纵向方向往复移动;1. A visual experiment platform calibration method is used to calibrate the visual experiment platform of the separation module, and the visual experiment platform includes a camera, an object stage, an X-axis module, a Y-axis module and a Z-axis module. The camera is installed on the Z-axis module, and the Z-axis module can drive the camera to reciprocate in the vertical direction; the Z-axis module is slidably arranged on the X-axis module, and the The X-axis module can drive the Z-axis module to reciprocate in the lateral direction; the stage is independently arranged below the camera, the Y-axis module is fixedly connected to the stage, and the The Y-axis module can drive the stage to reciprocate along the longitudinal direction; 其特征在于,所述视觉实验平台标定方法包括步骤:It is characterized in that, the calibration method of the visual experiment platform comprises the steps of: S1.在所述载物台上安放工件,所述工件朝向相机一侧设置有至少4个标志点;S1. Place a workpiece on the stage, and the workpiece is provided with at least 4 marker points on the side facing the camera; S2.选取任意一个所述标志点作为第一识别点;S2. Select any one of the marker points as the first recognition point; S3.根据所述第一识别点建立平台坐标系;S3. Establish a platform coordinate system according to the first identification point; S4.以除所述第一识别点以外的其他所述标志点作为第二识别点,基于所述平台坐标系,获取所述第二识别点在所述平台坐标系下的第一平台坐标;S4. Using the marker points other than the first identification point as the second identification point, and based on the platform coordinate system, obtain the first platform coordinates of the second identification point in the platform coordinate system; S5.获取所述第二识别点在工件坐标系下的工件坐标;S5. Obtain the workpiece coordinates of the second recognition point in the workpiece coordinate system; S6.根据所述第一平台坐标和所述工件坐标,获取所述工件坐标系和所述平台坐标系之间的转换矩阵。S6. Obtain a transformation matrix between the workpiece coordinate system and the platform coordinate system according to the first platform coordinates and the workpiece coordinates. 2.根据权利要求1所述的视觉实验平台标定方法,其特征在于,步骤S3中的具体步骤包括:2. visual experiment platform calibration method according to claim 1, is characterized in that, the concrete steps in step S3 comprise: S31.控制所述X轴模组和所述Y轴模组移动,在所述相机的视野中出现所述第一识别点时,获取第一图像;S31. Control the movement of the X-axis module and the Y-axis module, and acquire a first image when the first recognition point appears in the field of view of the camera; S32.获取所述第一图像的图像中心与所述第一识别点之间的第一像素距离;所述第一像素距离包括X轴方向的距离和Y轴方向的距离;S32. Acquire a first pixel distance between the image center of the first image and the first identification point; the first pixel distance includes a distance in the X-axis direction and a distance in the Y-axis direction; S33.根据所述第一像素距离计算所述第一图像的图像中心与所述第一识别点的第一实际距离;所述第一实际距离包括X轴方向的距离和Y轴方向的距离;S33. Calculate a first actual distance between the image center of the first image and the first identification point according to the first pixel distance; the first actual distance includes a distance in the X-axis direction and a distance in the Y-axis direction; S34.根据所述第一实际距离,控制所述X轴模组和所述Y轴模组,在所述第一图像的图像中心与所述第一识别点重合时,获取第二图像;S34. Control the X-axis module and the Y-axis module according to the first actual distance, and acquire a second image when the image center of the first image coincides with the first identification point; S35.基于所述第二图像,以所述第一识别点作为原点建立所述平台坐标系。S35. Based on the second image, establish the platform coordinate system with the first identification point as an origin. 3.根据权利要求2所述的视觉实验平台标定方法,其特征在于,步骤S4中的具体步骤包括:3. visual experiment platform calibration method according to claim 2, is characterized in that, the concrete steps in step S4 comprise: 依次针对各个所述第二识别点执行以下步骤:Perform the following steps for each of the second identification points in turn: S41.控制所述X轴模组和所述Y轴模组移动,在所述相机的视野中出现所述第二识别点时,获取第三图像;S41. Control the movement of the X-axis module and the Y-axis module, and acquire a third image when the second recognition point appears in the field of view of the camera; S42.获取所述第三图像的图像中心与所述第二识别点之间的第二像素距离;所述第二像素距离包括X轴方向的距离和Y轴方向的距离;S42. Acquire a second pixel distance between the image center of the third image and the second identification point; the second pixel distance includes a distance in the X-axis direction and a distance in the Y-axis direction; S43.根据所述第二像素距离计算所述第三图像的图像中心与所述第二识别点的第二实际距离;所述第二实际距离包括X轴方向的距离和Y轴方向的距离;S43. Calculate a second actual distance between the image center of the third image and the second identification point according to the second pixel distance; the second actual distance includes a distance in the X-axis direction and a distance in the Y-axis direction; S44.根据所述第二实际距离,控制所述X轴模组和所述Y轴模组,在所述第三图像的图像中心与所述第二识别点重合时,获取第四图像;S44. Control the X-axis module and the Y-axis module according to the second actual distance, and acquire a fourth image when the image center of the third image coincides with the second identification point; S45.基于所述第四图像,获取所述第二识别点在所述平台坐标系下的所述第一平台坐标。S45. Based on the fourth image, acquire the first platform coordinates of the second identification point in the platform coordinate system. 4.根据权利要求3所述的视觉实验平台标定方法,其特征在于,步骤S45中的具体步骤包括:4. visual experiment platform calibration method according to claim 3 is characterized in that, the concrete steps in step S45 comprise: 获取所述第二图像对应的所述X轴模组的第一数值,所述第一数值为所述第二图像对应的所述载物台的X轴坐标值;Acquiring a first numerical value of the X-axis module corresponding to the second image, where the first numerical value is an X-axis coordinate value of the stage corresponding to the second image; 获取所述第二图像对应的所述Y轴模组的第二数值,所述第二数值为所述第二图像对应的所述载物台的Y轴坐标值;Acquiring a second value of the Y-axis module corresponding to the second image, the second value being the Y-axis coordinate value of the stage corresponding to the second image; 获取各个所述第四图像对应的所述X轴模组的第三数值,所述第三数值为所述第四图像对应的所述载物台的X轴坐标值;Acquiring a third value of the X-axis module corresponding to each of the fourth images, the third value being the X-axis coordinate value of the stage corresponding to the fourth image; 获取各个所述第四图像对应的所述Y轴模组的第四数值,所述第四数值为所述第四图像对应的所述载物台的Y轴坐标值;Acquire a fourth value of the Y-axis module corresponding to each of the fourth images, where the fourth value is the Y-axis coordinate value of the stage corresponding to the fourth image; 通过计算所述第一数值与各个所述第三数值的差值,得到各个所述第二识别点在所述平台坐标系下的X轴坐标值;By calculating the difference between the first numerical value and each of the third numerical values, the X-axis coordinate value of each of the second identification points in the platform coordinate system is obtained; 通过计算所述第二数值与各个所述第四数值的差值,得到各个所述第二识别点在所述平台坐标系下的Y轴坐标值。By calculating the difference between the second numerical value and each of the fourth numerical values, the Y-axis coordinate value of each of the second identification points in the platform coordinate system is obtained. 5.根据权利要求4所述的视觉实验平台标定方法,其特征在于,在步骤S33和步骤S43之前还包括步骤:5. visual experiment platform calibration method according to claim 4, is characterized in that, also comprises step before step S33 and step S43: A1.将标定板置于所述相机的视野中,得到标定图像;A1. Place the calibration plate in the field of view of the camera to obtain a calibration image; A2.获取所述标定板中任意两个标定点之间的第三实际距离;所述第三实际距离包括X轴方向的距离和Y轴方向的距离;A2. Obtain the third actual distance between any two calibration points in the calibration plate; the third actual distance includes the distance in the X-axis direction and the distance in the Y-axis direction; A3.根据所述标定图像,获取所述两个标定点之间的第三像素距离;所述第三像素距离包括X轴方向的距离和Y轴方向的距离;A3. According to the calibration image, obtain the third pixel distance between the two calibration points; the third pixel distance includes the distance in the X-axis direction and the distance in the Y-axis direction; A4.根据所述第三实际距离和所述第三像素距离,计算得到单位像素距离,所述单位像素距离为单位像素间隔对应的实际距离;所述单位像素距离包括X轴方向的距离和Y轴方向的距离;A4. According to the third actual distance and the third pixel distance, calculate the unit pixel distance, the unit pixel distance is the actual distance corresponding to the unit pixel interval; the unit pixel distance includes the distance in the X-axis direction and the Y Axis distance; 步骤S33中的具体步骤包括:Concrete steps in step S33 include: 基于所述单位像素距离,将所述第一像素距离转换为所述第一实际距离;converting the first pixel distance into the first actual distance based on the unit pixel distance; 步骤S43中的具体步骤包括:Concrete steps in step S43 include: 基于所述单位像素距离,将所述第二像素距离转换为所述第二实际距离。Converting the second pixel distance into the second actual distance based on the unit pixel distance. 6.一种定位方法,其特征在于,应用于经如权利要求5中的所述视觉实验平台标定方法标定后的分离模组式的视觉实验平台,包括步骤:6. A positioning method, characterized in that, it is applied to the separated modular visual experiment platform after being calibrated by the visual experiment platform calibration method as claimed in claim 5, comprising the steps of: B1.控制所述X轴模组和所述Y轴模组移动,在所述相机的视野中出现目标点时,获取第五图像并计算所述X轴模组的第一移动距离和所述Y轴模组的第二移动距离;B1. Control the movement of the X-axis module and the Y-axis module, and when a target point appears in the field of view of the camera, acquire a fifth image and calculate the first moving distance and the first moving distance of the X-axis module The second moving distance of the Y-axis module; B2.获取所述第五图像的图像中心与所述目标点之间的第四像素距离;所述第四像素距离包括X轴方向的距离和Y轴方向的距离;B2. Obtain a fourth pixel distance between the image center of the fifth image and the target point; the fourth pixel distance includes a distance in the X-axis direction and a distance in the Y-axis direction; B3.基于所述单位像素距离,将所述第四像素距离转换为第四实际距离;所述第四实际距离包括X轴方向的距离和Y轴方向的距离;B3. Based on the unit pixel distance, convert the fourth pixel distance into a fourth actual distance; the fourth actual distance includes a distance in the X-axis direction and a distance in the Y-axis direction; B4.判断所述目标点在所述第五图像中的所属象限;B4. judging the quadrant to which the target point belongs in the fifth image; B5.根据所述X轴模组的第一移动距离、所述Y轴模组的第二移动距离、所述第四实际距离和所述目标点在所述第五图像中的所属象限,计算所述目标点在所述平台坐标系下的第二平台坐标;B5. Calculate according to the first moving distance of the X-axis module, the second moving distance of the Y-axis module, the fourth actual distance, and the quadrant of the target point in the fifth image The second platform coordinates of the target point in the platform coordinate system; 步骤B5中的具体步骤包括:Concrete steps in step B5 include: 根据以下公式计算所述第二平台坐标:Calculate the second platform coordinates according to the following formula:
Figure DEST_PATH_IMAGE002
Figure DEST_PATH_IMAGE002
;
Figure DEST_PATH_IMAGE004
Figure DEST_PATH_IMAGE004
;
其中,
Figure DEST_PATH_IMAGE006
为所述第二平台坐标中的X轴坐标值;
Figure DEST_PATH_IMAGE008
为所述第二平台坐标中的Y轴坐标值;
Figure DEST_PATH_IMAGE010
为所述第一数值;
Figure DEST_PATH_IMAGE012
为所述第一移动距离;
Figure DEST_PATH_IMAGE014
为所述第四实际距离中的X轴方向的第一距离值,所述第一距离值根据所述目标点在所述第五图像中的所属象限取正负;
Figure DEST_PATH_IMAGE016
为所述第二数值;
Figure DEST_PATH_IMAGE018
为所述第二移动距离;
Figure DEST_PATH_IMAGE020
为所述第四实际距离中的Y轴方向的第二距离值,所述第二距离值根据所述目标点在所述第五图像中的所属象限取正负。
in,
Figure DEST_PATH_IMAGE006
is the X-axis coordinate value in the second platform coordinates;
Figure DEST_PATH_IMAGE008
is the Y-axis coordinate value in the second platform coordinates;
Figure DEST_PATH_IMAGE010
is said first value;
Figure DEST_PATH_IMAGE012
is the first moving distance;
Figure DEST_PATH_IMAGE014
is the first distance value in the X-axis direction in the fourth actual distance, and the first distance value is positive or negative according to the quadrant to which the target point belongs in the fifth image;
Figure DEST_PATH_IMAGE016
is said second value;
Figure DEST_PATH_IMAGE018
is the second moving distance;
Figure DEST_PATH_IMAGE020
is the second distance value in the Y-axis direction in the fourth actual distance, and the second distance value is positive or negative according to the quadrant to which the target point belongs in the fifth image.
7.一种视觉实验平台标定装置,用于标定分离模组式的视觉实验平台,所述视觉实验平台包括相机、载物台、X轴模组、Y轴模组和Z轴模组,所述相机安装在所述Z轴模组上,且所述Z轴模组可带动所述相机沿竖直方向往复移动;所述Z轴模组滑动设置在所述X轴模组上,且所述X轴模组可带动所述Z轴模组沿横向方向往复移动;所述载物台独立设置在所述相机的下方,所述Y轴模组与所述载物台固定连接,且所述Y轴模组可带动所述载物台沿纵向方向往复移动;7. A visual experiment platform calibration device is used to calibrate the visual experiment platform of the separation module, and the visual experiment platform includes a camera, an object stage, an X-axis module, a Y-axis module and a Z-axis module. The camera is installed on the Z-axis module, and the Z-axis module can drive the camera to reciprocate in the vertical direction; the Z-axis module is slidably arranged on the X-axis module, and the The X-axis module can drive the Z-axis module to reciprocate in the lateral direction; the stage is independently arranged below the camera, the Y-axis module is fixedly connected to the stage, and the The Y-axis module can drive the stage to reciprocate along the longitudinal direction; 其特征在于,所述视觉实验平台标定装置包括:It is characterized in that the visual experiment platform calibration device includes: 准备模块,用于在所述载物台上安放工件,所述工件朝向相机一侧设置有至少4个标志点;A preparation module, configured to place a workpiece on the stage, and the workpiece is provided with at least 4 marker points on the side facing the camera; 选取模块,用于选取任意一个所述标志点作为第一识别点;A selection module, configured to select any one of the marker points as the first recognition point; 构建模块,用于根据所述第一识别点建立平台坐标系;A construction module, configured to establish a platform coordinate system according to the first identification point; 第一获取模块,用于以除所述第一识别点以外的其他所述标志点作为第二识别点,基于所述平台坐标系,获取所述第二识别点在所述平台坐标系下的第一平台坐标;The first acquisition module is configured to use the marker points other than the first identification point as the second identification point, and based on the platform coordinate system, acquire the position of the second identification point in the platform coordinate system First platform coordinates; 第二获取模块,用于获取所述第二识别点在工件坐标系下的工件坐标;The second acquisition module is used to acquire the workpiece coordinates of the second recognition point in the workpiece coordinate system; 第三获取模块,用于根据所述第一平台坐标和所述工件坐标,获取所述工件坐标系和所述平台坐标系之间的转换矩阵。A third acquiring module, configured to acquire a transformation matrix between the workpiece coordinate system and the platform coordinate system according to the first platform coordinates and the workpiece coordinates. 8.一种定位装置,其特征在于,应用于经如权利要求5中的所述视觉实验平台标定方法标定后的分离模组式的视觉实验平台,所述定位装置包括:8. A positioning device, characterized in that it is applied to a separate modular visual experiment platform calibrated by the visual experiment platform calibration method as claimed in claim 5, and the positioning device comprises: 第四获取模块,用于控制所述X轴模组和所述Y轴模组移动,在所述相机的视野中出现目标点时,获取第五图像并计算所述X轴模组的第一移动距离和所述Y轴模组的第二移动距离;The fourth acquisition module is used to control the movement of the X-axis module and the Y-axis module. When a target point appears in the field of view of the camera, acquire the fifth image and calculate the first The moving distance and the second moving distance of the Y-axis module; 第五获取模块,用于获取所述第五图像的图像中心与所述目标点之间的第四像素距离;所述第四像素距离包括X轴方向的距离和Y轴方向的距离;A fifth acquisition module, configured to acquire a fourth pixel distance between the image center of the fifth image and the target point; the fourth pixel distance includes a distance in the X-axis direction and a distance in the Y-axis direction; 转换模块,用于基于所述单位像素距离,将所述第四像素距离转换为第四实际距离;所述第四实际距离包括X轴方向的距离和Y轴方向的距离;A conversion module, configured to convert the fourth pixel distance into a fourth actual distance based on the unit pixel distance; the fourth actual distance includes a distance in the X-axis direction and a distance in the Y-axis direction; 判断模块,用于判断所述目标点在所述第五图像中的所属象限;a judging module, configured to judge the quadrant to which the target point belongs in the fifth image; 计算模块,用于根据所述X轴模组的第一移动距离、所述Y轴模组的第二移动距离、所述第四实际距离和所述目标点在所述第五图像中的所属象限,计算所述目标点在所述平台坐标系下的第二平台坐标;A calculation module, configured to use the first moving distance of the X-axis module, the second moving distance of the Y-axis module, the fourth actual distance, and the location of the target point in the fifth image quadrant, calculating the second platform coordinates of the target point in the platform coordinate system; 所述计算模块在用于根据所述X轴模组的第一移动距离、所述Y轴模组的第二移动距离、所述第四实际距离和所述目标点在所述第五图像中的所属象限,计算所述目标点在所述平台坐标系下的第二平台坐标的时候执行:The calculation module is used in the fifth image according to the first moving distance of the X-axis module, the second moving distance of the Y-axis module, the fourth actual distance and the target point The quadrant to which the target point belongs is executed when calculating the second platform coordinates of the target point in the platform coordinate system: 根据以下公式计算所述第二平台坐标:Calculate the second platform coordinates according to the following formula:
Figure 130978DEST_PATH_IMAGE002
Figure 130978DEST_PATH_IMAGE002
;
Figure 516960DEST_PATH_IMAGE004
Figure 516960DEST_PATH_IMAGE004
;
其中,
Figure 765539DEST_PATH_IMAGE006
为所述第二平台坐标中的X轴坐标值;
Figure 878857DEST_PATH_IMAGE008
为所述第二平台坐标中的Y轴坐标值;
Figure 162071DEST_PATH_IMAGE010
为所述第一数值;
Figure 656637DEST_PATH_IMAGE012
为所述第一移动距离;
Figure 641779DEST_PATH_IMAGE014
为所述第四实际距离中的X轴方向的第一距离值,所述第一距离值根据所述目标点在所述第五图像中的所属象限取正负;
Figure 309521DEST_PATH_IMAGE016
为所述第二数值;
Figure 447241DEST_PATH_IMAGE018
为所述第二移动距离;
Figure 909447DEST_PATH_IMAGE020
为所述第四实际距离中的Y轴方向的第二距离值,所述第二距离值根据所述目标点在所述第五图像中的所属象限取正负。
in,
Figure 765539DEST_PATH_IMAGE006
is the X-axis coordinate value in the second platform coordinates;
Figure 878857DEST_PATH_IMAGE008
is the Y-axis coordinate value in the second platform coordinates;
Figure 162071DEST_PATH_IMAGE010
is said first value;
Figure 656637DEST_PATH_IMAGE012
is the first moving distance;
Figure 641779DEST_PATH_IMAGE014
is the first distance value in the X-axis direction in the fourth actual distance, and the first distance value is positive or negative according to the quadrant to which the target point belongs in the fifth image;
Figure 309521DEST_PATH_IMAGE016
is said second value;
Figure 447241DEST_PATH_IMAGE018
is the second moving distance;
Figure 909447DEST_PATH_IMAGE020
is the second distance value in the Y-axis direction in the fourth actual distance, and the second distance value is positive or negative according to the quadrant to which the target point belongs in the fifth image.
9.一种电子设备,其特征在于,包括处理器以及存储器,所述存储器存储有计算机可读取指令,当所述计算机可读取指令由所述处理器执行时,运行如权利要求1-5中的任一项所述视觉实验平台标定方法中的步骤,和/或运行如权利要求6所述定位方法中的步骤。9. An electronic device, characterized in that it comprises a processor and a memory, the memory stores computer-readable instructions, and when the computer-readable instructions are executed by the processor, it operates as claimed in claim 1- 5 in the steps in the visual experiment platform calibration method described in any one, and/or run the steps in the positioning method as described in claim 6. 10.一种存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时运行如权利要求1-5中的任一项所述视觉实验平台标定方法中的步骤,和/或运行如权利要求6所述定位方法中的步骤。10. A storage medium on which a computer program is stored, characterized in that, when the computer program is executed by a processor, it runs the steps in the visual experiment platform calibration method as described in any one of claims 1-5, And/or run the steps in the positioning method according to claim 6.
CN202211237174.0A 2022-10-10 2022-10-10 Visual experiment platform calibration method, positioning method and related equipment Active CN115294217B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211237174.0A CN115294217B (en) 2022-10-10 2022-10-10 Visual experiment platform calibration method, positioning method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211237174.0A CN115294217B (en) 2022-10-10 2022-10-10 Visual experiment platform calibration method, positioning method and related equipment

Publications (2)

Publication Number Publication Date
CN115294217A CN115294217A (en) 2022-11-04
CN115294217B true CN115294217B (en) 2022-12-09

Family

ID=83819228

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211237174.0A Active CN115294217B (en) 2022-10-10 2022-10-10 Visual experiment platform calibration method, positioning method and related equipment

Country Status (1)

Country Link
CN (1) CN115294217B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105869150A (en) * 2016-03-24 2016-08-17 杭州南江机器人股份有限公司 Movable platform calibration device and calibration method based on visual recognition
CN106709955A (en) * 2016-12-28 2017-05-24 天津众阳科技有限公司 Space coordinate system calibrate system and method based on binocular stereo visual sense
CN111260734A (en) * 2020-01-13 2020-06-09 深圳市精昱智能技术有限公司 Calibration method of XY theta platform machine vision alignment system
CN111862220A (en) * 2020-07-31 2020-10-30 广东利元亨智能装备股份有限公司 Correction method, device, correction method and alignment system for UVW platform calibration
CN114663500A (en) * 2022-04-02 2022-06-24 深圳众为兴技术股份有限公司 Vision calibration method, computer device and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9188973B2 (en) * 2011-07-08 2015-11-17 Restoration Robotics, Inc. Calibration and transformation of a camera system's coordinate system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105869150A (en) * 2016-03-24 2016-08-17 杭州南江机器人股份有限公司 Movable platform calibration device and calibration method based on visual recognition
CN106709955A (en) * 2016-12-28 2017-05-24 天津众阳科技有限公司 Space coordinate system calibrate system and method based on binocular stereo visual sense
CN111260734A (en) * 2020-01-13 2020-06-09 深圳市精昱智能技术有限公司 Calibration method of XY theta platform machine vision alignment system
CN111862220A (en) * 2020-07-31 2020-10-30 广东利元亨智能装备股份有限公司 Correction method, device, correction method and alignment system for UVW platform calibration
CN114663500A (en) * 2022-04-02 2022-06-24 深圳众为兴技术股份有限公司 Vision calibration method, computer device and storage medium

Also Published As

Publication number Publication date
CN115294217A (en) 2022-11-04

Similar Documents

Publication Publication Date Title
EP3998580A2 (en) Camera calibration method and apparatus, electronic device, storage medium, program product, and road side device
US10984554B2 (en) Monocular vision tracking method, apparatus and non-volatile computer-readable storage medium
CN114220757B (en) Wafer detection alignment method, device and system and computer medium
CN102800075B (en) Image split-joint method based on line-scan digital camera shooting and device
JPH08210816A (en) Coordinate system connection method for determining relationship between sensor coordinate system and robot tip part in robot-visual sensor system
CN112581544B (en) Camera calibration method without public view field based on parameter optimization
CN114612447A (en) Image processing method and device based on data calibration and image processing equipment
CN113172636B (en) Automatic hand-eye calibration method and device and storage medium
CN110695520B (en) Vision-based full-automatic galvanometer field calibration system and calibration method thereof
CN116233392B (en) Calibration method and device of virtual shooting system, electronic equipment and storage medium
CN112950724A (en) Screen printing visual calibration method and device
CN113643384B (en) Coordinate system calibration method, automatic assembly method and device
CN113664838A (en) Robot positioning placement control method and device, electronic equipment and storage medium
CN115018922A (en) Distortion parameter calibration method, electronic device and computer readable storage medium
CN113034585B (en) Offset state test method, test equipment and storage medium
CN104469153A (en) Quick focusing method and system
CN104698694B (en) A kind of liquid crystal panel pairing device and method
CN118115573A (en) Method, system, equipment and storage medium for determining object pose
CN106500729A (en) A kind of smart mobile phone self-inspection calibration method without the need for control information
CN106507656A (en) Teaching of assembly positions
CN115294217B (en) Visual experiment platform calibration method, positioning method and related equipment
CN109520416B (en) Method based on visual compensation correction, fitting system and control equipment
CN111189413B (en) Optimization method and terminal equipment of dual-camera linear structured light measurement system
CN119559265B (en) Automatic calibration method, device and storage medium based on inspection camera inspection edge
CN105093480A (en) Method for improving optical lens focusing accuracy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
OL01 Intention to license declared