CN111947649A - Robot positioning method based on data fusion, chip and robot - Google Patents
Robot positioning method based on data fusion, chip and robot Download PDFInfo
- Publication number
- CN111947649A CN111947649A CN202010570130.4A CN202010570130A CN111947649A CN 111947649 A CN111947649 A CN 111947649A CN 202010570130 A CN202010570130 A CN 202010570130A CN 111947649 A CN111947649 A CN 111947649A
- Authority
- CN
- China
- Prior art keywords
- robot
- data
- obstacle
- laser
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 230000004927 fusion Effects 0.000 title claims abstract description 20
- 230000008569 process Effects 0.000 claims abstract description 4
- 230000007613 environmental effect Effects 0.000 claims description 2
- 238000004519 manufacturing process Methods 0.000 abstract description 8
- 230000005540 biological transmission Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000007500 overflow downdraw method Methods 0.000 description 2
- 230000004888 barrier function Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a robot positioning method based on data fusion, a chip and a robot, wherein the method comprises the following steps: s1: the robot acquires laser data of obstacles in the surrounding environment by transmitting and receiving laser through a laser radar; s2: the robot walks, and the robot acquires IMU data in the walking process; s3: the robot fuses the laser data and the IMU data to determine its position relative to the obstacle. The robot acquires laser data as basic data through a general laser radar, and performs positioning by fusing IMU data, so that the production cost of the robot is reduced, and the positioning speed of the robot is increased.
Description
Technical Field
The invention relates to the technical field of intelligent robots, in particular to a robot positioning method based on data fusion, a chip and a robot.
Background
Under the prior art environment, the robot is at the in-process that removes, often need confirm the position relation of self and barrier and carry out corresponding operation, and the robot can adopt laser radar as location and detection module in order to make the location more accurate, but the laser radar of general use, its frame rate generally is 5 revolutions per second, and the transmission of data is slower, obviously is not enough in some occasions that need to respond fast, if: obstacle avoidance, escaping, edge following and the like. If only the laser radar with the higher frame rate is adopted, although the response speed is higher, the laser radar with the higher frame rate is higher in price, so that the production cost of the robot is higher, and therefore a method for enabling the robot to be positioned by fusing other data on the basis of the laser radar with the lower cost is needed.
Disclosure of Invention
In order to solve the problems, the invention provides a robot positioning method based on data fusion, a chip and a robot, which greatly improve the positioning speed of the robot. The specific technical scheme of the invention is as follows:
a robot positioning method based on data fusion comprises the following steps: s1: the robot acquires laser data of obstacles in the surrounding environment by transmitting and receiving laser through a laser radar; s2: the robot walks, and the robot acquires IMU data in the walking process; s3: the robot fuses the laser data and the IMU data to determine its position relative to the obstacle. The robot acquires laser data as basic data through a general laser radar, and performs positioning by fusing IMU data, so that the production cost of the robot is reduced, and the positioning speed of the robot is increased.
In one or more aspects of the present invention, the step S1 includes the following steps: the laser radar of the robot rotates for one circle, and the environmental information around the robot is equally divided into specific parts of laser data according to the angular resolution of the laser radar. The robot equally divides the environment information, so that the acquired data are comprehensive, and the robot can calculate conveniently.
In one or more aspects of the present invention, the angular resolution of the lidar is 1 degree, and 360 parts of laser data are acquired by one rotation of the lidar. The angular resolution of the low-cost laser radar is generally 1 degree, the production cost of the robot can be reduced by adopting the laser radar to acquire data, and the calculation result of the robot is not influenced.
In one or more aspects of the present invention, the step S2 includes the following steps: the IMU at least comprises a gyroscope and an accelerometer, and the robot determines the rotation angle and the walking distance of the robot within set time according to the acquired data of the gyroscope and the acquired data of the accelerometer. The IMU is a common module of the robot, the robot does not need to additionally increase a functional module, and the production cost is reduced.
In one or more aspects of the present invention, the step S3 includes the following steps: the robot establishes a first coordinate system by taking the robot as an origin, calculates coordinate information of an obstacle on the first coordinate system according to laser data, calculates a relative motion attitude of the robot according to IMU data in set time, acquires position coordinates of the robot on the first coordinate system after moving, establishes a second coordinate system by taking the position coordinates as the origin, calculates coordinate information of the obstacle on the second coordinate system according to the coordinate information of the obstacle on the first coordinate system, and determines the position of the robot relative to the obstacle according to the coordinate information of the obstacle on the second coordinate system. The position of the robot relative to the obstacle after moving is calculated by adopting the IMU data through a method for establishing a coordinate system, the positioning speed of the robot is determined by the data acquisition and transmission speed of the IMU, and the laser data acquisition speed is improved without higher cost.
In one or more aspects of the invention, the set time is less than a time interval between the two frames of laser data received by the robot. The robot can calculate as long as receiving IMU data, the calculating speed is high, and the response speed of the robot is improved.
In one or more aspects of the present invention, the position of the robot relative to the obstacle includes an angular direction of the robot at the obstacle and a distance between the robot and the obstacle in the constructed coordinate system. The robot not only knows the distance between the robot and the obstacle, but also knows the position of the obstacle, the positioning is accurate, and the robot can conveniently carry out corresponding operation.
A chip is internally provided with a control program, and the control program is used for controlling a robot to execute the robot positioning method based on data fusion. The robot can be positioned by a data fusion method by loading the robot in different robots, so that the applicability is strong.
A robot is equipped with a main control chip, and the main control chip is the chip. The robot is positioned by adopting a robot positioning method based on data fusion, so that the positioning speed of the robot in different environments is improved.
Drawings
FIG. 1 is a flow chart of a robot positioning method based on data fusion in accordance with the present invention;
FIG. 2 is a schematic view of a first coordinate system of the robot of the present invention;
fig. 3 is a schematic diagram of a second coordinate system of the robot of the present invention.
Detailed Description
Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout.
In the description of the present invention, it should be noted that, for the terms of orientation, such as "central", "lateral", "longitudinal", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", etc., it indicates that the orientation and positional relationship shown in the drawings are based on the orientation or positional relationship shown in the drawings, and is only for the convenience of describing the present invention and simplifying the description, but does not indicate or imply that the device or element referred to must have a specific orientation, be constructed in a specific orientation, and be operated without limiting the specific scope of protection of the present invention.
Furthermore, if the terms "first" and "second" are used for descriptive purposes only, they are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features. Thus, a definition of "a first" or "a second" feature may explicitly or implicitly include one or more of the feature, and in the description of the invention, "at least" means one or more unless specifically defined otherwise.
In the present invention, unless otherwise expressly specified or limited, the terms "assembled", "connected", and "connected" are to be construed broadly, e.g., as meaning fixedly connected, detachably connected, or integrally connected; or may be a mechanical connection; the two elements can be directly connected or connected through an intermediate medium, and the two elements can be communicated with each other. The specific meanings of the above terms in the present invention can be understood by those of ordinary skill in the art according to specific situations.
In the present invention, unless otherwise specified and limited, "above" or "below" a first feature may include the first and second features being in direct contact, and may also include the first and second features not being in direct contact but being in contact with each other through another feature therebetween. Also, the first feature being "above," "below," and "above" the second feature includes the first feature being directly above and obliquely above the second feature, or simply an elevation which indicates a level of the first feature being higher than an elevation of the second feature. The first feature being "above", "below" and "beneath" the second feature includes the first feature being directly below or obliquely below the second feature, or merely means that the first feature is at a lower level than the second feature.
The technical scheme and the beneficial effects of the invention are clearer and clearer by further describing the specific embodiment of the invention with the accompanying drawings of the specification. The embodiments described below are exemplary and are intended to be illustrative of the invention, but are not to be construed as limiting the invention.
Referring to fig. 1, a robot positioning method based on data fusion includes the following steps: s1: the robot acquires laser data of obstacles in the surrounding environment by transmitting and receiving laser through a laser radar; s2: the robot walks, and the robot acquires IMU data in the walking process; s3: the robot fuses the laser data and the IMU data to determine its position relative to the obstacle. The robot acquires laser data as basic data through a general laser radar, and performs positioning by fusing IMU data, so that the production cost of the robot is reduced, and the positioning speed of the robot is increased.
As one example, the IMU is: an Inertial measurement unit (IMU for short). The IMU at least comprises a gyroscope and an accelerometer, and the robot acquires the angular velocity and the acceleration of the robot within a set time. By collecting IMU data from the time t1 to the time t2, the relative motion posture of the robot from the time t1 to the time t2 can be estimated through the IMU data. Therefore, the robot coordinate at the time t2 can be calculated from the robot coordinate at the time t 1. For a robot moving on a plane, the rotation direction of the robot is determined according to the angular velocity in the IMU data, and the motion distance of the robot can be estimated according to the acceleration of the robot, so that the position of the relative motion of the robot is calculated. The IMU is a common module of the robot, the robot does not need to additionally increase a functional module, and the production cost is reduced.
As an embodiment, the lidar is used for detecting the distance between the robot and an obstacle in the surrounding 360-degree environment, the lidar of the robot rotates for one circle, and the environment information around the robot is equally divided into a specific number of parts of laser data according to the angular resolution of the lidar. The robot equally divides the environment information, so that the acquired data are comprehensive, and the robot can calculate conveniently. The laser radar generally adopts a triangular ranging method, the frame rate is generally 5 revolutions per second, and the angular resolution is generally 1 degree. The robot makes 360 degrees of rotation, and one datum is obtained every 1 degree, so that 360 data can be obtained, and the numerical value is the distance of the obstacle measured by the angle. The angular resolution of the low-cost laser radar is generally 1 degree, the production cost of the robot can be reduced by adopting the laser radar to acquire data, and the calculation result of the robot is not influenced. The set time is less than the time interval between the two frames of laser data received by the robot. And the IMU data is used for calculating the laser data between two frames of laser data intervals, so that the range of receiving the laser data by the robot is increased. If the rotation speed of the laser radar is 5 revolutions per second, namely the interval between two frames of laser data is 200ms, the robot can use the IMU data to calculate the laser data at any time within the 200 ms. The robot can calculate as long as receiving IMU data, the calculating speed is high, and the response speed of the robot is improved.
As an example, as shown in fig. 2 and 3, the robot 1 establishes a first coordinate system with itself as an origin, calculates coordinate information of an obstacle on the first coordinate system from laser data, and sets the current robot 1 coordinate as (rx0, ry0) with the laser 0-degree emission direction of the laser radar 2 as the positive direction of the x-axis, when the laser radar 2 rotates clockwise, the obstacle coordinates are (d cos θ, -d sin θ), and when the laser radar 2 rotates counterclockwise, the obstacle coordinates are (d cos θ, d sin θ); wherein d is the distance between the robot 1 and the obstacle measured by the laser radar 2, and theta is the ranging rotation angle of the laser radar 2. The robot 1 calculates the relative motion attitude of the robot 1 according to IMU data within a set time, and acquires position coordinates (rx1, ry1) of the robot 1 on a first coordinate system after moving, the robot 1 establishes a second coordinate system with the position coordinates (rx1, ry1) as an origin, the laser 0-degree emission direction of the laser radar 2 is taken as the positive direction of an x axis, and calculates coordinate information of an obstacle on the second coordinate system according to the coordinate information of the obstacle on the first coordinate system, for example, in the surrounding environment of the robot 1, the coordinates of the obstacle are (x0, y0), and the coordinate information (x1, y1) of the obstacle on the second coordinate system are (x1, y1) According to the coordinate information of the obstacle on the second coordinate system, the direction, front, back, left, right and the like of the obstacle in which the robot 1 is positioned are obtained, and then the distance formula of the coordinates of the two points is usedThe distance between the robot 1 and the obstacle is obtained. The position of the robot 1 relative to the obstacle after moving is calculated by adopting the IMU data through a method for establishing a coordinate system, the positioning speed of the robot is determined by the data acquisition and transmission speed of the IMU, and the laser data acquisition speed is improved without higher cost.
A chip is internally provided with a control program, and the control program is used for controlling a robot to execute the robot positioning method based on data fusion. The robot can be positioned by a data fusion method by loading the robot in different robots, so that the applicability is strong.
A robot is equipped with a main control chip, and the main control chip is the chip. The robot is positioned by adopting a robot positioning method based on data fusion, so that the positioning speed of the robot in different environments is improved.
In the description of the specification, reference to the description of "one embodiment", "preferably", "an example", "a specific example" or "some examples", etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention, and schematic representations of the terms in this specification do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. The connection mode connected in the description of the specification has obvious effects and practical effectiveness.
With the above structure and principle in mind, those skilled in the art should understand that the present invention is not limited to the above embodiments, and modifications and substitutions based on the known technology in the field are within the scope of the present invention, which should be limited by the claims.
Claims (9)
1. A robot positioning method based on data fusion is characterized by comprising the following steps:
s1: the robot acquires laser data of obstacles in the surrounding environment by transmitting and receiving laser through a laser radar;
s2: the robot walks, and the robot acquires IMU data in the walking process;
s3: the robot fuses the laser data and the IMU data to determine its position relative to the obstacle.
2. The method for robot positioning based on data fusion of claim 1, wherein the specific steps of step S1 are as follows: the laser radar of the robot rotates for one circle, and the environmental information around the robot is equally divided into specific parts of laser data according to the angular resolution of the laser radar.
3. The data fusion-based robot positioning method according to claim 2, wherein the angular resolution of the lidar is 1 degree, and 360 parts of lidar data are acquired by one rotation of the lidar.
4. The method for robot positioning based on data fusion of claim 1, wherein the specific steps of step S2 are as follows: the IMU at least comprises a gyroscope and an accelerometer, and the robot determines the rotation angle and the walking distance of the robot within set time according to the acquired data of the gyroscope and the acquired data of the accelerometer.
5. The method for robot positioning based on data fusion of claim 1, wherein the specific steps of step S3 are as follows: the robot establishes a first coordinate system by taking the robot as an origin, calculates coordinate information of an obstacle on the first coordinate system according to laser data, calculates a relative motion attitude of the robot according to IMU data in set time, acquires position coordinates of the robot on the first coordinate system after moving, establishes a second coordinate system by taking the position coordinates as the origin, calculates coordinate information of the obstacle on the second coordinate system according to the coordinate information of the obstacle on the first coordinate system, and determines the position of the robot relative to the obstacle according to the coordinate information of the obstacle on the second coordinate system.
6. The data fusion-based robot positioning method according to claim 5, wherein the set time is less than a time interval between two frames of laser data received by the robot.
7. The data fusion-based robot positioning method according to claim 5, wherein the position of the robot relative to the obstacle comprises an angular direction of the robot at the obstacle and a distance between the robot and the obstacle in the constructed coordinate system.
8. A chip with a built-in control program, wherein the control program is used for controlling a robot to execute the data fusion-based robot positioning method of any one of claims 1 to 7.
9. A robot equipped with a master control chip, characterized in that the master control chip is the chip of claim 8.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010570130.4A CN111947649A (en) | 2020-06-21 | 2020-06-21 | Robot positioning method based on data fusion, chip and robot |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010570130.4A CN111947649A (en) | 2020-06-21 | 2020-06-21 | Robot positioning method based on data fusion, chip and robot |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN111947649A true CN111947649A (en) | 2020-11-17 |
Family
ID=73337112
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202010570130.4A Pending CN111947649A (en) | 2020-06-21 | 2020-06-21 | Robot positioning method based on data fusion, chip and robot |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN111947649A (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112711257A (en) * | 2020-12-25 | 2021-04-27 | 珠海市一微半导体有限公司 | Robot edge method based on single-point TOF, chip and mobile robot |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106291535A (en) * | 2016-07-21 | 2017-01-04 | 触景无限科技(北京)有限公司 | A kind of obstacle detector, robot and obstacle avoidance system |
| CN107562048A (en) * | 2017-08-08 | 2018-01-09 | 浙江工业大学 | Dynamic obstacle avoidance control method based on laser radar |
| CN110488818A (en) * | 2019-08-08 | 2019-11-22 | 深圳市银星智能科技股份有限公司 | A kind of robot localization method, apparatus and robot based on laser radar |
| CN110873883A (en) * | 2019-11-29 | 2020-03-10 | 上海有个机器人有限公司 | Positioning method, medium, terminal and device integrating laser radar and IMU |
-
2020
- 2020-06-21 CN CN202010570130.4A patent/CN111947649A/en active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106291535A (en) * | 2016-07-21 | 2017-01-04 | 触景无限科技(北京)有限公司 | A kind of obstacle detector, robot and obstacle avoidance system |
| CN107562048A (en) * | 2017-08-08 | 2018-01-09 | 浙江工业大学 | Dynamic obstacle avoidance control method based on laser radar |
| CN110488818A (en) * | 2019-08-08 | 2019-11-22 | 深圳市银星智能科技股份有限公司 | A kind of robot localization method, apparatus and robot based on laser radar |
| CN110873883A (en) * | 2019-11-29 | 2020-03-10 | 上海有个机器人有限公司 | Positioning method, medium, terminal and device integrating laser radar and IMU |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112711257A (en) * | 2020-12-25 | 2021-04-27 | 珠海市一微半导体有限公司 | Robot edge method based on single-point TOF, chip and mobile robot |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN111983936B (en) | Unmanned aerial vehicle semi-physical simulation system and evaluation method | |
| US8775063B2 (en) | System and method of lane path estimation using sensor fusion | |
| Lee et al. | Robust mobile robot localization using optical flow sensors and encoders | |
| CN111982114B (en) | Rescue robot for estimating three-dimensional pose by adopting IMU data fusion | |
| CN108375370A (en) | A kind of complex navigation system towards intelligent patrol unmanned plane | |
| CN208953962U (en) | A kind of robot tracking control and robot | |
| WO2019119289A1 (en) | Positioning method and device, electronic apparatus, and computer program product | |
| CN108845335A (en) | Unmanned aerial vehicle ground target positioning method based on image and navigation information | |
| CN113093759A (en) | Robot formation construction method and system based on multi-sensor information fusion | |
| KR20160120467A (en) | Azimuth correction apparatus and method of 2-dimensional radar for vehicle | |
| CN108801253A (en) | Robot builds figure positioning system and robot | |
| CN111947649A (en) | Robot positioning method based on data fusion, chip and robot | |
| CN112698654A (en) | Single-point TOF-based mapping and positioning method, chip and mobile robot | |
| Kang et al. | Ultra-wideband aided UAV positioning using incremental smoothing with ranges and multilateration | |
| CN108322698B (en) | System and method based on fusion of multiple cameras and inertial measurement unit | |
| CN111948673A (en) | Method and robot for updating laser data based on IMU data | |
| CN115290090A (en) | SLAM map construction method based on multi-sensor information fusion | |
| CN112747746A (en) | Point cloud data acquisition method based on single-point TOF, chip and mobile robot | |
| JP2019148456A (en) | Calculation device, self-location calculation method and program | |
| CN120194679A (en) | Robot positioning method and electronic equipment | |
| CN111949016A (en) | Obstacle detection method fusing laser and IMU data, chip and robot | |
| CN116559888B (en) | Indoor positioning methods, devices, electronic equipment and storage media for agricultural robots | |
| CN111897337A (en) | Obstacle avoidance control method and control system for robot walking along edge | |
| CN111397575A (en) | Vehicle body posture detection device of unmanned vehicle | |
| Schmitz et al. | Fault-tolerant 3d localization for outdoor vehicles |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| CB02 | Change of applicant information |
Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong Applicant after: Zhuhai Yiwei Semiconductor Co.,Ltd. Address before: 519000 room 105-514, No. 6, Baohua Road, Hengqin new area, Zhuhai City, Guangdong Province (centralized office area) Applicant before: AMICRO SEMICONDUCTOR Co.,Ltd. |
|
| CB02 | Change of applicant information | ||
| RJ01 | Rejection of invention patent application after publication |
Application publication date: 20201117 |
|
| RJ01 | Rejection of invention patent application after publication |


