CN109202861B - Medicament allotment robot integration platform - Google Patents
Medicament allotment robot integration platform Download PDFInfo
- Publication number
- CN109202861B CN109202861B CN201710519673.1A CN201710519673A CN109202861B CN 109202861 B CN109202861 B CN 109202861B CN 201710519673 A CN201710519673 A CN 201710519673A CN 109202861 B CN109202861 B CN 109202861B
- Authority
- CN
- China
- Prior art keywords
- robot
- platform
- environment
- model
- robot body
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 239000003814 drug Substances 0.000 title claims abstract description 25
- 230000010354 integration Effects 0.000 title description 6
- 230000007613 environmental effect Effects 0.000 claims abstract description 14
- 238000011217 control strategy Methods 0.000 claims abstract description 5
- 230000033001 locomotion Effects 0.000 claims description 21
- 238000013528 artificial neural network Methods 0.000 claims description 6
- 238000005516 engineering process Methods 0.000 claims description 6
- 238000012549 training Methods 0.000 claims description 6
- 230000006870 function Effects 0.000 claims description 5
- 238000013507 mapping Methods 0.000 claims description 4
- 230000008447 perception Effects 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 claims description 4
- 238000013135 deep learning Methods 0.000 claims description 3
- 230000002452 interceptive effect Effects 0.000 claims description 2
- 239000008177 pharmaceutical agent Substances 0.000 claims 2
- 230000003993 interaction Effects 0.000 abstract description 6
- 230000004927 fusion Effects 0.000 abstract description 5
- 238000003062 neural network model Methods 0.000 abstract description 3
- 238000000034 method Methods 0.000 description 12
- 229940079593 drug Drugs 0.000 description 9
- 230000008569 process Effects 0.000 description 9
- 238000013461 design Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 2
- 210000001503 joint Anatomy 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0084—Programme-controlled manipulators comprising a plurality of manipulators
- B25J9/0087—Dual arms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
The embodiment of the invention discloses a medicament allocation robot platform, which comprises a robot body, a medicament allocation robot platform and a medicament allocation robot platform, wherein the robot body comprises a mechanical structure, an electronic circuit and a controller of a robot; the sensor array is arranged on the robot body and used for sensing environmental data of the robot body; a sensing system to receive and transmit environmental data sensed by the sensor array; and the cloud platform is used for adjusting the control strategy of the robot or/and modifying the function of the robot based on the environment data transmitted by the sensing system. The embodiment of the invention is based on a multi-sensor fusion system and a dynamic capture database, and realizes modeling of the working environment of the robot and spatial positioning of the tail end of the arm through a neural network model, thereby effectively reducing the difficulty of human-computer interaction and avoiding excessive human intervention.
Description
Technical Field
The invention relates to the field of intelligent manufacturing and automatic control, in particular to a robot integration platform for dispensing medicaments, and particularly relates to a double-arm medicament dispensing robot integration platform.
Background
With the wide application of the intelligent robot technology in the leading-edge fields of precision manufacturing, aerospace and the like, the intelligent robot technology is also gradually applied to civil fields such as industry, education, medical services and the like. The existing intelligent robot used in the civil field is often limited by the factors of the rigidity of the robot body, complex programming control and the like, better human-computer interaction cannot be carried out, and the aims of industrial upgrading and complete manual release by replacing manual work with the robot cannot be achieved.
At present, robots (such as medicament deployment robots) in the field of medical services are mainly based on traditional industrial robot platforms, and a robot kinematics model is applied to a vertical multi-joint type serial robot to establish a mapping relation between robot joints and a user space, so that position control of each joint is realized. The control method has the disadvantages that the robot control system designed based on the framework loses the flexibility of robot control and the convenience of human-computer interaction while ensuring the positioning accuracy of the robot. Moreover, the perception of the external environment by the robot control system with the structural design needs to be completely dependent on the experience of the operator.
Therefore, there is a need for an intelligent robotic platform with positioning accuracy, flexible control and easy human-computer interaction to achieve precise dispensing of medication without requiring much intervention by a doctor or nurse, thereby putting an experienced doctor or nurse into more valuable work.
Disclosure of Invention
Aiming at the problems of the existing medicament allocation robot, the invention provides a medicament allocation robot platform which is based on a multi-sensor fusion system and a dynamic capture database and realizes modeling of the working environment of the robot and the space positioning of the tail end of an arm through a neural network model, thereby effectively reducing the difficulty of human-computer interaction and avoiding excessive human intervention. The scheme of the medicament preparing robot platform is as follows:
a medicament deployment robot platform, comprising: the robot comprises a robot body, a control unit and a control unit, wherein the robot body comprises a mechanical structure, an electronic circuit and a controller of the robot; the sensor array is arranged on the robot body and used for sensing environmental data of the robot body; a sensing system to receive and transmit environmental data sensed by the sensor array; and the cloud platform is used for adjusting the control strategy of the robot or/and modifying the function of the robot based on the environment data transmitted by the sensing system.
Preferably, the robot forms interactive feedback data based on the environmental data transmitted by the sensing system, and controls the motion trajectory and the start-stop state of the robot body.
Preferably, the perception system periodically transmits the environment data to the cloud platform.
Preferably, the cloud platform is connected with the robot body through a wireless network.
Preferably, the sensor array comprises one or more of a visual sensor, an acoustic sensor, a laser distance type sensor.
Preferably, the robot is provided with a motion decoupling model for mapping the joint space of the robot to a three-dimensional space.
Preferably, the motion decoupling model comprises a native built-in model and an environment learning model, the native built-in model models and measures the working space of the robot to realize basic modeling of the robot motion, and the environment learning model imports cloud data by setting a typical working environment to rapidly model the working environment of the robot.
Preferably, the environment learning model supports the robot to perform deep learning and training on a specific environment through a neural network technology.
Preferably, the robot body is a two-arm robot.
Preferably, each arm of the two-arm robot has 7 degrees of freedom.
According to the technical scheme, the embodiment of the invention has the following advantages:
the embodiment of the invention is based on the sensor array and the dynamic capture database, and realizes the modeling of the working environment of the robot body and the space positioning of the tail end of the robot arm through the learning and training of the neural network model. Specifically, the database only needs to perform dragging training iteration on the mechanical arm working space motion to form motion capture, the database is compared and corrected with a neural network motion model of the robot, the database is gradually corrected, and motion positioning of all the mechanical arm working spaces is gradually perfected by means of a large amount of training. Therefore, the embodiment of the invention can realize online or offline programming of the robot through dragging teaching, greatly reduce the difficulty of the robot in the field of human-computer cooperation and effectively reduce the medical cost.
Drawings
Fig. 1 is a schematic diagram of a logic architecture of an integrated platform of a drug dispensing robot according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a mechanical structure of a dual-arm drug dispensing robot according to an embodiment of the present invention;
fig. 3 is a schematic view of an environment learning model of an integrated platform of a dual-arm drug dispensing robot according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a schematic diagram of a logic architecture of an integrated platform of a drug dispensing robot according to an embodiment of the present invention. In this embodiment, the integrated pharmaceutical dispensing robot platform includes a robot body 10, a sensor array 40, a sensing system 30, and a cloud platform 20.
The robot body 10 mainly includes a mechanical structure, an electronic circuit, and a controller of the robot. In a preferred embodiment, the robot body 10 employs a two-arm robot having 7 degrees of freedom per arm. Fig. 2 is a schematic diagram of a mechanical structure of a dual-arm drug dispensing robot according to the present embodiment. The robot body 10 includes a base 105, a body 102, a head 101, and two arms (103 and 104) on both sides of the body. The two arms realize the medicament allocation work through accurate positioning and control. The double-arm robot adopts the same master control system to control, and compared with a plurality of single-arm robots, the hardware expenditure in the aspects of the microprocessor and the sensor is reduced, so that the cost is effectively reduced. The double-arm robot also reduces the communication pressure in the multi-machine cooperative work, and the cooperativity in the working process of the double arms is greatly improved, thereby effectively improving the safety of the robot.
The integrated platform of the double-arm robot has the following requirements on electromechanical composition: by adopting the integrated joint design, the occupied space is effectively reduced, and the robot is easy to deploy and move due to the smaller volume and lighter body design. In the embodiment, each arm of the double-arm robot has 7 degrees of freedom respectively, and the design of the arms of the double-arm robot is simulated, so that the collision avoidance, the optimal path trajectory planning, the minimum energy consumption trajectory planning and the like for obstacles in a complex environment can be realized. During long-term use, the 7-degree-of-freedom degree can show better flexibility and flexibility than the 6-degree-of-freedom degree. The double-arm robot adopts the high-performance main control unit to carry out centralized control on the control system, can support cloud scheduling, and is convenient for an operator to carry out remote monitoring and remote control on the working state of the robot through the portable terminals such as the smart phone and the tablet personal computer.
The sensor array 40 is disposed on the robot body 10 and is mainly used for sensing environmental data of the robot body 10. The sensor array 40 includes one or more of a visual sensor, an acoustic sensor, a laser distance type sensor. Preferably, the sensor array 40 further includes mechanical sensors, tactile sensors, and the like.
The sensing system 30 is used to receive and transmit environmental data sensed by the sensor array 40. The sensing system 30 interactively feeds back the environmental data and the robot body 10, and the robot body 10 transmits the motion data to the sensing system 30. The motion of the robot body 10 and the sensor array 40 can be parallel, and the robot body 10 can acquire the environment interaction feedback data transmitted by the sensing system 30 to control the motion track and the start-stop state of the robot body. The integrated platform of the medicament allocation robot adopts a sensing system based on multi-sensor fusion, so that the robot body can be rapidly deployed and safely controlled, an operator can use the robot in a completely relaxed state, and the condition cannot be met by a traditional robot control platform.
The cloud platform 20 adjusts the control strategy of the robot body 10 or/and modifies the function of the robot based on the environment data (i.e. state information) transmitted by the sensing system 30. Based on the cloud platform 20, an operator can realize a remote control function for the robot, and the processes of deployment, modification, logout and the like of the working process of the robot in an unattended state can be realized by combining a perception system, so that man-machine cooperation is really realized. The cloud platform 20 serves as a master control component, collects control system state data and environment data, modifies control strategies in a targeted manner, and provides a function of directly manipulating the robot through the portable terminal. The robot body 10 is in butt joint with cloud service through a wireless network, so that the safety and privacy of the intelligent terminal are effectively provided, and the robot integration platform is interconnected. In a preferred embodiment, the cloud data transmission can be based on secure encrypted communication, so that the secure control of the robot body by an operator is effectively ensured. Based on high in the clouds interconnection, the state control of robot can carry out the propelling movement through intelligent terminal, realizes effectively the intelligence and the use of being careful to the robot.
The sensing system 30 combines sensor data such as vision, touch, distance and the like with a cloud database through a multi-sensor fusion system to realize deep sensing and learning of the robot body to the working environment, and effectively improve the safety and flexibility of the robot in working. The sensing system 30 can collect user usage data, detect user usage habits, provide product usage experience and cautions through the cloud platform, and facilitate maximum protection of user rights and maximum output of the usage value of the robot platform. In a preferred embodiment, the environmental data is periodically sent to the cloud platform 20 through the sensing system 30 for the cloud to modify and update the accessed robot body.
In this embodiment the robot body 10 is provided with a motion decoupling model for mapping the joint space of the robot into a three-dimensional space. In this embodiment, the drug dispensing robot integration platform adopts the dynamic capture database to perform dual modeling for the robot body and the external environment, and is mainly divided into two units: a native built-in model and an environmental learning model. The primary built-in model carries out modeling and measurement aiming at the working space of the robot so as to realize basic modeling of the robot motion, and the motion space range and the position of a high-frequency use space of the robot are calibrated through a multi-sensor fusion technology so as to realize the basic modeling of the robot motion. The environment learning model is used for rapidly modeling the working environment of the robot by setting typical working environment leading-in cloud data, and meanwhile, deep learning of a specific environment through a neural network technology can be supported. After the neural network training, the robot can perform more freely. The basic model of the robot detection environment space is trained through the neural network, and the complexity of the learning model can be customized according to an expected learning period.
Fig. 3 is a schematic view of an environment learning model of an integrated platform of a dual-arm drug dispensing robot according to an embodiment of the present invention. In the actual research and development process, a functional interface for adjusting the complexity of the model according to expected deployment time is provided, and the deployment conditions of the robot can be set from the local and cloud sides.
In the embodiment of the invention, the medicament preparation robot platform adopts a more convenient dragging control mode and is assisted by a sensing system, so that the operation process of the robot is simplified, and the working safety of the robot is ensured; through visual compensation in the sensing array, the robot working space can be corrected with high precision, the movement precision in the drug configuration process is ensured, and the safety of the operation process can be further ensured; the robot is provided with a force sensor and a vision sensor, can be safely protected by identifying a human body and sensing touch, and can avoid and stop suddenly when necessary, so that the safety performance is effectively improved; the robot can carry out intelligent learning to the working space scope, can find commonly used, reasonable working position to reduce the probability that directly or indirectly causes the injury to the human body, be particularly suitable for the application scene of hospital class to the security is high.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
Claims (8)
1. A medicament deployment robot platform, comprising:
a robot body including a mechanical structure of a robot, an electronic circuit, and a controller;
the sensor array is arranged on the robot body and used for sensing environmental data of the robot body;
a sensing system to receive and transmit environmental data sensed by the sensor array;
the cloud platform is used for adjusting the control strategy of the robot or/and modifying the function of the robot based on the environment data transmitted by the perception system;
the robot is provided with a motion decoupling model for mapping the joint space of the robot to a three-dimensional space; the motion decoupling model comprises a native built-in model and an environment learning model, the native built-in model is used for modeling and measuring the working space of the robot to realize basic modeling of the robot motion, and the environment learning model is used for importing cloud end data by setting a typical working environment to rapidly model the working environment of the robot.
2. The platform of claim 1, wherein the robot forms interactive feedback data based on the environmental data transmitted by the sensing system to control the motion trajectory and start/stop state of the robot body.
3. The robotic platform of claim 1, wherein the sensing system periodically transmits the environmental data to the cloud platform.
4. The robot platform of claim 1, wherein the cloud platform is connected to the robot body via a wireless network.
5. The robotic platform of claim 1, wherein the sensor array comprises one or more of a visual sensor, an acoustic sensor, and a laser distance sensor.
6. The robot platform of claim 1, wherein the environment learning model supports deep learning and training of the robot for a specific environment through neural network technology.
7. The robotic platform for dispensing pharmaceutical agents of claim 1, wherein the robotic body is a two-armed robot.
8. The robotic platform for dispensing pharmaceutical agents of claim 7, wherein each arm of said two-arm robot has 7 degrees of freedom.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710519673.1A CN109202861B (en) | 2017-06-30 | 2017-06-30 | Medicament allotment robot integration platform |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710519673.1A CN109202861B (en) | 2017-06-30 | 2017-06-30 | Medicament allotment robot integration platform |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN109202861A CN109202861A (en) | 2019-01-15 |
| CN109202861B true CN109202861B (en) | 2021-11-23 |
Family
ID=64960961
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201710519673.1A Active CN109202861B (en) | 2017-06-30 | 2017-06-30 | Medicament allotment robot integration platform |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN109202861B (en) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115890656A (en) * | 2022-10-25 | 2023-04-04 | 中国电信股份有限公司 | Warehousing logistics robot control method, system, electronic device and storage medium |
| CN119347728B (en) * | 2024-12-09 | 2025-10-24 | 佛山显扬科技有限公司 | A fragrance-making dual-arm robot system based on three-dimensional machine vision |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN100498600C (en) * | 2007-09-18 | 2009-06-10 | 湖南大学 | Large condenser underwater operation environment two-joint robot control method |
| US9089353B2 (en) * | 2011-07-11 | 2015-07-28 | Board Of Regents Of The University Of Nebraska | Robotic surgical devices, systems, and related methods |
| CN105127997B (en) * | 2015-08-10 | 2017-04-05 | 深圳百思拓威机器人技术有限公司 | Pharmacists' intelligent robot system and its control method |
| CN106695840A (en) * | 2017-03-09 | 2017-05-24 | 哈尔滨理工大学 | Remote monitored robot based on instruction navigation |
-
2017
- 2017-06-30 CN CN201710519673.1A patent/CN109202861B/en active Active
Also Published As
| Publication number | Publication date |
|---|---|
| CN109202861A (en) | 2019-01-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Zhou et al. | IoT-enabled dual-arm motion capture and mapping for telerobotics in home care | |
| CN103192390B (en) | Control system of humanoid robot | |
| CN113829343B (en) | Real-time multitasking and multi-man-machine interaction system based on environment perception | |
| Li et al. | A dexterous hand-arm teleoperation system based on hand pose estimation and active vision | |
| CN113183147B (en) | Large-area coverage electronic skin system with remote proximity sense | |
| Dean-Leon et al. | Whole-body active compliance control for humanoid robots with robot skin | |
| CN108127673A (en) | A kind of contactless robot man-machine interactive system based on Multi-sensor Fusion | |
| CN110039547A (en) | A kind of human-computer interaction terminal and method of flexible mechanical arm remote operating | |
| WO2017115385A2 (en) | System and method for operating and controlling a hyper configurable humanoid robot to perform multiple applications in various work environments | |
| CN114800535A (en) | Robot control method, mechanical arm control method, robot and control terminal | |
| CN108406798A (en) | A kind of man-machine interactive system of Service Robots | |
| Garrido et al. | Modular design and control of an upper limb exoskeleton | |
| CN109202861B (en) | Medicament allotment robot integration platform | |
| CN105328701A (en) | Teaching programming method for series mechanical arms | |
| CN105014672A (en) | A wearable robot control system for assisting the disabled | |
| Yang et al. | Sensor fusion-based teleoperation control of anthropomorphic robotic arm | |
| WO2025179628A1 (en) | Multi-modal shared teleoperation system and method for three-arm space robot | |
| CN104002307A (en) | Wearable rescue robot control method and system | |
| CN107703842A (en) | A kind of kitchen automatic mode based on machine vision and mode identification technology | |
| CN118528270A (en) | Motion control method and system for humanoid robot | |
| CN112171672B (en) | System and method for monitoring and controlling movement behaviors of insect robot | |
| Ai et al. | Master-slave control technology of isomeric surgical robot for minimally invasive surgery | |
| CN103213143A (en) | Multi-element touch sense interactive perceiving system with temperature perceiving function | |
| CN112631148A (en) | Exoskeleton robot platform communication protocol and online simulation control system | |
| CN111309152A (en) | Man-machine flexible interaction system and method based on intention recognition and impedance matching |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |