CN106325065A - Robot interactive behavior control method, device and robot - Google Patents
Robot interactive behavior control method, device and robot Download PDFInfo
- Publication number
- CN106325065A CN106325065A CN201510363348.1A CN201510363348A CN106325065A CN 106325065 A CN106325065 A CN 106325065A CN 201510363348 A CN201510363348 A CN 201510363348A CN 106325065 A CN106325065 A CN 106325065A
- Authority
- CN
- China
- Prior art keywords
- perception
- robot
- control entries
- control
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Manipulator (AREA)
Abstract
The invention provides a robot interactive behavior control method, a robot interactive behavior control device and a robot. The control method includes the following steps that: information sensed by the robot is acquired; at least one pre-defined sensing unit generates sensing data containing the identifiers and values of the sensing units according the sensed information; a control entry which is matched with the generated sensing data and is used for responding to the information sensed by the robot to control the behaviors of the robot is searched, wherein the control entry contains a triggering condition formed by the at least one sensing unit and behaviors triggered by the triggering condition; and if the control entry matched the generated sensing data is found, the robot is made to perform the behaviors in the found control entry. According to the robot interactive behavior control method, the robot interactive behavior control device and the robot of the invention, the sensing unit is defined as a minimum unit for controlling the interactive behaviors of the robot; and the control entry is set according to the sensing unit so as to control the interactive behaviors of the robot, and therefore, the adaptive interactive behavior ability and intelligent degree of the robot can be effectively improved.
Description
Technical field
The present invention relates to robotics, particularly to the control method of a kind of robot interactive behavior, device and
Robot.
Background technology
Current robot mostly is industrial robot, and in the majority with unaware ability of industrial robot.These robots
Operation sequence all pre-establish, and repeat the task of inerrably completing to determine according to preset program.They lack
Adaptability, only when referent is identical, could produce consistent result.
Summary of the invention
Embodiments provide control method, device and the robot of a kind of robot interactive behavior, with at least
Effect improves robot adaptive interaction behavioral competence and intelligence degree.
In certain embodiments, the control method of a kind of robot interactive behavior, including: obtain what robot perception arrived
Information;According to the information perceived, at least generate perception data, wherein, this sense according to predefined perception unit
Primary data includes mark and the value of perception unit;The control entries that the perception data searched and generate mates, wherein,
Each control entries comprises trigger condition and the behavior of this trigger condition triggering, and each trigger condition is by least one perception
Unit is constituted;If finding the control entries that the perception data with above-mentioned generation mates, robot is made to perform to find
Control entries in behavior.
In certain embodiments, the control method of a kind of robot interactive behavior, including: at least one information of perception;
According to the information perceived, at least generate perception data, wherein, perception data bag according to predefined perception unit
Include mark and the value of perception unit;The perception data of generation is sent;Receive and said sensed Data Matching
Control entries, wherein, this control entries comprises trigger condition and the behavior of this trigger condition triggering, and trigger condition is by extremely
A few perception unit is constituted;Perform the interbehavior in the control entries received.
In certain embodiments, the control method of a kind of robot interactive behavior, including: provide and comprise multiple control strip
Purpose control entries document, wherein, each control entries comprises trigger condition and the behavior of this trigger condition triggering, often
Individual trigger condition is made up of at least one predefined perception unit;The perception data of robot is entered with control entries
Row coupling, the control entries mated with the perception data determined whether there is with described robot, wherein, robot
Information that perception data arrives according to robot perception, at least generate according to predefined perception unit.
In certain embodiments, the control device of a kind of robot interactive behavior, including: acquisition module, it is used for obtaining
The information that robot perception arrives;Generation module, for according to the information perceived, at least according to predefined perception
Unit generates perception data, and wherein, perception data includes mark and the value of perception unit;Search module, be used for looking into
The control entries that the perception data looked for and generate mates, wherein, each control entries comprises trigger condition and this triggering bar
The behavior that part triggers, each trigger condition is made up of at least one perception unit;Perform module, for when find with
The control entries of the perception data coupling generated, makes robot perform the behavior in the control entries found.
In certain embodiments, the control device of a kind of robot interactive behavior, including: sensing module, for perception
At least one information;Generation module, for according to the information perceived, at least according to the life of predefined perception unit
Becoming perception data, wherein, perception data includes mark and the value of perception unit;Sending module, for by generation
Perception data sends;Receiver module, for receiving the information of the control entries mated with described perception data, its
In, this control entries comprises trigger condition and the behavior of this trigger condition triggering, and trigger condition is by least one perception list
Unit is constituted;Perform module, for performing the interbehavior in this control entries according to the information of control entries.
In certain embodiments, the control device of a kind of robot interactive behavior, including: receiver module, it is used for receiving
The perception data of robot, wherein, information that perception data arrives according to robot perception, at least according to predefined
Perception unit generates;Search module, for searching the control entries mated with the perception data of robot, wherein, often
Individual control entries comprises trigger condition and the behavior of this trigger condition triggering, and each trigger condition is by least one perception list
Unit is constituted;Perform module, for when finding the control entries corresponding with the perception data of robot, making robot hold
The behavior that in the control entries that row finds, trigger condition triggers.
In certain embodiments, the control device of a kind of robot interactive behavior, including: control entries document, it is used for
Thering is provided the control entries document comprising multiple control entries, wherein, each control entries comprises trigger condition and this triggering
The behavior that condition triggers, each trigger condition is made up of at least one predefined perception unit;Matching module, uses
In the perception data of robot is mated with control entries, to determine whether there is the perception data with robot
The control entries joined, wherein, information that perception data arrives according to robot perception, at least according to predefined perception
Unit generates.
Said method can be performed by robot, and wherein said robot has: one or more unit, memorizer,
And one or more preservation is in memory to perform the module of these methods, program or instruction set.
The calculating being configured to be performed by one or more processors can be included in for performing the instruction of said method
In machine program product.
In embodiments of the present invention, it is proposed that the control method of a kind of robot, device and robot, pre-defined
Control the perception unit of robot interactive behavior, as the minimum unit of control robot interactive behavior, according to sense
Know the interbehavior that unit arranges trigger condition and trigger condition is triggered, obtain controlling the control entries of robot, system
The input and output standard that one robot controls so that non-technical personnel can also the behavior of editor robot, thus just
In the interbehavior of control robot, it is effectively improved robot adaptive interaction behavioral competence and intelligence degree.
Accompanying drawing explanation
Accompanying drawing described herein is used for providing a further understanding of the present invention, constitutes the part of the application, not
Constitute limitation of the invention.In the accompanying drawings:
Fig. 1 explanation is according to the structural representation of the robot of certain embodiments of the invention;
Fig. 2 explanation is according to the flow chart of the generation method of the control data of the robot of certain embodiments of the invention;
Fig. 3 explanation is according to the flow chart one of the control method of the robot interactive behavior of certain embodiments of the invention;
Fig. 4 explanation is according to the flowchart 2 of the control method of the robot interactive behavior of certain embodiments of the invention;
Fig. 5 explanation is according to the flow chart 3 of the control method of the robot interactive behavior of certain embodiments of the invention;
Fig. 6 explanation is according to the structured flowchart of the generating means of the control data of the robot of certain embodiments of the invention;
Fig. 7 explanation is according to the structured flowchart one controlling device of the robot interactive behavior of certain embodiments of the invention;
Fig. 8 explanation is according to the structured flowchart two controlling device of the robot interactive behavior of certain embodiments of the invention;
Fig. 9 explanation is according to the structured flowchart three controlling device of the robot interactive behavior of certain embodiments of the invention;With
And
Figure 10 explanation is according to the structured flowchart four controlling device of the robot interactive behavior of certain embodiments of the invention.
Detailed description of the invention
Now referring in detail to the embodiment described in accompanying drawing.In order to understand the present invention comprehensively, carry in the following detailed description
Arrive numerous detail.It will be appreciated by those skilled in the art that the present invention can be without these details
Realize.In other instances, it is not described in detail known method, process, assembly and circuit, in order to avoid unnecessarily making
Embodiment obscures.
Fig. 1 explanation is according to the structural representation of the robot of certain embodiments of the invention.Robot 100 includes memorizer
102, Memory Controller 104, one or more processing unit (CPU) 106, Peripheral Interface 108, radio frequency (RF)
Circuit 114, voicefrequency circuit 116, speaker 118, mike 120, perception subsystem 122, attitude transducer 132,
Video camera 134, touch sensor 136 and other sensing devices 138 one or more, and external interface 140.
These assemblies are communicated by one or more communication bus or holding wire 110.
Should be appreciated that an example of simply robot 100 of robot 100, the assembly of this robot 100 can compare
Diagram has more or less of assembly, or has different assembly configurations.Such as, in certain embodiments, machine
People 100 can include one or more CPU 106, memorizer 102, one or more sensing device (the most as above
Described sensing device), and one or more preservation is in the memory 102 to perform robot interactive Behavior-Based control
The module of method, program or instruction set.Various assemblies shown in Fig. 1 can use the combination of hardware, software or software and hardware
Realize, including one or more signal processing and/or special IC.
In certain embodiments, robot 100 can be the electromechanical equipment with biological profile (such as, humanoid etc.),
Can also is that the intelligent apparatus that not there is biological profile but there is human characteristic (such as, communication etc.), this intelligence
Device can include machinery, it is also possible to includes virtual bench (such as, the cyberchat robot realized by software
Deng).To information, the equipment at its place, cyberchat robot can include that electronics sets by the device-aware at its place
Standby, such as hand-hold electronic equipments, personal computer etc..
Memorizer 102 can include high-speed random access memory, and may also include nonvolatile memory, such as one
Individual or multiple disk storage equipment, flash memory device or other non-volatile solid-state memory devices.In certain embodiments,
Memorizer 102 can also include the memorizer away from one or more CPU 106, such as via RF circuit 114 or
The network attached storage that external interface 140 and communication network (not shown) access, wherein said communication network is permissible
It is the Internet, one or more in-house network, LAN (LAN), wide area network (WLAN), storage area network (SAN)
Deng, or it is appropriately combined.Memory Controller 104 can control the such as CPU 106 of robot 100 and peripheral hardware connects
Other assemblies of mouth 108 etc the access to memorizer 102.
Input and the output peripheral hardware of equipment are couple to CPU 106 and memorizer 102 by Peripheral Interface 108.Said one
Or multiple processor 106 runs various storage software program in the memory 102 and/or instruction set, in order to perform
The various functions of robot 100, and data are processed.
In certain embodiments, Peripheral Interface 108, CPU 106 and Memory Controller 104 can be at single cores
Realize on sheet, such as chip 112.And in some other embodiment, they may realize in multiple separate chip.
RF circuit 114 receives concurrent power transmission magnetic wave.Converting electrical signal is become electromagnetic wave by this RF circuit 114, or
Electromagnetic waveform is become the signal of telecommunication, and communicates with communication network and other communication equipments via electromagnetic wave.
This RF circuit 114 can include the known circuits for performing these functions, including, but not limited to antenna system,
RF transceiver, one or more amplifier, tuner, one or more agitator, digital signal processor, CODEC
Chipset, subscriber identity module (SIM) card, memorizer etc..This RF circuit 112 can be come by radio communication
Communicate with network and other equipment, this network such as have another name called the Internet of WWW (WWW), in-house network and/
Or the wireless network of such as cellular phone network etc, WLAN (LAN) and/or Metropolitan Area Network (MAN) (MAN).
Above-mentioned radio communication can use any one of multiple communication standard, agreement and technology, including but do not limit to
In global system for mobile communications (GSM), enhanced data gsm environment (EDGE), WCDMA
(W-CDMA), CDMA (CDMA), time division multiple acess (TDMA), bluetooth, Wireless Fidelity (Wi-Fi) (such as IEEE
802.11a, IEEE 802.11b, IEEE802.11g and/or IEEE 802.11n), voice based on Internet Protocol pass
Defeated (VoIP), Wi-MAX, for Email, Transit time flow meter and/or the agreement of Short Message Service (SMS),
Or any other suitable communication protocol, it is included in presents and submits the communication protocol not yet developed day to.
The audio frequency that voicefrequency circuit 116, speaker 118 and mike 120 provide between user and robot 100 connects
Mouthful.Voicefrequency circuit 116 receives the voice data from Peripheral Interface 108, and voice data is transformed into the signal of telecommunication, and
And the signal of telecommunication is sent to speaker 118.Converting electrical signal is become the sound wave that the mankind are audible by speaker.Voicefrequency circuit
116 also receive the signal of telecommunication converted by mike 118 from sound wave.Converting electrical signal is become audio frequency by this voicefrequency circuit 116
Data, and voice data is sent to Peripheral Interface 108, in order to process.Voice data can be connect by peripheral hardware
Mouth 108 retrieves from memorizer 102 and/or RF circuit 114, and/or is sent to memorizer 102 and/or RF
Circuit 114.
In some embodiments it is possible to include multiple mike 120, the distribution of multiple mikes 120 can be at not coordination
Put, according to the mike 120 of diverse location, determine the direction that sound sends according to predetermined policy.Should be appreciated that also
Audio direction can be identified by some sensor.
In certain embodiments, voicefrequency circuit 116 also includes telephone headset jack (not shown).This is worn to send and is subject to
Words device jack provides the interface between voicefrequency circuit 114 and removable audio frequency input/output peripheral hardware, for example,
This audio frequency input/output peripheral hardware both can be pure output earphone, it is also possible to be to have output (for monaural or ears simultaneously
Earphone) and input (mike) telephone headset.
In certain embodiments, also include speech recognition equipment (not shown), for realizing the voice identification to word,
And synthesize voice according to word.Speech recognition equipment can realize with the combination of hardware, software or software and hardware, bag
Include one or more signal processing and/or special IC.Voicefrequency circuit 116 receives from Peripheral Interface 108
Voice data, is transformed into the signal of telecommunication by voice data, and voice data can be identified by speech recognition equipment, by sound
Frequency is according to being converted to text data.Speech recognition equipment can also pass through audio frequency according to lteral data Composite tone data
Voice data is transformed into the signal of telecommunication by circuit 116, and the signal of telecommunication is sent to speaker 118.
Perception subsystem 122 provides the interface between the perception peripheral hardware of robot 100 and Peripheral Interface 108, outside perception
If such as attitude transducer 132, video camera 134, touch sensor 136 and other sensing devices 128.Perception
System 122 include attitude controller 124, vision controller 126, haptic controller 128 and one or more its
His sensing device controller 130.The one or more other sensing device controller 130 receive/send from/go
The signal of telecommunication toward other sensing devices 138.Other sensing devices 138 described can include temperature sensor, Distance-sensing
Device, proximity scnsor, baroceptor and air quality detecting device etc..
In certain embodiments, robot 100 can have multiple attitude controller 124, to control robot 100
Different limbs, the limbs of robot can include but not limited to arm, foot and head.Accordingly, robot 100
Multiple attitude transducer 132 can be included.In some embodiments, robot 100 can not possess gesture stability
Device 124 and attitude transducer 132, robot 100 can be solid form, does not possess the mechanical activity such as arm, foot
Parts.In certain embodiments, the attitude of robot 100 can not be arm, foot and the head of machinery, it is also possible to
Use deformable structure.
Robot 100 also includes for the power-supply system 142 for various assembly power supplies.This power-supply system 142 can be wrapped
Include power-supply management system, one or more power supply (such as battery, alternating current (AC)), charging system, power failure inspection
Slowdown monitoring circuit, power supply changeover device or inverter, power supply status indicator (such as light emitting diode (LED)), and with just
Electric energy in portable device generates, manages and be distributed other any assemblies being associated.Charging system can be wired filling
Electricity system, or can also be wireless charging system.
In certain embodiments, component software includes operating system 144, communication module (or instruction set) 146, mutual row
For controlling device (or instruction set) 148 and other devices one or more (or instruction set) 150.
Operating system 144 (such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS or all
Embedded OS such as Vxworks etc) include for controlling and managing general system tasks (such as internal memory pipe
The control of reason, storage device, power management etc.) and contribute to the various softwares of communication between various software and hardware assembly
Assembly and/or driver.
Communication module 146 contributes to communicating with other equipment through one or more external interfaces 140, and it
Also include the various component softwares for processing the data that RF circuit 114 and/or external interface 140 receive.Outside connects
Mouth 140 (such as USB (universal serial bus) (USB), FIREWIRE etc.) are suitable for directly or through network (such as because of spy
Net, WLAN etc.) it is indirectly coupled to other equipment.
In certain embodiments, robot 100 can also include display device (not shown), and display device can be wrapped
Include but be not limited to touch-sensitive display, touch pad etc..Said one or other devices 150 multiple can include figure module
(not shown), figure module includes the various known software assemblies for presenting and show figure on the display apparatus.
Notice that term " figure " includes being shown to any object of user, including, but not limited to text, webpage, figure
Mark (the such as user interface object including soft-key button), digital picture, video, animation etc..Touch-sensitive display
Or touch pad can be also used for user's input.
Robot 100 is by such as attitude transducer 132, video camera 134, touch sensor 136 and other perception
The external environment condition of the perception such as device 128, mike 120 peripheral hardware perception robot 10 and the situation of robot itself,
The information that robot 100 perceives controls device via perception peripheral hardware correspondence and processes, and transfers to one or more CPU
106 process.Robot 100 perception of environment is included but not limited to the sensor of self (such as attitude transducer 132,
Video camera 134, touch sensor 136 and other sensing devices 128) information that detects, it is also possible to be and machine
The information that the external device (ED) (not shown) that people 100 is connected detects, sets up logical between robot 100 and external device (ED)
Letter connects, and robot 100 and external device (ED) transmit data by this communication connection.External device (ED) includes various types of
Sensor, intelligent home device etc..
In certain embodiments, the information that robot 100 perceives include but not limited to sound, image, ambient parameter,
Tactile data, time, space etc..Ambient parameter includes but not limited to temperature, humidity, gas concentration etc.;Sense of touch is believed
Breath includes but not limited to and the contacting of robot 100, include but not limited to touch-sensitive display contact and sense of touch passes
The contact or close of sensor, touch sensor can be arranged on the position (not shown) such as the head of robot, arm,
Should be noted that the information also including other forms.Sound can include voice and other sound, and sound can be wheat
Gram sound that wind 120 collects, it is also possible to be the sound of storage in memorizer 102;Voice can include but not limited to
The mankind speak or singing etc..Image can be that single picture or video, picture and video include but not limited to by video camera
134 shootings obtain, it is also possible to read from memorizer 102 or be transferred to robot 100 by network.
The information of robot 100 perception not only includes the information outside robot 100, it is also possible to include robot 100
The information of self, includes but not limited to the information such as the electricity of robot 100, temperature.For example, it is possible to perceiving machine
When the electricity of device 100 is less than 20%, robot 100 is made to move to charge position automatic charging.
Should be appreciated that robot 100 is not limited by above-mentioned mode and perceives information, it is also possible to by other forms
Perceive information, be included in presents and submit the cognition technology not yet developed day to.Additionally, the perception of robot 100
Device is also not necessarily limited to the sensing device being arranged in robot 100, it is also possible to include associating with robot 100 and not setting
Put the sensing device in robot 100, the most various sensors for perception information.As an example, machine
Device people 100 can associate with the temperature sensor being arranged in certain area, humidity sensor (not shown) etc., logical
Cross these sensor senses to corresponding information.Robot 100 can be passed with these by polytype communication protocol
Sensor communicates, to obtain information from these sensors.
In some embodiments it is possible to set the information of robot 100 perception according to default condition, these conditions can
To include but not limited to set robot 100 which information of perception, when perception information etc..For example, it is possible to
When being set at user speech dialogue, the sound of perception user, the face of tracking user, the attitude etc. of identification user,
And other information of not perception or generate perception unit time reduce other information effect or to perceive its
He carries out process etc. at information;Or, in certain time period, (such as, user goes out, robot 100 is individually in indoor
Time in) perception ambient parameter, perceptual image and video data, judge whether to need and air-conditioning by ambient parameter
Mutual Deng equipment, judge whether indoor have stranger's entrance etc. by image and video data.Should be appreciated that setting sense
The condition of the information known is not limited to this, and above-mentioned condition is illustrative only, and can according to circumstances set robot
100 information needing perception.
About perception unit
Defining at least one perception unit, perception unit (or is referred to as the minimum unit controlling robot 100
Little input block), robot 100 makes interbehavior according at least to perception unit.The interbehavior of robot 100
Can be controlled by one or more perception unit, such as, when the value of one or more perception unit changes,
Robot 100 can respond these changes and make interbehavior;Or, when the value of one or more perception unit exists
When a certain span is interior or is equal to a certain value, robot 100 can respond perception unit and make interbehavior.Should
Understanding, perception unit is not limited to above-mentioned situation to the control of robot 100 interbehavior, and above-mentioned situation is only used as citing
Explanation.
In certain embodiments, perception unit can include multiple level, and the perception unit of high-level can comprise low layer
One or more perception unit of level.In certain embodiments, the perception unit of high-level can comprise and is adjacent
One or more perception unit of low-level, the perception unit of same high-level can comprise the perception of different low-level
Unit.In time, the low-level perception unit of perception unit of synthesis high-level include but not limited to the same time or
The perception unit of the low-level of the history before the low-level perception unit of time period, and this time or time period.?
In some embodiment, the perception unit of high-level is determined by the low-level perception unit of different time.
In certain embodiments, the value of perception unit can be one or a class value, it is also possible to is one or more taking
The scope of value.Can determine the value of perception unit according to the information that robot 100 perceives, a perception unit can
To be determined by the one or more information perceived, same perception unit can be determined by the different pieces of information perceived.
The information perceived can include the information that real-time perception arrives, or the information that history perceives (such as passes by certain for the moment
The information that quarter or certain section of Time Perception arrive).In some cases, the value of perception unit by real-time perception to information
The information perceived with history determines jointly.
As an example, audition (ear), vision (eye), time (timer) can be set, whether have people to exist
Family (so_at_home) and environment (environment) several perception unit.The voice that auditory description is heard,
When robot 100 receives sound, the sound received is carried out voice recognition processing, identify and obtain voice in sound
Text, the value of audition can be the text of the voice heard;In certain embodiments, vision can also include sound
The direction of sound, the direction of sound with the face of robot as reference, the direction such as including left, right, front and rear.Vision is retouched
Stating video monitoring situation, image or video can be analyzed by robot 100, it is judged that currently whether have people or
No have movement, and whether the value of vision may include whether people, had and move etc..The value whether someone is in can
To be " 0 " or " 1 ", " 0 " represents that nobody is in, and " 1 " indicates that people is in.Time describes temporal information,
Its value can be a time point or a time range, such as annual 14 o'clock sharps of February 1.Environment describes
Ambient conditions, including the oxidation in the ppm of the combustion gas in temperature, humidity, noise, PM2.5, air, air
Oxygen content etc. in carbon content, air, its value can be value or the scope of every kind of parameter.
In some embodiments it is possible to the value of predefined perception unit.The value of predefined perception unit can be
One or more occurrences or one or more span.The value of perception unit can be clear and definite value, also
Can be collectively formed with clear and definite value by asterisk wildcard (or it is similar to), but be not limited to this.Such as, perception unit is " language
Sound " time, its value can be " * rain * ", represents the voice messaging arbitrarily comprising " raining ";Or its value is permissible
It is " * [under have] rain * ", represents and arbitrarily comprise " raining " or the voice messaging of " rainy ".
Robot 100 can generate perception data according to perception unit and the information perceived, and perception data can include
One or more perception unit, perception data includes mark and the value of perception unit.Each perception in perception data
The value of unit sees the description to perception unit.Robot 100 is according to the information perceived, according to the life of perception unit
Become perception data, various analysis can be used to obtain the value of perception unit according to the information perceived, such as,
Whether the text that obtains voice by speech recognition technology, the image perceived by image recognition technology analysis exist
Portrait, determined the attribute etc. of portrait by portrait (facial) identification technology.Should be appreciated that robot 100 is not limited to
By the way of above-mentioned, obtain the value of perception unit, it is also possible to by other means, be included in presents and submit day to still
Untapped go out treatment technology.
About control entries
Based on predefined perception unit and the default interbehavior performed for robot, trigger condition can be set
And the interbehavior that trigger condition triggers.The interbehavior triggered according to trigger condition and trigger condition, generation is used for
The information that response robot perception arrives controls the control entries of robot interactive behavior.Control entries can have unique mark
Know, to distinguish control entries.
Trigger condition can be made up of one or more perception unit, can configure logical relation, patrol between perception unit
Volume relation includes but not limited to "AND", "or" and " non-" etc..In certain embodiments, trigger condition can be wrapped
Include the mark of perception unit and the value constituting trigger condition, the value of perception unit can be one or a class value or
One or one group of span.The value of perception unit can be clear and definite value, it is also possible to by asterisk wildcard (or it is similar to)
Collectively form with clear and definite value, but be not limited to this.Such as, when in trigger condition, perception unit is " voice ", it takes
Value can be " * rain * ", represents the voice messaging arbitrarily comprising " raining ";Or its value can be " * [have have] rain
* ", expression arbitrarily comprises " raining " or the voice messaging of " rainy ".
One or more interbehaviors that trigger condition can trigger.In some embodiments it is possible to arrange interbehavior
Between order, with according to arrange order perform multiple interbehaviors.Interbehavior can be configured to one or many
Individual can by robot resolve with perform action command, action command can also include one or more parameter.At some
In embodiment, it is also possible to configure the execution sequence of the one or more action command.This execution sequence can include but
It is not limited to perform one or set instruction at random, to realize the one or more actions of random execution;Or, according to
Predetermined process order performs multiple action command.
The operating system 144 of robot 100 and other relevant apparatus, can resolve the action command of interbehavior, make
Obtain robot and perform interbehavior.Such as, in order to make robot move forward 5 meters, action command can be
“move{“m”:5}”.The control device 148 of robot interactive behavior resolves this action command, obtains to be performed appointing
Business (movement) and task parameters (5 meters forward), to operating system 144 transmission tasks and parameter, operating system 144
Processing further makes mobile device (not shown) perform movement, and mobile device can include sufficient formula, wheeled and shoe
Belt etc..It is to be understood that, it is also possible to specific instruction is set, each motor (or like) of such as mobile device
Parameter.
In certain embodiments, the action command of interbehavior includes: for performing other control entries and arriving of arranging
The link of other control entries, and/or for choosing content and/or parameter from multiple contents and/or multiple parameter and setting
Put to multiple parameters and/or the link of multiple content.Each control entries can have unique mark, action command
The mark that can quote control entries is connected to this control entries.The content that action command is linked to can be one group and move
Making, robot 100 can perform the action in set according to other factors, for example, it is possible to be pre-configured with machine
The attributes such as the personality of people 100 or sex, these attributes can store in the memory 102, different sexes or personality
Robot 100 can be different to the interbehavior of same situation (or referred to as scene), robot 100 can basis
The attributes such as the personality arranged or sex select the action performed from set, and these actions can include but not limited to
The limb action etc. of robot 100.Action command can also be linked to one or one group of content, and this content can include
But be not limited to the content of voice-enabled chat, various internet informations etc., such as, robot 100 performs according to control entries
Action be inquiry Pekinese weather, action command can be the address of an inquiry weather, and robot 100 arrives this
Address acquisition Pekinese weather, this address can include URL (URL), memory address, data
Storehouse field etc..
The interbehavior of robot 100 include but not limited to by output voice, adjust attitude, output image or video,
Interact with other equipment.Output voice includes but not limited to and user's chat, broadcasting music;Adjust attitude bag
Include but be not limited to mobile (such as, imitating mankind's walking etc.), limbs swing (such as, the swing of arm, head),
Expression adjustment etc.;Output image or video include but not limited to show image or video on the display apparatus, and image is permissible
It is dynamic electron expression etc., it is also possible to be to shoot the image obtained, or the image got from network;With other
Equipment includes but not limited to control other equipment (such as adjusting the running parameter etc. of air-conditioning equipment), set to other alternately
Standby transmission data are set up with other equipment and are connected.Should be appreciated that interbehavior is not limited to the above-mentioned content enumerated,
The reaction of the robot 100 information to perceiving all can be considered the interbehavior of robot 100.
Control entries can use data interchange format to configure, naturally it is also possible to uses extended formatting configuration.Data exchange
Form includes but not limited to XML, JSON or YAML etc..As a example by JSON, need to realize: work as user
Say: " singing a first song to me ", first then start to sing a first song toward with medium speed 0 angle retrogressing 10cm, sung song
Later 10 second beats photos are sent to user, and then 0 angle moves ahead 5CM.The control entries of JSON data form
Can be following content:
In above-mentioned control entries, " ifs " part is the trigger condition arranged according to perception unit, and " ear " is perception list
The mark of unit, " singing " is the value of perception unit.The interbehavior that " trigger " part triggers for trigger condition, bag
Include " move (movement) ", " song (singing) " and " take_pic (taking pictures) " three interbehaviors, each alternately
Behavior includes corresponding action command.Wherein, " song (singing) " is linked to " http://bpeer.com/i.mp3 ",
The content sung obtains from " http://bpeer.com/i.mp3 ", and " gr " is the execution sequence of action.
In certain embodiments, multiple control entries can be stored as the document of data interchange format, or can also deposit
Storage is in data base.In certain embodiments, robot 100 can also include Database Systems, these Database Systems
In order to store control entries.Database Systems provide interface to read number from data base for one or more CPU 106
According to, and write data to Database Systems.
The control device 148 of robot interactive behavior can control the interbehavior of robot according to control entries, controls
Device 148 obtains the information that robot perception arrives, according to the information perceived, at least according to predefined perception list
Unit generates perception data, and wherein, perception data includes mark and the value of perception unit;The perception number searched and generate
Control entries according to coupling;If the control entries that the perception data found and generate mates, robot is made to perform to look into
Interbehavior in the control entries found.
In certain embodiments, controlling device 148 can also send the information that robot 100 perceives, by
Far-end server (not shown) generates perception data according to the information perceived and perception unit, and searches and generation
The control entries of perception units match, is then sent to the control entries found control device 148, controls device
148 make robot perform the interbehavior in control entries.It is alternatively possible to generate the mark of the information perceived,
To determine whether the control entries received is the control entries for the information perceived sent.Alternatively, send
Give control device 148 can be control entries itself, it is also possible to be the mark of control entries, or control entries is joined
The interbehavior data put, or the information of other make control device 148 determine interbehaviors that control entries configures.
In certain embodiments, information and the life of perception unit that device 148 can perceive are controlled according to robot 100
Becoming perception data, send the perception data generated to far-end server, far-end server receives perception data, searches
The control entries mated with perception data, sends the control entries found to robot 100, controls device 148
Robot 100 is made to perform the interbehavior in control entries.
Should be appreciated that controlling device 148 is not limited to be controlled the interbehavior of robot by mode as above,
Can also is that combination or other modes of above several ways.
About generating control entries
In some embodiments it is possible to arrange trigger condition according to perception unit, and this trigger condition trigger mutual
Behavior, obtains control entries, and using control entries as the data controlling robot 100 interbehavior.
Fig. 2 explanation is according to the flow chart of the generation method of the control data of the robot of certain embodiments of the invention, such as figure
Shown in 2, the method includes:
Step S202, is provided for controlling touching of robot interactive behavior according to one or more default perception unit
Clockwork spring part;
Step S204, according to one or more default interbehaviors being arranged to perform for robot, arranges institute
State the interbehavior that trigger condition triggers;
Step S206, generates for responding the information that robot perception arrives according to the trigger condition arranged and interbehavior
Control the control entries of robot interactive behavior.
In above-mentioned steps S202, at least one perception unit can be chosen from default perception unit;Choosing is set
The attribute of the perception unit taken, wherein, the attribute of perception unit includes the value of perception unit;According to the perception chosen
The attribute of unit and perception unit is provided for controlling the trigger condition of robot interactive behavior.In certain embodiments,
Also setting up the relation between multiple perception unit, the relation between perception unit includes but not limited to "AND", "or", " non-"
Etc. logical relation;Above-mentioned steps S202, can be according to the perception unit selected and the attribute of perception unit and sense
Know that the relation between unit arranges trigger condition.
In above-mentioned steps S204, can choose at least from the default interbehavior being arranged to perform for robot
One interbehavior;Arranging the attribute of the interbehavior chosen, wherein, the attribute of interbehavior includes interbehavior
One or more can by robot resolve with perform action command and the parameter of action command;Mutual according to choose
The attribute of behavior and interbehavior arranges the interbehavior that trigger condition triggers.In certain embodiments, it is also possible to arrange
The execution sequence of multiple interbehaviors, the execution sequence of interbehavior includes but not limited to perform one or more friendship at random
Behavior mutually, or press the predetermined process multiple interbehaviors of execution.Above-mentioned steps S204, can be mutual according to choose
Behavior and the attribute of interbehavior and above-mentioned execution sequence arrange the interbehavior that trigger condition triggers.
In certain embodiments, according to predetermined representation, trigger condition and the friendship of trigger condition triggering are described
Behavior mutually.It is alternatively possible to the interbehavior using data interchange format to trigger according to trigger condition and trigger condition is raw
Become control entries.Data interchange format includes but not limited to one below or combination in any: XML, JSON or
YAML.It is to be understood that, it is also possible to use extended formatting to generate trigger condition and the interbehavior of trigger condition triggering,
The representation not yet developed day is submitted to including presents.
In some embodiments it is possible to arrange multiple control entries, and multiple control entries are stored as data exchange lattice
The document of formula.Or, multiple control entries can also be stored in data base.Multiple control entries are stored as data
During the document of DIF, can separate with predetermined symbol between adjacent control entries, to distinguish different control strips
Mesh.Storage control strip destination document can be stored in the memorizer 102 of robot 100, and control strip destination document is also
Can be stored in far-end server.
Interbehavior is configured to one or more action command.Above-mentioned action command includes: is used for performing other and controls
Entry and the link to other control entries that arranges, and/or in choosing from multiple contents and/or multiple parameter
Hold and/or parameter and arrange to multiple parameters and/or the link of multiple content.Such as, " inquiry weather " is moved
Instruct, can be linked to provide the webpage of Weather information, from webpage, obtain the Weather information in city to be inquired about.
After inquiring Weather information, may be displayed in the display device of robot 100, or voice broadcast can also be passed through
Weather information.In certain embodiments, when being linked to the parameter of set, can perform according to other selection of configuration
The parameter of action;Equally, when being linked to multiple content (the multiple language materials such as chatted), it is also possible to according to other
The content that selection of configuration presents.
Can also arrange the execution sequence of action command, wherein, execution sequence includes: random execution is one or more dynamic
Instruct, or press the predetermined process multiple action commands of execution.Execution sequence can be marked with symbol, if do not had
There is labelling, can be according to the sequencing of description action.Same type of action can as an entirety, action it
Between sequencing can be marked, such as " first move forward 5 meters, nod 5 times, then draw back 10 meters ",
Action command can be expressed as [move:{gr:0, m:+5;gr:2,m:-10};Head{gr:1, head:5}], " gr " expression action
Execution sequencing, the action that value is little first carries out.
In some embodiments it is possible to provide the graphic user interface for arranging trigger condition and interbehavior
(GUI), graphic user interface provide arrange perception unit (such as, the title of perception unit, mark etc.), can
With the logical relation between the value of perception unit arranged, perception unit, the user arranging trigger condition can select
Perception unit, the value of perception unit and the logical relation of perception unit, select to arrange the perception unit of trigger condition
After, generate trigger condition according to corresponding form.Graphic user interface may be provided for the interbehavior arranged, permissible
It is the interbehavior pre-defined, after having selected interbehavior, generates interbehavior according to corresponding form.
In certain embodiments, it is also possible to direct editing trigger condition and interbehavior, such as according to above-mentioned data exchange lattice
Formula, use the action command specification of predefined perception unit and interbehavior, editor's trigger condition and trigger bar
The interbehavior that part triggers, obtains control entries.
In some embodiments it is possible to capture content (such as webpage etc.) from the Internet, the content captured is carried out point
Analysis, obtains the content for arranging control entries, the friendship triggered according to these curriculum offering trigger conditions and trigger condition
Behavior mutually.Such as, when grabbing sick from the Internet, dial emergency call, can arrange according to perception unit " raw
Sick " trigger condition, and the interbehavior that this trigger condition triggers is set to " dialing emergency call ".If in advance
Define " health status " this perception unit, can directly the value of perception unit be set to " sick ", constitute
Trigger condition can be { if (" health ": " sick ") }.Robot 100 can judge to use according to the data perceived
The health status at family, determines whether health status is " sick ", such as, carries out voice-enabled chat to understand use with user
The state at family, and the detection heart rate of user, body temperature etc..When health status is " sick ", robot 100
The perception data generated includes can { " health ": " sick " }.
About the interbehavior using control entries to control robot
Using control entries as control robot interactive behavior data after, machine can be controlled according to control entries
The interbehavior of people.
Fig. 3 explanation is according to the flow chart one of the control method of the robot interactive behavior of certain embodiments of the invention, such as figure
Shown in 3, the method includes:
Step S302, obtains the data that robot perception arrives;
Step S304, according to the information perceived, at least generates perception data according to predefined perception unit,
Wherein, perception data includes mark and the value of perception unit;
Step S306, searches, in multiple control entries of storage, the control entries mated with the perception data generated;
Step S308, if finding the control entries mated with the perception data generated, makes robot perform lookup
To control entries in interbehavior.
In certain embodiments, robot 100 is communicated with far-end server (not shown) by network, robot
At least one data of 100 perception, far-end server from robot 100 obtain robot perception to information, this acquisition
Send its information perceived including far-end server request robot 100, or robot perception is to after information, to
The information that far-end server distribution of machine people 100 perceives.Robot 100 can periodically send to far-end server
The information perceived, or send the information perceived to far-end server when the information perceived changes, with
Reduce the volume of transmitted data between far-end server and robot 100.
Control entries document can be stored in far-end server, and far-end server includes one or more processor, with
And one or more preservation is in memory to perform the module of method, program or the instruction set shown in Fig. 3.Far-end takes
Business device can be single server, it is also possible to the server cluster being made up of multiple servers.Should be appreciated that
The program stated or instruction set are not limited on a station server run, it is also possible to transport in distributed calculating resource
OK.
In some embodiments it is possible to the control entries found is sent to robot 100, robot 100 is from control
Entry processed reads interbehavior, and performs interbehavior.Or, mutual in the control entries that can will find
The data of behavior are sent to robot 100.Or, it is also possible to the data of the interbehavior in control entries are solved
Analysis, obtains the instruction that robot 100 can perform, and sends the instruction obtained to robot 100, robot 100
Perform this instruction.Should be appreciated that aforesaid way is by way of example only.
Fig. 4 explanation is according to the flowchart 2 of the control method of the robot interactive behavior of certain embodiments of the invention, such as figure
Shown in 4, the method includes:
Step S402, receives the perception data of robot, wherein, information that perception data arrives according to robot perception,
At least generating according to predefined perception unit, perception data includes mark and the value of perception unit;
Step S404, searches the control entries mated with the perception data of robot in multiple control entries of storage;
Step S406, if finding the control entries that the perception data with robot mates, makes robot perform to look into
Interbehavior in the control entries found.
As shown in Figure 4, robot 100 at least one information of perception, and raw according to the information perceived and perception unit
Become perception data, perception data is sent.In certain embodiments, perception data is sent extremely by robot 100
Far-end server (not shown).Robot 100 can send perception data after generating perception data, it is also possible to
Perception data is sent after the request receiving far-end server.
In certain embodiments, far-end server stores control strip destination document, such as, the literary composition of data interchange format
Shelves, or database.Certainly, control entries document can be with distributed storage at multiple memory spaces.Remote service
Device can include one or more processor, and one or more preservation is in memory to perform the side shown in Fig. 3
The module of method, program or instruction set.
Fig. 5 explanation is according to the flow chart 3 of the control method of the robot interactive behavior of certain embodiments of the invention, such as figure
Shown in 5, the method includes:
Step S502, at least one information of perception;
Step S504, according to the information perceived, at least generates perception number according to described predefined perception unit
According to, wherein, perception data includes mark and the value of perception unit;
Step S506, sends the perception data of generation;
Step S508, receives the information of the control entries mated with perception data;
Step S510, performs the interbehavior of this control entries configuration according to the information of control entries.
In certain embodiments, the interbehavior control device 148 of robot 100 performs method as shown in Figure 5.
Robot 100 at least one information of perception, according to perception data generation strategy, generates perception data according to perception unit.
After robot 100 generates perception data, perception data is sent to far-end server.Storage control in far-end server
Bar destination document processed, searches the control entries mated with the perception data of robot in storing multiple control entries, as
Fruit finds the control entries that the perception data with robot mates, and sends this control entries to robot 100.?
In some embodiment, the action command of the interbehavior in control entries can be sent to robot 100.
In certain embodiments, before the perception data of generation being sent, it is also possible to determine the perception data of generation
Mark.After determining the mark of perception data of generation, perception data and the mark thereof of generation are sent.Far-end
Server is after the control entries that the perception data found with generate mates, by the information of control entries and corresponding sense
The mark of primary data sends to controlling device 148, and the information of control entries can be control entries itself, control entries
Mark, the behavior of control strip entry configuration and combination in any thereof, but be not limited to this.Control device and receive control entries
Information, and according to the mark of the perception data carried in the information of control entries, it is judged that the control entries received
Whether information is the information of the control entries mated with the perception data generated.
Control device 148 and can determine the control entries of correspondence according to the mark of control entries, and perform in control entries
Interbehavior.Or, control device 148 and directly can read control from the control entries that far-end server sends
The interbehavior of entry configuration, performs this interbehavior.Furthermore, if far-end server transmission is in control entries
The interbehavior of configuration, controls device 148 and can directly resolve and perform this interbehavior.
In some embodiments it is possible to the perception data of robot is mated with the trigger condition in control entries,
Described coupling includes but not limited to judge whether a certain perception unit, compares the value of perception unit.
In certain embodiments, when finding the trigger condition that multiple perception data item with robot mates, permissible
The perception data determining robot and the matching degree of the multiple trigger conditions matched, select according at least to matching degree
The control entries mated with the perception data generated.As an example, for the speech text in perception data, can
With but be not limited to use editing distance determine matching degree, the least two texts of value of editing distance are the most similar.Voice
Text can also use regular expression to mate.
In certain embodiments, it is also possible to the priority of control entries is set, control it is referred to when selecting control entries
The priority of entry processed.For example, it is possible to control entries is categorized as core control entries, user's control entries and faces
Time control entries, core control entries is the control entries that priority is the highest, next to that user's control entries, is finally
Interim control entries.When searching control entries, first can search from core control entries and mate with perception data
Control entries.If not finding the control entries mated with perception data in core control entries, can be user
Control entries is searched the control entries mated with perception data.If do not found and perception in user's control entries
The control entries of Data Matching, can search, in interim training storehouse, the control entries mated with perception data.
In certain embodiments, robot 100 can be with at least one information of perception, according to the information perceived and perception
Unit generates perception data, and reads control entries (including but not limited to that the memorizer 102 from robot 100 reads),
The control entries that the perception data searched and generate mates, if finding the control strip mated with the perception data generated
Mesh, robot 100 performs the interbehavior in the control entries found.
In certain embodiments, control strip destination document can store memorizer 102 and the remote service of robot 100
In device.Robot 100 at least one information of perception, generates perception data according to the information perceived and perception unit,
From memorizer 102, read control entries, search, in the control entries read, the control mated with the perception data generated
Entry processed.If the control entries that the perception data found and generate mates, robot 100 performs the control found
Interbehavior in entry processed;Mate with the perception data generated if do not found in the control entries read
Control entries, the perception data of generation can be sent to far-end server by robot 100, and far-end server is in storage
Control entries in search the control entries mated with the perception data received, if the perception finding and receiving
The control entries of Data Matching, makes robot 100 perform the interbehavior in this control entries.Far-end server also may be used
Sending to robot 100 with the control entries that will find, robot 100 can be received by interface (not shown)
Control entries, and store the control entries received.
As it has been described above, when finding the control entries mated with perception data, make robot 100 perform control entries
In interbehavior.When not finding the control entries mated with perception data, interbehavior, machine can not be made
People 100 can continue at least one information of perception, and which kind of information of perception can determine according to default condition.At some
In embodiment, when not finding the control entries mated with perception data, speech answering can be carried out or import mutually
Content (such as, displayed web page information etc.) in networking.When not finding the control entries mated with perception data,
May determine that perception data the most relevant with voice (such as, if receive the phonetic order etc. of user), if really
Determine perception data relevant with voice, speech answering can be carried out, or search for according to voice content relevant in the Internet
Content, presents to user in the display device of robot 100.
In some embodiments it is possible to arrange control entries according to the interbehavior of robot Yu user.When not finding
During the control entries mated with the perception data of robot 100, robot 100 can carry out voice-enabled chat with user,
In chat process, robot 100 analyzes demand and the intention of user, obtains the friendship of robot under sight and this situation
Behavior mutually, according to sight, the interbehavior of robot, generates control entries according to perception unit.Such as, Yong Husheng
Time sick, robot being said " I am sick " does not has the mutual row when user is sick in the control entries of robot 100
For, now robot 100 can carry out interactive voice with user, such as inquires user's " I does not knows needs what does ",
User it may be said that " helping me to dial the phone of my private doctor, telephone number is .... ", robot 100 can dial
Make a phone call.Additionally, in this case, robot 100 analyzes to be needed when drawing user's " sick " to contact doctor, root
The result drawn according to analysis, robot 100 can generate control entries, and such as, trigger condition is [if (health:sick)],
The interbehavior that trigger condition triggers is [call{number: " //doctor_number.php "].
Below the structure of the generating means of the control data of the robot of some embodiment is described.Due to robot
Control data generating means solve problem principle similar to the control method of robot interactive behavior, therefore machine
The enforcement of the generating means of the control data of people may refer to the enforcement of the generation method of the control data of robot, repeats
Part repeats no more.Used below, term " unit " or " module " can realize predetermined function software and/
Or the combination of hardware.Although the device described by following example preferably realizes with software, but hardware, or
The realization of the combination of software and hardware also may and be contemplated.
Fig. 6 illustrates the structured flowchart of the generating means of the control data of the robot according to certain embodiments of the invention, as
Shown in Fig. 6, this device includes:
Trigger condition arranges module 602, for being provided for controlling machine according to one or more default perception unit
The trigger condition of people's interbehavior, wherein, perception unit is arranged to control the minimum unit of robot interactive behavior;
Interbehavior arranges module 604, arranges module 602 with trigger condition and is connected, for according to one or more pre-
If be arranged to for robot perform interbehavior, arrange described trigger condition trigger interbehavior;
Generation module 606, arranges module 604 and is connected with interbehavior, for the trigger condition according to setting with mutual
Behavior generate for respond robot perception to information to control the control entries of robot interactive behavior.
Fig. 7 illustrates the structured flowchart one controlling device of the robot interactive behavior according to certain embodiments of the invention, as
Shown in Fig. 7, this device includes: acquisition module 702, for obtaining the information that robot perception arrives;Generation module 704,
It is connected with acquisition module 702, for according to the information perceived, at least according to the generation sense of predefined perception unit
Primary data;Search module 706, be connected with generation module 704, the control that the perception data for searching with generate mates
Entry processed;Perform module 708, be connected with searching module 706, mate with the perception data generated for working as to find
Control entries time, make robot perform the behavior in the control entries that finds.
Fig. 8 illustrates the structured flowchart two controlling device of the robot interactive behavior according to certain embodiments of the invention, as
Shown in Fig. 8, this device includes: sensing module 802, at least one information of perception;Generation module 804, with
Sensing module 802 is connected, for according to the information perceived, at least according to the generation perception of predefined perception unit
Data;Sending module 806, is connected with generation module 804, for being sent by the perception data of generation;Receive
Module 808, is connected with sending module 806, for receiving the information of the control entries mated with perception data;Perform
Module 810, is connected with receiver module 808, for performing in this control entries according to the information of described control entries
Interbehavior.
Fig. 9 illustrates the structured flowchart three controlling device of the robot interactive behavior according to certain embodiments of the invention, as
Shown in Fig. 9, this device includes: receiver module 902, for receiving the perception data of robot;Search module 904,
It is connected with receiver module 902, for searching the control entries mated with the perception data of described robot;Perform module
906, it is connected with searching module 904, finds the control entries corresponding with the perception data of described robot for working as,
Described robot is made to perform the behavior that in the control entries found, trigger condition triggers.
Figure 10 illustrates the structured flowchart four controlling device of the robot interactive behavior according to certain embodiments of the invention,
As shown in Figure 10, this device includes: control entries document 1002, for providing the control comprising multiple control entries
Entry document, wherein, each control entries comprises trigger condition and the behavior of this trigger condition triggering, each triggering bar
Part is made up of at least one predefined perception unit;
Matching module 1004, is connected with control entries document 1002, for by the perception data of robot and control strip
Mesh mates, the control entries mated with the perception data determined whether there is with described robot, wherein, described
Information that perception data arrives according to robot perception, at least generate according to predefined perception unit.
Obviously, those skilled in the art should be understood that each module of the above-mentioned embodiment of the present invention or each step are permissible
Realizing with general calculating device, they can concentrate on single calculating device, or is distributed in multiple calculating
On the network that device is formed, alternatively, they can realize with calculating the executable program code of device, thus,
Can be stored in storing in device and be performed by calculating device, and in some cases, can be to be different from
The step shown or described by order execution herein, or they are fabricated to respectively each integrated circuit modules, or
Multiple modules in them or step are fabricated to single integrated circuit module and realize by person.So, the embodiment of the present invention
It is not restricted to any specific hardware and software combine.
The foregoing is only the preferred embodiments of the present invention, be not limited to the present invention, for the skill of this area
For art personnel, the embodiment of the present invention can have various modifications and variations.All within the spirit and principles in the present invention,
Any modification, equivalent substitution and improvement etc. made, should be included within the scope of the present invention.
Claims (40)
1. the control method of a robot interactive behavior, it is characterised in that including:
Obtain the information that robot perception arrives;
The information that perceives according to described, at least generate the mark comprising perception unit according to predefined perception unit
Perception data with value;
Search with generate perception data mate be used for respond robot perception to information to control the row of robot
For control entries, wherein, described control entries comprises the trigger condition that is made up of at least one perception unit and this touches
The behavior that clockwork spring part triggers;
If finding the control entries that the perception data with described generation mates, the execution of described robot is made to find
Behavior in control entries.
Method the most according to claim 1, it is characterised in that described lookup is mated with the perception data of generation
Control entries before, also include:
Reading control entries from control entries document, wherein, described control entries paper trail has multiple control entries.
Method the most according to claim 2, it is characterised in that described control entries document is data base or number
Document according to DIF.
Method the most according to claim 1, it is characterised in that control entries also comprises the execution of multiple behavior
Sequentially, wherein, described execution sequence includes: the one or more behaviors of random execution, or performs many as predetermined process
Individual behavior.
Method the most according to claim 1, it is characterised in that the behavior that the trigger condition of control entries triggers
It is configured to the parameter of one or more action command and action command.
Method the most according to claim 5, it is characterised in that the parameter of described action command includes: be used for
The link to other control entries performing other control entries and arrange, and/or for from multiple contents and/or multiple
Parameter is chosen content and/or parameter and arrange to multiple parameters and/or the link of multiple content.
Method the most according to claim 1, it is characterised in that described lookup is mated with the perception data of generation
For respond robot perception to information to control the control entries of the behavior of robot, including:
By the perception data of described generation be used for responding robot perception to information to control the behavior of robot
The trigger condition of control entries is mated, wherein, using control entries corresponding for the trigger condition that matches as with institute
State the control entries of the perception data coupling of generation.
Method the most according to claim 7, it is characterised in that described lookup is mated with the perception data of generation
Control entries, also include:
When the trigger condition that the perception data item finding multiple and described generation mates, determine the perception of described generation
The matching degree of data and the multiple trigger conditions matched;
The control entries mated with the perception data of described generation is selected according at least to described matching degree.
Method the most according to claim 1, it is characterised in that the information that perceives described in described basis, extremely
Few generation according to predefined perception unit comprises the mark of perception unit and the perception data of value, including:
The information perceived according at least to predefined perception element analysis and/or the combination of multinomial information,
Value to each perception unit;
Value according to each perception unit described generates perception data.
Method the most according to claim 1, it is characterised in that described in the information that perceives include: feel in real time
The information known and/or the information of history perception.
11. methods according to any one of claim 1 to 10, it is characterised in that the artificial room of described machine
Inner machine people.
12. methods according to any one of claim 1 to 10, it is characterised in that described perception unit has
There is one or more preset value.
The control method of 13. 1 kinds of robot interactive behaviors, it is characterised in that including:
At least one information of perception;
According to the information perceived, at least according to predefined perception unit generate comprise perception unit mark and take
The perception data of value;
The perception data of generation is sent;
Receiving the information of the control entries mated with the perception data of described generation, wherein, control entries is used for responding machine
The information that device people perceives to control the behavior of robot, control entries comprise be made up of at least one perception unit touch
The behavior that clockwork spring part and this trigger condition trigger;
Information according to described control entries performs the behavior in the control entries received.
14. methods according to claim 13, it is characterised in that information that described basis perceives, at least
Generate according to predefined perception unit and comprise the mark of perception unit and the perception data of value, including:
The information perceived according at least to predefined perception element analysis and/or the combination of multinomial information,
Value to each perception unit;
Value according to each perception unit described generates perception data.
15. methods according to claim 13, it is characterised in that the data of described perception include: feel in real time
The information known and/or the information of history perception.
16. according to the method according to any one of claim 13 to 15, it is characterised in that described perception unit has
There is one or more preset value.
17. according to the method according to any one of claim 13 to 15, it is characterised in that the perception number that will generate
Before sending, also comprise determining that the mark of the perception data of generation;
Wherein, the perception data of generation is sent include: by the perception data generated and the perception number of described generation
According to mark send;
Wherein, receive the information of the control entries mated with the perception data of described generation, including: receive control entries
Information, and according to the mark of the perception data carried in the information of control entries, it is judged that the control entries received
Whether information is the information of the control entries that the perception data with described generation mates.
18. according to the method according to any one of claim 13 to 15, it is characterised in that described control entries
Information includes at least one of or combination in any: control entries itself, the mark of control entries, control strip entry are joined
The behavior put.
19. according to the method according to any one of claim 13 to 15, it is characterised in that also include: if not
Receive the control entries that the perception data with described generation mates, carry out interactive voice with user.
20. methods according to claim 19, it is characterised in that also include: hand over according to the voice with user
Generate control entries mutually.
The control method of 21. 1 kinds of robot interactive behaviors, it is characterised in that including:
Receive the mark comprising perception unit of robot and the perception data of value, wherein, described perception data according to
Information that robot perception arrives, at least generate according to predefined perception unit;
Search mate with the perception data of described robot be used for respond robot perception to information to control machine
The control entries of the behavior of people, wherein, control entries comprises the trigger condition being made up of at least one perception unit and is somebody's turn to do
The behavior that trigger condition triggers;
If finding the control entries that the perception data with described robot mates, described robot is made to perform to find
Control entries in behavior.
22. methods according to claim 21, it is characterised in that described control entries also comprises multiple behavior
Execution sequence, wherein, described execution sequence includes: random perform one or more behaviors, or presses predetermined process
Perform multiple behavior.
23. methods according to claim 21, it is characterised in that the row that the trigger condition of control entries triggers
For being configured to the parameter of one or more action command and action command.
24. methods according to claim 23, it is characterised in that the parameter of described action command includes: use
In performing other control entries and the link to other control entries that arranges, and/or for from multiple contents and/or many
Individual parameter is chosen content and/or parameter and arrange to multiple parameters and/or the link of multiple content.
25. methods according to claim 23, it is characterised in that described perception unit has one or more
Preset value.
26. according to the method according to any one of claim 21 to 25, it is characterised in that the artificial room of described machine
Inner machine people.
The control method of 27. 1 kinds of robot interactive behaviors, it is characterised in that including:
There is provided comprise multiple for respond robot perception to information to control the control entries of the behavior of robot
Control entries document, wherein, each control entries comprises the triggering being made up of at least one predefined perception unit
The behavior that condition and this trigger condition trigger;
The mark comprising perception unit of robot and the perception data of value are mated with control entries, to determine
Whether there is the control entries that the perception data with described robot mates, wherein, the perception data root of described robot
The information that arrives according to robot perception, at least generate according to predefined perception unit.
28. methods according to claim 27, it is characterised in that described control entries document be data base or
The document of data interchange format.
29. methods according to claim 27, it is characterised in that the trigger condition of described control entries triggers
Behavior be configured to the parameter of one or more action command and action command.
30. methods according to claim 29, it is characterised in that the parameter of described action command includes: use
In performing other control entries and the link to other control entries that arranges, and/or for from multiple contents and/or many
Individual parameter is chosen content and/or parameter and arrange to multiple parameters and/or the link of multiple content.
31. methods according to claim 29, it is characterised in that control entries also includes that trigger condition triggers
The execution sequence of multiple behaviors, wherein, described execution sequence includes: random perform one or more behaviors, or
Multiple behaviors are performed by predetermined process.
The control device of 32. 1 kinds of robot interactive behaviors, it is characterised in that including:
Acquisition module, for obtaining the information that robot perception arrives;
Generation module, for the information perceived described in basis, at least includes according to the generation of predefined perception unit
The mark of perception unit and the perception data of value;
Search module, for search with generate perception data mate be used for respond robot perception to information control
The control entries of the behavior of robot processed, wherein, described control entries comprise be made up of at least one perception unit touch
The behavior that clockwork spring part and this trigger condition trigger;
Perform module, for when the control entries that the perception data found with described generation mates, make described machine
People performs the behavior in the control entries found.
The control device of 33. 1 kinds of robot interactive behaviors, it is characterised in that including:
Sensing module, at least one information of perception;
Generation module, is used for according to the information perceived, at least includes perception according to the generation of predefined perception unit
The mark of unit and the perception data of value;
Sending module, for sending the perception data of generation;
Receiver module, for receiving the information of the control entries mated with described perception data, wherein, control entries is used
In response robot perception to information control the behavior of robot, control entries comprises by least one perception unit
The behavior that the trigger condition constituted and this trigger condition trigger;
Perform module, for performing the behavior in the control entries received.
The control device of 34. 1 kinds of robot interactive behaviors, it is characterised in that including:
Receiver module, for receiving the mark including perception unit and the perception data of value of robot, wherein, institute
State information that perception data arrives according to robot perception, at least generate according to predefined perception unit;
Search module, for search mate with the perception data of described robot be used for respond the letter that robot perception arrives
Breath controls the control entries of behavior of robot, and wherein, control entries comprises and is made up of at least one perception unit
The behavior that trigger condition and this trigger condition trigger;
Perform module, for when finding the control entries corresponding with the perception data of described robot, make described machine
Device people performs the behavior in the control entries found.
The control device of 35. 1 kinds of robot interactive behaviors, it is characterised in that including:
Control entries document, for provide comprise multiple for respond robot perception to information to control robot
The control entries document of the control entries of behavior, wherein, each control entries comprises by least one predefined sense
Know the behavior that the trigger condition that unit is constituted triggers with this trigger condition;
Matching module, for entering the mark including perception unit of robot and the perception data of value with control entries
Row coupling, the control entries mated with the perception data determined whether there is with described robot, wherein, described perception
Information that data arrive according to robot perception, at least generate according to predefined perception unit.
36. 1 kinds of robots, it is characterised in that including:
One or more sensing devices;
Memorizer, be set to store for respond robot perception to information to control the control strip of the behavior of robot
Mesh, wherein, control entries comprises the trigger condition and trigger condition being made up of at least one predefined perception unit
The behavior triggered;
One or more processors, are set to generate according to the information that the one or more sensing device perceives include
The mark of perception unit and the perception data of value, search the control entries mated with the perception data generated, make machine
People performs the behavior in the control entries found.
37. 1 kinds of robots, it is characterised in that including:
One or more sensing devices;
Interface, be set to receive for respond robot perception to information to control the control strip of the behavior of robot
Mesh, wherein, described control entries comprises the trigger condition and triggering being made up of at least one predefined perception unit
The behavior that condition triggers;
One or more processors, are set to generate according to the information that the one or more sensing device perceives include
The mark of perception unit and the perception data of value, search the control entries mated with the perception data generated, make machine
People performs the behavior in the control entries found.
38. according to the robot described in claim 37, it is characterised in that also include: memorizer, is set to deposit
The control entries that storage receives.
39. 1 kinds of robots, it is characterised in that including:
One or more sensing devices;
Acquisition device, equipment is to obtain the information that the one or more sensing device perceives;
Generating means, the information perceived described according to equipment, at least according to predefined perception unit generate bag
Include the mark of perception unit and the perception data of value;
Searching device, the information arrived for responding robot perception that the perception data being set to search with generate mates is come
Controlling the control entries of behavior of robot, wherein, described control entries comprises and is made up of at least one perception unit
The behavior that trigger condition and this trigger condition trigger;
Perform device, be set to, when the control entries that the perception data found with described generation mates, perform lookup
To control entries in behavior.
40. 1 kinds of robots, it is characterised in that including:
One or more sensing devices;
Memorizer;
One or more processors;And
One or more modules, the one or more module is stored in described memorizer and is configured to by described
One or more processors perform, and the one or more module includes the instruction for performing following steps:
The information perceived according to the one or more sensing device generates and includes at least one predefined perception
The mark of unit and the perception data of value;
Search with generate perception data mate be used for respond robot perception to information to control the row of robot
For control entries, wherein, described control entries comprises the trigger condition that is made up of at least one perception unit and this touches
The behavior that clockwork spring part triggers;
When the control entries that the perception data found with described generation mates, robot is made to perform the control found
Behavior in entry.
Priority Applications (7)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201510363348.1A CN106325065A (en) | 2015-06-26 | 2015-06-26 | Robot interactive behavior control method, device and robot |
| PCT/CN2016/087260 WO2016206645A1 (en) | 2015-06-26 | 2016-06-27 | Method and apparatus for loading control data into machine device |
| PCT/CN2016/087261 WO2016206646A1 (en) | 2015-06-26 | 2016-06-27 | Method and system for urging machine device to generate action |
| PCT/CN2016/087262 WO2016206647A1 (en) | 2015-06-26 | 2016-06-27 | System for controlling machine apparatus to generate action |
| PCT/CN2016/087257 WO2016206642A1 (en) | 2015-06-26 | 2016-06-27 | Method and apparatus for generating control data of robot |
| PCT/CN2016/087259 WO2016206644A1 (en) | 2015-06-26 | 2016-06-27 | Robot control engine and system |
| PCT/CN2016/087258 WO2016206643A1 (en) | 2015-06-26 | 2016-06-27 | Method and device for controlling interactive behavior of robot and robot thereof |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201510363348.1A CN106325065A (en) | 2015-06-26 | 2015-06-26 | Robot interactive behavior control method, device and robot |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN106325065A true CN106325065A (en) | 2017-01-11 |
Family
ID=57721943
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201510363348.1A Pending CN106325065A (en) | 2015-06-26 | 2015-06-26 | Robot interactive behavior control method, device and robot |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN106325065A (en) |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107179683A (en) * | 2017-04-01 | 2017-09-19 | 浙江工业大学 | Interactive robot intelligent motion detection and control method based on neural network |
| CN109313635A (en) * | 2017-08-25 | 2019-02-05 | 深圳市得道健康管理有限公司 | A method for establishing an artificial intelligence behavior control database and its equipment and system |
| WO2019037075A1 (en) * | 2017-08-25 | 2019-02-28 | 深圳市得道健康管理有限公司 | Artificial intelligence terminal and system and behavior control method thereof |
| CN109564635A (en) * | 2017-08-11 | 2019-04-02 | 深圳市得道健康管理有限公司 | Artificial intelligence equipment, system and its behavior control method |
| CN110053046A (en) * | 2019-04-09 | 2019-07-26 | 江门市蚂蚁机器人有限公司 | Robot control method and its system based on customized event |
| CN110871813A (en) * | 2018-08-31 | 2020-03-10 | 比亚迪股份有限公司 | Control method and device of virtual robot, vehicle, equipment and storage medium |
| CN111890369A (en) * | 2020-08-07 | 2020-11-06 | 深圳市海柔创新科技有限公司 | Robot control method, device, system, control device and robot |
| CN112035714A (en) * | 2019-06-03 | 2020-12-04 | 鲨鱼快游网络技术(北京)有限公司 | Man-machine conversation method based on character companions |
| CN112203805A (en) * | 2018-04-17 | 2021-01-08 | Abb瑞士股份有限公司 | Method and apparatus for robot control |
| CN112904747A (en) * | 2021-01-29 | 2021-06-04 | 成都视海芯图微电子有限公司 | Control system and control method based on intelligent sensing |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101615072A (en) * | 2009-06-18 | 2009-12-30 | 东南大学 | Tactile Reproduction Method of Texture Force Based on Shape Restoration from Image Grayscale |
| US20140038489A1 (en) * | 2012-08-06 | 2014-02-06 | BBY Solutions | Interactive plush toy |
| CN104640677A (en) * | 2012-06-21 | 2015-05-20 | 睿信科机器人有限公司 | Train and operate industrial robots |
-
2015
- 2015-06-26 CN CN201510363348.1A patent/CN106325065A/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101615072A (en) * | 2009-06-18 | 2009-12-30 | 东南大学 | Tactile Reproduction Method of Texture Force Based on Shape Restoration from Image Grayscale |
| CN104640677A (en) * | 2012-06-21 | 2015-05-20 | 睿信科机器人有限公司 | Train and operate industrial robots |
| US20140038489A1 (en) * | 2012-08-06 | 2014-02-06 | BBY Solutions | Interactive plush toy |
Non-Patent Citations (1)
| Title |
|---|
| 李田泽: "《传感器技术设计与应用》", 31 May 2015 * |
Cited By (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107179683B (en) * | 2017-04-01 | 2020-04-24 | 浙江工业大学 | Interactive robot intelligent motion detection and control method based on neural network |
| CN107179683A (en) * | 2017-04-01 | 2017-09-19 | 浙江工业大学 | Interactive robot intelligent motion detection and control method based on neural network |
| US12165071B2 (en) | 2017-08-11 | 2024-12-10 | Shenzhen Tatfook Wisdom Health Technology Co., Ltd. | Artificial intelligence apparatus, system and behavior control method thereof |
| CN109564635A (en) * | 2017-08-11 | 2019-04-02 | 深圳市得道健康管理有限公司 | Artificial intelligence equipment, system and its behavior control method |
| CN109313635B (en) * | 2017-08-25 | 2020-09-08 | 深圳市大富智慧健康科技有限公司 | Method for establishing artificial intelligence behavior control database, and equipment, system and storage medium thereof |
| CN109791396B (en) * | 2017-08-25 | 2022-07-08 | 深圳市大富智慧健康科技有限公司 | Artificial intelligence terminal, system and behavior control method thereof |
| CN109313635A (en) * | 2017-08-25 | 2019-02-05 | 深圳市得道健康管理有限公司 | A method for establishing an artificial intelligence behavior control database and its equipment and system |
| CN109791396A (en) * | 2017-08-25 | 2019-05-21 | 深圳市得道健康管理有限公司 | Artificial intelligence terminal, system and its behaviour control method |
| WO2019037075A1 (en) * | 2017-08-25 | 2019-02-28 | 深圳市得道健康管理有限公司 | Artificial intelligence terminal and system and behavior control method thereof |
| CN112203805A (en) * | 2018-04-17 | 2021-01-08 | Abb瑞士股份有限公司 | Method and apparatus for robot control |
| CN112203805B (en) * | 2018-04-17 | 2023-07-25 | Abb瑞士股份有限公司 | Method and apparatus for robot control |
| US11897137B2 (en) | 2018-04-17 | 2024-02-13 | Abb Schweiz Ag | Method of identifying robot model automatically and safely |
| CN110871813A (en) * | 2018-08-31 | 2020-03-10 | 比亚迪股份有限公司 | Control method and device of virtual robot, vehicle, equipment and storage medium |
| CN110053046B (en) * | 2019-04-09 | 2022-05-03 | 江门市蚂蚁机器人有限公司 | Robot control method and system based on user-defined event |
| CN110053046A (en) * | 2019-04-09 | 2019-07-26 | 江门市蚂蚁机器人有限公司 | Robot control method and its system based on customized event |
| CN112035714A (en) * | 2019-06-03 | 2020-12-04 | 鲨鱼快游网络技术(北京)有限公司 | Man-machine conversation method based on character companions |
| CN111890369A (en) * | 2020-08-07 | 2020-11-06 | 深圳市海柔创新科技有限公司 | Robot control method, device, system, control device and robot |
| CN112904747A (en) * | 2021-01-29 | 2021-06-04 | 成都视海芯图微电子有限公司 | Control system and control method based on intelligent sensing |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN106325065A (en) | Robot interactive behavior control method, device and robot | |
| CN106325228A (en) | Method and device for generating control data of robot | |
| CN106873773B (en) | Robot interaction control method, server and robot | |
| EP2146492B1 (en) | Event execution method and system for robot synchronized with mobile terminal | |
| EP3341934B1 (en) | Electronic device | |
| CN106341300B (en) | Method, device and system for task release | |
| CN106653008B (en) | Voice control method, device and system | |
| EP2706418A1 (en) | Method and device for controlling an external apparatus | |
| CN111143683B (en) | Terminal interaction recommendation method, device and readable storage medium | |
| KR102561572B1 (en) | Method for utilizing sensor and electronic device for the same | |
| CN109409235B (en) | Image recognition method and apparatus, electronic device, computer-readable storage medium | |
| CN103404118A (en) | Aware Profile Switching on Mobile Computing Devices | |
| US9543918B1 (en) | Configuring notification intensity level using device sensors | |
| KR20180102870A (en) | Electronic device and method for controlling the same | |
| CN107452383B (en) | Information processing method, server, terminal and information processing system | |
| KR20180096182A (en) | Electronic device and method for controlling the same | |
| CN107193448A (en) | A kind of affairs prompt method, mobile terminal and storage medium | |
| KR102140740B1 (en) | A mobile device, a cradle for mobile device, and a method of managing them | |
| CN109257498B (en) | Sound processing method and mobile terminal | |
| CN106325113B (en) | Robot controls engine and system | |
| US20130159400A1 (en) | User device, server, and operating conditions setting system | |
| KR20190009201A (en) | Mobile terminal and method for controlling the same | |
| WO2016206642A1 (en) | Method and apparatus for generating control data of robot | |
| CN111741116B (en) | Emotion interaction method and device, storage medium and electronic device | |
| CN112788174B (en) | A kind of wireless earphone intelligent retrieval method and related device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| RJ01 | Rejection of invention patent application after publication | ||
| RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170111 |