CN102722253A - Static man-machine interactive control method and application thereof - Google Patents
Static man-machine interactive control method and application thereof Download PDFInfo
- Publication number
- CN102722253A CN102722253A CN2012102004046A CN201210200404A CN102722253A CN 102722253 A CN102722253 A CN 102722253A CN 2012102004046 A CN2012102004046 A CN 2012102004046A CN 201210200404 A CN201210200404 A CN 201210200404A CN 102722253 A CN102722253 A CN 102722253A
- Authority
- CN
- China
- Prior art keywords
- user
- control
- fine motion
- virtual unit
- virtual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 21
- 230000002452 interceptive effect Effects 0.000 title abstract description 6
- 230000003068 static effect Effects 0.000 title abstract 5
- 230000003993 interaction Effects 0.000 claims abstract description 17
- 230000001360 synchronised effect Effects 0.000 claims description 6
- 210000005252 bulbus oculi Anatomy 0.000 claims description 3
- 210000001508 eye Anatomy 0.000 claims description 3
- 210000000744 eyelid Anatomy 0.000 claims description 3
- 238000003384 imaging method Methods 0.000 claims description 3
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 230000018199 S phase Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 239000000654 additive Substances 0.000 description 1
- 230000000996 additive effect Effects 0.000 description 1
- 210000003423 ankle Anatomy 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 238000012797 qualification Methods 0.000 description 1
- 208000012802 recumbency Diseases 0.000 description 1
- 230000004043 responsiveness Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The invention provides a static man-machine interactive control method. A user can select and control the keystrokes of one or more virtual equipment through a user allowable micro motion. The method comprises the following steps of: 1) creating virtual equipment N1, N2,..., Nn of equipment M1, M2,...., Mn to be controlled in a virtual world; 2) linking the equipment M1, M2,..., Mn and the virtual equipment N1, N2,..., Nn; and 3) linking the user allowable micro motion and the virtual equipment N1, N2,..., Nn; and when implementing the micro motion, the user can select, lock and control the virtual equipment N1, N2,..., Nn, and thus the user can start, stop or control the equipment Mm to be controlled by controlling the virtual equipment Nm. According to the static man-machine interactive control method, the user can operate by lying or sitting without ever leaving the location, therefore, the user can carry out various controls easily and freely for a long time, and the problem that the user has to stop operating due to physical downing can be solved; and the static man-machine interactive control method is suitable for relatively wide people; and any people with active function can carry out corresponding man-machine interaction through the static man-machine interactive control method provided by the invention.
Description
Technical field
The present invention relates to a kind of control method and utilization that makes the bionical virtual world of human-computer interaction.
Background technology
In the actual life, we always hope more facility, and also always run into inconvenience unavoidably, leave berth like the patient of hospital and remove to control the various electronic equipments that hospital provides; In the big winter, we hope to hide and control various electrical equipment in the quilt the inside.
Summary of the invention
The purpose of this invention is to provide a kind of control method and utilization thereof of the human-computer interaction that addresses the above problem.
For the ease of understanding the present invention, the spy carries out following explanation of nouns to each related term.
The user allows fine motion to make scheme: when the user implements some or the qualified fine motion of a group when doing, can send a steering order to computing machine; The fine motion here especially refers to user's action by a small margin, and less than 20cm, concrete manifestation is like arm fine motion, the little song of pin like: corresponding joint moving displacement arbitrarily; Above-mentioned condition of the present invention especially comprises the situation that qualification is not given an order.
Virtual permission action scheme: virtual world is given action or the action scheme that utensil can carry out in self-role or the virtual world, and said action scheme comprises continuous combination of actions, action dynamics, speed etc.
Turning joint: the user is not the activity that the activity in whole joints can be controlled my role's phase position; When especially self-role is non-human; User some joint on one's body not, so " turning joint " of indication of the present invention is meant that virtual world is given the movable position of self-role and corresponding to the joint on user's actual body.On the other hand, when self-role's movable part is counted more than the actual turning joint of user, the additive method that then adopts the present invention to introduce; The alleged turning joint of this paper is not limited only to the skeleton junction in addition, and mobilizable any position on its general reference human body is like any point on the whole upper arm.
Palm: comprise the joint on all palms of wrist, like finger.
Sole: comprise the joint on all soles of ankle, like toe.
Estimate the index of motion amplitude: can be followed the tracks of displacement and direction that the position takes place, followed the tracks of the angle of position on two time points etc.
Action is amplified: in order to make every effort to user's sense of reality, reach the synchronisation requirement in interactive process, set following two rules:
1, in the human perception limit of power, action is amplified and preferably only user's movement range, dynamics is amplified;
2, when surpassing the human perception limit of power, action is amplified and can also be amplified user's responsiveness.
For realizing above-mentioned purpose, technical scheme of the present invention is:
A kind of control method of quiet formula human-computer interaction, the user allows fine motion to do to select and to control the control key of one or more virtual unit through implementing the user, and it may further comprise the steps:
1) establishment waits to control equipment M1, M2 in virtual world ... The virtual unit N1 of Mn, N2 ... Nn;
2) association waits to control equipment M1, M2 ... Mn and virtual unit N1, N2 ... Nn;
3) Zheng Lian user allows fine motion work and virtual unit N1, N2 ... Nn; When making the user carry out fine motion to do, can implement virtual unit N1, N2 ... The selection of Nn, lock and control, make the user can open, close or control then and wait to control equipment Mm through control virtual unit Nm.
The system of selection that the user allows fine motion to do virtual unit Nm and control key thereof is through eye tracking, catches eyeball is just being seen virtual unit Nm and control key thereof, to confirm the selection to virtual unit Nm and control key thereof.
The fine motion that locking virtual unit Nm and control key thereof are corresponding connects as eyelid carries out action nictation.
Said user allows fine motion to be provided with the user as scheme or stage property is carried out the amplitude peak M of this fine motion work, the amplitude peak that corresponding virtual control key is performed is N; Being located at user on the t time point, to carry out the amplitude that this fine motion does be Mt; The amplitude that corresponding virtual control key is performed is Nt; Then this system satisfies: when Mt >=M, and Nt=N.
Limit said user and allow fine motion to make scheme, when making the user accomplish arbitrary fine motion and do with said amplitude peak M, any adjacent two partly angle changing values on the trunk except that palm and sole are less than 30 degree.
A kind of control system of quiet formula human-computer interaction; The user does to select and the control virtual unit through implementing fine motion; Then realize that it is characterized in that: it comprises to respectively waiting the control of the equipment of controlling in the reality: imaging device, the user who shows virtual unit allows fine motion to make the identification capture device of scheme, control user and synchronous control system and associated virtual equipment and the converting system of waiting the equipment of controlling to virtual unit is selected and control action is synchronous.
Said identification capture device is provided with selective system and the locking system to virtual unit.
The usefulness of technique scheme is:
The present invention is because user's health need not to leave the position, therefore in operating process, but equal recumbencies or be seated, thereby the user can long-time free and relaxed completion each item control, and can be because of not being short of physical strength, and compelled the termination; Therefore the adaptation population is extremely wide, and all healths have the people of active muscles ability to carry out corresponding human-computer interaction through the present invention.
Through specific embodiment the present invention is further described below.
Specific embodiment
The control method of 1 one kinds of quiet formula human-computer interactions of embodiment
A kind of control method of quiet formula human-computer interaction, the user allows fine motion to do to select and to control the control key of one or more virtual unit through implementing the user, and it may further comprise the steps:
1) establishment waits to control equipment M1, M2 in virtual world ... The virtual unit N1 of Mn, N2 ... Nn;
2) association waits to control equipment M1, M2 ... Mn and virtual unit N1, N2 ... Nn;
3) Zheng Lian user allows fine motion work and virtual unit N1, N2 ... Nn; When making the user carry out fine motion to do, can implement virtual unit N1, N2 ... The selection of Nn, lock and control, make the user can open, close or control then and wait to control equipment Mm through control virtual unit Nm.
The system of selection that the user allows fine motion to do virtual unit Nm and control key thereof is through eye tracking, catches eyeball is just being seen virtual unit Nm and control key thereof, to confirm the selection to virtual unit Nm and control key thereof.
The fine motion that locking virtual unit Nm and control key thereof are corresponding connects as eyelid carries out action nictation.
Said user allows fine motion to be provided with the user as scheme or stage property is carried out the amplitude peak M of this fine motion work, the amplitude peak that corresponding virtual control key is performed is N; Being located at user on the t time point, to carry out the amplitude that this fine motion does be Mt; The amplitude that corresponding virtual control key is performed is Nt; Then this system satisfies: when Mt >=M, and Nt=N.
Limit said user and allow fine motion to make scheme, when making the user accomplish arbitrary fine motion and do with said amplitude peak M, any adjacent two partly angle changing values on the trunk except that palm and sole are less than 30 degree.
A kind of control system of quiet formula human-computer interaction; The user does to select and the control virtual unit through implementing fine motion; Then realize that it is characterized in that: it comprises to respectively waiting the control of the equipment of controlling in the reality: imaging device, the user who shows virtual unit allows fine motion to make the identification capture device of scheme, control user and synchronous control system and associated virtual equipment and the converting system of waiting the equipment of controlling to virtual unit is selected and control action is synchronous.
Said identification capture device is provided with selective system and the locking system to virtual unit.
Claims (7)
1. the control method of a quiet formula human-computer interaction is characterized in that, the user allows fine motion to do to select and to control the control key of one or more virtual unit through implementing the user, and it may further comprise the steps:
1) establishment waits to control equipment M1, M2 in virtual world ... The virtual unit N1 of Mn, N2 ... Nn;
2) association waits to control equipment M1, M2 ... Mn and virtual unit N1, N2 ... Nn;
3) Zheng Lian user allows fine motion work and virtual unit N1, N2 ... Nn; When making the user carry out fine motion to do, can implement virtual unit N1, N2 ... The selection of Nn, lock and control, make the user can open, close or control then and wait to control equipment Mm through control virtual unit Nm.
2. the control method of a kind of quiet formula human-computer interaction as claimed in claim 1; It is characterized in that: the system of selection that the user allows fine motion to do virtual unit Nm and control key thereof is through eye tracking; Catch eyeball is just being seen virtual unit Nm and control key thereof, to confirm selection to virtual unit Nm and control key thereof.
3. the control method of a kind of quiet formula human-computer interaction as claimed in claim 2 is characterized in that: the fine motion that locking virtual unit Nm and control key thereof are corresponding connects as eyelid carries out action nictation.
4. the control method of a kind of quiet formula human-computer interaction as claimed in claim 1; It is characterized in that: said user allows fine motion to be provided with the user as scheme or stage property is carried out the amplitude peak M of this fine motion work, the amplitude peak that corresponding virtual control key is performed is N; Being located at user on the t time point, to carry out the amplitude that this fine motion does be Mt; The amplitude that corresponding virtual control key is performed is Nt, and then this system satisfies: when Mt >=M, and Nt=N.
5. the control method of a kind of quiet formula human-computer interaction as claimed in claim 4; It is characterized in that: limit said user and allow fine motion to make scheme; When making the user accomplish arbitrary fine motion and do with said amplitude peak M, any adjacent two partly angle changing values on the trunk except that palm and sole are less than 30 degree.
6. the control system of a quiet formula human-computer interaction; The user does to select and the control virtual unit through implementing fine motion; Then realize that it is characterized in that: it comprises to respectively waiting the control of the equipment of controlling in the reality: imaging device, the user who shows virtual unit allows fine motion to make the identification capture device of scheme, control user and synchronous control system and associated virtual equipment and the converting system of waiting the equipment of controlling to virtual unit is selected and control action is synchronous.
7. the control system of a kind of quiet formula human-computer interaction as claimed in claim 6 is characterized in that: said identification capture device is provided with selective system and the locking system to virtual unit.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN2012102004046A CN102722253A (en) | 2011-09-12 | 2012-06-18 | Static man-machine interactive control method and application thereof |
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201110269433.3 | 2011-09-12 | ||
| CN201110269433 | 2011-09-12 | ||
| CN2012102004046A CN102722253A (en) | 2011-09-12 | 2012-06-18 | Static man-machine interactive control method and application thereof |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN102722253A true CN102722253A (en) | 2012-10-10 |
Family
ID=46948048
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN2012102004046A Pending CN102722253A (en) | 2011-09-12 | 2012-06-18 | Static man-machine interactive control method and application thereof |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN102722253A (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103092349A (en) * | 2013-01-23 | 2013-05-08 | 宁凯 | Panoramic experience method based on Kinect somatosensory equipment |
| CN108776541A (en) * | 2014-04-11 | 2018-11-09 | 黄得锋 | A kind of control method of human-computer interaction |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5913727A (en) * | 1995-06-02 | 1999-06-22 | Ahdoot; Ned | Interactive movement and contact simulation game |
| CN1889016A (en) * | 2006-07-25 | 2007-01-03 | 周辰 | Eye-to-computer cursor automatic positioning controlling method and system |
| US7205979B2 (en) * | 1987-03-17 | 2007-04-17 | Sun Microsystems, Inc. | Computer data entry and manipulation apparatus and method |
| CN101890237A (en) * | 2010-07-16 | 2010-11-24 | 叶尔肯·拜山 | Game controller and control method thereof |
| CN102047201A (en) * | 2008-05-26 | 2011-05-04 | 微软国际控股私有有限公司 | Controlling virtual reality |
| CN102129292A (en) * | 2010-01-15 | 2011-07-20 | 微软公司 | Recognizing user intent in motion capture system |
-
2012
- 2012-06-18 CN CN2012102004046A patent/CN102722253A/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7205979B2 (en) * | 1987-03-17 | 2007-04-17 | Sun Microsystems, Inc. | Computer data entry and manipulation apparatus and method |
| US5913727A (en) * | 1995-06-02 | 1999-06-22 | Ahdoot; Ned | Interactive movement and contact simulation game |
| CN1889016A (en) * | 2006-07-25 | 2007-01-03 | 周辰 | Eye-to-computer cursor automatic positioning controlling method and system |
| CN102047201A (en) * | 2008-05-26 | 2011-05-04 | 微软国际控股私有有限公司 | Controlling virtual reality |
| CN102129292A (en) * | 2010-01-15 | 2011-07-20 | 微软公司 | Recognizing user intent in motion capture system |
| CN101890237A (en) * | 2010-07-16 | 2010-11-24 | 叶尔肯·拜山 | Game controller and control method thereof |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103092349A (en) * | 2013-01-23 | 2013-05-08 | 宁凯 | Panoramic experience method based on Kinect somatosensory equipment |
| CN108776541A (en) * | 2014-04-11 | 2018-11-09 | 黄得锋 | A kind of control method of human-computer interaction |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Guidali et al. | A robotic system to train activities of daily living in a virtual environment | |
| Ma et al. | Hand rehabilitation learning system with an exoskeleton robotic glove | |
| Kim et al. | Continuous shared control for stabilizing reaching and grasping with brain-machine interfaces | |
| Patton et al. | Robotics and virtual reality: a perfect marriage for motor control research and rehabilitation | |
| Luo et al. | Combined perception, control, and learning for teleoperation: key technologies, applications, and challenges | |
| Carlson et al. | The birth of the brain-controlled wheelchair | |
| Chen et al. | Postural control during standing reach in children with Down syndrome | |
| CN102541260A (en) | Human-machine interaction control method and application thereof | |
| Luces et al. | A phantom-sensation based paradigm for continuous vibrotactile wrist guidance in two-dimensional space | |
| Pang et al. | Study on the sEMG driven upper limb exoskeleton rehabilitation device in bilateral rehabilitation | |
| Gasser et al. | Design and performance characterization of a hand orthosis prototype to aid activities of daily living in a post-stroke population | |
| Hasegawa et al. | Bilateral control of elbow and shoulder joints using functional electrical stimulation between humans and robots | |
| Zhu et al. | Face-computer interface (FCI): Intent recognition based on facial electromyography (fEMG) and online human-computer interface with audiovisual feedback | |
| Padmanabha et al. | Hat: Head-worn assistive teleoperation of mobile manipulators | |
| Lavoie et al. | Comparing eye–hand coordination between controller-mediated virtual reality, and a real-world object interaction task | |
| CN102722253A (en) | Static man-machine interactive control method and application thereof | |
| Hoshyarmanesh et al. | Evaluation of haptic devices and end‐users: novel performance metrics in tele‐robotic microsurgery | |
| Ma et al. | Sensing and force-feedback exoskeleton robotic (SAFER) glove mechanism for hand rehabilitation | |
| CN204428386U (en) | A kind of interactive device for realizing hand rehabilitation training | |
| Villani et al. | Natural interaction based on affective robotics for multi-robot systems | |
| Alex et al. | A review of sensor devices in stroke rehabilitation | |
| Geng et al. | Motor prediction in brain-computer interfaces for controlling mobile robots | |
| Lindén et al. | Special considerations for navigation and interaction in virtual environments for people with brain injury | |
| Beckerle | Virtual Hand Experience | |
| Naganuma et al. | Promotion of rehabilitation practice for elderly people using robotic pets |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| RJ01 | Rejection of invention patent application after publication |
Application publication date: 20121010 |
|
| RJ01 | Rejection of invention patent application after publication |