Perez et al., 2012 - Google Patents

Robotic wheelchair controlled through a vision-based interface

Perez et al., 2012

View PDF
Document ID
14490645123258481191
Author
Perez E
Soria C
Nasisi O
Bastos T
Mut V
Publication year
Publication venue
Robotica

External Links

Snippet

In this work, a vision-based control interface for commanding a robotic wheelchair is presented. The interface estimates the orientation angles of the user's head and it translates these parameters in command of maneuvers for different devices. The performance of the …
Continue reading at ri.conicet.gov.ar (PDF) (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00221Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
    • G06K9/00268Feature extraction; Face representation
    • G06K9/00281Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00362Recognising human body or animal bodies, e.g. vehicle occupant, pedestrian; Recognising body parts, e.g. hand

Similar Documents

Publication Publication Date Title
Bastos-Filho et al. Towards a new modality-independent interface for a robotic wheelchair
Schröer et al. An autonomous robotic assistant for drinking
Kuno et al. Look where you're going [robotic wheelchair]
Dometios et al. Vision-based online adaptation of motion primitives to dynamic surfaces: application to an interactive robotic wiping task
Bergasa et al. Commands generation by face movements applied to the guidance of a wheelchair for handicapped people
Kyrarini et al. Robot learning of assistive manipulation tasks by demonstration via head gesture-based interface
Petit et al. An integrated framework for humanoid embodiment with a BCI
Maheswari et al. Voice Control and Eyeball Movement Operated Wheelchair
Nakanishi et al. Robotic wheelchair based on observations of both user and environment
Perez et al. Robotic wheelchair controlled through a vision-based interface
CN111652155A (en) A method and system for recognizing human motion intention
Masud et al. Smart wheelchair controlled through a vision-based autonomous system
Ramanathan et al. Visual Environment perception for obstacle detection and crossing of lower-limb exoskeletons
Maciel et al. Shared control methodology based on head positioning and vector fields for people with quadriplegia
Wu et al. The visual footsteps planning system for exoskeleton robots under complex terrain
Wang et al. What you see is what you grasp: User-friendly grasping guided by near-eye-tracking
Erdoğan et al. Intention recognition using leap motion controller and Artificial Neural Networks
Halawani et al. Active vision for controlling an electric wheelchair
Perez et al. Robust human machine interface based on head movements applied to Assistive robotics
Moon et al. Safe and reliable intelligent wheelchair robot with human robot interaction
Bergasa et al. Guidance of a wheelchair for handicapped people by face tracking
Desai et al. Controlling a wheelchair by gesture movements and wearable technology
Yang et al. Head-free, human gaze-driven assistive robotic system for reaching and grasping
Kim et al. A human-robot interface using eye-gaze tracking system for people with motor disabilities
Hassani et al. Implementation of wheelchair controller using mouth and tongue gesture