CN201815058U - Treadmill capable of identifying gestures - Google Patents

Treadmill capable of identifying gestures Download PDF

Info

Publication number
CN201815058U
CN201815058U CN2010202760637U CN201020276063U CN201815058U CN 201815058 U CN201815058 U CN 201815058U CN 2010202760637 U CN2010202760637 U CN 2010202760637U CN 201020276063 U CN201020276063 U CN 201020276063U CN 201815058 U CN201815058 U CN 201815058U
Authority
CN
China
Prior art keywords
treadmill
gesture
camera
control system
gestures
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2010202760637U
Other languages
Chinese (zh)
Inventor
朱晓锋
刘宗钦
王凤辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Haizhen Electronic Technology Co., Ltd.
Original Assignee
HUZHOU HAIZHEN ELECTRONIC SCIENTIFIC AND TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HUZHOU HAIZHEN ELECTRONIC SCIENTIFIC AND TECHNOLOGY Co Ltd filed Critical HUZHOU HAIZHEN ELECTRONIC SCIENTIFIC AND TECHNOLOGY Co Ltd
Priority to CN2010202760637U priority Critical patent/CN201815058U/en
Application granted granted Critical
Publication of CN201815058U publication Critical patent/CN201815058U/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The utility model discloses a treadmill capable of identifying gestures, comprising a body, a motor installed on the body, a main control system and a camera, wherein the camera is connected with the input end of the main control system, and the output end of the main control system is connected with the motor. The treadmill based on the computer vision adopts the camera as a tool for collecting the gestures of a user, the control effect identical to that of a key can be achieved when a user makes the gestures facing to the camera, and the purpose of controlling the treadmill can be achieved just through simple gestures, thereby freeing the user from the restriction of a key panel and causing operation to be convenient.

Description

Can discern the treadmill of gesture
[technical field]
The utility model relates to the treadmill field, relates in particular to a kind of treadmill that has carried based on the real-time static gesture identification interaction technique of computer vision, belongs to artificial intelligence and pattern-recognition on subject.
[background technology]
Treadmill is as a kind of indoor body-building apparatus, remedied that the outdoor sport place lacks, the incomplete shortcoming of sports facility, is people's the healthy Gospel of having brought.In modern society, treadmill almost is the indispensable sports equipment in each tame fitness center, and some booming income crowds have also had the treadmill of oneself.
Along with the increase of number of users, treadmill must provide good user's operating platform, to satisfy most of user's demand.Treadmill on the market substantially all is to allow the user by come to send corresponding order to treadmill by the button on the panel now.The user goes still also to feel light by these buttons when static, if but just on treadmill, run, rethink by them, just seem that some is painstaking, particularly when the position of position of running and key panel is far away, if the runner wants to operate treadmill, just have on the crawler belt of treadmill, run back and forth so that the finger of oneself can be by arriving button.
[utility model content]
The purpose of this utility model solves the problems of the prior art exactly, proposes a kind of treadmill of discerning gesture, can make the user cast aside the constraint of key panel, only need just can reach the purpose of controlling treadmill by several simple gestures.
For achieving the above object, the utility model proposes a kind of treadmill of discerning gesture, comprise body and the motor that is installed on the body, also comprise master control system and camera, described camera is connected to the input of master control system, and the output of master control system is connected to motor.
As preferably, described camera adopts the camera of 1.3M pixel.Recognition effect is good, the cost performance height.
The beneficial effects of the utility model: the utility model is based on computer vision, adopt the sampling instrument of camera as user's gesture, the user only need make a gesture facing to this camera, can realize the control effect identical with button, only need just can reach the purpose of control treadmill by several simple gestures, make the user cast aside the constraint of key panel, made things convenient for operation.
Feature of the present utility model and advantage will be elaborated in conjunction with the accompanying drawings by embodiment.
[description of drawings]
Fig. 1 can discern the structural representation of the treadmill of gesture for the utility model;
Fig. 2 can discern the module diagram of master control system in the treadmill of gesture for the utility model.
[specific embodiment]
As shown in Figure 1, 2, can discern the treadmill of gesture, comprise body 2, be installed in motor (not shown), master control system and camera 1 on the body 2, described camera 1 is connected to the input of master control system, and the output of master control system is connected to motor.Described camera 1 adopts the camera 1 of 1.3M pixel.Described master control system comprises real-time video input module 31, key-frame extraction module 32, skin color segmentation module 33, gesture region extraction module 34, gesture feature extraction module 35 and the gesture identification module 36 that connects successively.Described master control system adopts the embedded industrial control board of INTEL ATOM, and dominant frequency is 1.6GHz.The operating system that adopts is (SuSE) Linux OS.The distribution situation of the described skin color segmentation module 33 statistics colours of skin under the hsv color space is partitioned into area of skin color in the image according to distribution situation then.Described gesture feature extraction module 35 adopts image normalization rotary inertia (NMI) and Hu, and bending moment is not as the feature of describing gesture, and the characteristic vector of its composition is as the classification foundation of grader.Described gesture identification module 36 adopts SVMs (SVM) as grader, after using a large amount of samples that grader is carried out learning training, the result of study is saved in the file of XML form, and when discerning next time, the result that only need read out study from file gets final product.
The gesture preprocessing process has comprised key-frame extraction, image filtering, human body complexion and step such as has cut apart.Wherein image filtering has adopted median filter method.Skin color segmentation counts the distribution of the colour of skin according to the distribution histogram of human body complexion under the hsv color space, thereby has extracted the zone of gesture in image exactly, and this has done preparation for next step gesture identification.
Described gesture identification process comprises that gesture feature extracts and two steps of gesture identification.Feature extraction is meant and extracts one group of data that from gesture these group data can fully characterize gesture, are called the characteristic vector of gesture.The utility model adopted the image normalization rotary inertia (NMI) of invariancies such as having rotation, translation and Hu not bending moment gesture feature is described.
In the said method NMI be around image centroid (cx, rotary inertia cy) (functional relation between J (cx, cy)) and the picture quality m is defined as (1) formula:
NMI = J ( cx , cy ) m = Σ x = 1 M Σ y = 1 N ( ( x - cx ) 2 + ( y - cy ) 2 ) f ( x , y ) Σ x = 1 M Σ y = 1 N f ( x , y ) - - - ( 1 )
Gesture identification is to instigate machine recognition to go out people's gesture behavior in the said method.It is divided into machine learning and two steps of classification.The utility model has adopted the instrument of SVMs (SVM) as classification.Grader is trained by the characteristic value of extracting sample at learning phase, and the relevant parameter after the preservation training.When identification, the parameter that obtains during according to training just can be divided into gesture in the predetermined class accurately, thereby realizes the identification of gesture.
Machine learning is meant how to make computer understand human behavior in the said method.It is the core of artificial intelligence, is to make computer have the fundamental way of intelligence.In the utility model, the task of machine learning is to make computer distinguish out different gestures as the people, and different gestures is divided in the inhomogeneity.
SVM is a kind of machine learning method that is come by the statistics development in the said method, is proposed by people such as Vapnik the earliest.Its core concept is: in the problem that two classes are divided, it makes the distance between two category features maximize by constructing an optimum hyperplane, thereby reaches the classification purpose; For the inseparable problem of lower dimensional space lower linear, it is transformed under the higher dimensional space by a definite Function Mapping relation, thereby makes former problem become the problem of a linear separability under higher dimensional space.
In the collected treadmill of the real-time video system that camera 1 is captured, after having carried out key-frame extraction, skin color segmentation, gesture extracted region, feature selecting and machine recognition, obtain the implication of user's gesture representative.Concrete implication is interpreted as concrete control command by system and removes to control treadmill.
Above every kind of gesture is selected the content of 200 sample gestures as machine learning respectively, the result of machine learning is saved in the XML file, and only need read out study next time from this document when discerning result just can identify gesture exactly.This benefit of bringing is to learn for every treadmill removes to select learning sample, only first learning outcome file of being preserved need be copied on the other machines to get final product.
Gesture identification accuracy rate of the present utility model is 97%, and can handle the video image of 14 frames each second, has guaranteed the real-time of identification and higher accuracy rate.Can be simply when the user moves just can control speed, the gradient of treadmill etc. by several gestures on treadmill, convenient and practical.
The foregoing description is to explanation of the present utility model, is not to qualification of the present utility model, any scheme after the utility model simple transformation is all belonged to protection domain of the present utility model.

Claims (2)

1. can discern the treadmill of gesture, comprise body and the motor that is installed on the body, it is characterized in that: also comprise master control system and camera, described camera is connected to the input of master control system, and the output of master control system is connected to motor.
2. the treadmill of discerning gesture as claimed in claim 1 is characterized in that: described camera adopts the camera of 1.3M pixel.
CN2010202760637U 2010-07-30 2010-07-30 Treadmill capable of identifying gestures Expired - Fee Related CN201815058U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010202760637U CN201815058U (en) 2010-07-30 2010-07-30 Treadmill capable of identifying gestures

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010202760637U CN201815058U (en) 2010-07-30 2010-07-30 Treadmill capable of identifying gestures

Publications (1)

Publication Number Publication Date
CN201815058U true CN201815058U (en) 2011-05-04

Family

ID=43913031

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010202760637U Expired - Fee Related CN201815058U (en) 2010-07-30 2010-07-30 Treadmill capable of identifying gestures

Country Status (1)

Country Link
CN (1) CN201815058U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2753242A4 (en) * 2011-09-08 2015-01-14 Paofit Holdings Pte Ltd Sensor device and system for fitness equipment
US9011293B2 (en) 2011-01-26 2015-04-21 Flow-Motion Research And Development Ltd. Method and system for monitoring and feed-backing on execution of physical exercise routines
WO2019019105A1 (en) * 2017-07-27 2019-01-31 深圳市屹石科技股份有限公司 Gesture-controlled smart treadmill

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9011293B2 (en) 2011-01-26 2015-04-21 Flow-Motion Research And Development Ltd. Method and system for monitoring and feed-backing on execution of physical exercise routines
US9987520B2 (en) 2011-01-26 2018-06-05 Flow Motion Research And Development Ltd. Method and system for monitoring and feed-backing on execution of physical exercise routines
US12157035B2 (en) 2011-01-26 2024-12-03 Flow-Motion Research And Development Ltd. Method and system for monitoring and feed-backing on execution of physical exercise routines
EP2753242A4 (en) * 2011-09-08 2015-01-14 Paofit Holdings Pte Ltd Sensor device and system for fitness equipment
WO2019019105A1 (en) * 2017-07-27 2019-01-31 深圳市屹石科技股份有限公司 Gesture-controlled smart treadmill

Similar Documents

Publication Publication Date Title
CN101912676A (en) Treadmill capable of recognizing gesture
CN104143079B (en) The method and system of face character identification
CN103237201B (en) A kind of case video analysis method based on socialization mark
Guo et al. Automatic image cropping for visual aesthetic enhancement using deep neural networks and cascaded regression
Angelova et al. Image segmentation for large-scale subcategory flower recognition
CN109829467A (en) Image labeling method, electronic device and non-transient computer-readable storage medium
CN111027378A (en) Pedestrian re-identification method, device, terminal and storage medium
CN101930549B (en) Static Human Detection Method Based on the Second Generation Curvelet Transform
CN108846359A (en) It is a kind of to divide the gesture identification method blended with machine learning algorithm and its application based on skin-coloured regions
CN106097381B (en) A Target Tracking Method Based on Manifold Discriminant Non-negative Matrix Factorization
CN106203237A (en) The recognition methods of container-trailer numbering and device
CN103985130B (en) A kind of saliency analysis method for complex texture image
CN102208020A (en) Human face recognition method based on optimal dimension scale cutting criterion
CN104111733B (en) A kind of gesture recognition system and method
CN105975934A (en) Dynamic gesture identification method and system for augmented reality auxiliary maintenance
CN102194108A (en) Smiley face expression recognition method based on clustering linear discriminant analysis of feature selection
CN106096612A (en) Trypetid image identification system and method
CN201815058U (en) Treadmill capable of identifying gestures
CN109635811A (en) Image Analysis Methods of Space Plants
CN103020614A (en) Human movement identification method based on spatio-temporal interest point detection
Wang et al. Pig face recognition model based on a cascaded network
CN107533547A (en) Product index editing method and its system
CN104679967A (en) Method for judging reliability of psychological test
CN103077383B (en) Based on the human motion identification method of the Divisional of spatio-temporal gradient feature
CN108108648A (en) A kind of new gesture recognition system device and method

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
C56 Change in the name or address of the patentee

Owner name: ZHEJIANG HIZHEN ELECTRONICS TECHNOLOGY CO., LTD.

Free format text: FORMER NAME: HUZHOU HAIZHEN ELECTRONIC TECHNOLOGY CO., LTD.

CP03 Change of name, title or address

Address after: 313100 Industrial Park, eight Li Town, Zhejiang, Huzhou

Patentee after: Zhejiang Haizhen Electronic Technology Co., Ltd.

Address before: 313000 Huzhou Wuxing Wuxing Industrial Park Huzhou Zhejiang vibration Electronic Technology Co., Ltd.

Patentee before: Huzhou Haizhen Electronic Scientific And Technology Co., Ltd.

CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110504

Termination date: 20140730

EXPY Termination of patent right or utility model