CN111144341A - Body-building action error correction method and system based on mobile platform - Google Patents

Body-building action error correction method and system based on mobile platform Download PDF

Info

Publication number
CN111144341A
CN111144341A CN201911395296.0A CN201911395296A CN111144341A CN 111144341 A CN111144341 A CN 111144341A CN 201911395296 A CN201911395296 A CN 201911395296A CN 111144341 A CN111144341 A CN 111144341A
Authority
CN
China
Prior art keywords
user
building
fitness
action
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911395296.0A
Other languages
Chinese (zh)
Inventor
黄昌正
周言明
陈曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huaibei Huanjing Intelligent Technology Co ltd
Original Assignee
Huaibei Huanjing Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huaibei Huanjing Intelligent Technology Co ltd filed Critical Huaibei Huanjing Intelligent Technology Co ltd
Priority to CN201911395296.0A priority Critical patent/CN111144341A/en
Publication of CN111144341A publication Critical patent/CN111144341A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/752Contour matching
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/065Visualisation of specific exercise parameters

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Social Psychology (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Psychiatry (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the technical field of computer vision, and discloses a body-building action error correction method and a body-building action error correction system based on a mobile platform, wherein the method comprises the following steps: the body-building image of the user is shot in a moving mode; identifying the body-building action of the user in the body-building image; detecting whether the fitness action matches a correct fitness action; and if not, outputting correct body building actions to the user. Wherein, moving platform is including removing base, cloud platform, camera, telescopic link and display screen. The moving platform is used for moving and shooting, dead-angle-free shooting can be performed on the body-building action of the user under the condition that a large number of devices are not required to be arranged, so that the body-building action is recognized and corrected, the implementation cost is low, and the popularization and the application are convenient.

Description

Body-building action error correction method and system based on mobile platform
Technical Field
The invention relates to the technical field of computer vision, in particular to a body-building action error correction method and system based on a mobile platform.
Background
The recently released HUAWEI HiAI 3.0 AI computing platform has the capability of open distributed computer vision and voice recognition, supports multiple terminals to share AI computing power, and can acquire and compute the vision/voice data on a plurality of terminal devices. When the technology is applied to the field of sports and fitness, shooting can be carried out by means of cameras on different terminal devices in a user's home, 3D limb nodes are identified by adopting distributed computer vision, fitness actions of the user are determined and analyzed, and correct actions are output through the terminal devices such as a screen and the like to carry out error correction.
Although the fitness movement error correction scheme is technically feasible, the implementation cost is too high, a plurality of cameras and display equipment are needed, certain requirements are further placed on the arrangement positions of the cameras and the display equipment, and dead angles exist in shooting of fitness movements when the quantity of the equipment is insufficient.
Disclosure of Invention
The embodiment of the invention discloses a body-building action error correction method and system based on a mobile platform.
The embodiment of the invention discloses a body-building action error correction method based on a mobile platform in a first aspect, which comprises the following steps:
the body-building image of the user is shot in a moving mode;
identifying the body-building action of the user in the body-building image;
detecting whether the fitness action matches a correct fitness action;
and if not, outputting the correct body-building action to the user.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, before the capturing the exercise image of the user, the method further includes:
capturing a positioning image including the user;
analyzing the positioning image to obtain the spatial scene of the user and the fitness items carried out by the user;
and determining a shooting path of the mobile shooting according to the space scene and the fitness items.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, the identifying the exercise motion of the user in the exercise image includes:
identifying a limb node of the user in each frame of the fitness image, the limb node being used to indicate spatial positions of the head, neck, shoulders, elbows, hands, hips, knees, and feet of the user;
processing the limb nodes by adopting a limb construction model to obtain the limb posture and the limb coordinates of the user in each frame of the fitness image;
and integrating the body posture and the body coordinates of the user in the body-building images of a plurality of consecutive frames to construct and obtain the body-building action of the user.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, if it is detected that the exercise motion does not match the correct exercise motion, the outputting the correct exercise motion to the user includes:
identifying a face image of the user in the fitness image, and determining the face orientation of the user;
determining a display position matched with the face orientation in the shooting path;
controlling the mobile platform to move to the display position;
outputting the correct workout to the user.
The second aspect of the embodiment of the invention discloses a body-building action error correction system based on a mobile platform, which comprises:
the shooting unit is used for movably shooting the body-building image of the user;
the action identification unit is used for identifying the body building action of the user in the body building image;
a motion detection unit for detecting whether the body-building motion is matched with a correct body-building motion;
and the output unit is used for outputting the correct body building action to the user when the action detection unit detects that the body building action is not matched with the correct body building action.
As an optional implementation manner, in the second aspect of the embodiment of the present invention, the system further includes:
the shooting unit is further used for shooting a positioning image comprising the user;
the image analysis unit is used for analyzing the positioning image to acquire the space scene where the user is located and the fitness items performed by the user;
and the path planning unit is used for determining a shooting path of the mobile shooting according to the space scene and the fitness project.
As an optional implementation manner, in a second aspect of the embodiment of the present invention, the motion recognition unit includes:
a node identification subunit, configured to identify a limb node of the user in each frame of the fitness image, where the limb node is used to indicate spatial positions of the head, neck, shoulder, elbow, hand, hip, knee, and foot of the user;
the model processing subunit is used for processing the limb nodes by adopting a limb construction model to obtain the limb posture and the limb coordinates of the user in each frame of the fitness image;
and the action construction subunit is used for integrating the body posture and the body coordinates of the user in the body building images of a plurality of consecutive frames to construct and obtain the body building action of the user.
As an alternative implementation, in a second aspect of the embodiments of the present invention, the output unit includes:
the face identification subunit is used for identifying the face image of the user in the fitness image and determining the face orientation of the user;
a position determining subunit configured to determine, in the shooting path, a display position that matches the face orientation;
the mobile control subunit is used for controlling the mobile platform to move to the display position;
and the output subunit is used for outputting the correct body-building action to the user.
The third aspect of the embodiment of the invention discloses a body-building action error correction system based on a mobile platform, which comprises:
a memory storing executable program code;
a processor coupled with the memory;
the processor calls the executable program code stored in the memory to execute part of the steps of the body-building action error correction method based on the mobile platform disclosed by the first aspect of the embodiment of the invention.
A fourth aspect of the embodiments of the present invention discloses a computer-readable storage medium storing a computer program, wherein the computer program enables a computer to execute all or part of the steps of the method for correcting the fitness action based on the mobile platform disclosed in the first aspect of the embodiments of the present invention.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
in the embodiment of the invention, the body-building image of the user is shot in a moving way; identifying the current body-building action of the user in the body-building image; detecting whether the fitness action matches a correct fitness action; and if not, outputting correct body building actions to the user. Therefore, a large number of shooting devices are not required to be arranged, the body-building images with multiple visual angles of the user can be completely shot and the body-building actions can be corrected by adopting a mobile shooting mode, the implementation cost is reduced, and the popularization and the application are convenient.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is an external view of a mobile platform according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a method for correcting exercise motions based on a mobile platform according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a mobile platform-based exercise performance correction system according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of another fitness motion correction system based on a mobile platform according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first", "second", "third" and "fourth" etc. in the description and claims of the present invention are used for distinguishing different objects, and are not used for describing a specific order. The terms "comprises," "comprising," and any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical solutions of the embodiments of the present invention will be further described with reference to the following embodiments and the accompanying drawings.
In order to better understand the exercise movement error correction method based on the mobile platform disclosed in the embodiment of the present invention, a mobile platform disclosed in the embodiment of the present invention is described first. Referring to fig. 1, a mobile platform according to an embodiment of the present invention may include: the device comprises a mobile base 10, a tripod head 11, a camera 12, an expansion link 13, a display screen 14 and a processor 15. Wherein, remove base 10 and install a plurality of mecanum wheel, cloud platform 11 and telescopic link 13 are installed on removing base 10, and camera 12 is installed on cloud platform 11 top, and display screen 14 is installed on the telescopic link top.
In the mobile platform disclosed in the embodiment of the present invention, with the above-described structure, the mobile base 10 can smoothly advance, retreat or turn around in the interior of a house and other environmental scenes with various obstacles through the plurality of mecanum wheels; the camera 12 disposed on the pan/tilt head 11 can capture images with different viewing angles by the movement of the mobile base 10 and the pitching/rotating of the pan/tilt head 11; in addition, by the movement of the movable base 10 and the extension/rotation of the telescopic rod 13, the display screen 14 disposed at the top end of the telescopic rod 13 can display toward different areas.
According to the mobile platform disclosed by the embodiment of the invention, the camera 12 on the cloud deck 11 can shoot a multi-view body-building image by moving and matching with the cloud deck 11 to pitch/rotate, and correct body-building actions can be output to a user by identifying the body-building image and through the display screen 14 arranged at the top end of the telescopic rod 13, so that the user can be helped to correct the body-building actions. The functions of image recognition, body-building action error correction, cradle head pitching/rotating, mobile platform moving and the like are realized by the processor 15 performing recognition processing according to the shot positioning image and body-building image and sending control instructions to the mobile base 10, the cradle head 11, the camera 12, the telescopic rod 13 and the display screen 14. Because a large amount of camera equipment is not required to be arranged, the overall implementation cost is reduced, and the popularization and the application are convenient.
Example one
Referring to fig. 2, as shown in fig. 2, a method for correcting exercise motions based on a mobile platform according to an embodiment of the present invention may include the following steps.
201. And moving to shoot the body-building image of the user.
In the embodiment of the invention, the body-building image of the user is shot by moving around the user, so that the identification and detection can be carried out in the subsequent steps.
In the embodiment of the invention, before moving around the user and shooting, the position of the user and the current scene need to be identified.
As an alternative embodiment, before the exercise image of the user is taken, a positioning image including the user is taken; analyzing the positioning image to obtain a spatial scene where the user is located and a fitness project performed by the user; and determining a shooting path of the mobile shooting according to the space scene and the fitness items. Specifically, the camera firstly shoots a panoramic image of a current scene as a positioning image, based on a preprocessing algorithm with high processing efficiency such as a contour recognition algorithm and the like, the human body contour of a user in the positioning image and the object contour in a spatial scene where the user is located are quickly recognized, the search is carried out in a fitness item database based on the human body contour of the user, fitness items and a plurality of standard fitness actions corresponding to each fitness item are stored in the fitness item database, so that the standard fitness actions can be searched in the fitness item database through the human body contour, the fitness item corresponding to the standard fitness action with the highest similarity to the human body contour in the positioning image is selected as the current fitness item of the user, and the shooting path for completely shooting the fitness actions of the user is determined according to the fitness items. Therefore, the space scene where the user is located and the fitness items performed by the user can be conveniently obtained by rapidly identifying the outline of the positioning image, and the shooting path is determined.
As another optional implementation manner, the shooting path for completely shooting the exercise motions of the user is determined according to the exercise projects, specifically, the shooting path is determined according to the limb motion amplitude and the limb posture of each exercise project, for example, when the exercise project is a push-up, the distance between the two hands of the user and the straightness of the body need to be detected, at this time, the orientation of the camera needs to be adjusted by the pan-tilt to be inclined downward to shoot the two hands of the user and the side faces of the user, and the shooting path is cyclically and reciprocally moved between the front of the top of the head of the user and the side faces of the body of the. Therefore, the body-building action of the user can be completely shot with the minimum power consumption by reasonably planning the shooting path.
202. And identifying the body-building action of the user in the body-building image.
In embodiments of the present invention, the exercise activity is identified in real time and detected in step 203.
As an alternative embodiment, the body nodes of the user in each frame of the fitness image are identified, and the body nodes are used for indicating the spatial positions of the head, the neck, the shoulders, the elbows, the hands, the hips, the knees and the feet of the user; processing the limb nodes by adopting a limb construction model to obtain the limb posture and the limb coordinates of the user in each frame of fitness image; and integrating the body postures and the body coordinates of the user in the plurality of consecutive body-building images to construct and obtain the body-building action of the user. Specifically, the body nodes and the spatial positions of the body nodes of the user in each frame of fitness image are determined based on the human body proportion, at the moment, the body nodes are processed by adopting a body construction model, the body nodes are connected to construct the body posture of the user and the relative body coordinates among the bodies, the body posture and the body coordinates of the user in a plurality of frames of fitness images shot from multiple visual angles are integrated continuously, the fitness action taken by the user at present can be obtained, and the recognized fitness action is accurate because the fitness images have no shooting dead angles.
203. It is detected whether the fitness action matches the correct fitness action.
In the embodiment of the invention, the fitness actions identified in the step 203 can be subjected to matching degree verification by combining a plurality of standard fitness actions in the fitness item database, and for the fitness actions with the matching degree lower than the preset matching threshold, the user can be determined that the fitness actions made by the user are not standard, and correct fitness actions need to be output to the user for error correction.
204. Outputting the correct fitness action to the user.
In the embodiment of the present invention, when it is determined in step 203 that the exercise performed by the user is not standard, the correct exercise is output to the user.
As an optional implementation manner, if it is detected that the body-building action does not match the correct body-building action, outputting the correct body-building action to the user, recognizing a face image of the user in the body-building image, and determining the face orientation of the user; determining a display position matched with the face orientation in the shooting path; controlling the mobile platform to move to a display position; outputting the correct fitness action to the user. Specifically, when the body building action made by the user is determined to be nonstandard, the face image of the user and the face orientation of the user are identified in the body building image, the area where the shooting path of the face orientation of the user is located is set as the display position, the object at the display position can be directly observed by two eyes of the user, the mobile platform is controlled to move to the display position, the telescopic rod is controlled to adjust the orientation and the height of the display screen according to the face orientation of the user and the specific height of the face, and correct body building action is output, so that the user can know that the body building action made by the user is not standard in real time in the body building process without making posture adjustment, and meanwhile correct body building action is learned.
Therefore, by implementing the body-building action error correction method based on the mobile platform described in fig. 2, the camera on the cradle head can shoot body-building images with multiple visual angles by moving and matching with the cradle head to pitch/rotate, correct body-building actions are output to users through the display screen arranged at the top end of the telescopic rod, error correction is performed on the body-building actions of the users, and due to the fact that a large number of camera devices are not required to be arranged, the overall implementation cost is reduced, and popularization and application are facilitated.
Example two
Referring to fig. 3, fig. 3 is a schematic structural diagram of a body-building action error correction system based on a mobile platform according to an embodiment of the present invention. The system may include:
the shooting unit 301 is used for movably shooting a body-building image of a user;
a motion recognition unit 302, configured to recognize a fitness motion of the user in the fitness image;
a motion detection unit 303 for detecting whether the fitness motion matches a correct fitness motion;
an output unit 304, configured to output a correct exercise motion to the user when the motion detection unit detects that the exercise motion does not match the correct exercise motion;
in addition, the shooting unit 301 is further configured to shoot a positioning image including the user before shooting the fitness image of the user;
the image analysis unit 305 is configured to analyze the positioning image to obtain a spatial scene where the user is located and a fitness project performed by the user;
the path planning unit 306 is used for determining a shooting path of the mobile shooting according to the space scene and the fitness project;
wherein the action recognition unit 302 includes:
a node identification subunit 3021, configured to identify a limb node of the user in each frame of the fitness image, where the limb node is used to indicate spatial positions of the head, neck, shoulder, elbow, hand, hip, knee, and foot of the user;
the model processing subunit 3022 is configured to process the limb node by using a limb construction model, so as to obtain a limb posture and a limb coordinate of the user in each frame of the fitness image;
the motion construction subunit 3023 is configured to synthesize the body postures and the body coordinates of the user in the consecutive fitness images to construct a fitness motion of the user;
and, the output unit 304 includes:
a face recognition subunit 3041, configured to recognize the face image of the user in the fitness image, and determine a face orientation of the user;
a position determination subunit 3042 for determining a display position matching the face orientation in the shooting path;
a movement control subunit 3043 for controlling the movement of the mobile platform to the display position
An output subunit 3044 for outputting the correct fitness action to the user.
As an alternative embodiment, before taking the exercise image of the user, the shooting unit 301 shoots a positioning image including the user; the image analysis unit 305 analyzes the positioning image to obtain the spatial scene where the user is located and the fitness items performed by the user; the path planning unit 306 determines a shooting path for the moving shooting according to the space scene and the fitness items. Specifically, the camera first shoots a panoramic image of a current scene as a positioning image, the image analysis unit 305 performs fast recognition on a human body contour of a user in the positioning image and an object contour in a spatial scene where the user is located based on a high-processing-efficiency preprocessing algorithm such as a contour recognition algorithm, and searches in a fitness item database based on the human body contour of the user, fitness items and a plurality of standard fitness actions corresponding to each fitness item are stored in the fitness item database, so that the standard fitness actions can be searched in the fitness item database through the human body contour, a fitness item corresponding to the standard fitness action with the highest similarity to the human body contour in the positioning image is selected as a fitness item currently being carried out by the user, and a shooting path for completely shooting the fitness actions of the user is determined according to the fitness items. Therefore, the space scene where the user is located and the fitness items performed by the user can be conveniently obtained by rapidly identifying the outline of the positioning image, and the shooting path is determined.
As another optional implementation, the path planning unit 306 determines a shooting path for completely shooting the exercise motions of the user according to the exercise items, specifically, the shooting path is determined according to the limb motion amplitude and the limb posture of each exercise item, for example, when the exercise item is a push-up, the distance between the two hands of the user and the straightness of the body need to be detected, at this time, the pan-tilt is needed to adjust the orientation of the camera to be inclined downward to shoot the two hands of the user and the side faces of the user, and the shooting path is cyclically moved back and forth between the front of the top of the head of the user and the side faces of the body of the. Therefore, the body-building action of the user can be completely shot with the minimum power consumption by reasonably planning the shooting path.
As an alternative embodiment, the node identifying subunit 3021 identifies a limb node of the user in each frame of the fitness image, where the limb node is used to indicate spatial positions of the head, neck, shoulder, elbow, hand, hip, knee, and foot of the user; the model processing subunit 3022 processes the limb nodes by using a limb construction model, and obtains the limb posture and the limb coordinates of the user in each frame of the fitness image; the motion construction subunit 3023 integrates the body posture and the body coordinates of the user in the consecutive frames of the fitness image to construct the fitness motion of the user. Specifically, the body nodes and the spatial positions of the body nodes of the user in each frame of fitness image are determined based on the human body proportion, at the moment, the body nodes are processed by adopting a body construction model, the body nodes are connected to construct the body posture of the user and the relative body coordinates among the bodies, the body posture and the body coordinates of the user in a plurality of frames of fitness images shot from multiple visual angles are integrated continuously, the fitness action taken by the user at present can be obtained, and the recognized fitness action is accurate because the fitness images have no shooting dead angles.
As an optional implementation manner, if the motion detection unit 303 detects that the fitness motion does not match the correct fitness motion, the output unit 304 outputs the correct fitness motion to the user, and the face identification subunit 3041 identifies the face image of the user in the fitness image to determine the face orientation of the user; the position determination subunit 3042 determines a display position in the shooting path that matches the face orientation; the movement control subunit 3043 controls the moving platform to move to the display position, and the output subunit 3044 outputs the correct exercise movement to the user. Specifically, when it is determined that the body-building action made by the user is not standard, the face recognition subunit 3041 recognizes the face image of the user and the face orientation of the user in the body-building image, the position determination subunit 3042 sets the area where the shooting path of the face orientation of the user is located as the display position, the object at the display position can be directly observed by both eyes of the user, at this time, the movement control subunit 3043 controls the moving platform to move to the display position, controls the telescopic rod to adjust the orientation and height of the display screen according to the face orientation and the specific height of the face of the user, and the output subunit 3044 outputs the correct body-building action, so that the user can know that the body-building action made by the user is not standard in real time during the body-building process without making posture adjustment, and simultaneously learns the correct.
It can be seen that, by implementing the exercise movement error correction system based on the mobile platform described in fig. 3, the camera 12 on the pan/tilt head 11 can shoot multi-view exercise images by moving and cooperating with the pan/tilt head 11 to perform pitching/rotating, and correct exercise movements can be output to the user through the display screen 14 arranged at the top end of the telescopic rod 13, so as to correct the exercise movements of the user.
Example four
Referring to fig. 4, fig. 4 is a schematic structural diagram of another fitness exercise error correction system based on a mobile platform according to an embodiment of the present invention. As shown in FIG. 4, the mobile platform based fitness activity correction system may include:
a memory 401 storing executable program code;
a processor 402 coupled with the memory 401;
wherein processor 402 invokes executable program code stored in memory 401 to perform some of the steps of a mobile platform based exercise motion correction method shown in figure 1.
The embodiment of the invention discloses a computer readable storage medium, which stores a computer program, wherein the computer program enables a computer to execute all or part of the steps of the fitness action error correction method based on a mobile platform, which is shown in fig. 1.
It will be understood by those skilled in the art that all or part of the steps in the methods of the embodiments described above may be implemented by instructions associated with a program, which may be stored in a computer-readable storage medium, where the storage medium includes Read-Only Memory (ROM), Random Access Memory (RAM), Programmable Read-Only Memory (PROM), Erasable Programmable Read-Only Memory (EPROM), One-time Programmable Read-Only Memory (OTPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), compact disc-Read-Only Memory (CD-ROM), or other Memory, magnetic disk, magnetic tape, or magnetic tape, Or any other medium which can be used to carry or store data and which can be read by a computer.
The method and the system for correcting the body-building action based on the mobile platform disclosed by the embodiment of the invention are described in detail, a specific example is applied in the method to explain the principle and the implementation mode of the invention, and the description of the embodiment is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A body-building action error correction method based on a mobile platform is characterized by comprising the following steps:
the body-building image of the user is shot in a moving mode;
identifying the body-building action of the user in the body-building image;
detecting whether the fitness action matches a correct fitness action;
and if not, outputting the correct body-building action to the user.
2. The method of claim 1, wherein prior to said taking the user's fitness image, the method further comprises:
capturing a positioning image including the user;
analyzing the positioning image to obtain the spatial scene of the user and the fitness items carried out by the user;
and determining a shooting path of the mobile shooting according to the space scene and the fitness items.
3. The method of claim 2, wherein the identifying the user's workout activities in the workout image comprises:
identifying a limb node of the user in each frame of the fitness image, the limb node being used to indicate spatial positions of the head, neck, shoulders, elbows, hands, hips, knees, and feet of the user;
processing the limb nodes by adopting a limb construction model to obtain the limb posture and the limb coordinates of the user in each frame of the fitness image;
and integrating the body posture and the body coordinates of the user in the body-building images of a plurality of consecutive frames to construct and obtain the body-building action of the user.
4. The method of claim 2, wherein outputting the correct workout action to the user if it is detected that the workout action does not match the correct workout action comprises:
identifying a face image of the user in the fitness image, and determining the face orientation of the user;
determining a display position matched with the face orientation in the shooting path;
controlling the mobile platform to move to the display position;
outputting the correct workout to the user.
5. A mobile platform-based fitness activity correction system, comprising:
the shooting unit is used for movably shooting the body-building image of the user;
the action identification unit is used for identifying the body building action of the user in the body building image;
a motion detection unit for detecting whether the body-building motion is matched with a correct body-building motion;
and the output unit is used for outputting the correct body building action to the user when the action detection unit detects that the body building action is not matched with the correct body building action.
6. The system of claim 5, further comprising:
the shooting unit is further used for shooting a positioning image comprising the user before shooting the body-building image of the user;
the image analysis unit is used for analyzing the positioning image to acquire the space scene where the user is located and the fitness items performed by the user;
and the path planning unit is used for determining a shooting path of the mobile shooting according to the space scene and the fitness project.
7. The system of claim 6, wherein the action recognition unit comprises:
a node identification subunit, configured to identify a limb node of the user in each frame of the fitness image, where the limb node is used to indicate spatial positions of the head, neck, shoulder, elbow, hand, hip, knee, and foot of the user;
the model processing subunit is used for processing the limb nodes by adopting a limb construction model to obtain the limb posture and the limb coordinates of the user in each frame of the fitness image;
and the action construction subunit is used for integrating the body posture and the body coordinates of the user in the body building images of a plurality of consecutive frames to construct and obtain the body building action of the user.
8. The system of claim 6, wherein the output unit comprises:
the face identification subunit is used for identifying the face image of the user in the fitness image and determining the face orientation of the user;
a position determining subunit configured to determine, in the shooting path, a display position that matches the face orientation;
the mobile control subunit is used for controlling the mobile platform to move to the display position;
and the output subunit is used for outputting the correct body-building action to the user.
9. A mobile platform comprises a mobile base, a holder, a camera, a telescopic rod and a display screen, and the mobile platform comprises the body building action error correction method based on the mobile platform according to any one of claims 1 to 4.
10. A mobile platform comprises a mobile base, a holder, a camera, a telescopic rod and a display screen, and the mobile platform comprises the fitness action error correction system based on the mobile platform, which is disclosed by any one of claims 5-8.
CN201911395296.0A 2019-12-30 2019-12-30 Body-building action error correction method and system based on mobile platform Pending CN111144341A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911395296.0A CN111144341A (en) 2019-12-30 2019-12-30 Body-building action error correction method and system based on mobile platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911395296.0A CN111144341A (en) 2019-12-30 2019-12-30 Body-building action error correction method and system based on mobile platform

Publications (1)

Publication Number Publication Date
CN111144341A true CN111144341A (en) 2020-05-12

Family

ID=70521917

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911395296.0A Pending CN111144341A (en) 2019-12-30 2019-12-30 Body-building action error correction method and system based on mobile platform

Country Status (1)

Country Link
CN (1) CN111144341A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112418046A (en) * 2020-11-17 2021-02-26 武汉云极智能科技有限公司 Fitness guidance method, storage medium and system based on cloud robot
CN112734799A (en) * 2020-12-14 2021-04-30 中国科学院长春光学精密机械与物理研究所 Body-building posture guidance system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110039659A1 (en) * 2009-08-13 2011-02-17 Sk C&C Co., Ltd. User-Participating Type Fitness Lecture System and Fitness Training Method Using the Same
CN108734104A (en) * 2018-04-20 2018-11-02 杭州易舞科技有限公司 Body-building action error correction method based on deep learning image recognition and system
CN110113532A (en) * 2019-05-08 2019-08-09 努比亚技术有限公司 A kind of filming control method, terminal and computer readable storage medium
CN110170159A (en) * 2019-06-27 2019-08-27 郭庆龙 A kind of human health's action movement monitoring system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110039659A1 (en) * 2009-08-13 2011-02-17 Sk C&C Co., Ltd. User-Participating Type Fitness Lecture System and Fitness Training Method Using the Same
CN108734104A (en) * 2018-04-20 2018-11-02 杭州易舞科技有限公司 Body-building action error correction method based on deep learning image recognition and system
CN110113532A (en) * 2019-05-08 2019-08-09 努比亚技术有限公司 A kind of filming control method, terminal and computer readable storage medium
CN110170159A (en) * 2019-06-27 2019-08-27 郭庆龙 A kind of human health's action movement monitoring system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐晓龙;刘轶铭;吴宁馨;岳子谦;路正莲;: "基于Kinect 3D体感摄影机的健身教练系统设计" *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112418046A (en) * 2020-11-17 2021-02-26 武汉云极智能科技有限公司 Fitness guidance method, storage medium and system based on cloud robot
CN112734799A (en) * 2020-12-14 2021-04-30 中国科学院长春光学精密机械与物理研究所 Body-building posture guidance system

Similar Documents

Publication Publication Date Title
US9224037B2 (en) Apparatus and method for controlling presentation of information toward human object
CN105654512B (en) A kind of method for tracking target and device
CN112464918B (en) Body-building action correcting method and device, computer equipment and storage medium
CN105930767B (en) A kind of action identification method based on human skeleton
US9154739B1 (en) Physical training assistant system
US9330470B2 (en) Method and system for modeling subjects from a depth map
CN110544301A (en) A three-dimensional human motion reconstruction system, method and motion training system
CN109923583A (en) A kind of recognition methods of posture, equipment and moveable platform
CN109274883B (en) Posture correction method, device, terminal and storage medium
EP2993894A1 (en) Image capturing method, panorama image generating method and electronic apparatus
CN110544302A (en) Human motion reconstruction system, method and motion training system based on multi-eye vision
CN106155315A (en) Method, device and mobile terminal for adding augmented reality effect in shooting
CN109961039A (en) A kind of individual's goal video method for catching and system
CN113408435B (en) A security monitoring method, device, equipment and storage medium
CN109117753A (en) Part identification method, device, terminal and storage medium
CN114187656A (en) Action detection method, device, equipment and storage medium
US12208309B2 (en) Method and device for recommending golf-related contents, and non-transitory computer-readable recording medium
CN111144341A (en) Body-building action error correction method and system based on mobile platform
CN114093030B (en) Shooting training analysis method based on human body posture learning
CN110334609A (en) A kind of real-time body-sensing method for catching of intelligence
US20220366716A1 (en) Person state detection apparatus, person state detection method, and non-transitory computer readable medium storing program
CN116703968B (en) Visual tracking method, device, system, equipment and medium for target object
CN114463850B (en) Human body action recognition system suitable for multiple application scenes
CN113114924A (en) Image shooting method and device, computer readable storage medium and electronic equipment
KR20160099289A (en) Method and system for video search using convergence of global feature and region feature of image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200512