CN113268014B - Carrier, facility control method, device, system and storage medium - Google Patents

Carrier, facility control method, device, system and storage medium Download PDF

Info

Publication number
CN113268014B
CN113268014B CN202010093466.6A CN202010093466A CN113268014B CN 113268014 B CN113268014 B CN 113268014B CN 202010093466 A CN202010093466 A CN 202010093466A CN 113268014 B CN113268014 B CN 113268014B
Authority
CN
China
Prior art keywords
users
behavior
user
adjustment parameter
facility
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010093466.6A
Other languages
Chinese (zh)
Other versions
CN113268014A (en
Inventor
戴继松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010093466.6A priority Critical patent/CN113268014B/en
Publication of CN113268014A publication Critical patent/CN113268014A/en
Application granted granted Critical
Publication of CN113268014B publication Critical patent/CN113268014B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the application provides a carrier, a facility control method, equipment, a system and a storage medium. The carrier control method comprises the following steps: determining a sign difference between at least two users in the same behavior scene; according to the sign differences, respectively determining adjustment parameter values corresponding to the behavior carriers used by the at least two users; according to the adjustment parameter values, adjusting the working states of the behavior carriers so as to enable the behavior states of the at least two users to be mutually adapted; the working state of the behavior carrier influences the behavior state of a user borne by the behavior carrier. In the embodiment of the application, the behavior states of the users in the same behavior scene can be mutually adapted by adjusting the working state of the behavior carrier, and the users can not feel unequal or honored in the behavior scene.

Description

Carrier, facility control method, device, system and storage medium
Technical Field
The application relates to the technical field of the internet of things, in particular to a carrier, a facility control method, equipment, a system and a storage medium.
Background
The current application scenes of independent small spaces are more and more, such as singing bars, leisure bars and the like on the street. The independent small spaces can provide private spaces for two left and right users, and the users can conveniently carry out entertainment, recreation, conference or communication in the relatively closed and quiet space.
Currently, limited by the facilities in these small independent spaces, users may feel uncomfortable, uneven, or unrespected in the small independent spaces in many cases, resulting in poor user experience.
Disclosure of Invention
Aspects of the present application provide a carrier, a facility control method, apparatus, system, and storage medium for improving flexibility of facilities in a space and improving user experience.
The embodiment of the application provides a carrier control method, which comprises the following steps:
determining a sign difference between at least two users in the same behavior scene;
according to the sign differences, respectively determining adjustment parameter values corresponding to the behavior carriers used by the at least two users;
According to the adjustment parameter values, adjusting the working states of the behavior carriers so as to enable the behavior states of the at least two users to be mutually adapted;
The working state of the behavior carrier influences the behavior state of a user borne by the behavior carrier.
The embodiment of the application also provides a facility control method, which comprises the following steps:
Determining a sign difference between at least two users using the same target facility;
Determining an adjustment parameter value corresponding to the target facility according to the sign difference between the at least two users;
and adjusting the working state of the target facility according to the adjustment parameter value so as to adapt to the signs of the at least two users.
The embodiment of the application also provides a facility control method, which comprises the following steps:
acquiring sign information of a user entering a target space;
According to the sign information of the user, determining an adjustment parameter value of at least one facility in the target space;
And respectively adjusting the working state of the at least one facility according to the adjusting parameter value of the at least one facility so as to adapt to the sign information of the user.
The embodiment of the application also provides a control system, which comprises: a controller and at least two behavior carriers;
the at least two behavior carriers are used for bearing users;
The controller is used for determining sign differences between at least two users in the same behavior scene; according to the sign differences, respectively determining adjustment parameter values corresponding to the behavior carriers used by the at least two users; according to the adjustment parameter values, adjusting the working states of the behavior carriers so as to enable the behavior states of the at least two users to be mutually adapted;
The working state of the behavior carrier influences the behavior state of a user borne by the behavior carrier.
The embodiment of the application also provides an intelligent facility, which comprises a facility body, a processor and a driving assembly;
the processor is used for determining sign differences among users in a behavior scene containing the users borne by the intelligent facility; determining an adjustment parameter value from the sign differences;
And adjusting the working state of the intelligent facility by utilizing the driving component according to the adjustment parameter value so as to enable the behavior state of the user borne by the intelligent facility to be matched with the behavior states of other users in the behavior scene.
The embodiment of the application also provides a control system, which comprises: a controller and a target facility;
The target facility is used for providing facility services for users;
the controller is configured to determine a sign difference between at least two users using the same target facility; determining an adjustment parameter value corresponding to the target facility according to the sign difference between the at least two users; and adjusting the working state of the target facility according to the adjustment parameter value so as to adapt to the signs of the at least two users.
The embodiment of the application also provides an intelligent facility, which comprises a facility body, a processor and a driving assembly;
The processor is configured to determine a sign difference between at least two users using the smart facility; determining an adjustment parameter value from the sign differences;
and adjusting the working state of the intelligent facility by utilizing the driving component according to the adjustment parameter value so as to adapt to the physical signs of the at least two users.
The embodiment of the application also provides a control system, which comprises: a controller and at least one facility located in the target space;
the at least one facility is configured to provide facility services to the user;
The controller is used for acquiring physical sign information of a user entering the target space; determining an adjustment parameter value of the at least one facility according to the sign information of the user; and respectively adjusting the working state of the at least one facility according to the adjusting parameter value of the at least one facility so as to adapt to the sign information of the user.
The embodiment of the application also provides a computing device, which comprises a memory and a processor;
the memory is used for storing one or more computer instructions;
the processor is coupled to the memory for executing the one or more computer instructions for:
determining a sign difference between at least two users in the same behavior scene;
according to the sign differences, respectively determining adjustment parameter values corresponding to the behavior carriers used by the at least two users;
According to the adjustment parameter values, adjusting the working states of the behavior carriers so as to enable the behavior states of the at least two users to be mutually adapted;
The working state of the behavior carrier influences the behavior state of a user borne by the behavior carrier.
The embodiment of the application also provides a computing device, which comprises a memory and a processor;
the memory is used for storing one or more computer instructions;
the processor is coupled to the memory for executing the one or more computer instructions for:
Determining a sign difference between at least two users using the same target facility;
Determining an adjustment parameter value corresponding to the target facility according to the sign difference between the at least two users;
and adjusting the working state of the target facility according to the adjustment parameter value so as to adapt to the signs of the at least two users.
The embodiment of the application also provides a computing device, which comprises a memory and a processor;
the memory is used for storing one or more computer instructions;
the processor is coupled to the memory for executing the one or more computer instructions for:
acquiring sign information of a user entering a target space;
According to the sign information of the user, determining an adjustment parameter value of at least one facility in the target space;
And respectively adjusting the working state of the at least one facility according to the adjusting parameter value of the at least one facility so as to adapt to the sign information of the user.
Embodiments of the present application also provide a computer-readable storage medium storing computer instructions that, when executed by one or more processors, cause the one or more processors to perform the aforementioned control method.
In the embodiment of the application, the sign difference between different users in the same behavior scene can be determined, and the working states of the behavior carriers used by the different users can be adjusted according to the sign difference between the different users, so that the behavior states of the users in the behavior scene are mutually adapted. Accordingly, in the embodiment of the application, the behavior states of the users in the same behavior scene are mutually adapted, and the users are not unequal or not respected in the behavior scene.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1a is a schematic diagram of a control system according to an exemplary embodiment of the present application;
FIG. 1b is a schematic diagram of an intelligent facility according to an exemplary embodiment of the present application;
FIGS. 2 a-2 c are schematic diagrams illustrating an application scenario provided in an exemplary embodiment of the present application;
FIG. 3a is a schematic diagram of another control system according to another exemplary embodiment of the present application;
FIG. 3b is a schematic diagram of an intelligent facility according to another exemplary embodiment of the present application;
fig. 4 a-4 c are schematic diagrams of another application scenario provided in another exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of a further control system according to a further exemplary embodiment of the present application;
Fig. 6 is a schematic flow chart of a carrier control method according to another exemplary embodiment of the present application;
FIG. 7 is a schematic diagram of a computing device according to yet another exemplary embodiment of the present application;
FIG. 8 is a flow chart of another facility control method according to still another exemplary embodiment of the present application;
FIG. 9 is a schematic diagram of another computing device provided in accordance with yet another exemplary embodiment of the present application;
FIG. 10 is a flow chart of yet another facility control method provided by yet another exemplary embodiment of the present application;
FIG. 11 is a schematic diagram of a further computing device according to a further exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Currently, limited by the facilities in the separate small space, in many cases, the user may feel uncomfortable, uneven, or unqualified in the separate small space, resulting in an poor user experience. Aiming at the technical problems, the embodiment of the application provides a solution, and the basic idea is as follows: the physical sign difference among different users in the same behavior scene can be determined, and the working states of behavior carriers used by the different users can be adjusted according to the physical sign difference among the different users, so that the behavior states of the users in the behavior scene are mutually adapted. Accordingly, in the embodiment of the application, the behavior states of the users in the same behavior scene are mutually adapted, and the users are not unequal or not respected in the behavior scene.
The following describes in detail the technical solutions provided by the embodiments of the present application with reference to the accompanying drawings.
Fig. 1a is a schematic structural diagram of a control system according to an exemplary embodiment of the present application. As shown in fig. 1a, the control system comprises a controller 10 and at least two behavior carriers 20.
The control system provided in this embodiment may be applied in an independent small space scenario, for example: singing bar, leisure bar, conference bar, dining bar, etc. Of course, the method can also be applied to space scenes of other specifications, such as meeting rooms, KTV boxes, restaurants and the like. The present embodiment is not limited to the application scenario.
The control system provided by the embodiment mainly aims at the situation that at least two users participate in the same behavior scene, and provides a solution for mutually adapting the behavior states of the at least two users.
The behavior scene in this embodiment may be understood as an event process involving a user. The user may have active behavior or passive behavior in the behavior scene. The type of the behavior scene is not limited in this embodiment, and the behavior scene may be a chat scene, a dining scene, a conference scene, an entertainment scene, and the like. At least two users participate in the same behavior scenario means that at least two users participate in the same event together. For example, at least two users chat together, at least two users eat together, at least two users join a video conference together, at least two users singing together, and so forth. The present embodiment is not limited thereto.
For the controller 10, a sign difference between at least two users in the same behavioral scenario may be determined. Where the user's physical characteristics refer to the physical characteristics of the user, including but not limited to height, weight, gender, etc. Sign differences can be understood as differences in the same sign of different users, e.g., height differences, weight differences, etc.
In this embodiment, the controller 10 may determine adjustment parameter values corresponding to the behavior carriers 20 used by at least two users according to the sign differences between the at least two users, where the adjustment parameter values are used to define the adjustment degree of the behavior carriers; and adjusting the working state of each behavior carrier 20 according to the adjustment parameter value so as to adapt the behavior states of the at least two users to each other.
In practice, only a part of the behavior of the carrier 20 may be adjusted. Accordingly, in the present embodiment, the adjustment parameter value may include 0, and for the behavior carriers 20 that do not need to adjust the working state, the adjustment parameter value for such behavior carriers 20 may be configured to be 0. In the case that the corresponding adjustment parameter value of the behavior carrier is 0, the operation state of the behavior carrier will not be adjusted.
The behavior carrier 20 is understood to be a facility for carrying the user and providing support for the user behavior, among other things. The operational state of the behavioural carrier 20 includes, but is not limited to, a height state, a hardness state, a tilt state, etc. In the case of a change in the working state of the behavior carrier 20, the behavior state of the user carried by the behavior carrier in the behavior scene may also be changed.
For example, in the case where at least two users chat together, a chair used by each user may be used as the behavior carrier 20, and the height state of the chair may be adjusted, and when the height of the chair is changed, the chat view (as the behavior state) of the user will be changed accordingly.
For another example, in the case where at least two users have dinner together, a chair used by each user may be used as the behavior carrier 20, a table used by each user may be used as the behavior carrier 20, the height states of the chair and the table may be adjusted, and when the height of the chair or the table is changed, the dining posture (as the behavior state) of the user will be changed accordingly.
In this embodiment, the behavior carriers 20 under different behavior scenarios may not be identical. The action carrier 20 may be a chair, a table, a sofa or a lift table, etc. Of course, this is merely exemplary, and the behavior carrier 20 in the present embodiment is not limited thereto.
In practical applications, sign differences among users may cause users to feel unequal or not to be respected in the process of participating in a behavior scene, and especially in the case that the environment space corresponding to the behavior scene is smaller, the behavior distance among users is closer, and the feeling is stronger.
In this embodiment, the controller 10 may adjust the working states of the behavior carriers 20 used by different users by using the sign differences among the different users as adjustment basis, so as to adjust the behavior states of the different users, thereby improving the uneven feeling or the non-honored feeling caused by the sign differences of the different users in the behavior scene.
For example, when two users chat in close proximity, the gap in height will result in a pitch communication view that will have a sense of pressure, unevenness, etc. for a shorter party. The controller 10 can adjust the height of the chairs used by both sides so that the chair used by the shorter side is higher than the chair used by the higher side, thereby making the communication angles of both sides consistent and improving the aforementioned feeling of oppression, unevenness, etc.
For another example, when two users chat in close proximity, the weight difference may cause the seat to sink to a different extent, which may be uncomfortable or shame for the heavier party. The controller 10 can adjust the hardness of the seats used by both sides so that the seat used by the heavier side is harder than the seat used by the lighter side, and reduce the sinking degree of the seat used by the heavier side so that the sitting postures of both sides are consistent, thereby improving the aforementioned sense of shame and the like.
For another example, when two users eat together, the difference in height will cause the difference in dining postures, and for a shorter one, there will be a sense of oppression, inequality, etc., while a higher one may feel that the table is too short to have a dinner while having a lower comfort. The controller 10 may adjust the height of the chairs used by both sides and the height of the table shared by both sides so that the chair used by the shorter side is higher than the chair used by the higher side, and the height of the table is suitable for the dining height adjusted by both sides so that the dining postures of both sides are consistent, thereby improving the aforementioned feeling of oppression, unevenness, etc.
Among other things, the behavior state of a user may be understood as a body posture used in participating in a behavior scene, such as standing posture, sitting posture, line of sight angle, head rotation angle, arm posture, and the like. The behavior states of the two users are mutually adapted, which can be understood as improving the matching degree between the behavior states caused by the physical signs. In practical applications, the possible perfect matching between behavior states is not limited by the adjustment limits of the behavior carrier 20 and ergonomic requirements, and the goal of the present embodiment is not to require perfect matching between behavior states, but to improve the matching degree between behavior states as much as possible when the aforementioned various limitations are satisfied.
For example, the behavior state mutual adaptation may be the communication view angle adaptation between different users, or may be that the heads of different users all enter the shooting range of the same video conference acquisition device.
In this embodiment, the sign differences between different users in the same behavior scene may be determined, and the working states of the behavior carriers 20 used by different users may be adjusted according to the sign differences between different users, so that the behavior states of the users in the behavior scene are mutually adapted. Accordingly, in the embodiment of the application, the behavior states of the users in the same behavior scene are mutually adapted, and the users are not unequal or not respected in the behavior scene.
In the above or below embodiments, the controller 10 may detect the respective sign information of at least two users using the detection component, and determine the sign difference between the at least two users according to the respective sign information of the at least two users.
Wherein the detection component may be an image acquisition component, and the controller 10 may acquire images of at least two users using the image acquisition device; extracting respective sign information of at least two users from images of the at least two users; and determining the sign difference between the at least two users according to the sign information of each of the at least two users.
In this embodiment, the image capturing apparatus captures images of at least two users as needed.
In one implementation, an image acquisition device may be deployed at an entrance to an environmental space corresponding to a behavioral scene. Wherein the portal is necessary for at least two users participating in the behavior scene, and therefore, the image acquisition device deployed at the portal can successfully acquire images of the at least two users.
In this implementation, the image capturing device may capture images including a single user at a time, and of course, if a plurality of users pass through the entrance at the same time, the image capturing device may also capture images including a plurality of users at a time. And are not limited thereto.
In practical application, a reference object may be set in the acquisition range of the image acquisition device, the reference object may be included in at least two images of the users acquired by the image acquisition device, and the controller 10 may perform image analysis on the images acquired by the image acquisition device, and determine sign information of each user according to the reference object.
For example, a scale may be provided at the entrance, and the image capturing device may capture an image including the user and the scale as the user passes the entrance, and the controller 10 may analyze which scale the user's head top is flush with the scale, thereby determining the height information of the user.
Of course, different references may be deployed for different signs, and the references are not required, and some signs may not require references, such as gender, etc.
In another implementation, the image acquisition device may be deployed in a location where images containing at least two users can be acquired. For example, in a singing bar, it may be deployed in front of a plurality of seats.
At an initial stage when at least two users are loaded on the respective corresponding behavior carriers 20, the image acquisition device may acquire images including at least two users; at least two users load on the corresponding behavior carriers 20 at an initial stage, and each behavior carrier 20 is in an initial working state.
In this implementation, the initial stage of loading the behavior carrier 20 by the user refers to a stage before the controller 10 has not adjusted the behavior carrier 20 according to the sign difference, and the behavior carrier 20 at this stage may be kept in the initial working state. To facilitate the analysis of the characteristic differences by the controller 10, the initial operating states of the plurality of behavioural carriers 20 may be kept consistent.
Also taking a singing bar as an example, multiple seats may be at the same initial height when no one is sitting. After the different users are seated respectively, the multiple seats may maintain the initial height awaiting subsequent adjustment actions by the controller 10. The image acquisition device can acquire images containing users after sitting.
In this implementation, the reference object may be deployed within the acquisition range of the image acquisition device, so that the image acquired by the image acquisition device includes at least two users and the reference object. The controller 10 may determine the sign information of each user based on the reference, thereby analyzing the sign differences between different users. Of course, the controller 10 may also directly compare the signs between different users in the image, so as to determine the sign differences between the different users.
In this implementation, since each user is already loaded on the behavior carrier 20, the determined sign differences will reflect the differences of behavior states that will be generated by different users when participating in the behavior scenario more accurately. This may provide a more accurate basis for the controller 10 to determine the adjustment parameter value.
Of course, in this embodiment, the detection component may further include other types of devices, for example, the detection component may include an infrared sensing device, the infrared sensing device may collect body height or weight information of the user, and different sign information may be collected by using different types of devices. The detecting element in the present embodiment is not limited thereto.
In the above or the following embodiments, before extracting the respective sign information of the at least two users from the images of the at least two users, the controller 10 may further identify the at least two users to determine whether there is a registered user among the at least two users; extracting sign information from user information corresponding to a registered user aiming at the registered user; for a non-registered user, an operation of extracting respective sign information of at least two users from images of the at least two users is performed.
In this embodiment, a pre-registration scheme may be provided. The pre-registration scheme is particularly suitable for a behavior scene in which the user group is relatively fixed. For example, a meeting bar, meeting room, etc. within an enterprise, the user group is an employee within the enterprise. Each user can register the user before participating in the behavior scene and submit the user information such as the identity information and the sign information of the user. The identity information may be a face image, fingerprint data, etc.
Accordingly, in this embodiment, the controller 10 may identify at least two users to determine registered users of the at least two users. Wherein an identification device, such as a face recognition device, a fingerprint acquisition device, etc., may be deployed in the control system, the controller 10 may utilize the identification device to identify a user, thereby determining a registered user of the at least two users.
For registered users, sign information may be extracted from user information of the registered users. In this way, the controller 10 can eliminate the need to perform the operation of extracting the sign information of at least two users from the images of at least two users mentioned in the foregoing embodiments.
For non-registered users, the controller 10 may perform the operations of extracting the sign information of at least two users from the images of at least two users mentioned in the foregoing embodiments to obtain the sign information of at least two users.
Of course, the identification process mentioned in this embodiment is not essential. For a behavior scene where the user group is not fixed, on one hand, the user group is too large in scale, and on the other hand, the number of times that the user participates in the behavior scene is very low, and under the condition that these factors are considered, the identity recognition process provided in the embodiment can be no longer executed.
In the above or below embodiments, the controller 10 may further determine the behavior carrier 20 used by each of the at least two users before determining the adjustment parameter values corresponding to the behavior carrier 20 used by each of the at least two users.
An exemplary implementation manner provided in this embodiment is: determining the relative position between at least two users and the respective behavioural carrier 20 in a preliminary stage of the behavioural scenario; determining the behavior carriers 20 selected by at least two users according to the relative positions between the at least two users and the behavior carriers 20; wherein at least two users are respectively seated to the selected behavioural carrier 20 during the preparation phase.
In this implementation, each user is individually seated to his selected behavior carrier 20 during the preliminary phase of the behavior scenario. Where in place refers to entering into the space of use of the behavioural carrier 20 or being carried on the behavioural carrier 20.
In order to determine the relative position between the at least two users and the respective behavior carriers 20, an image acquisition device may be deployed in the control system, for example, the image acquisition device in the previous embodiments may be multiplexed, but of course a dedicated image acquisition device is also added to acquire images reflecting the relative position between the at least two users and the respective behavior carriers 20 in a preliminary stage of the behavior scene.
The controller 10 may graphically analyze the images acquired during the preliminary stage to determine the relative position between at least two users and each behavioural carrier 20.
In practical applications, if the acquired image does not include the behavior carrier 20, the reference object may be deployed, and the positions of at least two users may be determined according to the relative positions between at least two users and the reference object. And determining the relative position between the at least two users and each behavioural carrier 20 according to the respective positions of the at least two users and the deployment position of each behavioural carrier 20. In addition, the location of at least two users and the deployment location of each behavioural carrier 20 employ the same location registration benchmark. For example, the same coordinate system, the same reference object, or the like is used.
If the acquired image includes at least two users and each behavior carrier 20, the controller 10 can directly analyze the relative positions of the at least two users and each behavior carrier 20 from the image.
On the basis of determining the relative positions between the at least two users and each behavior carrier 20, the controller 10 may obtain the association relationship between the at least two users and each behavior carrier 20, so as to determine the behavior carrier 20 used by each of the at least two users in the behavior scenario. Here, the behavior carrier 20 selected by the user in the preliminary stage is considered to be the behavior carrier 20 that the user uses in the formal stage of the behavior scenario.
If a user who replaces the behavior carrier 20 occurs in the formal stage of the behavior scenario, the controller 10 may apply the adjustment parameter values determined according to the sign differences between such user and other users to the replaced behavior carrier 20 to adjust the working state of the replaced behavior carrier 20 for such user. For the original action carrier 20, the restoration to the original working state can be controlled to wait for the next user.
Of course, the above implementation is merely exemplary, and other implementations may be used to determine the behavior carrier 20 used by each of the at least two users in this embodiment. For example, the pre-stage of the behavioral scenario designates the behavioral carrier 20 for the user and guides the user to take his place to the behavioral carrier 20 designated for him. The action carrier 20 used by each of the at least two users can be recorded during the process of specifying the action carrier 20. The present embodiments are in no way limited to these exemplary implementations.
In the above or in the following embodiments, the types of adjustment parameters that can be adjusted by different types of behavior carriers 20 may be the same or different. The types of adjustment parameters include, but are not limited to, height, stiffness, inclination, and the like. For example, the type of adjustment parameter corresponding to the chair may include height, the type of adjustment parameter corresponding to the sofa may include hardness, and the type of adjustable parameter corresponding to the lift table may also include height.
Thus, to save processing resources, the controller 10 may focus on the differences in sign of different dimensions for different types of behavior carriers 20, thereby determining the adjustment parameter values corresponding to the behavior carriers 20. For example, for chairs, the difference in height between different users may be of great concern, while for sofas, the difference in weight between different users may be of great concern. Of course, the present embodiment is not limited thereto.
In this embodiment, based on the types of the adjustment parameters that can be adjusted by the different types of behavior carriers 20, in an exemplary implementation, the controller 10 may obtain the respective sign information of at least two users, and determine the respective ergonomic requirements of the at least two users according to the sign information; for each of the at least two user-used behavior carriers 20, respectively determining a range of values of the adjustment parameters that meet the ergonomic requirements of their user; and selecting target adjustment parameter values from the value ranges of the adjustment parameters corresponding to the action carriers 20 used by at least two users respectively to serve as adjustment parameter values corresponding to the action carriers 20 used by at least two users with the aim of mutually adapting the action states between the at least two users.
In this implementation, a range of values for the adjustment parameters that meet the ergonomic requirements of its user is set for each behavioural carrier 20. The controller 10 will select the appropriate value within the range of values of the corresponding adjustment parameters of the respective behavioural carriers 20 with the aim of adapting the behavioural status between at least two users to each other.
This implementation makes it possible to minimize the difference in behavior state between at least two users due to the difference in signs, on the premise that the operating state of the behavior carrier 20 is ergonomic. As explained above for the adaptation of the behavior states to one another, in this implementation, the consistency of the behavior states between different users is not excessively pursued, but rather the ergonomic requirements are taken into account in combination, which ensures the comfort of the individual users using the behavior carrier 20, while the comfort counteracts a part of the negative experiences mentioned above.
Of course, in this embodiment, other implementation manners may also be used to determine the adjustment parameter values corresponding to the behavior carriers 20. Moreover, the processing logic adopted by the controller 10 in determining the adjustment parameter values corresponding to each behavior carrier 20 can be flexibly adjusted according to factors such as different behavior carrier 20 types, the concerned sign types, the behavior characteristics in the behavior scene and the like. This embodiment is not limited thereto.
In the above or below embodiments, each behavior carrier 20 may be associated with a respective drive component. The controller 10 may generate control commands corresponding to the behavior carriers 20 according to the adjustment parameter values corresponding to the behavior carriers 20 used by at least two users, respectively; and respectively sending the control commands corresponding to the behavior carriers 20 to the driving components associated with the behavior carriers 20 to control the driving components to adjust the working states of the behavior carriers 20 according to the adjustment parameter values corresponding to the behavior carriers 20.
The components comprised by the drive assembly may not be exactly the same for different types of behavioural carriers 20, as well as for different adjustment targets.
For example, for an adjustment target of the adjustment height, the drive assembly may comprise a PLC (programmable logic controller 10) and a cylinder, which may be fixedly connected to the behavior carrier 20, the PLC may be communicatively connected to the controller 10. The PLC may drive the cylinder to move in response to the control command transmitted from the controller 10 according to the adjustment parameter value in the control command, so as to drive the behavior carrier 20 to adjust the height.
Of course, this is merely exemplary, and the implementation of the driving assembly is not limited in this embodiment. The drive assembly can be deployed according to actual requirements.
Fig. 2 a-2 c are schematic diagrams of an application scenario according to an exemplary embodiment of the present application. In fig. 2 a-2 c, two users of different heights enter the same scene space for chat.
As shown in fig. 2a, an image acquisition device is disposed at the entrance of the scene space, and images of two users can be acquired and provided to the controller in case that the two users enter the scene space. The controller can determine the height difference between the two users according to the images of the two users.
As shown in fig. 2b, two users are respectively seated in selected chairs during the preliminary phase of chat. A taller user selects the left chair and a shorter user selects the right chair. In addition, in the preliminary stage, the two chairs are at the same initial height.
As shown in fig. 2c, the controller may determine adjustment parameter values of two chairs respectively, with the chat view angles of the two users being mutually adapted as targets, according to the height difference between the two users. For example, the height difference between the two users is 30cm, and the controller may determine that the left chair is lowered by 15cm and the right chair is raised by 15cm. The controller may issue control commands to the drive assemblies associated with each of the two chairs and carry the adjustment parameter values in the control commands. The drive assembly will adjust the height of the chair in accordance with the control commands.
Wherein, before the driving component adjusts the height of the chair, the adjustment prompt information can be sent to the user, for example, prompt words or prompt voices can be sent through a display screen or an audio device in the scene space, so as to prompt the user to prepare for passive movement.
As shown in fig. 2c, the left chair will be turned down by 15cm and the right chair will be turned up by 15cm. The communication perspectives of the two users tend to be parallel.
In addition, the scheme of adjusting the height of the chair provided in fig. 2a-2c can be applied to a chat scene, and can also be applied to a video conference, and the height difference between two users can be weakened in the video conference picture by adjusting the height of the chair, so that the heights of the heads of the two users tend to be consistent in the acquisition range of the video conference acquisition equipment.
Fig. 1b is a schematic structural diagram of an intelligent facility according to an exemplary embodiment of the present application. As shown in fig. 1b, the smart utility may include a utility body 3, a processor 1, and a drive assembly 2;
the processor 1 is used for determining sign differences among users in a behavior scene containing the users carried by the intelligent facilities; determining an adjustment parameter value based on the sign differences;
and according to the adjustment parameter value, the driving component 2 is utilized to adjust the working state of the intelligent facility so as to enable the behavior state of the user borne by the intelligent facility to be matched with the behavior states of other users in the behavior scene.
The intelligent setting in this embodiment corresponds to the behavior vector in the control system shown in fig. 1 a. The difference between this embodiment and the control system shown in fig. 1a is that the functions of the controller in the control system are integrated into the intelligent infrastructure. Accordingly, in this embodiment, the intelligent facility may autonomously implement the determination of the sign differences and the adjustment of the working state.
In this embodiment, the processor 1 may communicate with the detection component 4 corresponding to the behavior scene, and acquire the sign information of each user in the behavior scene by using the detection component 4; and determining the sign difference between the at least two users according to the sign information of each of the at least two users.
In order to achieve a relative adjustment of the working states between different intelligent facilities in the same behavior scenario, in this embodiment, processors in different intelligent facilities may employ the same processing rule. For a single intelligent facility, the sign information of each user in the behavior scene can be comprehensively considered, the relative adjustment scheme of each behavior carrier in the behavior scene is designed, the self adjustment parameter value is further determined, and the determined adjustment parameter value is matched with the adjustment parameter values of other intelligent facilities in the behavior scene.
In addition, for further technical details related to the present embodiment, reference may be made to the description related to the embodiments of the carrier control system, which is omitted herein for brevity and should not cause any loss of the protection scope of the present application.
Fig. 3a is a schematic structural diagram of another control system according to another exemplary embodiment of the present application. As shown in fig. 3a, the control system includes a controller 30 and a target facility 40.
The control system provided in this embodiment may be applied in an independent small space scenario, for example: singing bar, leisure bar, conference bar, dining bar, etc. Of course, the method can also be applied to space scenes of other specifications, such as meeting rooms, KTV boxes, meeting rooms, restaurants and the like. The present embodiment is not limited to the application scenario.
The control system provided in this embodiment mainly aims at the situation that at least two users share the same target facility 40, and provides a solution for adapting the working body of the target facility 40 to the signs of at least two users.
For the controller 30, a sign difference between at least two users using the same target facility 40 may be determined. Wherein the user's signs include, but are not limited to, height, weight, gender, etc. Sign differences can be understood as the differences between different users under the same sign.
In this embodiment, the controller 30 may determine the adjustment parameter value corresponding to the target facility 40 according to the sign difference between at least two users; the operating state of the target facility 40 is adjusted according to the adjustment parameter values to adapt to the signs of at least two users.
The target facility 40 may be one or more of a camera, a display screen, a table, or a microphone, among others. These are merely exemplary, and the present embodiment does not limit the type of the target facility 40, and the target facility 40 may be any facility common to a plurality of users.
In practice, the difference in physical sign between users may cause the users to feel uneven or not respected during use of the same target facility 40, especially in the case of a smaller environmental space, where the distance of behavior between users is closer, the sensation will be more intense.
In this embodiment, the controller 30 may adjust the working state of the target facility 40 according to the sign differences between different users, so that the working state of the target facility 40 is adapted to the signs of different users, and thus the uneven feeling or the unqualified feeling caused by the sign differences of different users is improved.
For example, when two users sing on the same stand, the difference in height will cause the two users to feel completely different in microphone usage, and the users with shorter height will not feel the embarrassing feeling of the microphone, and the users with higher height will feel uncomfortable to bend down. The controller 30 can adjust the height of the microphone to a height that is suitable for both sides, thereby improving the embarrassment or discomfort.
The adaptation of the operational status of the target facility 40 to the signs of at least two users may be understood as adjusting the target facility 40 to an operational status that balances the negative experiences experienced by different users using the same target facility 40. In practical applications, due to limitations such as adjustment limits of the target facility 40 and ergonomic requirements, negative experiences experienced by different users using the same target facility 40 may not be completely eliminated, and the objective of the present embodiment is not to require complete elimination of negative experiences, but rather to try to balance negative experiences experienced by different users using the same target facility 40 if the aforementioned various limitations are satisfied.
In this embodiment, a sign difference between at least two users using the same target facility 40 may be determined; determining an adjustment parameter value corresponding to the target facility 40 according to the sign difference between at least two users; the operating state of the target facility 40 is adjusted according to the adjustment parameter values to adapt to the signs of at least two users. Accordingly, the negative experiences experienced by different users using the same target facility 40 may be effectively balanced.
In the above or below embodiments, the controller 30 may acquire images of at least two users using the image acquisition device; extracting respective sign information of at least two users from images of the at least two users; and determining the sign difference between the at least two users according to the sign information of each of the at least two users.
In this embodiment, the control system further includes an image capturing device. The image acquisition device acquires images of at least two users as needed.
In one implementation, the image acquisition device may be deployed at an entrance to an environmental space in which the target facility 40 is located. Wherein the portal is a must-be-there-through for at least two users, and therefore, the image capturing device deployed at the portal can successfully capture images of at least two users.
In this implementation, the image capturing device may capture images including a single user at a time, and of course, if a plurality of users pass through the entrance at the same time, the image capturing device may also capture images including a plurality of users at a time. And are not limited thereto.
In practical applications, a reference object may be set in the acquisition range of the image acquisition device, where the reference object is included in at least two images of the user acquired by the image acquisition device, and the controller 30 may perform image analysis on the images acquired by the image acquisition device, and determine sign information of each user according to the reference object.
For example, a scale may be provided at the entrance, and as the user passes the entrance, the image capture device may capture an image including the user and the scale, and the controller 30 may analyze which scale the user's crown is flush with to determine the height information of the user.
Of course, different references may be deployed for different signs, and the references are not required, and some signs may not require references, such as gender, etc.
In another implementation, the image acquisition device may be deployed in a location where images containing at least two users can be acquired. For example, in a singing bar, it may be deployed in front of a plurality of seats.
The image acquisition device may acquire an image containing at least two users with the at least two users in place at respective facility use locations; the facility use position refers to a position in the environment space when a user uses a target to implement.
Also taking a singing bar as an example, at least two users may be seated separately, and a plurality of seats may be at the same initial height. The image acquisition device can acquire images containing users after sitting.
In this implementation, the reference object may be deployed within the acquisition range of the image acquisition device, so that the image acquired by the image acquisition device includes at least two users and the reference object. The controller 30 may determine the sign information of each user based on the reference, thereby analyzing the sign differences between different users. Of course, the controller 30 may also directly compare the signs between different users in the image, thereby determining the sign differences between the different users.
In this implementation, since each user is already in place at the facility use location, the determined sign differences will more accurately reflect the differences in the operating status of different users when using the same target facility 40. This may provide a more accurate basis for the controller 30 to determine the adjustment parameter value.
Of course, in this embodiment, other implementation manners may be used to determine the sign difference between different users, for example, an infrared sensing device may be used to collect information such as height or weight of the user, and different sign information may be collected by different devices. The manner of determining the sign differences between different users in the present embodiment is by no means limited thereto.
In the above or the following embodiments, before extracting the respective sign information of the at least two users from the images of the at least two users, the controller 30 may further identify the at least two users to determine whether there is a registered user among the at least two users; extracting sign information from user information corresponding to a registered user aiming at the registered user; for a non-registered user, an operation of extracting respective sign information of at least two users from images of the at least two users is performed.
In this embodiment, a pre-registration scheme may be provided. The pre-registration scheme is particularly suitable for a behavior scene in which the user group is relatively fixed. For example, a meeting bar, meeting room, etc. within an enterprise, the user group is an employee within the enterprise. Each user may register the user before using the target facility 40 and submit his own user information such as identity information and sign information. The identity information may be a face image, fingerprint data, etc.
Accordingly, in this embodiment, the controller 30 may identify at least two users to determine registered users of the at least two users. Wherein an identification device, such as a face recognition device, a fingerprint acquisition device, etc., may be deployed in the control system, the controller 30 may utilize the identification device to identify a user, thereby determining a registered user of the at least two users.
For registered users, sign information may be extracted from user information of the registered users. In this way, the controller 30 may not need to perform the operation of extracting the sign information of at least two users from the images of at least two users mentioned in the foregoing embodiments.
For non-registered users, the controller 30 may perform the operations of extracting the sign information of at least two users from the images of at least two users mentioned in the foregoing embodiments to obtain the sign information of at least two users.
Of course, the identification process mentioned in this embodiment is not essential. In the case where the user group is not fixed, on the one hand, the user group is excessively large in size, and on the other hand, the number of times the user uses the target facility 40 is low, and in consideration of these factors, the process of identification provided in the present embodiment may not be performed.
In the above or below embodiments, the types of adjustment parameters that different types of target facilities 40 may be adjusted may be the same or different. The types of adjustment parameters include, but are not limited to, height, stiffness, inclination, and the like. For example, the type of adjustment parameter corresponding to the microphone may include height, the type of adjustment parameter corresponding to the sofa may include hardness, and the type of adjustable parameter corresponding to the table may also include inclination.
Thus, to conserve processing resources, the controller 30 may focus on the differences in sign in different dimensions for different types of target facilities 40 to determine the adjustment parameter values corresponding to the target facilities 40. For example, for microphones, the difference in height between different users may be of great concern, while for sofas, the difference in weight between different users may be of great concern. Of course, the present embodiment is not limited thereto.
In this embodiment, based on the types of adjustment parameters that may be adjusted by different types of target facilities 40, in one exemplary implementation, the controller 30 may obtain respective sign information of at least two users, and determine respective ergonomic requirements of the at least two users according to the sign information; respectively determining the numerical ranges of the adjustment parameters meeting the ergonomic requirements of at least two users; selecting target adjustment parameter values from the numerical ranges of the adjustment parameters corresponding to at least two users respectively by taking the physical signs of the at least two users as targets; and determining the corresponding adjustment parameter value of the target facility 40 according to the corresponding target adjustment parameter values of at least two users.
In this implementation, a numerical range of adjustment parameters that meet the ergonomic requirements of each user is set for the target facility 40. The controller 30 will select the appropriate value within the range of values of the adjustment parameters corresponding to each user with the goal of adapting the physical characteristics of at least two users. For example, the intermediate value among all the value ranges is determined, and the value closest to the intermediate value is selected as the target adjustment parameter value among the value ranges.
Based on this, the controller 30 may determine the adjustment parameter value corresponding to the target facility 40 according to the plurality of target adjustment parameter values. For example, the controller 30 may select a median value, a minimum value, or may calculate an average value, etc., from among a plurality of target adjustment parameter values, which are not limited herein.
This implementation can minimize the difference in usage experience between at least two users due to the difference in physical signs on the premise that the adjusted operating state of the target facility 40 is ergonomic. In this implementation, differences in the usage experience of different users for the same target facility 40 may be balanced under ergonomic considerations.
Of course, in this embodiment, other implementation manners may also be used to determine the adjustment parameter value corresponding to the target facility 40. Moreover, the processing logic employed by the controller 30 in determining the adjustment parameter values corresponding to the target facilities 40 may be flexibly adjusted according to factors such as the types of different target facilities 40, the types of physical signs of interest, and the like. This embodiment is not limited thereto.
In either the above or the following embodiments, the target facility 40 may be associated with a drive assembly. The controller 30 may generate a control command according to the adjustment parameter value corresponding to the target facility 40; the control command is sent to the driving component associated with the target facility 40 to control the driving component to adjust the working state of the target facility 40 according to the adjustment parameter value corresponding to the target facility 40.
The components contained by the drive assembly may not be identical for different types of target facilities 40, as well as different types of tuning parameters.
For example, in the event that the height of the target facility 40 needs to be adjusted, the drive assembly may include a PLC (programmable logic controller 30) and a pneumatic cylinder, which may be grounded to the target facility 40, and the PLC may be communicatively coupled to the controller 30. The PLC may drive the cylinder to move in response to the control command transmitted from the controller 30 in accordance with the adjustment parameter value in the control command to drive the target facility 40 to adjust the height.
Of course, this is merely exemplary, and the implementation of the driving assembly is not limited in this embodiment. The drive assembly can be deployed according to actual requirements.
Fig. 4a-4 c are schematic diagrams of another application scenario provided in another exemplary embodiment of the present application. In fig. 4a-4 c, two users of different heights sing using the same stand.
As shown in fig. 4a, an image acquisition device is disposed at the entrance of the scene space, and images of two users can be acquired and provided to the controller in case that the two users enter the scene space. The controller can determine the height difference between the two users according to the images of the two users.
As shown in fig. 4b, in the preliminary stage of singing, the two users are respectively seated to the singing location. The microphone is at an initial elevation. The initial height is 170cm and for shorter users, the microphone cannot be used due to height limitations.
As shown in fig. 4c, the controller may determine the adjustment parameter value of the microphone according to the height difference between the two users, with the goal of adapting the heights of the two users. For example, the height difference of two users is 30cm, the height of a shorter user is 150cm, the height of a taller user is 180cm, and the controller can adjust the height of the microphone to 160cm. The controller may issue control commands to the microphone-associated drive components and carry the adjustment parameter values in the control commands. The drive assembly will adjust the height of the microphone in accordance with the control commands.
Before the driving component adjusts the height of the microphone, an adjustment prompt message can be sent to the user, for example, prompt text or prompt voice can be sent through a display screen or an audio device in a scene space, so as to prompt the user to prepare for mind.
As shown in fig. 4c, the height of the microphone was adjusted to 160cm. The use experience of both users is improved.
Fig. 3b is a schematic structural diagram of an intelligent facility according to another exemplary embodiment of the present application. As shown in fig. 3b, the smart utility may include a utility body 7, a processor 5, and a drive assembly 6;
the processor 5 is for determining a sign difference between at least two users using the smart facility; determining an adjustment parameter value based on the sign differences;
According to the adjustment parameter values, the operating state of the intelligent facility is adjusted by means of the drive assembly 6 to adapt to the signs of at least two users.
The intelligent setting in this embodiment corresponds to the target facility in the control system shown in fig. 3 a. The difference between this embodiment and the control system shown in fig. 3a is that the functions of the controller in the control system are integrated into the intelligent infrastructure. Accordingly, in this embodiment, the intelligent facility may autonomously implement the determination of the sign differences and the adjustment of the working state.
In this embodiment, the processor 5 may communicate with the detection component 8 corresponding to the space where the intelligent facility is located, and acquire the sign information of each of at least two users using the intelligent facility by using the detection component 8; and determining the sign difference between the at least two users according to the sign information of each of the at least two users. And then determining the value of the adjustment parameter according to the sign difference.
In addition, other technical details related to the present embodiment may refer to the related descriptions in the embodiments of the control system, which are not described herein for brevity, but should not cause a loss of the protection scope of the present application.
Fig. 5 is a schematic structural diagram of another control system according to still another exemplary embodiment of the present application. As shown in fig. 5, the control system includes a controller 50 and at least one facility 60.
The control system provided in this embodiment mainly provides a solution for adapting the working state of at least one facility 60 to the sign of the user, for the situation that the user enters the target space and uses at least one facility 60 in the target space. In this embodiment, the target space may be a restaurant, a conference room, a chat room, an entertainment room, or the like, and the specification of the target space is not limited in this embodiment. The designation of target space may be different for different specifications, e.g., under separate small space specifications, target space may also be referred to as: singing bar, leisure bar, conference bar, dining bar, etc.
The facility 60 in the target space may be a chair, a sofa, a camera, a display screen, a table, a microphone, etc., which are merely exemplary, the type of the facility 60 is not limited in this embodiment, and the facility 60 may be any facility that may be used by a user in the target space.
For the controller 50, the sign information of the user entering the target space may be acquired. The sign information includes, but is not limited to, height, weight, or gender, among others.
Determining an adjustment parameter value for at least one facility 60 in the target space based on the user's sign information; the operating states of the at least one facility 60 are respectively adjusted according to the adjustment parameter values of the at least one facility 60 to adapt to the sign information of the user.
In this embodiment, the controller 50 may adjust at least one facility 60 to a working state adapted to the sign of the user by using the sign information of the user as an adjustment basis for adjusting at least one facility 60.
Wherein the types of adjustment parameters corresponding to the different facilities 60 may not be identical. The types of adjustment parameters include, but are not limited to, height, stiffness, inclination, and the like.
In this embodiment, sign information of a user entering a target space may be obtained; determining an adjustment parameter value for at least one facility 60 in the target space based on the user's sign information; the operating states of the at least one facility 60 are respectively adjusted according to the adjustment parameter values of the at least one facility 60 to adapt to the sign information of the user. Accordingly, the user's feeling of using the facility 60 in the target space can be effectively improved.
In the above or below embodiments, the controller 50 may acquire an image of a user using the image acquisition device; and extracting the sign information of the user from the image of the user.
In this embodiment, the control system further includes an image capturing device. The image acquisition device acquires images of at least two users as needed.
In one implementation, an image acquisition device may be deployed at an entrance to a target space. Wherein the portal is a necessary location for the user, and thus the image capturing device deployed at the portal can successfully capture the image of the user.
In practical applications, a reference object may be set in the acquisition range of the image acquisition device, and the image of the user acquired by the image acquisition device will include the reference object, and the controller 50 may perform image analysis on the image acquired by the image acquisition device and determine the sign information of the user according to the reference object.
For example, a scale may be provided at the entrance, and as the user passes the entrance, the image capture device may capture an image including the user and the scale, and the controller 50 may analyze which scale the user's crown is flush with to determine the height information of the user.
Of course, different references may be deployed for different signs, and the references are not required, and some signs may not require references, such as gender, etc.
Of course, in this embodiment, other implementation manners may be used to determine the sign information of the user, for example, an infrared sensing device may be used to collect the sign information of the height or weight of the user, and different sign information may be collected by different devices. The manner of determining the sign difference of the user in the present embodiment is by no means limited thereto.
In the above or below embodiments, the controller 50 may also identify the user to determine whether the user is a registered user before extracting the user's sign information from the user's image; if the user is a registered user, extracting sign information from user information corresponding to the registered user; and if the user is a non-registered user, executing the operation of extracting the sign information of the user from the image of the user.
In this embodiment, a pre-registration scheme may be provided. The pre-registration scheme is particularly suitable for a behavior scene in which the user group is relatively fixed. For example, a meeting bar, meeting room, etc. within an enterprise, the user group is an employee within the enterprise. The user may register the user before using the target facility 60 and submit his own user information such as identity information and sign information. The identity information may be a face image, fingerprint data, etc.
Accordingly, in this embodiment, the controller 50 can identify the user to determine whether the user is a registered user. Wherein an identification device, such as a face recognition device, a fingerprint acquisition device, etc., may be deployed in the control system, the controller 50 may use the identification device to identify the user, thereby determining whether the user is a registered user.
For registered users, sign information may be extracted from user information of the registered users. In this way, the controller 50 can eliminate the need to perform the operation of extracting the sign information of the user from the image of the user mentioned in the foregoing embodiment.
For non-registered users, the controller 50 may then perform the operations mentioned in the previous embodiments to extract the sign information of the user from the image of the user, so as to obtain the sign information of the user.
Of course, the identification process mentioned in this embodiment is not essential. In the case that the user group is not fixed, on the one hand, the user group is too large in scale, and on the other hand, the number of times that the user enters the target space is very low, and in consideration of these factors, the identification process provided in the embodiment can be no longer executed.
In the above or below embodiments, the types of adjustment parameters that different types of facilities 60 may be adjusted may be the same or different. The types of adjustment parameters include, but are not limited to, height, stiffness, inclination, and the like. For example, the type of adjustment parameter corresponding to the microphone may include height, the type of adjustment parameter corresponding to the sofa may include hardness, and the type of adjustable parameter corresponding to the table may also include inclination.
Thus, to conserve processing resources, the controller 50 may focus on the differences in sign in different dimensions for different types of facilities 60 to determine the adjustment parameter values for each facility 60. For example, for a microphone, the height information of the user may be focused, while for a sofa, the weight information of the user may be focused. Of course, the present embodiment is not limited thereto.
In this embodiment, based on the types of adjustment parameters that can be adjusted by different types of facilities 60, in one exemplary implementation, the controller 50 can determine the ergonomic requirements of the user under at least one facility 60, respectively, according to the sign information of the user; the respective adjustment parameter values for the at least one facility 60 are determined based on the respective ergonomic requirements for the at least one facility 60.
In this implementation, the ergonomic requirements of the user are not exactly the same for different facilities 60. The controller 50 may adjust the at least one facility 60 to an operating state that meets the ergonomic requirements of the user, respectively.
Of course, in the present embodiment, other implementation manners may also be used to determine the adjustment parameter value corresponding to the target facility 60. Moreover, the processing logic employed by the controller 50 in determining the adjustment parameter values corresponding to each facility 60 may be flexibly adjusted according to factors such as the type of the different facility 60, the type of physical sign of interest, and the like. This embodiment is not limited thereto.
In either the above or the below embodiments, at least one of the facilities 60 may each be associated with a drive assembly. The controller 50 may generate control commands corresponding to the at least one facility 60 according to the adjustment parameter values of the at least one facility 60, respectively; and respectively sending the control commands corresponding to the at least one facility 60 to the driving components associated with the at least one facility 60 to control the driving components associated with the at least one facility 60 to adjust the working state of the at least one facility 60 according to the adjusting parameter values of the at least one facility 60.
The components contained by the drive assembly may not be identical for different types of facilities 60, as well as different types of tuning parameters.
For example, in the event that the height of the facility 60 needs to be adjusted, the drive assembly may include a PLC (programmable logic controller 50) and a pneumatic cylinder, which may be grounded to the facility 60, and the PLC may be communicatively coupled to the controller 50. The PLC may drive the cylinder to move in response to the control command sent from the controller 50 in accordance with the adjustment parameter value in the control command to drive the facility 60 to adjust the height.
Of course, this is merely exemplary, and the implementation of the driving assembly is not limited in this embodiment. The drive assembly can be deployed according to actual requirements.
Fig. 6 is a schematic flow chart of a carrier control method according to another exemplary embodiment of the present application.
As shown in fig. 6, the method includes:
step 600, determining a sign difference between at least two users in the same behavior scene;
Step 601, respectively determining adjustment parameter values corresponding to behavior carriers used by at least two users according to the sign differences;
Step 602, adjusting the working state of each behavior carrier according to the adjustment parameter value so as to enable the behavior states of at least two users to be mutually adapted;
The working state of the behavior carrier influences the behavior state of a user borne by the behavior carrier.
In an alternative embodiment, the step of determining the sign difference between at least two users in the same behavioral scenario comprises:
acquiring images of at least two users by using an image acquisition device;
extracting respective sign information of at least two users from images of the at least two users;
And determining the sign difference between the at least two users according to the sign information of each of the at least two users.
In an alternative embodiment, the step of capturing images of at least two users with an image capturing device comprises:
And respectively acquiring images of at least two users at the entrance of the environment space corresponding to the action scene by using the image acquisition equipment.
In an alternative embodiment, the step of capturing images of at least two users with an image capturing device comprises:
at the initial stage that at least two users load on the corresponding behavior carriers, an image acquisition device is utilized to acquire images containing at least two users;
at the initial stage that at least two users load on the corresponding behavior carriers, each behavior carrier is in an initial working state.
In an alternative embodiment, before extracting the respective sign information of the at least two users from the images of the at least two users, the steps further include:
identifying at least two users to determine whether registered users exist in the at least two users;
extracting sign information from user information corresponding to a registered user aiming at the registered user;
for a non-registered user, an operation of extracting respective sign information of at least two users from images of the at least two users is performed.
In an alternative embodiment, before determining the adjustment parameter values corresponding to the behavior carriers used by the at least two users respectively according to the sign differences, the steps further include:
determining the relative positions between at least two users and each behavior carrier in a preparation stage of the behavior scene;
Determining the behavior carriers selected by at least two users according to the relative positions between the at least two users and each behavior carrier;
Wherein at least two users are respectively seated to the selected behavioural carrier during the preliminary stage.
In an alternative embodiment, the step of determining the adjustment parameter values corresponding to the behavior vectors used by the at least two users respectively according to the sign differences includes:
acquiring respective sign information of at least two users, and determining respective ergonomic requirements of the at least two users according to the sign information;
For the behavior carriers used by the at least two users, respectively determining the numerical ranges of the adjustment parameters conforming to the ergonomic requirements of the users;
And selecting target adjustment parameter values from the value ranges of the adjustment parameters corresponding to the behavior carriers used by at least two users respectively by taking the behavior states between the at least two users as targets, wherein the behavior states are mutually adapted, and the target adjustment parameter values are used as adjustment parameter values corresponding to the behavior carriers used by the at least two users respectively.
In an alternative embodiment, each behavior carrier is associated with a driving component, and the step of adjusting the working state of each behavior carrier according to the adjustment parameter value includes:
According to the adjustment parameter values corresponding to the behavior carriers used by at least two users, respectively generating control commands corresponding to the behavior carriers;
and respectively sending the control commands corresponding to the behavior carriers to the driving components associated with the behavior carriers so as to control the driving components to adjust the working states of the behavior carriers according to the adjustment parameter values corresponding to the behavior carriers.
In an alternative embodiment, the behavioral carrier comprises one or more of a chair, a table, a sofa, or a lift.
In an alternative embodiment, the type of adjustment parameter includes one or more of height, inclination, or hardness.
In an alternative embodiment, the sign comprises one or more of height, weight, or gender.
In an alternative embodiment, the behavioral scenes include one or more of chat scenes, dining scenes, meeting scenes, or entertainment scenes.
It should be noted that, for the technical details related to the embodiments of the control method, reference may be made to the related description about the controller in the related embodiment of the control system shown in fig. 1a, which is not described in detail herein for the sake of brevity, but should not be construed as a loss of the protection scope of the present application.
Fig. 7 is a schematic structural diagram of a computing device according to another exemplary embodiment of the present application. As shown in fig. 7, the computing device includes a memory 70 and a processor 71.
A processor 71 coupled to the memory 70 for executing a computer program in the memory 70 for:
determining a sign difference between at least two users in the same behavior scene;
according to the sign difference, respectively determining adjustment parameter values corresponding to behavior carriers used by at least two users;
according to the adjustment parameter values, the working state of each behavior carrier is adjusted so as to enable the behavior states of at least two users to be mutually adapted;
The working state of the behavior carrier influences the behavior state of a user borne by the behavior carrier.
In an alternative embodiment, the processor 71, when determining the sign difference between at least two users in the same behavioral scenario, is configured to:
acquiring images of at least two users by using an image acquisition device;
extracting respective sign information of at least two users from images of the at least two users;
And determining the sign difference between the at least two users according to the sign information of each of the at least two users.
In an alternative embodiment, processor 71, when capturing images of at least two users using the image capturing device, is configured to:
And respectively acquiring images of at least two users at the entrance of the environment space corresponding to the action scene by using the image acquisition equipment.
In an alternative embodiment, processor 71, when capturing images of at least two users using the image capturing device, is configured to:
at the initial stage that at least two users load on the corresponding behavior carriers, an image acquisition device is utilized to acquire images containing at least two users;
at the initial stage that at least two users load on the corresponding behavior carriers, each behavior carrier is in an initial working state.
In an alternative embodiment, the processor 71 further comprises, prior to extracting the respective sign information of the at least two users from the images of the at least two users:
identifying at least two users to determine whether registered users exist in the at least two users;
extracting sign information from user information corresponding to a registered user aiming at the registered user;
for a non-registered user, an operation of extracting respective sign information of at least two users from images of the at least two users is performed.
In an alternative embodiment, the processor 71 is further configured to, before determining the adjustment parameter values corresponding to the behavior vectors used by the at least two users, respectively, according to the sign differences:
determining the relative positions between at least two users and each behavior carrier in a preparation stage of the behavior scene;
Determining the behavior carriers selected by at least two users according to the relative positions between the at least two users and each behavior carrier;
Wherein at least two users are respectively seated to the selected behavioural carrier during the preliminary stage.
In an alternative embodiment, the processor 71 is configured to, when determining the adjustment parameter values corresponding to the behavior vectors used by the at least two users, respectively, according to the sign differences:
acquiring respective sign information of at least two users, and determining respective ergonomic requirements of the at least two users according to the sign information;
For the behavior carriers used by the at least two users, respectively determining the numerical ranges of the adjustment parameters conforming to the ergonomic requirements of the users;
And selecting target adjustment parameter values from the value ranges of the adjustment parameters corresponding to the behavior carriers used by at least two users respectively by taking the behavior states between the at least two users as targets, wherein the behavior states are mutually adapted, and the target adjustment parameter values are used as adjustment parameter values corresponding to the behavior carriers used by the at least two users respectively.
In an alternative embodiment, each behavior carrier is associated with a driving component, and the processor 71 is configured to, when adjusting the working state of each behavior carrier according to the adjustment parameter value:
According to the adjustment parameter values corresponding to the behavior carriers used by at least two users, respectively generating control commands corresponding to the behavior carriers;
and respectively sending the control commands corresponding to the behavior carriers to the driving components associated with the behavior carriers so as to control the driving components to adjust the working states of the behavior carriers according to the adjustment parameter values corresponding to the behavior carriers.
In an alternative embodiment, the behavioral carrier comprises one or more of a chair, a table, a sofa, or a lift.
In an alternative embodiment, the type of adjustment parameter includes one or more of height, inclination, or hardness.
In an alternative embodiment, the sign comprises one or more of height, weight, or gender.
In an alternative embodiment, the behavioral scenes include one or more of chat scenes, dining scenes, meeting scenes, or entertainment scenes.
It should be noted that, for the technical details related to the embodiments of the computing device, reference may be made to the related description of the controller in the related embodiment of the control system shown in fig. 1a, which is not described in detail herein for the sake of brevity, but should not be construed as a loss of the protection scope of the present application.
Further, as shown in fig. 7, the computing device further includes: communication component 72, power component 73, and the like. Only some of the components are schematically shown in fig. 7, which does not mean that the computing device only includes the components shown in fig. 7.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing a computer program that, when executed, is capable of implementing the steps of the method embodiments described above that are executable by a computing device.
Fig. 8 is a flow chart of another facility control method according to still another exemplary embodiment of the present application.
As shown in fig. 8, the method includes:
Step 800, determining a sign difference between at least two users using the same target facility;
Step 801, determining an adjustment parameter value corresponding to a target facility according to the sign difference between at least two users;
Step 802, adjusting the working state of the target facility according to the adjustment parameter value so as to adapt to the physical signs of at least two users.
In an alternative embodiment, the step of determining the difference in sign between at least two users using the same target facility comprises:
acquiring images of at least two users by using an image acquisition device;
extracting respective sign information of at least two users from images of the at least two users;
And determining the sign difference between the at least two users according to the sign information of each of the at least two users.
In an alternative embodiment, the step of capturing images of at least two users with an image capturing device comprises:
and respectively acquiring images of at least two users at the entrance of the environment space where the target equipment is positioned by using the image acquisition equipment.
In an alternative embodiment, the step of capturing images of at least two users with an image capturing device comprises:
With at least two users in place at respective facility use locations, images containing the at least two users are acquired with an image acquisition device.
In an alternative embodiment, before extracting the respective sign information of the at least two users from the images of the at least two users, the steps further include:
identifying at least two users to determine whether registered users exist in the at least two users;
extracting sign information from user information corresponding to a registered user aiming at the registered user;
for a non-registered user, an operation of extracting respective sign information of at least two users from images of the at least two users is performed.
In an alternative embodiment, the step of determining the adjustment parameter value corresponding to the target facility according to the sign difference between the at least two users includes:
acquiring respective sign information of at least two users, and determining respective ergonomic requirements of the at least two users according to the sign information;
respectively determining the numerical ranges of the adjustment parameters meeting the ergonomic requirements of at least two users;
selecting target adjustment parameter values from the numerical ranges of the adjustment parameters corresponding to at least two users respectively by taking the physical signs of the at least two users as targets;
And determining the adjustment parameter value corresponding to the target facility according to the target adjustment parameter values corresponding to at least two users.
In an alternative embodiment, the target facility is associated with a driving component, and the step of adjusting the operating state of the target facility according to the adjustment parameter value includes:
Generating a control command according to the adjustment parameter value corresponding to the target facility;
and sending the control command to a driving component associated with the target facility so as to control the driving component to adjust the working state of the target facility according to the adjustment parameter value corresponding to the target facility.
In an alternative embodiment, the target facility includes one or more of a camera, a display screen, a table, or a microphone.
In an alternative embodiment, the type of adjustment parameter includes one or more of height, inclination, or hardness.
In an alternative embodiment, the sign comprises one or more of height, weight, or gender.
It should be noted that, for the technical details related to the embodiments of the control method, reference may be made to the related description about the controller in the related embodiment of the control system shown in fig. 3a, which is not described in detail herein for the sake of brevity, but should not be construed as a loss of the protection scope of the present application.
Fig. 9 is a schematic structural diagram of another computing device according to still another exemplary embodiment of the present application. As shown in fig. 9, the computing device includes: a memory 90 and a processor 91.
A processor 91 coupled to the memory 90 for executing a computer program in the memory 90 for:
Determining a sign difference between at least two users using the same target facility;
determining an adjustment parameter value corresponding to the target facility according to the sign difference between at least two users;
and adjusting the working state of the target facility according to the adjustment parameter value so as to adapt to the physical signs of at least two users.
In an alternative embodiment, the processor 91, when determining the difference in sign between at least two users using the same target facility, is configured to:
acquiring images of at least two users by using an image acquisition device;
extracting respective sign information of at least two users from images of the at least two users;
And determining the sign difference between the at least two users according to the sign information of each of the at least two users.
In an alternative embodiment, processor 91, when capturing images of at least two users using an image capturing device, is configured to:
and respectively acquiring images of at least two users at the entrance of the environment space where the target equipment is positioned by using the image acquisition equipment.
In an alternative embodiment, processor 91, when capturing images of at least two users using an image capturing device, is configured to:
With at least two users in place at respective facility use locations, images containing the at least two users are acquired with an image acquisition device.
In an alternative embodiment, the processor 91 is further configured to, prior to extracting the respective sign information of the at least two users from the images of the at least two users:
identifying at least two users to determine whether registered users exist in the at least two users;
extracting sign information from user information corresponding to a registered user aiming at the registered user;
for a non-registered user, an operation of extracting respective sign information of at least two users from images of the at least two users is performed.
In an alternative embodiment, the processor 91 is configured to, when determining the adjustment parameter value corresponding to the target facility based on the difference in sign between the at least two users:
acquiring respective sign information of at least two users, and determining respective ergonomic requirements of the at least two users according to the sign information;
respectively determining the numerical ranges of the adjustment parameters meeting the ergonomic requirements of at least two users;
selecting target adjustment parameter values from the numerical ranges of the adjustment parameters corresponding to at least two users respectively by taking the physical signs of the at least two users as targets;
And determining the adjustment parameter value corresponding to the target facility according to the target adjustment parameter values corresponding to at least two users.
In an alternative embodiment, the target facility has associated with it a drive assembly, and the processor 91 is configured to, when adjusting the operating state of the target facility in accordance with the adjustment parameter value:
Generating a control command according to the adjustment parameter value corresponding to the target facility;
and sending the control command to a driving component associated with the target facility so as to control the driving component to adjust the working state of the target facility according to the adjustment parameter value corresponding to the target facility.
In an alternative embodiment, the target facility includes one or more of a camera, a display screen, a table, or a microphone.
In an alternative embodiment, the type of adjustment parameter includes one or more of height, inclination, or hardness.
In an alternative embodiment, the sign comprises one or more of height, weight, or gender.
It should be noted that, for the technical details related to the embodiments of the computing device, reference may be made to the related description of the controller in the related embodiment of the control system shown in fig. 3a, which is not described in detail herein for the sake of brevity, but should not be construed as a loss of the protection scope of the present application.
Further, as shown in fig. 9, the computing device further includes: a communication component 92, a power supply component 93, and the like. Only some of the components are schematically shown in fig. 9, which does not mean that the computing device only includes the components shown in fig. 9.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing a computer program that, when executed, is capable of implementing the steps of the method embodiments described above that are executable by a computing device.
Fig. 10 is a flow chart of yet another facility control method according to yet another exemplary embodiment of the present application.
As shown in fig. 10, the method includes:
step 100, acquiring sign information of a user entering a target space;
Step 101, determining an adjustment parameter value of at least one facility in a target space according to sign information of a user;
Step 102, respectively adjusting the working state of at least one facility according to the adjustment parameter value of the at least one facility so as to adapt to the sign information of the user.
In an alternative embodiment, the step of acquiring the sign information of the user entering the target space comprises:
Collecting an image of a user by using an image collecting device;
and extracting the sign information of the user from the image of the user.
In an alternative embodiment, the step of capturing an image of the user with the image capturing device comprises:
An image of a user is acquired at an entrance to a target space using an image acquisition device.
In an alternative embodiment, before extracting the sign information of the user from the image of the user, the steps further include:
Identifying the user to determine whether the user is a registered user;
If the user is a registered user, extracting sign information from user information corresponding to the registered user;
And if the user is a non-registered user, executing the operation of extracting the sign information of the user from the image of the user.
In an alternative embodiment, the step of determining the adjustment parameter value of at least one facility in the target space according to the sign information of the user includes:
respectively determining the corresponding ergonomic requirements of the user under at least one facility according to the physical sign information of the user;
and respectively determining the respective adjustment parameter values of the at least one facility according to the respective ergonomic requirements of the at least one facility.
In an alternative embodiment, the at least one facility is associated with a driving assembly, and the step of adjusting the operating state of the at least one facility according to the adjustment parameter value of the at least one facility includes:
respectively generating control commands corresponding to at least one facility according to the adjustment parameter values of the at least one facility;
and respectively sending the control commands corresponding to the at least one facility to the driving components associated with the at least one facility so as to control the driving components associated with the at least one facility to adjust the working state of the at least one facility according to the adjustment parameter values of the at least one facility.
In an alternative embodiment, the facility is a chair, sofa, camera, display, table, or microphone.
In an alternative embodiment, the type of adjustment parameter includes one or more of height, inclination, or hardness.
In an alternative embodiment, the vital sign information includes one or more of height, weight, or gender information.
In an alternative embodiment, the target space is a restaurant, conference room, chat room, or entertainment room.
It should be noted that, for the technical details related to the embodiments of the control method, reference may be made to the related description about the controller in the related embodiment of the control system shown in fig. 5, which is not described in detail herein for the sake of brevity, but should not be construed as a loss of the protection scope of the present application.
FIG. 11 is a schematic diagram of a further computing device according to a further exemplary embodiment of the present application. As shown in fig. 11, the computing device includes a memory 110 and a processor 111.
A processor 111 coupled to the memory 110 for executing a computer program in the memory 110 for:
acquiring sign information of a user entering a target space;
determining an adjustment parameter value of at least one facility in the target space according to the sign information of the user;
and respectively adjusting the working state of at least one facility according to the adjusting parameter value of the at least one facility so as to adapt to the sign information of the user.
In an alternative embodiment, the processor 111, when acquiring the sign information of the user entering the target space, is configured to:
Collecting an image of a user by using an image collecting device;
and extracting the sign information of the user from the image of the user.
In an alternative embodiment, processor 111, when capturing an image of a user with an image capturing device, is configured to:
An image of a user is acquired at an entrance to a target space using an image acquisition device.
In an alternative embodiment, the processor 111 is further configured to, prior to extracting the user's sign information from the user's image:
Identifying the user to determine whether the user is a registered user;
If the user is a registered user, extracting sign information from user information corresponding to the registered user;
And if the user is a non-registered user, executing the operation of extracting the sign information of the user from the image of the user.
In an alternative embodiment, the processor 111 is configured to, when determining the adjustment parameter value of at least one facility in the target space according to the sign information of the user:
respectively determining the corresponding ergonomic requirements of the user under at least one facility according to the physical sign information of the user;
and respectively determining the respective adjustment parameter values of the at least one facility according to the respective ergonomic requirements of the at least one facility.
In an alternative embodiment, the at least one facility is associated with a driving assembly, and the processor 111 is configured to, when adjusting the operating state of the at least one facility according to the adjustment parameter values of the at least one facility, respectively:
respectively generating control commands corresponding to at least one facility according to the adjustment parameter values of the at least one facility;
and respectively sending the control commands corresponding to the at least one facility to the driving components associated with the at least one facility so as to control the driving components associated with the at least one facility to adjust the working state of the at least one facility according to the adjustment parameter values of the at least one facility.
In an alternative embodiment, the facility is a chair, sofa, camera, display, table, or microphone.
In an alternative embodiment, the type of adjustment parameter includes one or more of height, inclination, or hardness.
In an alternative embodiment, the vital sign information includes one or more of height, weight, or gender information.
In an alternative embodiment, the target space is a restaurant, conference room, chat room, or entertainment room.
It should be noted that, for the technical details related to the embodiments of the computing device, reference should be made to the related description about the controller in the related embodiment of the control system shown in fig. 5, which is not described in detail herein for the sake of brevity, but should not be construed as a loss of protection scope of the present application.
Further, as shown in fig. 11, the computing device further includes: a communication component 112, a power supply component 113, and the like. Only some of the components are schematically shown in fig. 11, which does not mean that the computing device only includes the components shown in fig. 11.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing a computer program that, when executed, is capable of implementing the steps of the method embodiments described above that are executable by a computing device.
It should be noted that, the execution subjects of each step of the method provided in the above embodiment may be the same device, or the method may also be executed by different devices.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations appearing in a specific order are included, but it should be clearly understood that the operations may be performed out of the order in which they appear herein or performed in parallel, the sequence numbers of the operations such as 800, 801, etc. are merely used to distinguish between the various operations, and the sequence numbers themselves do not represent any order of execution. In addition, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first" and "second" herein are used to distinguish different messages, devices, modules, etc., and do not represent a sequence, and are not limited to the "first" and the "second" being different types.
Wherein the memories of fig. 7,9 and 11 are used to store computer programs and may be configured to store various other data to support operations on the computing platform. Examples of such data include instructions for any application or method operating on a computing platform, contact data, phonebook data, messages, pictures, videos, and the like. The memory may be implemented by any type of volatile or nonvolatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
Wherein the communication components of fig. 7, 9 and 11 are configured to facilitate wired or wireless communication between the device in which the communication component is located and other devices. The device where the communication component is located can access a wireless network based on a communication standard, such as a mobile communication network of WiFi,2G, 3G, 4G/LTE, 5G, etc., or a combination thereof. In one exemplary embodiment, the communication component receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further comprises a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
The power supply assembly in fig. 7, 9 and 11 provides power for various components of the device in which the power supply assembly is located. The power components may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the devices in which the power components are located.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.

Claims (34)

1. A carrier control method, characterized by comprising:
determining a sign difference between at least two users in the same behavior scene;
acquiring the sign information of each of the at least two users, and determining the ergonomic requirements of each of the at least two users according to the sign information;
For the behavior carriers used by the at least two users, respectively determining the numerical ranges of the adjustment parameters conforming to the ergonomic requirements of the users;
Respectively selecting target adjustment parameter values from the value ranges of the adjustment parameters corresponding to the action carriers used by the at least two users by taking the mutual adaptation of the action states between the at least two users as targets, and taking the target adjustment parameter values as adjustment parameter values corresponding to the action carriers used by the at least two users;
According to the adjustment parameter values, adjusting the working states of the behavior carriers so as to enable the behavior states of the at least two users to be mutually adapted;
The working state of the behavior carrier influences the behavior state of a user borne by the behavior carrier.
2. The method of claim 1, wherein the determining the difference in sign between at least two users in the same behavioral scenario comprises:
Acquiring images of the at least two users by using an image acquisition device;
Extracting respective sign information of the at least two users from the images of the at least two users;
And determining the sign difference between the at least two users according to the sign information of each of the at least two users.
3. The method of claim 2, wherein the capturing images of the at least two users with the image capture device comprises:
and respectively acquiring images of the at least two users at the entrance of the environment space corresponding to the behavior scene by using the image acquisition equipment.
4. The method of claim 2, wherein the capturing images of the at least two users with the image capture device comprises:
At the initial stage that the at least two users are loaded on the corresponding behavior carriers, acquiring images containing the at least two users by using the image acquisition equipment;
In the initial stage that the at least two users load on the corresponding behavior carriers, each behavior carrier is in an initial working state.
5. The method of claim 2, wherein prior to extracting the respective sign information of the at least two users from the images of the at least two users, further comprising:
Identifying the at least two users to determine whether registered users exist in the at least two users;
Extracting sign information from user information corresponding to a registered user aiming at the registered user;
and aiming at the non-registered user, executing the operation of extracting the sign information of each of the at least two users from the images of the at least two users.
6. The method according to claim 1, wherein before determining the adjustment parameter values corresponding to the behavior vectors used by the at least two users, respectively, according to the sign differences, the method further comprises:
determining the relative positions between the at least two users and each behavior carrier in a preparation stage of the behavior scene;
Determining the behavior carriers selected by the at least two users according to the relative positions between the at least two users and the behavior carriers;
Wherein, in the preliminary stage, the at least two users are respectively seated to the selected behavioural carrier.
7. The method according to claim 1, wherein each behavior carrier is associated with a driving component, and said adjusting the working state of each behavior carrier according to the adjustment parameter value comprises:
Respectively generating control commands corresponding to the behavior carriers according to the adjustment parameter values corresponding to the behavior carriers used by the at least two users;
and respectively sending the control commands corresponding to the behavior carriers to the driving components associated with the behavior carriers so as to control the driving components to adjust the working states of the behavior carriers according to the adjustment parameter values corresponding to the behavior carriers.
8. The method of claim 1, wherein the behavioral carrier comprises one or more of a chair, a table, a sofa, or a lift;
The type of the adjustment parameter includes one or more of height, inclination or hardness;
the sign includes one or more of height, weight, or gender.
9. The method of claim 1, wherein the behavioral scenario comprises one or more of a chat scenario, a dining scenario, a meeting scenario, or an entertainment scenario.
10. A facility control method, characterized by comprising:
Determining a sign difference between at least two users using the same target facility;
acquiring the sign information of each of the at least two users, and determining the ergonomic requirements of each of the at least two users according to the sign information;
respectively determining the numerical ranges of the adjustment parameters which meet the ergonomic requirements of the at least two users;
Selecting target adjustment parameter values from the numerical ranges of the adjustment parameters corresponding to the at least two users respectively by taking the physical signs of the at least two users as targets;
Determining the adjustment parameter values corresponding to the target facilities according to the target adjustment parameter values corresponding to the at least two users;
and adjusting the working state of the target facility according to the adjustment parameter value so as to adapt to the physical characteristics of the at least two users.
11. The method of claim 10, wherein the determining the difference in sign between at least two users using the same target facility comprises:
Acquiring images of the at least two users by using an image acquisition device;
Extracting respective sign information of the at least two users from the images of the at least two users;
And determining the sign difference between the at least two users according to the sign information of each of the at least two users.
12. The method of claim 11, wherein the capturing images of the at least two users with the image capture device comprises:
And respectively acquiring the images of the at least two users at the entrance of the environment space where the target equipment is positioned by using the image acquisition equipment.
13. The method of claim 11, wherein the capturing images of the at least two users with the image capture device comprises:
With the at least two users in place at respective facility use locations, images containing the at least two users are acquired with the image acquisition device.
14. The method of claim 11, wherein prior to extracting the respective sign information of the at least two users from the images of the at least two users, further comprising:
Identifying the at least two users to determine whether registered users exist in the at least two users;
Extracting sign information from user information corresponding to a registered user aiming at the registered user;
and aiming at the non-registered user, executing the operation of extracting the sign information of each of the at least two users from the images of the at least two users.
15. The method of claim 10, wherein the target facility has associated therewith a drive assembly, and wherein adjusting the operating state of the target facility in accordance with the adjustment parameter value comprises:
Generating a control command according to the adjustment parameter value corresponding to the target facility;
and sending the control command to a driving component associated with the target facility so as to control the driving component to adjust the working state of the target facility according to the adjusting parameter value corresponding to the target facility.
16. The method of claim 10, wherein the target facility comprises one or more of a camera, a display screen, a table, or a microphone;
The type of the adjustment parameter includes one or more of height, inclination or hardness;
the sign includes one or more of height, weight, or gender.
17. A facility control method, characterized by comprising:
acquiring sign information of a user entering a target space;
Respectively determining corresponding ergonomic requirements of the user under various facilities in the target space according to the sign information of the user;
Respectively determining the adjustment parameter values corresponding to the facilities according to the ergonomic requirements corresponding to the facilities;
And respectively adjusting the working states of the facilities according to the adjustment parameter values of the facilities so as to adapt to the sign information of the user.
18. The method of claim 17, wherein the acquiring the sign information of the user into the target space comprises:
Acquiring an image of the user by using an image acquisition device;
and extracting sign information of the user from the image of the user.
19. The method of claim 18, wherein the capturing the image of the user with the image capture device comprises:
and acquiring the image of the user at the entrance of the target space by using the image acquisition equipment.
20. The method of claim 18, wherein prior to extracting the user's sign information from the user's image, further comprising:
Identifying the user to determine whether the user is a registered user;
if the user is a registered user, extracting sign information from user information corresponding to the registered user;
And if the user is a non-registered user, executing the operation of extracting the sign information of the user from the image of the user.
21. The method of claim 17, wherein each of the plurality of facilities has associated therewith a drive assembly, and wherein adjusting the operating state of each of the plurality of facilities according to the adjustment parameter values of each of the plurality of facilities comprises:
respectively generating control commands corresponding to the various facilities according to the adjustment parameter values of the various facilities;
And respectively sending the control commands corresponding to the various facilities to the driving components associated with the various facilities so as to control the driving components associated with the various facilities to adjust the working states of the various facilities according to the adjustment parameter values of the various facilities.
22. The method of claim 17, wherein the facility is a chair, a sofa, a camera, a display screen, a table, or a microphone;
The type of the adjustment parameter includes one or more of height, inclination or hardness;
the physical sign information comprises one or more of height, weight or gender information;
The target space is a restaurant, conference room, chat room or entertainment room.
23. A control system, comprising: a controller and at least two behavior carriers;
the at least two behavior carriers are used for bearing users;
The controller is used for determining sign differences between at least two users in the same behavior scene; acquiring the sign information of each of the at least two users, and determining the ergonomic requirements of each of the at least two users according to the sign information; for the behavior carriers used by the at least two users, respectively determining the numerical ranges of the adjustment parameters conforming to the ergonomic requirements of the users; respectively selecting target adjustment parameter values from the value ranges of the adjustment parameters corresponding to the action carriers used by the at least two users by taking the mutual adaptation of the action states between the at least two users as targets, and taking the target adjustment parameter values as adjustment parameter values corresponding to the action carriers used by the at least two users; according to the adjustment parameter values, adjusting the working states of the behavior carriers so as to enable the behavior states of the at least two users to be mutually adapted;
The working state of the behavior carrier influences the behavior state of a user borne by the behavior carrier.
24. The system of claim 23, further comprising a detection component;
the detection component is used for detecting sign information of at least two users in the same behavior scene and providing the sign information to the controller so that the controller can determine the sign difference between the at least two users.
25. The system of claim 23, wherein each of the at least two behavior carriers is associated with a drive component;
the controller is used for respectively generating control commands corresponding to the behavior carriers according to the adjustment parameter values corresponding to the behavior carriers used by the at least two users; and respectively sending the control commands corresponding to the behavior carriers to the driving components associated with the behavior carriers so as to control the driving components to adjust the working states of the behavior carriers according to the adjustment parameter values corresponding to the behavior carriers.
26. An intelligent facility is characterized by comprising a facility body, a processor and a driving assembly;
The processor is used for determining sign differences between at least two users in a behavior scene containing the users borne by the intelligent facility; acquiring the sign information of each of the at least two users, and determining the ergonomic requirements of each of the at least two users according to the sign information; for the intelligent facilities used by the at least two users, respectively determining the numerical ranges of the adjustment parameters conforming to the ergonomic requirements of the users; respectively selecting target adjustment parameter values from the value ranges of adjustment parameters corresponding to the intelligent facilities used by the at least two users by taking the behavior states between the at least two users as targets, wherein the behavior states are mutually adapted to serve as adjustment parameter values corresponding to the intelligent facilities used by the at least two users;
And adjusting the working state of the intelligent facility by utilizing the driving component according to the adjustment parameter value so as to enable the behavior state of the user borne by the intelligent facility to be matched with the behavior states of other users in the behavior scene.
27. The intelligent appliance of claim 26, wherein the processor is specifically configured to:
Acquiring respective sign information of the at least two users in the behavior scene by using detection components corresponding to the behavior scene;
And determining the sign difference between the at least two users according to the sign information of each of the at least two users.
28. A control system, comprising: a controller and a target facility;
The target facility is used for providing facility services for users;
The controller is configured to determine a sign difference between at least two users using the same target facility; acquiring the sign information of each of the at least two users, and determining the ergonomic requirements of each of the at least two users according to the sign information; respectively determining the numerical ranges of the adjustment parameters which meet the ergonomic requirements of the at least two users; selecting target adjustment parameter values from the numerical ranges of the adjustment parameters corresponding to the at least two users respectively by taking the physical signs of the at least two users as targets; determining the adjustment parameter values corresponding to the target facilities according to the target adjustment parameter values corresponding to the at least two users; and adjusting the working state of the target facility according to the adjustment parameter value so as to adapt to the signs of the at least two users.
29. An intelligent facility is characterized by comprising a facility body, a processor and a driving assembly;
The processor is configured to determine a sign difference between at least two users using the smart facility; acquiring the sign information of each of the at least two users, and determining the ergonomic requirements of each of the at least two users according to the sign information; respectively determining the numerical ranges of the adjustment parameters which meet the ergonomic requirements of the at least two users; selecting target adjustment parameter values from the numerical ranges of the adjustment parameters corresponding to the at least two users respectively by taking the physical signs of the at least two users as targets; according to the target adjustment parameter values corresponding to the at least two users, determining adjustment parameter values corresponding to the intelligent facilities;
and adjusting the working state of the intelligent facility by utilizing the driving component according to the adjustment parameter value so as to adapt to the physical signs of the at least two users.
30. A control system, comprising: a controller and a plurality of facilities located in the target space;
the plurality of facilities are used for providing facility services for users;
The controller is used for acquiring physical sign information of a user entering the target space; respectively determining the corresponding ergonomic requirements of the user under various facilities according to the sign information of the user; respectively determining the adjustment parameter values corresponding to the facilities according to the ergonomic requirements corresponding to the facilities; and respectively adjusting the working states of the facilities according to the adjustment parameter values of the facilities so as to adapt to the sign information of the user.
31. A computing device comprising a memory and a processor;
the memory is used for storing one or more computer instructions;
the processor is coupled to the memory for executing the one or more computer instructions for:
determining a sign difference between at least two users in the same behavior scene;
acquiring the sign information of each of the at least two users, and determining the ergonomic requirements of each of the at least two users according to the sign information;
For the behavior carriers used by the at least two users, respectively determining the numerical ranges of the adjustment parameters conforming to the ergonomic requirements of the users;
Respectively selecting target adjustment parameter values from the value ranges of the adjustment parameters corresponding to the action carriers used by the at least two users by taking the mutual adaptation of the action states between the at least two users as targets, and taking the target adjustment parameter values as adjustment parameter values corresponding to the action carriers used by the at least two users;
According to the adjustment parameter values, adjusting the working states of the behavior carriers so as to enable the behavior states of the at least two users to be mutually adapted;
The working state of the behavior carrier influences the behavior state of a user borne by the behavior carrier.
32. A computing device comprising a memory and a processor;
the memory is used for storing one or more computer instructions;
the processor is coupled to the memory for executing the one or more computer instructions for:
Determining a sign difference between at least two users using the same target facility;
acquiring the sign information of each of the at least two users, and determining the ergonomic requirements of each of the at least two users according to the sign information;
respectively determining the numerical ranges of the adjustment parameters which meet the ergonomic requirements of the at least two users;
Selecting target adjustment parameter values from the numerical ranges of the adjustment parameters corresponding to the at least two users respectively by taking the physical signs of the at least two users as targets;
Determining the adjustment parameter values corresponding to the target facilities according to the target adjustment parameter values corresponding to the at least two users;
and adjusting the working state of the target facility according to the adjustment parameter value so as to adapt to the signs of the at least two users.
33. A computing device comprising a memory and a processor;
the memory is used for storing one or more computer instructions;
the processor is coupled to the memory for executing the one or more computer instructions for:
acquiring sign information of a user entering a target space;
Respectively determining corresponding ergonomic requirements of the user under various facilities in the target space according to the sign information of the user;
Respectively determining the adjustment parameter values corresponding to the facilities according to the ergonomic requirements corresponding to the facilities;
And respectively adjusting the working states of the facilities according to the adjustment parameter values of the facilities so as to adapt to the sign information of the user.
34. A computer-readable storage medium storing computer instructions that, when executed by one or more processors, cause the one or more processors to perform the carrier control method of any one of claims 1-9 or the facility control method of any one of claims 10-22.
CN202010093466.6A 2020-02-14 2020-02-14 Carrier, facility control method, device, system and storage medium Active CN113268014B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010093466.6A CN113268014B (en) 2020-02-14 2020-02-14 Carrier, facility control method, device, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010093466.6A CN113268014B (en) 2020-02-14 2020-02-14 Carrier, facility control method, device, system and storage medium

Publications (2)

Publication Number Publication Date
CN113268014A CN113268014A (en) 2021-08-17
CN113268014B true CN113268014B (en) 2024-07-09

Family

ID=77227309

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010093466.6A Active CN113268014B (en) 2020-02-14 2020-02-14 Carrier, facility control method, device, system and storage medium

Country Status (1)

Country Link
CN (1) CN113268014B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110693654A (en) * 2019-10-15 2020-01-17 北京小米移动软件有限公司 Method and device for adjusting intelligent wheelchair and electronic equipment

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9700240B2 (en) * 2012-12-14 2017-07-11 Microsoft Technology Licensing, Llc Physical activity inference from environmental metrics
US20160089294A1 (en) * 2014-09-26 2016-03-31 Marc Jonas Guillaume Dual-chair assembly
US9916537B2 (en) * 2015-03-03 2018-03-13 Pynk Systems, S.L. Smart office desk interactive with the user
CN106354251B (en) * 2016-08-17 2019-04-02 深圳前海小橙网科技有限公司 A kind of model system and method that virtual scene is merged with real scene
CN106724376A (en) * 2016-12-06 2017-05-31 南京九致信息科技有限公司 Height Adjustable intelligent table, chair and height adjusting method
CN208541010U (en) * 2017-10-30 2019-02-26 北京广研广播电视高科技中心 A kind of showing stand for client connection
CN108958686A (en) * 2018-07-23 2018-12-07 广州维纳斯家居股份有限公司 Intelligent elevated table synergetic office work method, apparatus, intelligent elevated table and storage medium
CN108965443A (en) * 2018-07-23 2018-12-07 广州维纳斯家居股份有限公司 Intelligent elevated table height adjusting method, device, intelligent elevated table and storage medium
CN110515510B (en) * 2019-08-20 2021-03-02 北京小米移动软件有限公司 Data processing method, apparatus, equipment and storage medium
CN110531872B (en) * 2019-08-20 2025-11-07 北京小米移动软件有限公司 Input method window display method, device, equipment and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110693654A (en) * 2019-10-15 2020-01-17 北京小米移动软件有限公司 Method and device for adjusting intelligent wheelchair and electronic equipment

Also Published As

Publication number Publication date
CN113268014A (en) 2021-08-17

Similar Documents

Publication Publication Date Title
US11647172B2 (en) Content presentation method, content presentation mode push method, and intelligent terminal
US10708544B2 (en) Group and conversational framing for speaker tracking in a video conference system
US12022224B2 (en) Image capturing method and apparatus, computer device, and storage medium
US11800235B2 (en) Dual exposure control in a camera system
RU2578210C1 (en) Method and device for correcting skin colour
CN103945121A (en) Information processing method and electronic equipment
US20200058302A1 (en) Lip-language identification method and apparatus, and augmented reality device and storage medium
CN102378097A (en) Microphone control system and method
US12430789B2 (en) Image processing method and apparatus, and computer-readable storage medium
CN105960801B (en) Enhancing video conferencing
KR20150031896A (en) Speech recognition device and the operation method
US11695992B2 (en) Content recommendations for users with disabilities
AU2014339827B2 (en) Generating image compositions
CN105892854A (en) Photographing parameter menu loading method and device
US9491263B2 (en) Systems and methods for automatically modifying a picture or a video containing a face
CN110345610B (en) Control method and device of air conditioner and air conditioning equipment
CN107578777A (en) Word-information display method, apparatus and system, audio recognition method and device
JP2020136971A (en) Back viewing system and back viewing method
US20250119610A1 (en) Systems and methods for providing media content for an exhibit or display
US20220277528A1 (en) Virtual space sharing system, virtual space sharing method, and virtual space sharing program
CN113268014B (en) Carrier, facility control method, device, system and storage medium
US20120281114A1 (en) System, method and apparatus for providing an adaptive media experience
JP6829348B1 (en) Shooting control method, information processing device, program, and shooting system
US20170140789A1 (en) Image processing apparatus that selects images, image processing method, and storage medium
US12464084B2 (en) Contactless photo system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant