Laser induction system and laser induction method

文档序号:1278879 发布日期:2020-08-28 浏览:21次 中文

阅读说明:本技术 一种激光感应系统及激光感应方法 (Laser induction system and laser induction method ) 是由 韩超 于 2020-04-02 设计创作,主要内容包括:本申请提供了一种激光感应系统及激光感应方法,包括手持激光发射器,用于根据用户的触发指令向终端设备发射第一激光信号;可穿戴激光发射器,用于向终端设备发射多个第二激光信号;摄像装置,用于将采集到的用户的动作图像发送至终端设备;终端设备,用于确定第一激光信号对应的第一激光照射点,以及每个第二激光信号对应的第二激光照射点,将第一激光照射点确定为虚拟角色的射击位置,基于每个第二激光照射点以及动作图像,确定出虚拟角色对应的目标移动姿势以及移动方向,并控制虚拟角色按照目标移动姿势向移动方向移动,增加了用户射击的命中率,降低了射击的操作难度,有助于提高用户的体验感。(The application provides a laser sensing system and a laser sensing method, which comprise a handheld laser transmitter, a laser processing unit and a laser sensing unit, wherein the handheld laser transmitter is used for transmitting a first laser signal to terminal equipment according to a trigger instruction of a user; the wearable laser transmitter is used for transmitting a plurality of second laser signals to the terminal equipment; the camera device is used for sending the collected action image of the user to the terminal equipment; the terminal device is used for determining a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each second laser signal, determining the first laser irradiation point as a shooting position of the virtual character, determining a target moving posture and a moving direction corresponding to the virtual character based on each second laser irradiation point and the action image, and controlling the virtual character to move to the moving direction according to the target moving posture, so that the shooting hit rate of a user is increased, the shooting operation difficulty is reduced, and the user experience is improved.)

1. The utility model provides a laser induction system which characterized in that, laser induction system includes handheld laser emitter, a plurality of wearable laser emitter, camera device and terminal equipment:

the handheld laser transmitter is used for transmitting a first laser signal to the terminal equipment according to a trigger instruction sent by a user;

the wearable laser transmitters are respectively worn at different parts of the body of the user and used for transmitting a plurality of second laser signals to the terminal equipment according to preset time intervals;

the camera device is used for acquiring the action image of the user according to a preset time interval and sending the acquired action image to the terminal equipment;

the terminal device is configured to display a virtual character corresponding to the user and a scene corresponding to the virtual character, determine a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each of the second laser signals in the plurality of second laser signals, determine the first laser irradiation point as a shooting position of the virtual character, determine a target movement posture and a movement direction corresponding to the virtual character based on each second laser irradiation point and the motion image, and control the virtual character to move in the movement direction according to the target movement posture.

2. The laser sensing system of claim 1, wherein the terminal device comprises a signal receiving module, a temperature identification module, and a signal identification module:

the signal receiving module is used for receiving a first laser signal emitted by the handheld laser emitter and a second laser signal emitted by each wearable laser emitter in the plurality of wearable laser emitters;

the temperature identification module is used for determining a plurality of laser irradiation points to be identified corresponding to the first laser signals and the plurality of second laser signals in a display screen of the terminal device according to a temperature difference between an irradiated area and an unirradiated area in the display screen;

the signal identification module is used for identifying a first laser irradiation point corresponding to the handheld laser transmitter and a second laser irradiation point corresponding to each wearable laser transmitter in the laser irradiation points to be identified according to the irradiation frequency of the laser irradiation points to be identified, and determining the first laser irradiation point as the shooting position of the virtual character.

3. The laser sensing system of claim 1, wherein the terminal device further comprises a location identification module, an action identification module, and a function control module:

the position identification module is used for receiving the action images sent by the camera device and determining the moving direction of the user based on at least two received action images;

the motion recognition module is used for determining the target movement posture of the virtual character based on each second laser irradiation point and the received motion image;

and the function control module is used for controlling the virtual character to move towards the moving direction according to the target moving posture.

4. The laser sensing system of claim 3, wherein the motion recognition module comprises a part determination unit, a movement gesture determination unit, and a gesture conversion unit:

the part determining unit is used for determining a wearable laser transmitter corresponding to each second laser irradiation point and determining a wearing part of the wearable laser transmitter;

the movement posture determining unit is used for determining an initial movement posture corresponding to the user from the action image and adjusting the initial movement posture according to each wearing part;

and the posture conversion unit is used for performing coordinate conversion on the adjusted initial movement posture and converting the initial movement posture into a target movement posture corresponding to the virtual character.

5. The laser sensing system of claim 1, wherein the terminal device further comprises a gesture correction module:

the gesture correction module is configured to determine, for each second laser irradiation point, whether the second laser irradiation point is located within a corresponding preset standard range, and if the second laser irradiation point is not located within the preset standard range, display the second laser irradiation point and the preset standard range corresponding to the second laser irradiation point in a display screen of the terminal device.

6. The laser sensing system according to claim 1, wherein a display screen of the terminal device is covered with a transparent optical laser sensing film, and the transparent optical laser sensing film is used for sensing the first laser signal and each of the second laser signals.

7. A laser sensing method applied to the terminal device according to any one of claims 1 to 6, the laser sensing method comprising:

after receiving a first laser signal transmitted by a handheld laser transmitter and a second laser signal transmitted by each wearable laser transmitter, respectively determining a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each second laser signal;

determining the first laser irradiation point as a shooting position of a virtual character;

determining a target movement posture and a movement direction corresponding to the virtual character displayed by the terminal equipment based on each second laser irradiation point and the received action image of the user sent by the camera device;

and controlling the virtual character to move to the moving direction according to the target moving posture.

8. The laser sensing method according to claim 7, wherein the determining a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each second laser signal after receiving the first laser signal emitted by the handheld laser emitter and the second laser signal emitted by each wearable laser emitter respectively comprises:

after a first laser signal transmitted by the handheld laser transmitter and a second laser signal transmitted by each wearable laser transmitter are received, determining a plurality of laser irradiation points to be identified corresponding to the first laser signal and each second laser signal in a display screen of the terminal device according to a temperature difference between an irradiation point and an unirradiated area in the display screen;

according to the irradiation frequency of the laser irradiation points to be identified, identifying a first laser irradiation point corresponding to the handheld laser emitter in the laser irradiation points to be identified and a second laser irradiation point corresponding to each wearable laser emitter.

9. The laser sensing method according to claim 7, wherein the determining a target movement posture and a movement direction corresponding to the virtual character displayed by the terminal device based on each of the second laser irradiation points and the received motion image of the user transmitted by the camera device includes:

receiving the action images sent by the camera device, and determining the moving direction of the user based on at least two received action images;

and determining the target movement posture of the virtual character based on each second laser irradiation point and the received motion image.

10. The laser sensing method according to claim 9, wherein the determining a target movement posture of the virtual character based on each of the second laser irradiation points and the received motion image includes:

for each second laser irradiation point, determining a wearable laser transmitter corresponding to the second laser irradiation point, and determining a wearing part of the wearable laser transmitter;

determining an initial movement posture of the user from the motion image, and adjusting the initial movement posture based on each wearing part;

and performing coordinate transformation on the adjusted initial movement posture, and converting the initial movement posture into a target movement posture corresponding to the virtual character.

11. The laser sensing method of claim 7, wherein after the controlling the virtual character to move in the moving direction according to the target movement gesture, the laser sensing method further comprises:

for each second laser irradiation point, determining whether the second laser irradiation point is located within a corresponding preset standard range;

and if the second laser irradiation point is not located in the preset standard range, displaying the second laser irradiation point and the preset standard range corresponding to the second laser irradiation point in a display screen of the terminal equipment.

Technical Field

The application relates to the technical field of simulation games, in particular to a laser sensing system and a laser sensing method.

Background

Along with the continuous development of science and technology, the improvement of people's standard of living, people can choose to carry out some games and motion usually in the time of leisure, for the convenience of watching and reinforcing people's experience sense, to some shooting class games, can simulate a comparatively real shooting scene for the player through some information acquisition equipment usually, the player can oneself control the shooting direction of the virtual shooting equipment of use promptly.

The game mode is very single for some players, and the players usually find that the shooting positions of the virtual characters are not the shooting positions most suitable for shooting, but can not change the shooting positions, so that the shooting operation difficulty of the players is high, the hit rate is low, the operation fatigue of the users is easily caused, and the experience degree of the users is reduced.

Disclosure of Invention

In view of this, an object of the present application is to provide a laser sensing system and a laser sensing method, which can control a virtual character corresponding to a user to move according to a target movement posture and a movement direction of the user, increase a hit rate of shooting by the user, reduce an operation difficulty of shooting, and contribute to improving a user experience.

The embodiment of the application provides a laser induction system, laser induction system includes handheld laser emitter, a plurality of wearable laser emitter, camera device and terminal equipment:

the handheld laser transmitter is used for transmitting a first laser signal to the terminal equipment according to a trigger instruction sent by a user;

the wearable laser transmitters are respectively worn at different parts of the body of the user and used for transmitting a plurality of second laser signals to the terminal equipment according to preset time intervals;

the camera device is used for acquiring the action image of the user according to a preset time interval and sending the acquired action image to the terminal equipment;

the terminal device is configured to display a virtual character corresponding to the user and a scene corresponding to the virtual character, determine a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each of the second laser signals in the plurality of second laser signals, determine the first laser irradiation point as a shooting position of the virtual character, determine a target movement posture and a movement direction corresponding to the virtual character based on each second laser irradiation point and the motion image, and control the virtual character to move in the movement direction according to the target movement posture.

Further, the terminal device comprises a signal receiving module, a temperature identification module and a signal identification module:

the signal receiving module is used for receiving a first laser signal emitted by the handheld laser emitter and a second laser signal emitted by each wearable laser emitter in the plurality of wearable laser emitters;

the temperature identification module is used for determining a plurality of laser irradiation points to be identified corresponding to the first laser signals and the plurality of second laser signals in a display screen of the terminal device according to a temperature difference between an irradiated area and an unirradiated area in the display screen;

the signal identification module is used for identifying a first laser irradiation point corresponding to the handheld laser transmitter and a second laser irradiation point corresponding to each wearable laser transmitter in the laser irradiation points to be identified according to the irradiation frequency of the laser irradiation points to be identified, and determining the first laser irradiation point as the shooting position of the virtual character.

Further, the terminal device further comprises a position identification module, an action identification module and a function control module:

the position identification module is used for receiving the action images sent by the camera device and determining the moving direction of the user based on at least two received action images;

the motion recognition module is used for determining the target movement posture of the virtual character based on each second laser irradiation point and the received motion image;

and the function control module is used for controlling the virtual character to move towards the moving direction according to the target moving posture.

Further, the motion recognition module includes a part determination unit, a movement posture determination unit, and a posture conversion unit:

the part determining unit is used for determining a wearable laser transmitter corresponding to each second laser irradiation point and determining a wearing part of the wearable laser transmitter;

the movement posture determining unit is used for determining an initial movement posture corresponding to the user from the action image and adjusting the initial movement posture according to each wearing part;

and the posture conversion unit is used for performing coordinate conversion on the adjusted initial movement posture and converting the initial movement posture into a target movement posture corresponding to the virtual character.

Further, the terminal device further comprises a gesture correction module:

the gesture correction module is configured to determine, for each second laser irradiation point, whether the second laser irradiation point is located within a corresponding preset standard range, and if the second laser irradiation point is not located within the preset standard range, display the second laser irradiation point and the preset standard range corresponding to the second laser irradiation point in a display screen of the terminal device.

Further, a display screen of the terminal device is covered with a transparent optical laser sensing film, and the transparent optical laser sensing film is used for sensing the first laser signal and each of the second laser signals.

Further, the camera device is arranged above the terminal equipment.

The embodiment of the application further provides a laser sensing method, which is applied to the terminal device, and the laser sensing method comprises the following steps:

after receiving a first laser signal transmitted by a handheld laser transmitter and a second laser signal transmitted by each wearable laser transmitter, respectively determining a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each second laser signal;

determining the first laser irradiation point as a shooting position of a virtual character;

determining a target movement posture and a movement direction corresponding to the virtual character displayed by the terminal equipment based on each second laser irradiation point and the received action image of the user sent by the camera device;

and controlling the virtual character to move to the moving direction according to the target moving posture.

Further, after receiving a first laser signal transmitted by the handheld laser transmitter and a second laser signal transmitted by each wearable laser transmitter, determining a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each second laser signal, respectively, includes:

after a first laser signal transmitted by the handheld laser transmitter and a second laser signal transmitted by each wearable laser transmitter are received, determining a plurality of laser irradiation points to be identified corresponding to the first laser signal and each second laser signal in a display screen of the terminal device according to a temperature difference between an irradiation point and an unirradiated area in the display screen;

according to the irradiation frequency of the laser irradiation points to be identified, identifying a first laser irradiation point corresponding to the handheld laser emitter in the laser irradiation points to be identified and a second laser irradiation point corresponding to each wearable laser emitter.

Further, the determining, based on each of the second laser irradiation points and the received motion image of the user sent by the camera device, a target movement posture and a movement direction corresponding to the virtual character displayed by the terminal device includes:

receiving the action images sent by the camera device, and determining the moving direction of the user based on at least two received action images;

and determining the target movement posture of the virtual character based on each second laser irradiation point and the received motion image.

Further, the determining the target movement posture of the virtual character based on each second laser irradiation point and the received motion image includes:

for each second laser irradiation point, determining a wearable laser transmitter corresponding to the second laser irradiation point, and determining a wearing part of the wearable laser transmitter;

determining an initial movement posture of the user from the motion image, and adjusting the initial movement posture based on each wearing part;

and performing coordinate transformation on the adjusted initial movement posture, and converting the initial movement posture into a target movement posture corresponding to the virtual character.

Further, after the controlling the virtual character to move to the moving direction according to the target movement gesture, the laser sensing method further includes:

for each second laser irradiation point, determining whether the second laser irradiation point is located within a corresponding preset standard range;

and if the second laser irradiation point is not located in the preset standard range, displaying the second laser irradiation point and the preset standard range corresponding to the second laser irradiation point in a display screen of the terminal equipment.

The laser sensing system and the laser sensing method provided by the embodiment of the application receive a first laser signal emitted by a handheld laser emitter according to a trigger instruction sent by a user and a plurality of second laser signals sent by a wearable laser emitter according to a preset time interval, respectively determine a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each second laser signal in a display screen of a terminal device, determine the first laser irradiation point as a shooting position of a virtual character, determine a target movement posture and a movement direction of the virtual character based on the action images of the user collected by the plurality of second laser irradiation points and a camera device, and control the virtual character to move to the movement direction according to the target movement posture. Therefore, the user can adjust the shooting position, the shooting direction and the shooting posture of the corresponding virtual character according to the shooting habit of the user, so that the shooting hit rate of the user can be increased, the shooting operation difficulty is reduced, and the user experience is improved.

In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.

Drawings

In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.

Fig. 1 is a schematic structural diagram of a laser sensing system according to an embodiment of the present disclosure;

fig. 2 is one of the schematic structural diagrams of the terminal device shown in fig. 1;

FIG. 3 is a second schematic structural diagram of the terminal device shown in FIG. 1;

FIG. 4 is a schematic diagram of the structure of the motion recognition module shown in FIG. 3;

FIG. 5 is a third schematic structural diagram of the terminal device shown in FIG. 1;

FIG. 6 is a flow chart of a laser sensing method provided in an embodiment of the present application;

fig. 7 is a flowchart of a laser sensing method according to another embodiment of the present disclosure.

Icon: 100-laser induction system; 110-a hand-held laser transmitter; 120-a wearable laser transmitter; 130-a camera device; 140-terminal device; 141-a signal receiving module; 142-a temperature identification module; 143-a signal identification module; 144-a location identification module; 145-action recognition module; 146-a function control module; 147-a posture correction module; 1451-a location determination unit; 1452-a movement gesture determination unit; 1453-gesture conversion unit.

Detailed Description

In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. Every other embodiment that can be obtained by a person skilled in the art without making creative efforts based on the embodiments of the present application falls within the protection scope of the present application.

Research shows that the current game mode is very single for some players, and the players usually find that the shooting positions of the virtual characters are not the shooting positions most suitable for shooting, but can not change the shooting positions, so that the shooting operation difficulty of the players is high, the hit rate is low, the operation fatigue of the users is easy to cause, and the experience degree of the users is reduced.

In view of this, an object of the present disclosure is to provide a laser sensing system and a laser sensing method, in which the laser sensing system 100 receives a first laser signal emitted by a user through a handheld laser emitter 110 and a second laser signal emitted by a wearable laser emitter 120, recognizes a shooting position, a target movement posture and a movement direction of a virtual character corresponding to the user, and controls the virtual character to move to the movement direction according to the target movement posture. Therefore, the user can adjust the shooting position, the shooting direction and the shooting posture of the corresponding virtual character according to the shooting habit of the user, so that the shooting hit rate of the user can be increased, the shooting operation difficulty is reduced, and the user experience is improved.

First, a laser sensing system 100 disclosed in the present application will be described.

Referring to fig. 1, fig. 1 is a schematic structural diagram of a laser sensing system 100 according to an embodiment of the present disclosure. The laser sensing system 100 includes a handheld laser transmitter 110, a plurality of wearable laser transmitters 120, a camera 130, and a terminal device 140:

when the user sends a trigger instruction, the handheld laser transmitter 110 transmits a first laser signal to the terminal device 140 after receiving the trigger instruction sent by the user.

The user wears wearable laser emitter 120 on different parts of the user's body, e.g., wearable laser emitter a on the left wrist, wearable laser emitter B on the right wrist, wearable laser emitter C on the left ankle, wearable laser emitter D on the right foot, etc.

The wearing rule of the wearable laser emitter 120 may be preset by a technician, for example, the wearable laser emitter a must be worn on the left wrist; or after the user wears the wearable laser emitter 120 randomly, the wearing part of the wearable laser emitter 120 is determined according to the position information of the wearable laser emitter 120.

After the user wears the wearable laser transmitter 120 on a different part of the body, the wearable laser transmitter 120 can send a plurality of second laser signals to the terminal device 140 at preset time intervals.

The camera device 130 may be fixedly disposed above the terminal device 140, and configured to collect the motion image of the user according to a preset time interval, and send the collected motion image of the user to the terminal device 140.

Terminal equipment 140 can demonstrate the virtual character that corresponds with the user to and the scene that the virtual character is located, it has transparent light laser response film to cover on terminal equipment's 140 display screen, transparent light laser response film can respond to first laser signal and a plurality of second laser signal, can arouse certain difference in temperature change when first laser signal and second laser signal shine transparent light laser response film on, consequently, can utilize the difference in temperature to confirm first laser irradiation point and second laser irradiation point, confirm the position of first laser irradiation point and second laser irradiation point in terminal equipment's 140 display screen.

After receiving a first laser signal transmitted by the handheld laser transmitter 110 and a second laser transmission signal transmitted by each wearable laser transmitter 120, a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each second laser transmission signal in the display screen of the terminal device 140 are respectively determined.

Determining a first laser irradiation point corresponding to the determined first laser signal as a shooting position of the virtual character; the terminal device 140 can control the virtual character corresponding to the user to move in the movement direction according to the target movement posture, by determining the target movement posture and the movement direction of the virtual character based on the second laser irradiation point corresponding to each second laser emission signal and the received motion image of the user transmitted by the image pickup device 130.

The terminal device 140 can simultaneously recognize the first laser irradiation point and the plurality of second laser irradiation points, that is, the virtual character can shoot while moving, and the posture at the time of shooting can be the determined target movement posture.

In this way, the terminal device 140 may determine, according to the received first laser signal transmitted by the handheld laser transmitter 110, the second laser signal transmitted by each wearable laser transmitter 120, and the motion image sent by the camera device 130, a shooting position, a target movement posture, and a movement direction of the virtual character corresponding to the user, and control the virtual character to move to the movement direction according to the target movement posture.

Further, referring to fig. 2, fig. 2 is a schematic structural diagram of the terminal device 140 shown in fig. 1, where the terminal device 140 includes a signal receiving module 141, a temperature identification module 142, and a signal identification module 143.

The signal receiving module 141 is configured to receive a first laser signal emitted by the handheld laser emitter 110 and a second laser signal emitted by each of the plurality of wearable laser emitters 120.

When the signal receiving module 141 receives the first laser signal and each of the second laser signals, the temperature identification module 142 may determine a plurality of to-be-identified laser irradiation points irradiated by the first laser signal and the plurality of second laser signals in the display screen according to a temperature difference between an irradiated point and an unirradiated area in the display screen of the terminal device 140.

The signal recognition module 143 may recognize, from among the plurality of laser irradiation points to be recognized, a first laser irradiation point obtained from a first laser signal emitted by the handheld laser emitter 110, that is, a first laser irradiation point corresponding to the handheld laser emitter 110, and a second laser irradiation point obtained from a second laser signal emitted by each wearable laser emitter 120, that is, a second laser irradiation point corresponding to each wearable laser emitter 120, according to an irradiation frequency of each laser irradiation point to be recognized, and determine the determined first laser irradiation point as a shooting position of the virtual character, where the virtual character shoots at the shooting position corresponding to the first laser irradiation point when the user sends a shooting instruction to the handheld laser emitter 110.

Further, referring to fig. 3, fig. 3 is a second schematic structural diagram of the terminal device 140 shown in fig. 1, and the terminal device 140 further includes a location identification module 144, an action identification module 145, and a function control module 146.

The position recognition module 144 is configured to receive the motion image of the user sent by the camera device 130, recognize a position area of the user in each motion image based on at least two received motion images, and determine a moving direction of the user according to a time sequence of each motion image.

For example, the camera device 130 is fixedly installed in advance, with the center of the collected motion image as an origin, the user is located at the left end of the origin in the first frame of motion image, that is, the user is located at the left end of the motion image, and the user is located at the right end of the origin in the second frame of motion image collected according to the preset time interval, that is, the user is located at the right end of the motion image, and obviously, the user has moved rightward; for another example, the upper body of the user is located at the upper half of the origin in the first frame of motion image, that is, the upper body of the user is located at the upper half of the image, and the upper body of the user is located at the lower half of the origin in the second frame of motion image acquired at the preset time interval, that is, the upper body of the user is located at the lower half of the image.

The motion recognition module 145 is configured to determine, according to each second laser irradiation point and the received motion image of the user sent by the camera device 130, a target movement posture of the virtual character corresponding to the user displayed by the terminal device 140.

The function control module 146 is configured to control the virtual character to move to the determined moving direction according to the determined target movement posture corresponding to the virtual character, for example, control the virtual character to move downward in the target movement posture.

Further, referring to fig. 4, fig. 4 is a schematic structural diagram of the motion recognition module 145 shown in fig. 3, and the motion recognition module 145 includes a part determination unit 1451, a movement posture determination unit 1452, and a posture conversion unit 1453.

The part determining unit 1451 is configured to determine, for each second laser irradiation point, the wearable laser transmitter 120 corresponding to the laser irradiation point, and determine the wearing part of the wearable laser transmitter 120 on the user.

Specifically, according to the irradiation frequency corresponding to each second laser irradiation point, the wearable laser transmitter 120 corresponding to the second laser irradiation point is determined, so that the wearing part of the wearable laser transmitter 120 worn on the user is determined.

The wearing position of the wearable laser emitter 120 may be set by a technician in advance, for example, the wearable laser emitter a must be worn on the left wrist; or after the user wears the wearable laser emitter 120 randomly, the wearing part of the wearable laser emitter 120 is determined according to the position information of the wearable laser emitter 120.

The movement posture determination unit 1452 is configured to determine an initial movement posture of the user in the motion image from the motion image, and because the motion of the user in the motion image is blocked, only the right arm or only the right forearm of the user can be recognized from the captured motion image, so that the initial movement posture is incomplete, and there is no way to simulate the movement posture of the user, at this time, the initial movement posture needs to be adjusted by the wearing part corresponding to the determined second laser irradiation point to obtain a complete movement posture, for example, in the case that only the right forearm can be recognized, the position of the shoulder of the user can be determined according to the second laser irradiation point corresponding to the wearable laser emitter 120 worn on the shoulder of the user without determining the shoulder position from the motion image, therefore, the adjustment of the initial moving posture is realized, and a complete initial moving posture is obtained; or in the case of a deviation from the initial movement gesture recognized in the moving image, for example, a false recognition due to the influence of the background, and recognizing the hand of the user 1 as the hand of the user 2, the corresponding adjustment may be made according to the wearable laser transmitter 120 worn by the user 1 or the user 2.

Since the motion image captured by the camera device 130 is a front image of the user and the virtual character corresponding to the user should be a back image in the game, the posture conversion unit 1453 needs to perform coordinate conversion on the adjusted initial movement posture after obtaining the adjusted initial movement posture, and determine the front movement posture of the user as a back movement posture corresponding to the virtual character, that is, convert the adjusted initial movement posture into a target movement posture corresponding to the virtual character.

Further, referring to fig. 5, fig. 5 is a third schematic structural diagram of the terminal device 140 shown in fig. 1, and the terminal device 140 further includes a posture correction module 147.

The posture correction module 147 is configured to determine, for each second laser irradiation point, whether the laser irradiation point is located within a preset standard range, and if it is detected that the second laser irradiation point is not located within the preset standard range, display, in the display screen of the terminal device 140, the second laser irradiation point that is not located within the preset standard range and a preset standard range corresponding to the second laser irradiation point, so that the user can adjust the movement posture of the user according to the preset standard range.

For example, when the user moves, it may be detected in real time whether the limbs of the user reach a preset standard range, for example, whether the right hand reaches a preset height, and specifically, whether a second laser irradiation point corresponding to the wearable laser emitter 120 worn on the right wrist is within the preset standard range is detected, and if it is detected that the second laser irradiation point is not within the preset standard range, a second laser irradiation point corresponding to the wearable laser emitter 120 on the right wrist that is not within the preset range and a preset standard range corresponding to the second laser irradiation point are displayed in the display screen of the terminal device 140.

The preset standard range is set by a technician in advance, and is different according to different application environments or scenes. After the user selects a corresponding scene, the terminal device 140 can determine the preset standard range corresponding to each second laser irradiation point in the scene, and the preset standard range corresponding to the second laser irradiation point changes with the switching of the content in the scene.

In this way, the terminal device 140 can detect the posture of the user in real time through the posture correction module 147, and display the second laser irradiation point that does not reach the preset standard range and the preset standard range corresponding to the second laser irradiation point on the display screen, so that the user can be reminded to adjust the posture of the user when the posture of the user is not standard.

The laser sensing system 100 provided in the embodiment of the present application receives a first laser signal emitted by the handheld laser emitter 110 according to a trigger instruction sent by a user, and a plurality of second laser signals sent by the wearable laser emitter 120 according to a preset time interval, and determines a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each second laser signal in the display screen of the terminal device 140, and determines the first laser irradiation point as a shooting position of a virtual character, and determines a target movement posture and a movement direction of the virtual character based on the plurality of second laser irradiation points and an action image of the user acquired by the camera device 130, and controls the virtual character to move to the movement direction according to the target movement posture.

Therefore, the user can adjust the shooting position, the shooting direction and the shooting posture of the corresponding virtual character according to the shooting habit of the user, so that the shooting hit rate of the user can be increased, the shooting operation difficulty is reduced, and the user experience is improved.

Referring to fig. 6, a flowchart of a laser sensing method provided in the embodiment of the present application is applied to a terminal device, and the laser sensing method includes:

s601, after receiving a first laser signal transmitted by a handheld laser transmitter and a second laser signal transmitted by each wearable laser transmitter, respectively determining a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each second laser signal.

In this step, after the terminal device receives a first laser signal transmitted by the handheld laser transmitter and a second laser signal transmitted by each wearable laser transmitter, a first laser irradiation point corresponding to the first laser signal in a display screen of the terminal device is determined, that is, a first laser irradiation point corresponding to the handheld laser transmitter in the display screen, and a second laser irradiation point corresponding to each second laser signal, that is, a second laser irradiation point corresponding to the wearable laser transmitter in the display screen are determined.

And S602, determining the first laser irradiation point as a shooting position of the virtual character.

In the step, after a first laser irradiation point corresponding to the handheld laser transmitter is determined, the determined first laser irradiation point is determined as a shooting position of a virtual character corresponding to a user.

Therefore, the position where the user wants to shoot can be determined according to the received first laser signal sent by the handheld laser transmitter, and the virtual character can be controlled to shoot at the position where the user wants to shoot.

And S603, determining a target movement posture and a movement direction corresponding to the virtual character displayed by the terminal equipment based on each second laser irradiation point and the received motion image of the user sent by the camera device.

In this step, a target movement posture and a movement direction corresponding to the virtual character corresponding to the user displayed on the display screen of the terminal device are determined based on the second laser irradiation point corresponding to each determined second laser signal and the motion image of the user received from the image pickup device.

And S604, controlling the virtual character to move towards the moving direction according to the target moving posture.

In this step, after determining the target movement posture and the movement direction corresponding to the virtual character, the terminal device controls the virtual character to move in the movement direction according to the target movement posture.

It should be noted that the terminal device can simultaneously determine the shooting position of the virtual character and the target movement posture and the movement direction corresponding to the virtual character, so that the virtual character can shoot in the moving process and can shoot according to the target movement posture.

Further, step S601 includes: after a first laser signal transmitted by the handheld laser transmitter and a second laser signal transmitted by each wearable laser transmitter are received, determining a plurality of laser irradiation points to be identified corresponding to the first laser signal and each second laser signal in a display screen of the terminal device according to a temperature difference between an irradiation point and an unirradiated area in the display screen; according to the irradiation frequency of the laser irradiation points to be identified, identifying a first laser irradiation point corresponding to the handheld laser emitter in the laser irradiation points to be identified and a second laser irradiation point corresponding to each wearable laser emitter.

After a first laser signal transmitted by a handheld laser transmitter and a second laser signal transmitted by each wearable laser transmitter are received, determining laser irradiation points to be identified corresponding to the first laser signal and each second laser signal in a display screen of the terminal equipment according to the temperature difference between an irradiation point and an unirradiated area in the display screen of the terminal equipment; and then according to the irradiation frequency of the laser irradiation points to be identified, respectively determining a first laser irradiation point corresponding to the first laser signal, namely a first laser irradiation point corresponding to the handheld laser transmitter, and a second laser irradiation point corresponding to each second laser signal, namely a second laser irradiation point corresponding to the wearable laser transmitter.

Further, step S603 includes: receiving the action images sent by the camera device, and determining the moving direction of the user based on at least two received action images; and determining the target movement posture of the virtual character based on each second laser irradiation point and the received motion image.

After receiving the motion images sent by the camera device, identifying the position area of the user in each motion image, and then determining the moving direction of the user according to the acquisition time sequence of each motion image; and determining the target movement posture of the virtual character according to each second laser irradiation point and the received motion image.

For example, the camera device is fixedly installed in advance, the center of the collected motion image is used as an origin, the user is located at the left end of the origin in the first frame of motion image, that is, the user is located at the left end of the motion image, and the user is located at the right end of the origin in the second frame of motion image collected according to the preset time interval, that is, the user is located at the right end of the motion image, and obviously, the user moves to the right; for another example, the upper body of the user is located at the upper half of the origin in the first frame of motion image, that is, the upper body of the user is located at the upper half of the image, and the upper body of the user is located at the lower half of the origin in the second frame of motion image acquired at the preset time interval, that is, the upper body of the user is located at the lower half of the image.

Further, the determining the target movement posture of the virtual character based on each of the second laser irradiation points and the motion image includes: for each second laser irradiation point, determining a wearable laser transmitter corresponding to the second laser irradiation point, and determining a wearing part of the wearable laser transmitter; determining an initial movement posture of the user from the motion image, and adjusting the initial movement posture based on each wearing part; and performing coordinate transformation on the adjusted initial movement posture, and converting the initial movement posture into a target movement posture corresponding to the virtual character.

In this step, according to the irradiation frequency corresponding to each second laser irradiation point, the wearable laser transmitter corresponding to the laser irradiation point is determined, so that the wearable laser transmitter is determined to be worn on the user. The wearable laser emitter can be preset by a technician, for example, the wearable laser emitter A is required to be worn on a left wrist; then or after the user wears the wearable laser transmitter randomly, determining the wearable part of the wearable laser transmitter according to the position information of the wearable laser transmitter; then, the initial movement posture of the user in the motion image is determined from the motion image, because the motion of the user in the motion image is blocked, only the right arm or only the right forearm of the user can be recognized from the collected motion image, so that the initial movement posture is incomplete, and the movement posture of the user cannot be simulated, at this time, the initial movement posture needs to be adjusted by the wearing part corresponding to the determined second laser irradiation point to obtain the complete movement posture, for example, when only the right forearm can be recognized, the position of the shoulder of the user can be determined according to the second laser irradiation point corresponding to the wearable laser emitter worn on the shoulder of the user, so that the initial movement posture can be adjusted, obtaining a complete initial movement posture; or in the case where there is a deviation in the initial movement gesture recognized from the motion image, for example, a false recognition occurs due to the influence of the background, and the hand of the user 1 is recognized as the hand of the user 2, and then the adjustment may be performed according to the wearable laser transmitter worn by the user 1 or the user 2.

Finally, since the motion image acquired by the camera device is a front image of the user and the virtual character corresponding to the user should be a back image in the game, after the adjusted initial movement posture is obtained, coordinate conversion needs to be performed on the adjusted initial movement posture, and the front movement posture of the user is determined as a back movement posture corresponding to the virtual character, that is, the adjusted initial movement posture is converted into a target movement posture corresponding to the virtual character.

According to the laser sensing method provided by the embodiment of the application, after a first laser signal transmitted by a handheld laser transmitter and a second laser signal transmitted by each wearable laser transmitter are received, a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each second laser signal are respectively determined; determining the first laser irradiation point as a shooting position of a virtual character; determining a target movement posture and a movement direction corresponding to the virtual character displayed by the terminal equipment based on each second laser irradiation point and the received action image of the user sent by the camera device; and controlling the virtual character to move to the moving direction according to the target moving posture.

Like this, this application determines shooting position, moving direction and the target movement posture of the virtual character that the user corresponds through handheld laser emitter that the user was handheld and the wearable laser emitter of wearing to can control the virtual character and move to moving direction according to the target movement posture, and then can increase the hit rate of user's shooting, reduce the operation degree of difficulty of shooting, help improving user's experience and feel.

Referring to fig. 7, fig. 7 is a flowchart of a laser sensing method according to another embodiment of the present application. As shown in fig. 7, a laser sensing method provided in an embodiment of the present application includes:

s701, after a first laser signal transmitted by the handheld laser transmitter and a second laser signal transmitted by each wearable laser transmitter are received, a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each second laser signal are respectively determined.

And S702, determining the first laser irradiation point as a shooting position of the virtual character.

And S703, determining a target movement posture and a movement direction corresponding to the virtual character displayed by the terminal equipment based on each second laser irradiation point and the received motion image of the user sent by the camera device.

And S704, controlling the virtual character to move to the moving direction according to the target moving posture.

S705, for each of the second laser irradiation points, determining whether the second laser irradiation point is located within a corresponding preset standard range.

In this step, it is determined whether the second laser irradiation point is located within a preset standard range for each determined second laser irradiation point, where the preset standard range is set by a technician in advance, and the preset standard ranges are different for different application environments or scenes, and after a user selects a corresponding scene, the terminal device can determine the preset standard range corresponding to each second laser irradiation point in the scene, and the preset standard range corresponding to the second laser irradiation point changes with the switching of the content in the scene.

S706, if the second laser irradiation point is not located within the preset standard range, displaying the second laser irradiation point and the preset standard range corresponding to the second laser irradiation point in the terminal device.

In this step, if it is determined that the second laser irradiation point is not located within the preset standard range, the second laser irradiation point that is not within the preset standard range and the preset standard range corresponding to the second laser irradiation point are displayed on the display screen of the terminal device to remind the user of adjusting the posture of the user.

For example, when the user moves, it is detected in real time whether limbs of the user reach a preset standard range, for example, whether a right hand reaches a preset height, and the like, specifically, whether a second laser irradiation point corresponding to a wearable laser emitter worn on a right wrist is within the preset standard range is detected, and if it is detected that the second laser irradiation point is not within the preset standard range, a second laser irradiation point corresponding to the wearable laser emitter on the right wrist that is not within the preset standard range and a preset standard range corresponding to the second laser irradiation point are displayed in a display screen of the terminal device.

Therefore, the terminal equipment can detect the posture of the user in real time through each second laser irradiation point, and display the second laser irradiation points which do not reach the preset standard range and the preset standard range corresponding to the second laser irradiation points on the display screen, so that the user can be reminded to adjust the posture of the user under the condition that the posture of the user is not standard.

The descriptions of S701 to S704 may refer to the descriptions of S601 to S604, and the same technical effect can be achieved, which is not described in detail herein.

According to the laser sensing method provided by the embodiment of the application, after a first laser signal transmitted by a handheld laser transmitter and a second laser signal transmitted by each wearable laser transmitter are received, a first laser irradiation point corresponding to the first laser signal and a second laser irradiation point corresponding to each second laser signal are respectively determined; determining the first laser irradiation point as a shooting position of a virtual character; determining a target movement posture and a movement direction corresponding to the virtual character displayed by the terminal equipment based on each second laser irradiation point and the received action image of the user sent by the camera device; controlling the virtual character to move to the moving direction according to the target moving posture; for each second laser irradiation point, determining whether the second laser irradiation point is located within a corresponding preset standard range; and if the second laser irradiation point is not located in the preset standard range, displaying the second laser irradiation point and the preset standard range corresponding to the second laser irradiation point in a display screen of the terminal equipment.

Like this, this application is through handheld laser emitter and the wearable laser emitter of wearing that the user was handheld, determine the virtual character's that the user corresponds shooting position, moving direction and target movement gesture, thereby can control the virtual character and move to moving direction according to target movement gesture, and simultaneously, can also shine the point through each second laser, real time monitoring user's gesture, when user's gesture is not standard, can remind the user to adjust own gesture, and then the hit rate of user's shooting has been increased, the operation degree of difficulty of shooting has been reduced, help improving user's experience sense.

It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.

In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.

The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.

In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.

The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.

Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present application, and are used for illustrating the technical solutions of the present application, but not limiting the same, and the scope of the present application is not limited thereto, and although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the exemplary embodiments of the present application, and are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

22页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于体态追踪及改善、体感游戏远程互联互动的可穿戴系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类