Projection touch processing method and device and computer readable storage medium

文档序号:1936627 发布日期:2021-12-07 浏览:13次 中文

阅读说明:本技术 投影触控处理方法、装置及计算机可读存储介质 (Projection touch processing method and device and computer readable storage medium ) 是由 金凌琳 李志� 郭鹏亮 于 2021-08-05 设计创作,主要内容包括:本申请公开了一种投影仪散热控制方法、装置、投影仪和计算机可读存储介质,该方法包括:检测投影面朝向投影仪一侧是否存在操作者;若投影面朝向投影仪一侧存在操作者,则检测操作者的手部与投影面的距离是否小于预设感应距离;若小于预设感应距离,则获取操作者的手部在投影面的悬停位置;分别向悬停位置分别投影第一红外点和第二红外点;获取第一红外光点和第二红外点在操作者的手部的光点间距;根据安装间距、光点间距以及移动设备的安装位置至悬停位置的实际投影距离,获取第一红外点至悬停位置的触控间距;若触控间距小于预设手部宽度,则响应操作者的手部的触控操作。本申请无需大幅度增加硬件成本,实现高准确性地实现了用户与投影机的触控交互。(The application discloses a projector heat dissipation control method, a projector heat dissipation control device, a projector and a computer readable storage medium, wherein the method comprises the following steps: detecting whether an operator exists on one side of the projection surface facing the projector; if an operator exists on one side of the projection surface facing the projector, detecting whether the distance between the hand of the operator and the projection surface is smaller than a preset sensing distance; if the distance is smaller than the preset sensing distance, acquiring the hovering position of the hand of the operator on the projection surface; respectively projecting a first infrared point and a second infrared point to the hovering position; acquiring the spot distance of the first infrared spot and the second infrared spot on the hand of an operator; acquiring a touch control distance from a first infrared point to the hovering position according to the installation distance, the light spot distance and the actual projection distance from the installation position of the mobile equipment to the hovering position; and if the touch control distance is smaller than the preset hand width, responding to the touch control operation of the hand of the operator. According to the method and the device, the hardware cost is not required to be greatly increased, and the touch interaction between the user and the projector is realized with high accuracy.)

1. A projection touch processing method based on a mobile device is characterized by comprising the following steps:

detecting whether an operator exists on one side, facing the projector, of the projection surface based on a depth camera of a mobile device, wherein the mobile device is arranged in parallel with the projector and is parallel to the projection surface;

if the operator exists, detecting whether the distance between the hand of the operator and the projection plane is smaller than a preset sensing distance or not based on the depth camera;

if the distance is smaller than the preset sensing distance, acquiring the hovering position of the hand of the operator on the projection surface;

controlling a first infrared transmitter of the mobile device and a second infrared transmitter of the projector to respectively project a first infrared point and a second infrared point to the hovering position;

acquiring the spot distance of the first infrared spot and the second infrared spot on the hand of an operator;

acquiring a touch control distance from a first infrared point to a hovering position according to an installation distance between the mobile equipment and the projector, a light spot distance and an actual projection distance from the mobile equipment to the hovering position;

and if the touch control distance is smaller than the preset hand width, responding to the touch control operation of the hand of the operator.

2. The projection touch processing method according to claim 1, wherein the projection touch interaction method is applied to a projection touch system, the projection touch system includes a mobile device and a projector that are communicatively connected to each other, a shooting direction of the mobile device is consistent with a projection direction of the projector and both face a projection plane, the mobile device includes a depth camera, a first infrared emitter and an infrared camera, and the projector includes an optical engine and a second infrared emitter.

3. The projected touch processing method of claim 1, wherein the projected touch processing method comprises:

after the projection surface, the projector and the mobile equipment are installed, acquiring an installation distance between the installation position of the mobile equipment and the installation position of the projector;

and acquiring the projection distance between the installation position of the mobile equipment and each preset position point of the projection surface, and establishing a mapping table of the preset position point of the mobile equipment relative to the projection surface and the projection distance.

4. The projected touch processing method of claim 3, wherein the step of detecting whether an operator is present on a side of the projection surface facing the projector based on the depth camera of the mobile device comprises:

detecting whether a human skeleton key point frame exists on one side of a projection surface facing a projector or not based on a depth camera of the mobile equipment;

if the human skeleton key point frame exists, identifying whether the human skeleton key point frame has an arm indicating action;

if the arm instruction action exists, the operator is judged to exist.

5. The projected touch processing method of claim 4, wherein prior to the step of the mobile device-based depth camera detecting whether an operator is present on the side of the projection surface facing the projector, comprising:

establishing wireless connection between the mobile equipment and the projector, controlling the first infrared emitter and the second infrared emitter to emit to preset position points of the projection surface one by one, and acquiring an image light spot distance and a real scene light spot distance of each preset position point from each reference distance in a preset induction distance of the projection surface, wherein the image light spot distance is obtained by measuring a shot image of an infrared camera of the mobile equipment, and the real scene light spot distance is obtained by measuring the distance between two real scenes, so as to establish a second mapping table of the image light spot distance and the real scene light spot distance of each reference distance of each preset position point on the projection surface in the preset induction distance from the projection surface;

the step of acquiring the spot spacing of the first infrared spot and the second infrared spot on the hand of the operator comprises:

the infrared camera based on the mobile equipment acquires the current image light spot distance of the first infrared light spot and the second infrared spot on the hand of the operator;

and inquiring the current live-action light spot distance associated with the current image light spot distance of the preset position point corresponding to the hovering position in a second mapping table, and taking the current live-action light spot distance as the light spot distance.

6. The projected touch processing method of claim 5, wherein the step of obtaining the touch distance from the first infrared point to the hover position according to the mounting distance between the mobile device and the projector, the light spot distance, and the actual projection distance from the mounting position of the mobile device to the hover position comprises:

setting the installation distance between two installation positions of the mobile equipment and the projector as L, setting the light spot distance as I, setting the actual projection distance as S, and setting the touch control distance as X,

based on the formula: and obtaining the touch space X ═ X/S.

7. The projected touch processing method of claim 6, further comprising, before the step of responding to the touch operation of the hand of the operator:

and controlling the projector to project a preset ripple area at the hovering position, wherein the preset ripple area takes the hovering position as a geometric center and continuously emits transparent ripples to the periphery.

8. The projected touch processing method of claim 7, further comprising, after the step of controlling the projector to project a preset moire area at the hover position:

and if the hand of the operator is detected to move, taking the hover position of the moved hand of the operator as a new hover position.

9. A projection touch processing device, comprising: memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the projected touch processing method of any of claims 1 to 8.

10. A computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, implements the steps of the projected touch processing method according to any one of claims 1 to 8.

Technical Field

The present disclosure relates to the field of projected touch recognition technologies, and in particular, to a projected touch processing method and apparatus, and a computer-readable storage medium.

Background

With the development of projection technology, the cost performance of the projector is higher and higher, and more users use the intelligent projector. The projector is increasingly applied to scenes such as children education, game interaction, work communication and the like, but the current projector generally can only use a remote controller for interaction and cannot directly perform touch interaction on a projection picture. If touch interaction is to be realized, a corresponding 3D camera module needs to be added to hardware, and the performance of a CPU (central processing unit) of the projector itself needs to be improved, and the configuration of a RAM (Random Access Memory) is increased, so that the cost is very high, and how to realize touch interaction without greatly increasing the hardware cost is an urgent technical problem to be solved.

Disclosure of Invention

Embodiments of the present invention provide a method and an apparatus for processing projection touch, and a computer-readable storage medium, which aim to implement touch interaction of a projector without greatly increasing hardware cost.

In order to achieve the above object, an embodiment of the present application provides a projection touch processing method based on a mobile device, where the projection touch processing method includes the following steps:

detecting whether an operator exists on one side, facing the projector, of the projection surface based on a depth camera of a mobile device, wherein the mobile device is arranged in parallel with the projector and is parallel to the projection surface;

if the operator exists, detecting whether the distance between the hand of the operator and the projection plane is smaller than a preset sensing distance or not based on the depth camera;

if the distance is smaller than the preset sensing distance, acquiring the hovering position of the hand of the operator on the projection surface;

controlling a first infrared transmitter of the mobile device and a second infrared transmitter of the projector to respectively project a first infrared point and a second infrared point to the hovering position;

acquiring the spot distance of the first infrared spot and the second infrared spot on the hand of an operator;

acquiring a touch control distance from a first infrared point to a hovering position according to an installation distance between the mobile equipment and the projector, a light spot distance and an actual projection distance from the mobile equipment to the hovering position;

and if the touch control distance is smaller than the preset hand width, responding to the touch control operation of the hand of the operator.

Optionally, the projection touch interaction method is applied to a projection touch system, the projection touch system includes a mobile device and a projector which are in communication connection with each other, a shooting direction of the mobile device is consistent with a projection direction of the projector and faces a projection plane, the mobile device includes a depth camera, a first infrared emitter and an infrared camera, and the projector includes an optical machine and a second infrared emitter.

Optionally, the projection touch processing method includes:

after the projection surface, the projector and the mobile equipment are installed, obtaining an installation distance between the installation position of the mobile equipment and the installation position of the projector;

and acquiring the projection distance between the installation position of the mobile equipment and each preset position point of the projection surface, and establishing a mapping table of the preset position point of the mobile equipment relative to the projection surface and the projection distance.

Optionally, the step of detecting whether an operator is present on the side of the projection surface facing the projector by the depth camera based on the mobile device includes:

detecting whether a human skeleton key point frame exists on one side of a projection surface facing a projector or not based on a depth camera of the mobile equipment;

if the human skeleton key point frame exists, identifying whether the human skeleton key point frame has an arm indicating action;

if the arm instruction action exists, the operator is judged to exist.

Before the step of detecting whether an operator is present on the side of the projection surface facing the projector by the mobile device-based depth camera, the method comprises the following steps:

establishing wireless connection between the mobile equipment and the projector, controlling the first infrared emitter and the second infrared emitter to emit to preset position points of the projection surface one by one, and acquiring an image light spot distance and a real scene light spot distance of each preset position point from each reference distance in a preset induction distance of the projection surface, wherein the image light spot distance is obtained by measuring a shot image of an infrared camera of the mobile equipment, and the real scene light spot distance is obtained by measuring the distance between two real scenes, so as to establish a second mapping table of the image light spot distance and the real scene light spot distance of each reference distance of each preset position point on the projection surface in the preset induction distance from the projection surface;

the step of acquiring the spot spacing of the first infrared spot and the second infrared spot on the hand of the operator comprises:

the infrared camera based on the mobile equipment acquires the current image light spot distance of the first infrared light spot and the second infrared spot on the hand of the operator;

and inquiring the current live-action light spot distance associated with the current image light spot distance of the preset position point corresponding to the hovering position in a second mapping table, and taking the current live-action light spot distance as the light spot distance.

Optionally, the step of obtaining, according to the installation distance between the mobile device and the projector, the light spot distance, and the actual projection distance from the installation position of the mobile device to the hover position, the touch distance from the first infrared point to the hover position includes:

setting the installation distance between two installation positions of the mobile equipment and the projector as L, setting the light spot distance as I, setting the actual projection distance as S, and setting the touch control distance as X,

based on the formula: and obtaining the touch space X ═ X/S.

Optionally, before the step of responding to the touch operation of the hand of the operator, the method further includes:

and controlling the projector to project a preset ripple area at the hovering position, wherein the preset ripple area takes the hovering position as a geometric center and continuously emits transparent ripples to the periphery.

Optionally, after the step of controlling the projector to project the preset moire area at the hovering position, the method further includes:

and if the hand of the operator is detected to move, taking the hover position of the moved hand of the operator as a new hover position.

Optionally, after detecting that the operator is present on the side of the projection surface facing the projector, the method further includes:

and carrying out face recognition or gesture recognition preposed authority verification on the operator, and if the preposed authority verification is passed, executing the step of detecting whether the distance between the hand of the operator and the projection plane is smaller than a preset induction distance or not based on the depth camera.

In order to achieve the above object, the present application further provides a projection touch processing device, including: the projection touch control processing method comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the computer program realizes the steps of the projection touch control processing method when being executed by the processor.

To achieve the above object, the present application further provides a readable storage medium, which stores a computer program, and the computer program, when executed by a processor, implements the steps of the projected touch processing method as described above.

The method comprises the steps of judging whether an operator exists and whether the distance between the hand of the operator and a projection plane is smaller than a preset sensing distance or not by utilizing a depth camera of the mobile equipment, then determining a hovering position, then obtaining the light spot distance between a first infrared light spot and a second infrared spot on the hand of the operator, obtaining the touch control distance between the first infrared spot and the hovering position according to the installation distance between the mobile equipment and two installation positions of a projector, the light spot distance and the actual projection distance between the installation position of the mobile equipment and the hovering position, responding to the touch control operation of the hand of the operator only when the touch control distance is smaller than a preset hand width, carrying out primary identification of the distance between the hand and the projection plane and secondary identification of the first infrared spot and the second infrared spot on the hand of the operator by successively based on the depth camera of the mobile equipment, obtaining the light spot distance, and then obtaining the touch control distance, and finally, performing three-level identification on the preset hand width to accurately respond to the touch operation of the hand of the operator, without increasing a high-performance CPU and RAM by the projector, without greatly increasing the hardware cost, and realizing the touch interaction between the user and the projector with high accuracy.

Drawings

Fig. 1 is a schematic hardware configuration diagram of a projector according to an alternative embodiment of the present application;

fig. 2 is a schematic flowchart of a projection touch processing method according to the present application;

fig. 3 is a scene schematic diagram of a projection touch system applied to the projection touch processing method of the present application;

fig. 4 is a schematic view of a scene where a mobile device and a projector respectively project a first infrared light spot and a second infrared light spot to form an image on a hand of an operator according to an embodiment of the projected touch processing method;

fig. 5 is a scene diagram of an implementation form of a human skeleton key point.

The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings.

Detailed Description

It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.

In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning by themselves. Thus, "module", "component" or "unit" may be used mixedly.

The implementation device of the projection touch processing method based on the mobile device may be a projector, and the projector may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), a touch screen, and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.

Optionally, the projector may further include a camera, RF (Radio Frequency) circuitry, a sensor, audio circuitry, a WiFi module, a second infrared transmitter, and so forth. Among others, sensors such as light sensors and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display screen according to the brightness of ambient light, and a proximity sensor that turns off the display screen and/or the backlight when the projector is away from the user.

Those skilled in the art will appreciate that the projector configuration shown in fig. 1 does not constitute a limitation of the projector, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.

As shown in fig. 1, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and a projector heat dissipation control program.

In the terminal shown in fig. 1, the network interface 1004 is mainly used for connecting to a backend server and performing data communication with the backend server; the user interface 1003 is mainly used for connecting a client (user side) and performing data communication with the client; and processor 1001 may be configured to invoke a projector heat dissipation control program stored in memory 1005 and perform the following operations:

establishing wireless connection between the mobile equipment and the projector, and detecting whether an operator exists on one side of the projection surface facing the projector or not based on a depth camera of the mobile equipment;

if an operator exists on one side, facing the projector, of the projection surface, detecting whether the distance between the hand of the operator and the projection surface is smaller than a preset sensing distance or not based on the depth camera;

if the distance between the hand of the operator and the projection surface is smaller than the preset sensing distance, acquiring the hovering position of the hand of the operator on the projection surface;

controlling a first infrared emitter and a second infrared emitter to respectively project a first infrared point and a second infrared point to the hovering position, wherein the hovering positions of the first infrared point and the second infrared point on the projection plane are overlapped;

acquiring a light spot distance between the first infrared light spot and the second infrared spot on the hand of the operator based on an infrared camera of the mobile device;

acquiring a touch distance from the first infrared point to the hovering position according to the mounting distance between the mobile equipment and the two mounting positions of the projector, the light spot distance and the actual projection distance from the mounting position of the mobile equipment to the hovering position;

and if the touch control distance is smaller than the preset hand width, responding to the touch control operation of the hand of the operator.

In the application, the method mainly comprises the steps that a mobile device (such as a smart phone) is used for communicating with a projector in a matched mode, a depth camera and an AI algorithm of the mobile device are used for roughly calculating that the hand of an operator in front of a projection surface is close to the projection surface or touches the projection surface, the hovering position of the hand touch control projection surface of the operator is calculated based on the mobile device, the mobile device transmits information to the projector through Bluetooth and wifi, the projector knows the requirement that the hand of the operator has the touch control projection surface and obtains the hovering position of the hand touch control of the operator, a first infrared point and a second infrared point are projected to the hovering position on the projection surface respectively by controlling a first infrared transmitter of the mobile device and a second infrared transmitter of the projector, and the hovering positions of the first infrared point and the second infrared point on the projection surface are overlapped; then acquiring the light spot distance between the first infrared light spot and the second infrared spot on the hand of the operator based on an infrared camera of the mobile device; thirdly, acquiring a touch distance from the first infrared point to the hovering position according to the installation distance between the mobile equipment and the two installation positions of the projector, the light spot distance and the actual projection distance from the installation position of the mobile equipment to the hovering position; when the touch distance is smaller than the preset hand width, the fact that the hand of the operator touches the hovering position on the projection surface with the maximum probability is shown, and at the moment, the projector responds to touch operation of the hand of the operator.

The application provides a projection touch control processing method based on mobile device, projection touch control interactive approach is applied to projection touch control system, projection touch control system includes intercommunication connection's mobile device and projecting apparatus, the shooting direction of mobile device is unanimous with the projection direction of projecting apparatus and all faces the plane of projection, the mobile device includes degree of depth camera, first infrared emitter and infrared camera, the projecting apparatus includes ray apparatus and second infrared emitter, refer to fig. 2, projection touch control processing method includes the following steps:

step S10, detecting whether an operator exists on the side of the projection surface facing the projector based on the depth camera of the mobile equipment, wherein the mobile equipment is arranged in parallel with the projector and is parallel to the projection surface;

specifically, step S10 is: establishing wireless connection between the mobile equipment and the projector, and detecting whether an operator exists on one side of the projection surface facing the projector or not based on a depth camera of the mobile equipment;

the mobile device can be a smart phone, the mobile device and the projector are wirelessly connected through Bluetooth or wifi, and data interaction can be carried out between the mobile device and the projector. The mobile device is equipped with a depth camera that can detect the approximate distance of a person or object in the shooting direction, typically to a depth determination of the order of 10 cm. The distance measurement principle of the depth camera at least comprises a binocular matching method, a double RGB camera (red, green and blue three-primary color camera) and an optional lighting system are utilized, the binocular matching adopts a triangulation principle, namely, the difference (namely parallax) between horizontal coordinates of a target point imaged in left and right two views imaged by the two RGB cameras is in inverse proportion to the distance of an imaging plane of the target point, and then depth information is obtained based on the inverse proportion relation, namely, an image processing technology is utilized, and matching points are obtained by searching the same characteristic points in the two images, so that the depth value is obtained.

Because the half depth camera includes the RGB camera, and the direction of shooing of the depth camera of mobile device and the projection direction of projecting apparatus all just to the plane of projection, accessible depth camera to whether have the user (being the operator) that carries out touch-control operation to the plane of projection in front of the plane of projection, whether there is the operator in the mode judgement of human skeleton key point frame discernment of accessible.

Specifically, the step of detecting whether an operator is present on the side of the projection surface facing the projector based on the depth camera of the mobile device in step S10 includes:

a1, detecting whether a human skeleton key point frame exists on one side of a projection surface facing a projector or not based on a depth camera of the mobile equipment;

the human skeleton key point frame is a frame which is formed by splicing skeleton key points into skeleton key points similar to match people according to the human body structure characteristics, and the human skeleton key point frame can comprise the following skeleton key points: crown, neck, left shoulder, right shoulder, left elbow, right elbow, left wrist, right wrist, left hip, right hip, left knee, right knee, left ankle, right ankle, as shown in fig. 5.

Step A2, if a human skeleton key point frame exists, identifying whether the human skeleton key point frame has an arm indicating action;

in step a3, if the arm instruction operation is present, it is determined that the operator is present.

If the human skeleton key point frame is detected on the side of the projection surface facing the projector, it indicates that a user who wants to perform touch operation on the projection surface exists in front of the projection surface, namely, a human-shaped operator profile exists. In this case, it is possible to further recognize whether or not there is an arm instruction motion in the human skeleton key point frame, and if there is an arm instruction motion, it is determined that there is an operator, and if there is no arm instruction motion, it is determined that there is no operator.

The arm indicating action is required to satisfy the skeletal key points at the same time: a first condition, a second condition, and a third condition,

the first condition is: a first line from the neck to the left shoulder of the skeletal key point and a second line from the left shoulder to the left elbow of the skeletal key point are not at right angles (indicating that the first line is perpendicular to the second line and the left arm of the operator is against the body and not lifted);

the second condition is: a second connecting line from the left shoulder to the left elbow of the skeletal key point is not parallel to a third connecting line from the left elbow to the left wrist of the skeletal key point (the second connecting line is parallel to the third connecting line, the left arm of the operator is lifted in a flat manner, and no action of touch operation is performed);

the third condition is: the first line, the second line and the third line are not on the same plane (indicating that the arm of the operator has an abnormal curvature, and the arm is likely to lift the left arm to click the projection plane);

alternatively, the first and second electrodes may be,

the first condition is: a first line from the neck to the right shoulder of the skeletal key point and a second line from the right shoulder to the right elbow of the skeletal key point are not right angles (indicating that the first line is perpendicular to the second line and the right arm of the operator is against the body and is not lifted);

the second condition is: a second connecting line from the right shoulder to the right elbow of the skeleton key point is not parallel to a third connecting line from the right elbow to the right wrist of the skeleton key point (the second connecting line is parallel to the third connecting line, the right arm of the operator is lifted horizontally, and no action of touch operation exists);

the third condition is: the first line, the second line and the third line are not on the same plane (indicating that the arm of the operator has the heterofacial curvature and is likely to lift the right arm to click the projection plane);

step S20, if an operator exists, detecting whether the distance between the hand of the operator and the projection plane is smaller than a preset sensing distance based on the depth camera;

specifically, step S20 is: if an operator exists on one side, facing the projector, of the projection surface, detecting whether the distance between the hand of the operator and the projection surface is smaller than a preset sensing distance or not based on the depth camera;

further, after the projection surface, the projector, and the mobile device are installed, before step S10, the projection touch processing method includes:

step B1, acquiring the installation distance between the installation position of the mobile device and the installation position of the projector;

and step B2, acquiring the projection distance between the installation position of the mobile equipment and each preset position point of the projection surface, and establishing a mapping table of the preset position point of the mobile equipment relative to the projection surface and the projection distance.

As shown in fig. 3, the projection plane, the projector, and the mobile device are correspondingly arranged and installed, as shown in fig. 4, the installation distance is L, the projection distance is S, and it is noted that fig. 4 is preset as a point a and the projection distance is S, and a plurality of preset position points can be preset and planned on the projection plane, for example, the preset position points are position points arranged in an array on the projection plane, and the more the preset position points are set, the more accurate the estimation result of the projection distance corresponding to each preset position point on the projection plane based on the preset position points and the projection distance mapping table is. For example, the preset position points on the projection plane are a point array with 10 horizontal points and 10 vertical columns, so that a mapping table of the projection distances from the mobile device to the 100 preset position points and the corresponding 100 preset position points can be established, and the projection distance S of the preset position point can be known by knowing one preset position point and inquiring the mapping table.

Step S30, if the distance is smaller than the preset sensing distance, the hovering position of the hand of the operator on the projection surface is obtained;

specifically, step S30 is: if the distance between the hand of the operator and the projection surface is smaller than the preset sensing distance, acquiring the hovering position of the hand of the operator on the projection surface;

moreover, whether the hand of operator and plane of projection are in nearer distance can be estimated to the degree of depth camera, specifically when there is the operator in plane of projection towards projecting apparatus one side, based on whether the distance of the hand of operator and plane of projection is less than preset inductive distance is detected to the degree of depth camera, if the distance of operator's hand and plane of projection is less than preset inductive distance, and surface operator's hand is close but still can't final determination touch the plane of projection, consequently further acquires the position of hovering of operator's hand at the plane of projection, and the position of hovering is the position of operator hand orthographic projection to the plane of projection.

Step S40, controlling a first infrared emitter of the mobile device and a second infrared emitter of the projector to respectively project a first infrared point and a second infrared point to the hovering position;

specifically, step S40 is: controlling a first infrared emitter and a second infrared emitter to respectively project a first infrared point and a second infrared point to the hovering position, wherein the hovering positions of the first infrared point and the second infrared point on the projection plane are overlapped;

after the hovering position of the hand of the operator on the projection surface is determined, controlling a first infrared emitter of the mobile device and a second infrared emitter of the projector to project a first infrared point and a second infrared point to the hovering position of the projection surface respectively, wherein the hovering positions of the first infrared point and the second infrared point on the projection surface are overlapped, and as shown in fig. 4, the hovering positions a of the first infrared point and the second infrared point on the projection surface are overlapped.

Step S50, acquiring the light spot distance of the first infrared light spot and the second infrared spot on the hand of the operator;

specifically, step S50 is: acquiring a light spot distance between the first infrared light spot and the second infrared spot on the hand of the operator based on an infrared camera of the mobile device;

the method comprises the following steps before the step of detecting whether an operator exists on the side of the projection surface facing the projector based on the depth camera of the mobile device in step S10:

step C, establishing wireless connection between the mobile equipment and the projector, controlling the first infrared emitter and the second infrared emitter to shoot preset position points of the projection surface one by one, and obtaining an image light spot distance and a real scene light spot distance of each preset position point from each reference distance in a preset sensing distance of the projection surface, wherein the image light spot distance is obtained by measuring a shot image of an infrared camera of the mobile equipment, and the real scene light spot distance is obtained by measuring a distance between two real scenes, so as to establish a second mapping table of the image light spot distance and the real scene light spot distance of each preset position point on the projection surface from each reference distance in the preset sensing distance of the projection surface;

for example, the preset sensing distance is 10cm, the reference distances within the preset sensing distance include 2cm, 4cm, 6cm, 8cm and 10cm, the preset position points include 16 points distributed on the projection surface in an array of 4 × 4, real scene light spot distances at five reference distances of 2cm, 4cm, 6cm, 8cm and 10cm when the first infrared emitter and the second infrared emitter emit to the 16 preset position points are actually measured respectively, and the projection surface or another projection surface can be placed at the five reference distances to display the first infrared light spot and the second infrared light spot. And simultaneously, image light spot distances at five reference distances of 2cm, 4cm, 6cm, 8cm and 10cm when the first infrared emitter and the second infrared emitter emit to 16 preset position points are respectively obtained through an infrared camera of the mobile equipment. The second mapping table is built with image spot distances and live-action spot distances corresponding to five reference distances with one preset position point, i.e. in this example the second mapping table comprises records of 16 preset position points at image spot distances and live-action spot distances of five reference distances, the mapping table comprises 80 records.

Meanwhile, step S50 includes:

step D1, the infrared camera based on the mobile device acquires the current image light spot distance of the first infrared light spot and the second infrared spot on the hand of the operator;

step D2, searching the current live-action light spot distance associated with the current image light spot distance of the preset position point corresponding to the hovering position in the second mapping table, and taking the current live-action light spot distance as the light spot distance.

Because the hand of the operator hovers in front of the projection surface, the first infrared light spot and the second infrared spot are projected on the hand of the operator, so that the infrared imaging image of the hand of the operator is acquired by the infrared camera of the mobile equipment, and the light spots in the infrared imaging image are measured to obtain the current image light spot distance between the first infrared light spot and the second infrared spot. And then looking up a table in a second mapping table to obtain the current real scene light spot distance corresponding to the current image light spot distance as the light spot distance. As shown in fig. 4, the spot pitch is a line segment DE.

Step S60, acquiring a touch distance from the first infrared point to the hovering position according to the installation distance between the mobile device and the projector, the light spot distance and the actual projection distance from the mobile device to the hovering position;

specifically, step S60 is: acquiring a touch distance from the first infrared point to the hovering position according to the mounting distance between the mobile equipment and the two mounting positions of the projector, the light spot distance and the actual projection distance from the mounting position of the mobile equipment to the hovering position;

specifically, referring to fig. 4, the mounting distance (i.e., the line segment BC) is set to L, the light spot distance (i.e., the line segment DE) is set to I, the actual projection distance (i.e., the line segment AB) is set to S, the touch distance (i.e., the line segment AD) is set to X,

based on the formula: and obtaining the touch space X ═ X/S.

Since triangle ADE is similar to triangle ABC,

therefore, the segment AD/segment AB is the segment DE/segment BC, i.e., I/L is X/S, and the touch distance X is (I/L) S.

And step S70, if the touch control distance is smaller than the preset hand width, responding the touch control operation of the hand of the operator.

The preset hand width is the thickness of general human hand, and if the preset hand width is 2cm, if the touch control interval is less than the preset hand width, it indicates that the hand of the operator is attached to or basically attached to the projection surface, and the operation hand touches the operation surface with a high probability, and then responds to the touch control operation of the hand of the operator at the hovering position of the projection surface.

In the embodiment, whether an operator exists and whether the distance between the hand of the operator and a projection plane is smaller than a preset sensing distance is judged by using a depth camera of the mobile device, then after the hovering position is determined, the light spot distance between a first infrared light spot and a second infrared spot on the hand of the operator is obtained, according to the installation distance between the mobile device and two installation positions of the projector, the light spot distance and the actual projection distance between the installation position of the mobile device and the hovering position, the touch distance between the first infrared spot and the hovering position is obtained, and only when the touch distance is smaller than the preset hand width, the touch operation of the hand of the operator is responded, and the light spot distance is obtained by performing primary identification of the distance between the hand and the projection plane and secondary identification of the first infrared spot and the second infrared spot on the hand of the operator based on the depth camera of the mobile device in sequence, and finally, performing three-level identification on the preset hand width after obtaining the touch distance so as to accurately respond the touch operation of the hand of the operator, without increasing a high-performance CPU and RAM by the projector, without greatly increasing the hardware cost, and realizing the touch interaction between the user and the projector with high accuracy.

Further, in another embodiment of the projected touch processing method of the present application, before the step of responding to the touch operation of the hand of the operator in step S70, the method further includes:

and E1, controlling the projector to project a preset ripple area at the hovering position, wherein the preset ripple area takes the hovering position as a geometric center and continuously emits transparent ripples to the periphery.

The projection content in the preset ripple area of the hovering position is distorted, but the color of the projection content in the preset ripple area is not changed, so that the transparent water ripple effect is generated, the preset ripple area is used for reminding an operator of touching the projection surface and about to start touch operation corresponding to the hovering position, and the operator is prevented from repeatedly operating the projection surface.

Specifically, before the step of responding to the touch operation of the hand of the operator at step S70, the projection touch processing method further includes:

step E2, after determining that the touch control distance is smaller than the preset hand width, controlling the projector to project a preset ripple area at the hovering position, wherein the preset ripple area takes the hovering position as a geometric center and continuously emits transparent ripples to the periphery;

step E3, if the hand of the operator moves within the preset time unit, re-executing the step of acquiring the hovering position of the hand of the operator on the projection surface, so as to take the hovering position after the hand of the operator moves as a new hovering position; and if the hand of the operator does not move within the preset time unit, responding to the touch operation of the hand of the operator.

After the touch distance is determined to be smaller than the preset hand width, indicating that the hand of the operator is likely to touch the projection surface at the moment, projecting the preset ripple area at the hovering position to remind the operator whether the operator has the intention of touch operation at the current hovering position. And continuing to position the hand position of the operator through the depth camera of the mobile device, and if the hand of the operator moves within a preset time unit (such as 1 second), indicating that the hand of the operator does not need to perform touch operation at the hovering position at the moment, so that the step of acquiring the hovering position of the hand of the operator on the projection surface is executed again, and taking the hovering position after the hand of the operator moves as a new hovering position to continue to execute the subsequent steps. And if the hand of the operator does not move within the preset time unit, indicating that the hand of the operator has a touch operation requirement at the current hovering position, responding to the touch operation of the hand of the operator. Therefore, the touch operation requirement of the operator is verified once based on the preset ripple area, and the accuracy of responding to the touch operation of the hand of the operator is further improved.

In addition, in a scene where a plurality of people stand in front of the projection surface, in order to improve the response speed of the touch operation on the hands of the operator and save the calculation resources consumed by the image recognition of the depth camera of the mobile device, after detecting that the operator exists on the projection surface facing the projector side, the method further comprises the following steps:

and F, carrying out front authority verification of face recognition or gesture recognition on the operator, and if the front authority verification is passed, executing the step of detecting whether the distance between the hand of the operator and the projection plane is smaller than a preset sensing distance based on the depth camera.

Whether an operator on one side of the projection surface facing the projector has the authority for touch interaction with the projection surface or not (namely front authority verification) is determined through face recognition or air gesture recognition, and the distance between the hand of the operator passing the front authority verification and the projection surface is only judged, namely whether the distance between the hand of the operator passing the front authority verification and the projection surface is smaller than a preset induction distance or not is only detected based on the depth camera.

The present application further provides a projection touch processing device, the projection touch processing device includes: the projection touch control processing method comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the computer program realizes the steps of the projection touch control processing method when being executed by the processor.

The present application further provides a readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the projected touch processing method as described above.

It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the recitation of an element by the phrase "comprising an … …" does not exclude the presence of additional like elements in the process, method, article, or apparatus that comprises the element, and further, where similarly-named elements, features, or elements in different embodiments of the disclosure may have the same meaning, or may have different meanings, that particular meaning should be determined by their interpretation in the embodiment or further by context with the embodiment.

It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. Thus, "A, B or C" or "A, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.

It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or at least partially with respect to other steps or sub-steps of other steps.

It should be noted that step numbers such as S10 and S20 are used herein for the purpose of more clearly and briefly describing the corresponding content, and do not constitute a substantial limitation on the sequence, and those skilled in the art may perform S20 first and then S10 in specific implementation, which should be within the scope of the present application.

The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.

Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.

While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

18页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:静电传感器、控制装置以及计算机可读存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类