User signal processing method and device for executing same

文档序号:1409658 发布日期:2020-03-06 浏览:6次 中文

阅读说明:本技术 用户信号处理方法及执行该方法的装置 (User signal processing method and device for executing same ) 是由 金勇局 赵成来 金勇振 金峻世 于 2017-12-12 设计创作,主要内容包括:本发明涉及一种用户信号处理方法及执行该方法的装置。用户信号处理方法能够包括:影像处理装置从被定义的多个用户信号中识别用户信号的步骤;以及影像处理装置根据与用户信号相对应的控制信号来控制全向影像处理装置的步骤,多个用户信号中的每个上述用户信号分别与多个不同的手图像中的每个手图像相对应。(The invention relates to a user signal processing method and a device for executing the method. The user signal processing method can include: a step in which the image processing device identifies a user signal from a plurality of defined user signals; and a step in which the video processing device controls the omnidirectional video processing device in accordance with a control signal corresponding to a user signal, each of the plurality of user signals corresponding to each of a plurality of different hand images.)

1. A method for processing a user signal, comprising:

a step in which the image processing device identifies a user signal from a plurality of defined user signals; and

a step of controlling the omnidirectional image processing device by the image processing device according to a control signal corresponding to the user signal,

each of the plurality of user signals corresponds to a respective one of a plurality of different hand images.

2. The user signal processing method of claim 1,

the user signal corresponds to one of the plurality of different hand images that matches a hand image to be recognized,

the hand image to be recognized is included in the omnidirectional image photographed by the omnidirectional image processing apparatus.

3. The user signal processing method of claim 2,

the hand image to be recognized is a hand image for a specific object in the omnidirectional picture,

the control signal corresponding to the hand image to be recognized is used to instruct tracking shooting of the object or shooting of a still image of the object.

4. The user signal processing method of claim 2,

the hand image to be recognized is a hand image defined for stopping or resuming the photographing of the omnidirectional picture,

and the control signal corresponding to the hand image to be recognized is used for indicating to stop or resume the shooting of the omnidirectional image.

5. The user signal processing method of claim 2,

the hand image to be recognized is a hand image defined to change a photographing field angle of the omnidirectional picture,

and the control signal corresponding to the hand image to be recognized is used for controlling the shooting visual field angle.

6. An image processing apparatus for processing a user signal, the image processing apparatus comprising:

a communication section for communicating with an external device; and

a processor operatively connected with the communication section,

the processor is configured to identify a user signal from the defined plurality of user signals and control the omnidirectional image processing apparatus according to a control signal corresponding to the user signal,

each of the plurality of user signals corresponds to a respective one of a plurality of different hand images.

7. The image processing apparatus according to claim 6,

the user signal corresponds to one of the plurality of different hand images that matches a hand image to be recognized,

the hand image to be recognized is included in the omnidirectional image photographed by the omnidirectional image processing apparatus.

8. The image processing apparatus according to claim 7,

the hand image to be recognized is a hand image for a specific object in the omnidirectional picture,

the control signal corresponding to the hand image to be recognized is used to instruct tracking shooting of the object or shooting of a still image of the object.

9. The image processing apparatus according to claim 7,

the hand image to be recognized is a hand image defined for stopping or resuming the photographing of the omnidirectional picture,

and the control signal corresponding to the hand image to be recognized is used for indicating to stop or resume the shooting of the omnidirectional image.

10. The image processing apparatus according to claim 7,

the hand image to be recognized is a hand image defined to change a photographing field angle of the omnidirectional picture,

and the control signal corresponding to the hand image to be recognized is used for controlling the shooting visual field angle.

11. A computer-readable recording medium characterized in that a computer program for executing the user signal processing method according to any one of claims 1 to 5 is recorded.

Technical Field

The present invention relates to a user signal processing method and an apparatus for performing the same. More particularly, the present invention relates to a method and apparatus for controlling an image processing apparatus more conveniently by recognizing a user signal and performing an operation according to the user signal.

Background

The omni (omni) video system is a video system that records video information in all directions (360 degrees) based on a specific viewpoint. Since a field-of-view image is obtained with a very wide viewing angle as compared with a conventional imaging system, the range of applications thereof has recently been gradually increased in the research fields such as computer vision and mobile robots, and in the practical fields such as monitoring systems, virtual reality systems, pan-tilt-zoom (PTZ) cameras and video conferences.

Various methods are used for obtaining an omnidirectional image. For example, an omnidirectional image is generated by rotating one camera based on an optical axis (optical-axis) satisfying a single viewpoint (single viewpoint) and joining the obtained images. Alternatively, a method of arranging a plurality of cameras in a ring structure and combining images obtained by the respective cameras may be used. The user generates an omnidirectional image using an omnidirectional image processing apparatus (or an omnidirectional image processing camera, a 360-degree camera) for obtaining various omnidirectional images.

When an omnidirectional image is photographed based on an omnidirectional image processing apparatus, a method of conveniently photographing an omnidirectional image by more rapidly controlling the omnidirectional image processing apparatus is required.

Disclosure of Invention

Technical problem

The present invention aims to solve all the problems described above.

Another object of the present invention is to provide a user signal to an image processing apparatus and to more conveniently control the image processing apparatus according to the user signal.

Another object of the present invention is to confirm an omnidirectional image photographed by a user apparatus in real time and to more conveniently control photographing of the omnidirectional image.

Technical scheme

The present invention for achieving the above object has the following representative configurations.

According to an aspect of the present invention, a user signal processing method can include: a step in which the image processing device identifies a user signal from a plurality of defined user signals; and a step in which the video processing device controls the omnidirectional video processing device in accordance with a control signal corresponding to the user signal, each of the plurality of user signals corresponding to each of a plurality of different hand images.

According to another aspect of the present invention, an image processing apparatus for processing a user signal may include a communication unit for communicating with an external apparatus; a processor operatively connected to the communication section, the processor configured to identify a user signal from a plurality of defined user signals, each of the plurality of user signals corresponding to a respective one of a plurality of different hand images, and to control the omnidirectional video processing apparatus in accordance with a control signal corresponding to the user signal.

Effects of the invention

According to the present invention, a user signal is provided to an image processing apparatus, and the image processing apparatus is more conveniently controlled according to the user signal.

In addition, the shot omnidirectional image is confirmed in real time through the user device, and the shooting of the omnidirectional image is more conveniently controlled.

Drawings

Fig. 1 is a conceptual diagram illustrating an omnidirectional image processing apparatus according to an embodiment of the present invention.

Fig. 2 is a conceptual diagram illustrating characteristics of a plurality of image capturing units located in an omnidirectional image processing apparatus according to an embodiment of the present invention.

Fig. 3 is a conceptual diagram illustrating imaging lines of a plurality of image capturing units according to an embodiment of the present invention.

Fig. 4 is a conceptual diagram showing imaging lines of a plurality of image capturing units according to an embodiment of the present invention.

Fig. 5 is a conceptual diagram illustrating an image capturing method according to an embodiment of the present invention.

Fig. 6 is a conceptual diagram illustrating an image capturing method according to an embodiment of the present invention.

Fig. 7 is a conceptual diagram illustrating an image capturing method according to an embodiment of the present invention.

Fig. 8 is a conceptual diagram illustrating an image capturing method according to an embodiment of the present invention.

Fig. 9 is a conceptual diagram illustrating a method for confirming an omnidirectional image according to an embodiment of the present invention.

Fig. 10 is a conceptual diagram illustrating control of an omnidirectional video processing apparatus by a user apparatus according to an embodiment of the present invention.

Detailed Description

In describing the present invention in detail, reference is made to the accompanying drawings that show, by way of illustration, specific embodiments in which the invention may be practiced. This embodiment is specifically described to enable those skilled in the art to fully practice the invention. The various embodiments of the invention are different from each other but do not have mutually exclusive requirements. For example, the specific shape, structure, and characteristics described in the present specification can be changed from one embodiment to another without departing from the spirit and scope of the present invention. Moreover, the position or arrangement of individual components in each embodiment may be changed without departing from the spirit and scope of the present invention. The description set forth below is therefore not to be taken in a limiting sense, and the scope of the present invention includes the scope as claimed in the appended claims, along with the full scope of equivalents to which such claims are entitled. In the drawings, like reference characters designate the same or similar components in all respects.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art to which the present invention pertains can easily carry out the present invention.

Hereinafter, the image processing apparatus according to the embodiment of the invention may include an omnidirectional image processing apparatus. The omnidirectional image processing apparatus may include an omnidirectional camera (360-degree camera) that can capture an omnidirectional (or 360-degree image).

In addition, the image information and the video information disclosed in the embodiments of the present invention may include an omnidirectional image (or a 360-degree image).

Fig. 1 is a conceptual diagram illustrating an omnidirectional image processing apparatus according to an embodiment of the present invention.

Fig. 1 discloses a structure of an omnidirectional image processing apparatus.

Referring to fig. 1, the omnidirectional image processing apparatus 100 is a wearable structure, and has a shape similar to a necklace worn on the neck of a user. As shown in fig. 1, the omnidirectional image processing apparatus 100 may be in the shape of a necklace with one side open or in the shape of a necklace with one side not open. Next, in the embodiment of the present invention, it is assumed that the omnidirectional image processing apparatus 100 has a U-shape with one side open. The U-shaped omnidirectional image processing apparatus 100 is worn on the neck of a user in the shape of a wearable device (wearable device) to capture an omnidirectional image.

In the embodiment of the present invention, for convenience of explanation, it is assumed that the omnidirectional image processing apparatus 100 is in a necklace shape (or a necklace shape with one side open, or a U-shape) and is worn on the neck of the user. However, the omnidirectional image processing apparatus 100 is not simply shaped to be worn around the neck of the user. For example, the omnidirectional image processing apparatus 100 can obtain omnidirectional images by being installed in various shapes that can be hooked or attached to different body parts of the user or external objects (or individuals)/devices/structures or the like.

The user wears the omnidirectional image processing apparatus 100 implemented as a wearable device on the neck, and obtains a plurality of images for generating an omnidirectional image in a state where both hands are free.

The omnidirectional image processing apparatus 100 includes a plurality of image capturing units. The plurality of image capturing units are provided at the omnidirectional image processing apparatus at a specific pitch (or a predetermined pitch), respectively, and thereby images of the angle of view and the line of view are individually imaged. The positions of the plurality of image capturing units may be fixed to the omnidirectional image processing apparatus 100, but the plurality of image capturing units may be moved, and the positions of the plurality of image capturing units may be changed.

For example, the omnidirectional image processing apparatus 100 may include three image capturing units, and the three image capturing units image the omnidirectional image at a certain field of view (e.g., 120 degrees to 180 degrees). The three image capturing units are the first image capturing unit 110, the second image capturing unit 120, and the third image capturing unit 130.

For convenience of explanation, a configuration in which three image capturing units are included in the omnidirectional image processing apparatus 100 is disclosed below. However, not three, but a plurality of (for example, 2, 4, 5, 6, etc.) image capturing units may be included in the omnidirectional image processing apparatus 100 to form an omnidirectional image, and the shape is also included in the scope of the claims of the present invention.

The first image capturing unit 110, the second image capturing unit 120, and the third image capturing unit 130 capture images according to the angle of view. In the same time resource, the first image capturing unit 110 generates a first image, the second image capturing unit 120 generates a second image, and the third image capturing unit 130 generates a third image. The first image capturing unit 110, the second image capturing unit 120, and the third image capturing unit 130 have an angle of view of 120 degrees, and have overlapping imaging regions in the first image, the second image, and the third image. Then, the omnidirectional image processing apparatus 100 splices/corrects the first image, the second image, and the third image imaged on the same time resource, thereby generating an omnidirectional image. The order of stitching and/or correcting the plurality of images may be executed by the omnidirectional image processing apparatus itself, or may be executed by a user equipment (smartphone) capable of communicating with the omnidirectional image processing apparatus 100. That is, the increased image processing procedure for the generated plurality of images can be executed by the omnidirectional image processing apparatus 100 and/or other image processing devices (smart phones, personal computers, and the like).

Next, specific features of the omnidirectional image processing apparatus and the omnidirectional image generating method are specifically shown.

Fig. 2 is a conceptual diagram illustrating characteristics of a plurality of image capturing units located in an omnidirectional image processing apparatus according to an embodiment of the present invention.

Fig. 2 discloses the features of a plurality of image capturing units of a U-shaped omnidirectional image processing apparatus. The position of the image capturing section in fig. 2 is exemplary. The plurality of image capturing units are respectively located at different positions on the omnidirectional image processing apparatus to capture a plurality of images for generating the omnidirectional image.

The rear part of the omnidirectional image processing apparatus is shown at the upper end of fig. 2.

The first image capturing unit 210 and the second image capturing unit 220 included in the omnidirectional image processing apparatus are located at a curved portion having a curvature in the omnidirectional image processing apparatus. Specifically, in the case where the user wears the omnidirectional image processing apparatus as a wearable device on the neck, the first image capturing unit 210 and the second image capturing unit 220 are provided in a region that is curved while being in contact with the rear portion of the neck. For example, the first image capturing unit 210 and the second image capturing unit 220 are disposed at a fixed distance from each other based on a maximum curvature fulcrum (e.g., a middle portion of a U-shape) of the U-shaped omnidirectional image processing apparatus.

The first image capturing unit 210 captures an image of a region including a rear left dead-angle region with reference to a line of sight (sight) direction of the user. The second image capturing unit 220 captures an image of a region including a rear right dead-angle region with reference to the user's line of sight. Specifically, the first image capturing unit 210 has a first angle of view, and performs imaging of a region corresponding to the first angle of view. The second image capturing unit 220 has a second angle of view, and performs imaging of a region corresponding to the second angle of view. For example, the first viewing angle and the second viewing angle are 120 to 180 degrees.

When imaging is performed by the first image capturing unit 210 and the second image capturing unit 220, a first overlapping area 215 overlapping with the first angle of view and the second angle of view is generated. Thereafter, an omnidirectional image is generated based on stitching based on the overlapping regions.

The lower end of fig. 2 shows the front of the omnidirectional image processing apparatus.

A third image capturing unit 230 is provided in front of the omnidirectional image processing apparatus. Specifically, the third image capturing section 230 is provided at the end portion (end) portion of the U-shape) of the omnidirectional image processing apparatus. In a case where the user hangs the omnidirectional image processing apparatus on the neck as a wearable device, the distal end portion of the U-shaped omnidirectional image processing apparatus is positioned in the front direction of the user (the direction in which the user looks at the gaze). The omnidirectional image processing device comprises: a first terminal portion and a second terminal portion, and a third image capturing portion 230 is disposed at one of the first terminal portion and the second terminal portion.

The third image capturing unit 230 performs imaging of an area corresponding to the line of sight of the user by performing imaging in the same direction as the direction of the line of sight of the user.

Specifically, the third image capturing unit 230 has a third angle of view, and performs imaging of a region corresponding to the third angle of view. For example, the third viewing angle is 120 degrees to 180 degrees. In the case of performing imaging by the third image capturing part 230, the second overlap region 225 is generated due to the first angle of view of the first image capturing part 210 and the third angle of view of the third image capturing part 230. In the case of performing imaging by the third image capturing unit 230, a third overlapping area 235 is generated due to the second angle of view of the second image capturing unit 220 and the third angle of view of the third image capturing unit 230.

In the case where the omnidirectional image processing apparatus is hung on the neck in the structure of the wearable device hung on the neck, the first image capturing unit 210 and the second image capturing unit 220 are relatively located higher than the third image capturing unit 230 with respect to the ground. The third image capturing unit 230 is located at one end portion.

In the conventional omnidirectional image processing apparatus, the plurality of image capturing units located at the same height have a certain angle, but the plurality of image capturing units of the omnidirectional image processing apparatus according to the embodiment of the present invention have different angles and different heights. Therefore, the first overlap region 215, the second overlap region 225, and the third overlap region 235 of the plurality of images generated by the plurality of image capturing units are different in size and shape from each other.

Then, the omnidirectional image is generated based on the image processing procedure (stitching/correction, etc.) of the first image, the second image, and the third image generated by the first image capturing unit 210, the second image capturing unit 220, and the third image capturing unit 230 based on the first overlap area 215, the second overlap area 225, and the third overlap area 235, respectively.

The sizes of the first viewing angle, the second viewing angle, and the third viewing angle may be set similarly, but may be set differently from each other, and are included in the embodiments and the scope of the claims of the present invention.

Fig. 3 is a conceptual diagram illustrating imaging lines of a plurality of image capturing units according to an embodiment of the present invention.

Fig. 3 shows imaging lines of a plurality of image capturing units provided in the omnidirectional image processing apparatus. In the case where the ground plane is assumed to be parallel to the XZ plane formed by the X axis and the Z axis, the imaging line is defined as a line passing through the centers of the lenses included in the plurality of image capturing units of the omnidirectional image processing apparatus in a vertical direction in the space represented by the X axis, the Y axis, and the Z axis.

In a conventional omnidirectional image processing apparatus, a plurality of image capturing units are disposed at the same height at a predetermined angle (for example, 120 degrees). In this case, the plurality of imaging lines included in the plurality of image capturing units of the conventional omnidirectional image processing apparatus are lines which are parallel to the ground (or XZ plane) and have a certain angle (for example, 120 degrees) between the plurality of imaging lines.

As described above, in the omnidirectional image processing apparatus according to the embodiment of the present invention, the heights of the plurality of image capturing units (or the positions where the plurality of image capturing units are realized) and the angles between the plurality of image capturing units (or the angles formed between the imaging lines) are different from each other at the time of image capturing. Therefore, the characteristics of the imaging line of the omnidirectional image processing apparatus according to the embodiment of the present invention are different from those of the conventional omnidirectional image processing apparatus.

The imaging lines of each of the plurality of image capturing parts illustrated in fig. 3 are used to display a difference in characteristics (e.g., height, angle) between the imaging lines of each of the plurality of image capturing parts due to characteristics of the wearable device. The imaging lines shown in fig. 3 are imaging lines that are not moved by the user wearing the omnidirectional image processing apparatus or are fixed in a specific state of the omnidirectional image processing apparatus.

The upper end of fig. 3 discloses the imaging lines of the first image capturing unit 310 and the second image capturing unit 320.

The first image capturing unit 310 and the second image capturing unit 320 are disposed at a relatively higher position than the third image capturing unit 330. Assuming that the direction of the user wearing the omnidirectional image processing apparatus is the Y-axis direction, in the structure of the wearable device hung on the neck, the portion (curve/central portion in the U-shape) having the curvature where the first image capturing unit 310 and the second image capturing unit 320 are located is relatively raised, and the leg portion (end portion in the U-shape) where the third image capturing unit 330 is located is relatively moved down.

For example, the first imaging line 315 of the first image capturing unit 310 is parallel to the XZ plane, and has the 1 st angle with the X axis, the 2 nd angle with the Y axis, and the 3 rd angle with the Z axis in the Y-axis coordinate a.

The second imaging line 325 of the second image capturing unit 320 is parallel to the XZ plane, and has a 4 th angle with the X axis, a 5 th angle with the Y axis, and a 6 th angle with the Z axis at the fulcrum a of the Y axis.

Referring to the lower end of fig. 3, the third imaging line 335 of the third image capturing unit 330 is parallel to the XZ plane, and has a 7 th angle with the X axis, an 8 th angle with the Y axis, and a 9 th angle with the Z axis on the Y-axis coordinate b. b is a value less than a. The third imaging line 335 of the third image capturing unit 330 is parallel to the XZ plane and looks at the front side (e.g., in a direction perpendicular to the XY plane) as viewed from the user.

That is, the first and second imaging lines 315 and 325 have the same height with respect to the Y-axis, and the third imaging line 335 is located at a relatively lower position than the first and second imaging lines with respect to the Y-axis. The first, second, and third imaging lines 315, 325, and 335 disclosed in fig. 3 are an example of imaging lines having different characteristics from each other, define various imaging lines, and image an omnidirectional image.

Fig. 4 is a conceptual diagram showing imaging lines of a plurality of image capturing units according to an embodiment of the present invention.

Fig. 4 discloses a plurality of imaging lines of the image capturing unit different from those in fig. 3. Similarly, in fig. 4, the ground is assumed to be parallel to the XZ plane formed by the X axis and the Z axis.

The upper end of fig. 4 is an imaging line of the first image capturing unit 410 and the second image capturing unit 420.

The first image capturing unit 410 and the second image capturing unit 420 are located at positions relatively higher than the third image capturing unit 430. Similarly, in the case where the user is assumed to be in the Y-axis direction, the omnidirectional image processing apparatus suspended on the wearable device of the neck captures the image in a manner of relatively raising the portion (curved portion in the U-shape) having the curvature where the first image capturing unit 410 and the second image capturing unit 420 are located, and relatively moving down the portion (end portion in the U-shape) of the leg where the third image capturing unit 430 is located.

For example, the first imaging line 415 of the first image capturing unit 410 is parallel to the XZ plane, and the coordinate a of the Y axis is at the 1 st angle with respect to the X axis, at the 2 nd angle with respect to the Y axis, and at the 3 rd angle with respect to the Z axis.

The second imaging line 425 of the second imaging unit 420 is parallel to the XZ plane, and has a 4 th angle with the X axis, a 5 th angle with the Y axis, and a 6 th angle with the Z axis on the Y-axis coordinate a.

The lower end of fig. 4 discloses an imaging line of the third image capturing section 430.

The third imaging line 435 of the third image capturing unit 430 is not parallel to the XZ plane, and has a 7 th angle with the X axis, an 8 th angle with the Y axis, and a 9 th angle with the Z axis, with the coordinate b of the Y axis as a starting point.

The third image capturing unit 430 is located at the end of the omnidirectional image processing apparatus, so that the imaging line is not parallel to the XZ plane and has an angle (e.g., 0 to 30 degrees) with the XZ plane.

That is, the first and second imaging lines 415 and 425 have the same height based on the Y-axis, and the third imaging line 435 is located relatively lower than the first and second imaging lines 415 and 425 based on the Y-axis. Also, the first and second imaging lines 415 and 425 are parallel to the XZ plane, but the third imaging line 435 is not parallel to the XZ plane.

In another embodiment of the present invention, for example, the first imaging line of the first image capturing unit forms a 1' th angle with the XZ plane, and forms a 1 st angle 1 with the X axis, a 2 nd angle with the Y axis, and a 3 rd angle with the Z axis with the coordinate a of the Y axis as a starting point. The second imaging line of the second imaging unit forms a 1' th angle with the XZ plane, and forms a 4 th angle with the X axis, a 5 th angle with the Y axis, and a 6 th angle with the Z axis, with the coordinate a of the Y axis as a starting point. The third imaging line of the third image capturing unit forms a 2' th angle with the XZ plane, and forms a 7 th angle with the X axis, an 8 th angle with the Y axis, and a 9 th angle with the Z axis, with the coordinate b of the Y axis as a starting point.

According to another embodiment of the present invention, the first imaging line of the first image capturing unit forms a 1' th angle with the XZ plane, and forms a 1 st angle 1 with the X axis, a 2 nd angle with the Y axis, and a 3 rd angle with the Z axis, with the coordinate a of the Y axis as a starting point. The second imaging line of the second imaging unit forms a 2' th angle with the XZ plane, and forms a 4 th angle with the X axis, a 5 th angle with the Y axis, and a 6 th angle with the Z axis, with the coordinate a of the Y axis as a starting point. The third imaging line of the third image capturing unit forms a 3' th angle with the XZ plane, and forms a 7 th angle with the X axis, an 8 th angle with the Y axis, and a 9 th angle with the Z axis, with the coordinate b of the Y axis as a starting point.

That is, unlike the conventional image processing apparatus in which the imaging lines of the plurality of image capturing units have the same angle with the ground at the same Y-axis fulcrum, the imaging lines of the plurality of image capturing units of the omnidirectional image processing apparatus according to the embodiment of the present invention are located at different Y-axis fulcrums and have different angles with the ground (or XZ plane).

According to an embodiment of the invention, the following method is disclosed: the omnidirectional image processing apparatus is a wearable apparatus, and a user hanging on a neck provides a user signal (for example, a finger gesture, a voice signal, or the like), and the omnidirectional image processing apparatus controls omnidirectional image capturing by recognizing the user signal.

The identification and processing of the user signal may be performed by the omnidirectional image processing apparatus or a separate external apparatus that receives information about the user signal from the omnidirectional image processing apparatus for processing. Hereinafter, the term "image processing apparatus" may be used to denote an apparatus for recognizing and processing a user signal. The image processing device may include an omnidirectional image processing device and/or an external device.

The image processing apparatus for processing a user signal in an image may include: a communication section for communicating with an external device; and a processor operatively connected to the communication section. The processor may perform the following identification and processing of user signals as will be described later in the present invention.

In addition, for convenience of explanation, the omnidirectional image processing apparatus in the embodiment of the present invention discloses a method for recognizing and processing a user signal, but the present invention may also be a method for recognizing and processing a user signal in a normal image that is not an omnidirectional image, and such an embodiment is also included in the claims of the present invention.

Fig. 5 is a conceptual diagram illustrating an image capturing method according to an embodiment of the present invention.

Fig. 5 discloses a method for identifying a user signal and operating according to the user signal by the omnidirectional image processing apparatus.

Referring to fig. 5, the omnidirectional image processing apparatus may recognize a user signal (step S500).

When the user signal is a video signal included in the photographed omnidirectional image, it is possible to confirm whether the user signal is included in the omnidirectional image. Alternatively, when the user signal is a voice signal, it may be determined whether the input voice signal includes the user signal. The user signal may be predefined or may be defined based on user settings or learning of the omnidirectional image processing apparatus.

The omni-directional image processing apparatus may confirm whether the user signal is included in the omni-directional image (or the input signal) to identify the user signal.

For example, when the user signal is a hand signal of the user, the omnidirectional image processing apparatus may determine whether a hand image of the user exists in the omnidirectional image. When a hand image of the user exists in the omnidirectional picture, the hand image can be recognized. In this case, it is possible to recognize only a hand image in which the shooting distance (the distance between the image capturing unit and the subject) is equal to or less than the threshold distance as the user signal in the omnidirectional image, recognize whether the hand is the user's hand based on the stored characteristics of the hand of the user, and recognize the hand as the user signal only in the case of the hand image of the user.

When the user signal is the voice signal of the user, the omnidirectional image processing device can judge whether the voice signal of the user exists in the input voice signal. When the voice signal of the user exists in the sound signal, the voice signal of the user can be recognized.

In order to control the omnidirectional image processing apparatus based on the user signal, a reference user signal matched with the control operation of the omnidirectional image processing apparatus may be defined. The similarity between the reference user signal and the user signal matched with the control operation of the omnidirectional image processing apparatus can be determined. The control operation of the omnidirectional image processing apparatus may be performed based on the user signal only in a case where the degree of similarity between the reference user signal and the identified user signal is more than a critical percentage.

For example, a similarity between the picture of the reference user signal and the picture of the user signal may be determined. By judging the similarity between the characteristics of the image of the reference user signal and the characteristics of the image of the user signal, the user signal can be identified as corresponding to the reference user signal under the condition that the similarity between the image of the reference user signal and the image of the user signal to be judged is more than a critical percentage.

Alternatively, according to an embodiment of the present invention, a sensor capable of sensing an object located below a threshold distance with reference to the position of the omnidirectional image processing apparatus can be implemented in the omnidirectional image processing apparatus. When the sensor senses an object below the threshold distance, the omnidirectional image processing device can be switched from the power-saving mode to the active mode to receive the user signal. For example, when a user wearing the omnidirectional image processing apparatus extends his or her hand, the omnidirectional image processing apparatus can be switched from the power saving mode to the active mode by recognizing the hand, and the image capturing section can capture an image of the hand of the user by driving the image capturing section in the active mode.

The omnidirectional image processing apparatus may perform a control operation based on the identified user signal (step S510).

The omnidirectional image processing device can determine the shape of the current hand as the signal of which user from a plurality of defined user signals, and can operate according to the determined user signal. For example, the user signal may be a square formed by the user's hand, and such a user signal may be matched to the operation of tracking and shooting the center object within the square by the omnidirectional image processing apparatus.

Specifically, when it is recognized that the user signal corresponds to the reference user signal, the operation of the omnidirectional image processing apparatus corresponding to the reference user signal image may be performed.

In other words, the image processing apparatus can identify the user signal from the plurality of user signals defined, and the image processing apparatus controls the omnidirectional image processing apparatus according to the control signal corresponding to the user signal. Each of the plurality of user signals may correspond to each of a plurality of different hand images.

The user signal may correspond to one hand image matching the hand image to be recognized among the plurality of different hand images, and the hand image to be recognized may be included in the omnidirectional image photographed by the omnidirectional image processing apparatus.

The hand image to be recognized may be a hand image of a subject in a specific omnidirectional picture, and the control signal corresponding to the hand image to be recognized indicates tracking shooting for the subject or shooting for a still picture of the subject.

Alternatively, the hand image to be recognized may be a hand image defined as stopping or resuming the photographing of the omnidirectional image, and the control signal corresponding to the hand image to be recognized instructs the above-mentioned stopping or resuming the photographing of the omnidirectional image.

Alternatively, the hand image to be recognized may be a hand image defined to change a photographing field angle of the omnidirectional picture, and the control signal corresponding to the hand image to be recognized may control the photographing field angle.

Alternatively, the hand image to be recognized may be a picture for cropping (crop) a prescribed area in the omnidirectional picture, and the control signal corresponding to the hand image to be recognized may indicate tracking shooting for the subject or shooting for a still picture of the subject. In this case, the hand image to be recognized may indicate a cut position (horizontal direction/vertical direction) in the omnidirectional image by a finger, and cut a prescribed area in the omnidirectional image based on the indicated cut position. For example, the thumb and index finger of the right hand may be used on the first side to indicate the horizontal and vertical regions where image cropping is desired, and the thumb and index finger of the left hand may be used on the second side to indicate the horizontal and vertical regions where image cropping is desired. The kind of the finger can be freely set.

An image within a quadrangle formed by 4 axes (or extension lines of axes) may be cut from the omnidirectional image, with the right index finger indicating a first horizontal axis, the left index finger indicating a second horizontal axis, the left thumb indicating a first vertical axis, and the right thumb indicating a second vertical axis. The fingers of the right hand and the fingers of the left hand do not contact, and the size of the cut area can be adjusted by expanding or contracting the horizontal area and the vertical area.

That is, a specific picture among the omnidirectional pictures can be cut by generating a control signal corresponding to a hand image to be recognized, which captures a hand of the user. The hand image (or the direction of the finger) is analyzed, and the clipped image area can be determined by analyzing the direction of the finger.

The method for cropping such an image is an example, and various hand shapes of a user may be recognized according to the user's setting, and the image may be cropped. The following specifically discloses a user signal and an operation of the omnidirectional image processing apparatus based on the user signal.

Fig. 6 is a conceptual diagram illustrating an image capturing method according to an embodiment of the present invention.

A method of capturing a picture based on the recognition of a user signal is disclosed in fig. 6.

Referring to fig. 6, the omnidirectional image processing apparatus may recognize a user signal (e.g., a specific gesture of a user) of a user wearing the omnidirectional image processing apparatus and photograph an omnidirectional image.

For example, the user can designate an area to be photographed by hand. Hereinafter, the disclosed gesture of the user is one example, and various gestures may be used in addition to the disclosed gesture of the user.

The user may spatially generate a square with the thumb and index finger of both hands to indicate a particular object (hereinafter, target object 600) to be photographed through default photographing settings and other photographing settings. The default photographing setting may be a basic setting for the omnidirectional image photographing apparatus to photograph the omnidirectional image.

For example, the omnidirectional image processing apparatus recognizes a square formed based on both hands from the photographed image, and focuses on the target object 600 located in the center area within the recognized square to photograph the object. Also, the omnidirectional image processing apparatus may recognize a square formed based on both hands from the photographed image, and photograph the target object 600 positioned in a central area within the recognized square in more detail based on the zoom function. The degree of zoom may be determined according to the square advance or retreat. Alternatively, the omnidirectional image processing apparatus may provide still image information by taking a still image of a square formed by two hands.

That is, the omnidirectional image processing apparatus may provide an omnidirectional image photographed by individually setting the target object 600 or individually generate image information including the target object 600 to provide to the user.

The omnidirectional image processing apparatus can perform photographing by tracking a target object. The indicated object or omnidirectional image processing apparatus may be moved. The omni-directional image processing apparatus may track the target object 600 and photograph the target object 600 using a separate photographing setup.

Also, according to an embodiment of the present invention, the user may also indicate a plurality of target objects 600 through a user signal. When a plurality of target objects 600 are indicated based on the user signal, the plurality of target objects 600 may be photographed based on individual photographing settings, respectively.

When the photographing of the target object 600 is to be stopped, the user may generate a user signal (e.g., a target object photographing stop signal) for instructing to stop photographing the target object 600, and the omnidirectional image processing apparatus stops photographing the target object 600 by recognizing the user signal and converts the photographing setting to the default photographing setting again to perform photographing.

Also, according to an embodiment of the present invention, it is possible to instruct to stop photographing an omnidirectional picture and to resume photographing an omnidirectional picture based on a user signal.

Specifically, a user signal (photographing stop signal) instructing to stop photographing the omnidirectional image may be generated, and when the photographing stop signal is generated by the omnidirectional image processing apparatus, the photographing operation for the omnidirectional image may be stopped.

Alternatively, a user signal (recovery shot signal) instructing recovery of shooting of the omnidirectional image may be generated, and when the recovery shot signal is generated by the omnidirectional image processing apparatus, the shooting operation for the omnidirectional image may be resumed.

The user can prevent unnecessary photographing by stopping the photographing signal and resuming the photographing signal.

Fig. 7 is a conceptual diagram illustrating an image capturing method according to an embodiment of the present invention.

A method of capturing a picture based on the recognition of a user signal is disclosed in fig. 7.

Referring to fig. 7, the omnidirectional image processing apparatus may recognize a user signal (e.g., a specific gesture of a user) of a user wearing the omnidirectional image processing apparatus and photograph an omnidirectional image.

The user can adjust the angle of view of the omnidirectional image by hand. For example, the user can perform necessary photographing within the vertical direction angle of view 700 based on adjustment of the angle of view in the vertical direction.

Specifically, the user can adjust the vertical field angle 700 of the omnidirectional image by a movement of spreading both arms at a predetermined angle in the vertical direction, a movement of spreading the thumb and index finger of the hand at a predetermined angle in the vertical direction, a movement of drawing a circle in a predetermined direction with the finger, or the like. For example, the relatively larger the angle between the thumb and the index finger of the hand, the larger the photographing angle of view may be, and the relatively smaller the angle between the thumb and the index finger of the hand, the smaller the photographing angle of view may be.

For example, when only the region corresponding to 90 degrees in the vertical direction needs to be photographed, 1/4, which opens the thumb and the index finger of the hand at a prescribed angle along the vertical direction or draws only a circle, may be performed. The omnidirectional image processing apparatus recognizes such a user motion as a user signal for adjusting the vertical viewing angle 700, and can adjust the vertical viewing angle 700 of the image captured by the omnidirectional image capturing apparatus.

Also, according to the embodiment of the present invention, the user can perform only photographing within the necessary horizontal direction angle of view 750 based on the adjustment of the horizontal direction angle of view 750. When the first image capturing unit captures an image based on the first viewing angle, the second image capturing unit captures an image based on the second viewing angle, and the third image capturing unit captures an image based on the third viewing angle, the user can adjust the horizontal viewing angle 750 of the omnidirectional image by a movement of spreading both arms at a predetermined angle in the horizontal direction or a movement of spreading the thumb and the index finger of the hand at a predetermined angle in the horizontal direction. Alternatively, the horizontal field angle 750 of the omnidirectional image may be adjusted by a motion of drawing a part of a circle with a finger.

For example, when only an area corresponding to 180 degrees in the horizontal direction is to be photographed, an action of opening the thumb and the index finger of the hand at a prescribed angle in the horizontal direction or drawing a semicircle may be performed. The omnidirectional image processing apparatus recognizes such a user motion as a user signal for adjusting the horizontal viewing angle 750, and can photograph the horizontal direction by adjusting the horizontal viewing angle 750 of the image photographed by the omnidirectional photographing apparatus. In this case, only a part of the plurality of image capturing units included in the omnidirectional image processing apparatus may be operated. For example, only the first image capturing unit and the second image capturing unit may be operated among the first image capturing unit, the second image capturing unit, and the third image capturing unit to capture an image.

Also, according to an embodiment of the present invention, it is also possible to provide the vertical direction angle of view 700 to be different according to the horizontal direction angle of view 750. For example, the vertical field angle may be different depending on the horizontal field angle, and for example, the vertical field angle of the image captured by the first image capturing unit may be a degrees, the vertical field angle of the image captured by the second image capturing unit may be b degrees, and the vertical field angle of the image captured by the third image capturing unit may be c degrees.

Furthermore, according to the embodiment of the present invention, the user can adjust the quality of the omnidirectional image by hand. For example, the user may adjust the quality of the omnidirectional image to generate a high-quality omnidirectional image. For example, the user may generate a high-quality omnidirectional image generation signal for generating a high-quality omnidirectional image based on the user signal. Conversely, the user can generate a low-quality omnidirectional image generation signal for generating a low-quality omnidirectional image as the user signal.

The omnidirectional image processing apparatus can adjust the image quality of the omnidirectional image by recognizing the high-quality omnidirectional image generation signal and the low-quality omnidirectional image generation signal.

The user signals disclosed in fig. 6 and 7 are an example, and various other user signals may be used to control the omnidirectional image processing apparatus.

In fig. 6 and 7, the user signal is assumed to be a hand signal of the user, but a user signal of another format such as a voice signal may be used for controlling the omnidirectional image processing apparatus.

Fig. 8 is a conceptual diagram illustrating an image capturing method according to an embodiment of the present invention.

A method of capturing a picture based on the recognition of a user signal is disclosed in fig. 8. In particular, fig. 8 assumes a case where the user signal is a speech signal.

Referring to fig. 8, a voice signal may be recognized (step S800).

The user can control the omnidirectional image processing device by generating a voice signal. The user can transfer information about the object to be photographed to the omnidirectional image processing apparatus as voice information. For example, when a cyclist is to be photographed as a target object, the user may generate a voice signal of "photograph bicycle". And may deliver user generated speech to the omnidirectional image processing apparatus.

An object corresponding to the voice signal may be searched for from the omni-directional image (step S810).

When the user generates a voice signal of "photographing a bicycle", the omnidirectional image processing apparatus may search for a bicycle from the photographed omnidirectional image based on information related to the voice signal. The bicycle template image information can be obtained, and an object with high similarity to the bicycle template image information can be searched from the omnidirectional image as a bicycle. When the bicycle is searched from the omnidirectional image, the searched bicycle can be set as a target object, and the target object is photographed.

When a plurality of bicycles is searched from the omnidirectional image or it is difficult to search for a bicycle, the omnidirectional image processing apparatus may request additional information related to the target object. The omnidirectional image processing apparatus may notify the user that the target object is not specified based on a sound signal, a text signal, a vibration signal, or the like, and request additional information for the specific target object.

For example, in order to specify a specific bicycle as a target object, the user may provide additional voice information such as the direction of the bicycle, the color of the bicycle, and the like as voice information. If the target object is not specified based on the user signal, the target object in the omnidirectional image may be specified based on such additional voice information.

As described above, the user can also specify a plurality of target objects based on the voice information, thereby photographing the plurality of target objects based on the default photographing setting and other photographing setting values.

The omnidirectional image processing apparatus may separate an object in the omnidirectional image from a background, may obtain information on the name of the object by performing image-based learning on the object, and may specify the target object based on voice information of the user. The audio of the audio-based object may be performed not by the omnidirectional video processing apparatus but by a video processing server connected to the omnidirectional video processing apparatus.

In the same manner, adjustment of the horizontal angle of view, adjustment of the vertical angle of view, stop/resume shooting of an omnidirectional image, and the like can also be performed based on a voice signal. A voice signal matching such a control operation may be defined, and the omnidirectional image processing apparatus may be controlled by comparing the defined voice signal with the inputted voice signal.

Fig. 9 is a conceptual diagram illustrating a method for confirming an omnidirectional image according to an embodiment of the present invention.

Fig. 9 discloses a method for allowing a user to confirm a change of a photographed omnidirectional image and an omnidirectional image based on a user signal in real time.

Referring to fig. 9, omnidirectional image information photographed by the omnidirectional image processing apparatus may be transmitted to the user apparatus. The user device may be a user's smart phone or the like.

Referring to the upper end of fig. 9, the user device receives the omnidirectional image information and outputs a currently photographed omnidirectional image 900 through a display of the user device. The user device outputs video information regarding a specific horizontal viewing angle on a screen, and provides an omnidirectional video 900 by rotating 360 degrees according to an input signal input by the user. The input signal of the user may be a touch signal for turning the screen sideways. Alternatively, the omnidirectional image 900 output to the user apparatus is converted and provided according to the rotation of the user apparatus.

Referring to the lower end of fig. 9, the display screen of the user device is divided into a first screen 910 for the first image capturing unit, a second screen 920 for the second image capturing unit, and a third screen 930 for the third image capturing unit, and the first screen 910, the second screen 920, and the third screen 930 are rotated according to input signals input by the user to provide image information, respectively. For example, a touch signal of "page to side" is input to the divided first screen 910, and the output first screen 910 may be converted according to the input touch signal.

As described above, the user can control the operation of the omnidirectional image processing apparatus based on the user signal, and the photographed omnidirectional image can be changed according to the controlled operation of the omnidirectional image processing apparatus. The user can confirm whether to take a picture based on the user signal through the user device.

For example, when the user generates a user signal by performing an action for instructing a target object, the omnidirectional image processing apparatus can photograph the target object with a setting different from a default photographing setting.

The user device may indicate the target object by outputting a recognition image for recognizing the target object indicated by the user, and the user may confirm the indicated target object through a display of the user device. Also, a photographing screen that changes according to an indication of a target object based on a user signal may be provided through a display of the user device. By this method, the user can directly confirm whether to photograph the target object indicated by the user based on the user signal.

For another example, when the user generates a user signal for changing a horizontal angle of view or a vertical angle of view, the omnidirectional picture processing apparatus may photograph the omnidirectional picture by changing the horizontal angle of view or the vertical angle of view. The user device may transmit information about a changed horizontal or vertical angle of view to the user through its display, and transmit the photographed omnidirectional image to the user device according to the changed horizontal or vertical angle of view, thereby allowing the user to confirm the photographed omnidirectional image according to the changed horizontal or vertical angle of view.

In this way, the user can confirm whether to recognize the user signal through the display of the user device, and can receive the omnidirectional image information changed according to the user signal.

Fig. 10 is a conceptual diagram illustrating control of an omnidirectional video processing apparatus by a user apparatus according to an embodiment of the present invention.

Fig. 10 discloses a method for conversely transferring control information generated by the user apparatus to the omnidirectional video processing apparatus.

Referring to fig. 10, an application for controlling the omnidirectional image processing apparatus may be provided at the user apparatus, and the user may control the omnidirectional image processing apparatus through the user apparatus.

As described above, the user device receives the omnidirectional image information and may output the currently photographed omnidirectional image information through the display of the user device. The user device outputs image information regarding a viewing angle in a specific horizontal direction on a screen, and can provide the image information by rotating 360 degrees in accordance with an input signal input by the user. Alternatively, the display screen of the user device may be divided into a first screen for the first image capturing unit, a second screen for the second image capturing unit, and a third screen for the third image capturing unit, and the first screen, the second screen, and the third screen may be rotated according to an input signal input by the user to provide image information, respectively.

The user may also input user signals through the user device.

For example, the user may indicate the target object 1000 based on a touch through a screen of the user device, and may transfer information about the target object 1000 indicated by the user device to the omnidirectional image processing device. The omnidirectional image processing apparatus can track and photograph the target object 1000 indicated by the user apparatus. When the user indicates the target object 1000 based on the touch, an additional picture (e.g., an arrow for indicating the target object 1000) for indicating the target object 1000 is generated, and it may be confirmed whether the target object 1000 is accurately indicated by the additional picture.

For example, the user can adjust the horizontal viewing angle 1040 and the vertical viewing angle 1020 on the screen of the user apparatus. The screen of the user device is divided, and the image information photographed by each image photographing part can be provided to the user. The user can adjust the horizontal-direction angle of view 1040 by the action of placing 2 fingers in the horizontal direction on each screen and opening or closing the 2 fingers, and can adjust the vertical-direction angle of view 1020 by the action of placing 2 fingers in the vertical direction on each screen and opening or closing the same.

The screen of the user apparatus can be provided with information on the horizontal direction angle of view 1040 and information on the vertical direction angle of view 1020 by an operation of placing 2 fingers and opening or closing them. For example, a numerical value relating to the horizontal viewing angle 1040 and a numerical value relating to the vertical viewing angle 1020 may be output on the screen of the user apparatus.

The omnidirectional image processing apparatus may transmit a signal input through the user apparatus to the omnidirectional image processing apparatus, and may control the horizontal viewing angle 1040 and the vertical viewing angle 1020 based on the signal received from the user apparatus.

This approach is exemplary and the operation of the omnidirectional image processing apparatus may be controlled in other various ways based on the user signal generated by the user apparatus.

The embodiments according to the present invention described above can be embodied and recorded in a computer-readable recording medium in the form of program instructions executable by various computer-constituting elements. The above-described computer-readable recording medium can include program instructions, data files, data structures, and the like, alone or in combination. The program instructions recorded in the computer-readable recording medium may be those specially designed and constructed for the present invention, or may be those known to those skilled in the art of computer software. Examples of the computer-readable recording medium include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical recording media such as CD-ROMs and DVDs, magneto-optical media (magneto-optical media) such as floptical disks (floppy disks), and specially constructed hardware devices such as ROMs, RAMs, flash memories, and the like, which store and execute program instructions. Examples of the program instructions include not only machine language codes such as those generated by a compiler but also high-level language codes that can be executed by a computer using a compiler and the like. A hardware device can be changed to more than one software module to perform a process according to the present invention and vice versa.

Although the present invention has been described above with reference to specific matters such as specific constituent elements and limited embodiments and drawings, the present invention is provided only to facilitate a more complete understanding of the present invention, and the present invention is not limited to the above embodiments and various modifications and changes can be made by those skilled in the art in light of the description.

Therefore, the idea of the present invention is not limited to the above-described embodiments, and not only the appended claims but also all the scope equivalent to or modified by the claims are included in the scope of the idea of the present invention.

24页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:利用力传感器和触觉致动器的可调节触觉反馈

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类