Device comprising a plurality of markers

文档序号:260765 发布日期:2021-11-16 浏览:9次 中文

阅读说明:本技术 包括多个标记的设备 (Device comprising a plurality of markers ) 是由 西川宪三 南野孝范 于 2020-03-26 设计创作,主要内容包括:根据本发明,输入设备16配备有壳体、向壳体的外部发射光的多个标记30、和指示输入设备16的状态的指示器32。控制单元50在预定的时间段打开多个标记30,并且在多个标记30未点亮的时间段内打开指示器32。(According to the present invention, the input device 16 is equipped with a housing, a plurality of markers 30 that emit light to the outside of the housing, and an indicator 32 that indicates the status of the input device 16. The control unit 50 turns on the plurality of markers 30 for a predetermined period of time, and turns on the indicator 32 for a period of time in which the plurality of markers 30 are not lit.)

1. An apparatus, comprising:

a housing; and

a plurality of markers configured to emit light to an outside of the housing,

wherein the apparatus includes a control section configured to light up the plurality of marks at predetermined intervals, and

the control portion illuminates an indicator indicating a state of the apparatus during a period in which the plurality of marks are not illuminated.

2. The apparatus as set forth in claim 1, wherein,

wherein the control portion does not light the indicator for a period of time in which the plurality of markers are lit.

3. The apparatus of claim 1 or 2,

wherein the control section does not light the indicator for an exposure period during which the image sensing apparatus images the apparatus.

4. The apparatus of any one of claims 1 to 3,

wherein the device is an input device including an operation section operated by a user.

5. The apparatus of any one of claims 1 to 4,

wherein the device is imaged by an image sensing device connected to the head mounted display.

Technical Field

The invention relates to a device comprising a plurality of markers.

Background

PTL 1 discloses a game device that acquires a frame image obtained by imaging a space in front of the game device, estimates position information and posture information about a game controller in a real space from a position of a Light Emitting Diode (LED) image of the game controller in the frame image, and reflects the estimated position information and/or posture information in processing of a game application.

[ citation list ]

[ patent document ]

[PTL 1]

Japanese patent laid-open No. 2007-296248

Disclosure of Invention

[ problem ] to

A Head Mounted Display (HMD) is mounted on a user's head to provide a Virtual Reality (VR) video space for the user. A user wearing the HMD operates operation buttons in the input device, so that various inputs to the video space can be generated.

In recent years, a technique for tracking the position and posture of a device to reflect obtained information on a three-dimensional (3D) model in a virtual reality space has been widely used. The movement of the player character or game object in the game space is synchronized with the change in the position and posture of the device to be tracked, thereby achieving intuitive operation by the user. The plurality of markers that are lit up are used to track the device, analyze an image obtained by imaging the plurality of markers, and thereby recognize the positions of the marker images in the image, with the result that the position and posture of the device in real space are estimated.

In order to estimate the position and orientation of the apparatus with high accuracy, it is necessary to accurately recognize the position of the marker image within the image obtained by the imaging apparatus. When an image other than the marker is included in the captured image, this may adversely affect the accuracy of estimation of the position and posture of the apparatus, and therefore it is preferable to prevent such an image from being included in the captured image.

Therefore, an object of the present invention is to prevent light-emitting portions other than markers in a device including a plurality of markers from being included in a captured image. Note that although the device may be an input device including operation buttons, the device may be only a tracked target device including no operation part.

[ solution of problem ]

In order to solve the above-described problems, an apparatus according to an aspect of the present invention includes a housing and a plurality of markers configured to emit light to the outside of the housing. The apparatus includes a control section for lighting a plurality of marks at predetermined intervals, and the control section lights an indicator indicating a state of the apparatus during a period in which the plurality of marks are not lighted.

Drawings

Fig. 1 is a diagram showing a configuration example of an information processing system in the embodiment.

Fig. 2 is a diagram illustrating an example of an external shape of the HMD.

Fig. 3 is a diagram showing functional blocks of the HMD.

Fig. 4 is a diagram showing an external shape of the input device.

Fig. 5 is a graph showing an example of a light emission pattern for synchronization processing, which is used to identify an imaging time by the image sensing apparatus.

Fig. 6 is a diagram showing functional blocks of the input device.

Fig. 7 is a diagram showing an example of a part of an image obtained by imaging an input device.

Fig. 8 is a diagram showing a period in which a mark set within an exposure period of the image sensing apparatus is lit.

Fig. 9 is a diagram showing a relationship between a period in which the marker is lit and a period in which the indicator is lit.

Detailed Description

Fig. 1 depicts an example of the configuration of an information processing system 1 in the embodiment. The information processing system 1 includes an information processing apparatus 10, a recording apparatus 11, an HMD 100, an input apparatus 16 operated by a finger of a user, and an output apparatus 15 that outputs images and sounds. The output device 15 may be a television set. The information processing apparatus 10 is connected to an external network 2 such as the internet through an Access Point (AP) 17. The AP17 has functions of a wireless access point and a router, and the information processing apparatus 10 may be connected to the AP17 with a cable, or may be connected to the AP17 with a known wireless communication protocol.

The recording device 11 records system software and applications such as game software. The information processing apparatus 10 can download game software from a content server to the recording apparatus 11 via the network 2. The information processing apparatus 10 executes game software and supplies image data and sound data of the game to the HMD 100. The information processing apparatus 10 and the HMD 100 may be connected to each other using a known wireless communication protocol, or may be connected to each other using a cable.

The HMD 100 is a display device mounted to the head by the user for displaying images on a display panel located in front of the eyes. The HMD 100 displays an image for the left eye on a display panel for the left eye and an image for the right eye on a display panel for the right eye, respectively. These images form parallax images seen from left and right viewpoints to realize a stereoscopic view. In order for the user to view the display panel through the optical lens, the information processing apparatus 10 provides the HMD 100 with parallax image data obtained by correcting optical distortion caused by the lens.

Although the user wearing the HMD 100 does not need the output device 15, the output device 15 is prepared so that another user can see the display image of the output device 15. Although the information processing apparatus 10 may display the same image as seen by the user wearing the HMD 100 on the output apparatus 15, another image may be displayed. For example, when a user wearing the HMD 100 plays a game with another user, a game image on a viewpoint from a character of the other user may be displayed from the output device 15.

The information processing apparatus 10 and the input apparatus 16 may be connected to each other using a known wireless communication protocol, or may be connected to each other using a cable. The input device 16 includes a plurality of operation members such as operation buttons, and the user operates the operation members with fingers while gripping the input device 16. When the information processing device 10 executes a game, the input device 16 is used as a game controller. The input device 16 is equipped with an attitude sensor including a three-axis acceleration sensor and a three-axis gyro sensor, and transmits sensor data to the information processing device 10 at a predetermined cycle (for example, 1600 hz).

The game of the present embodiment processes not only the operation information of the operation member in the input device 16 but also the operation information of the position, posture, motion, and the like of the input device 16, and reflects the operation information on the motion of the player character within the virtual three-dimensional space. For example, the operation information of the operation member may be used as information for moving the player character, and the operation information of the position, posture, movement, and the like of the input device 16 may be used as information for moving the arm of the player character. In a battle scene within the game, the movement of the input device 16 is reflected on the movement of the player character holding the weapon, thus enabling intuitive operation by the user, with the result that the user's sense of immersion in the game is enhanced.

In order to track the position and posture of the input device 16, a plurality of markers (light emitting portions) that can be imaged by the image sensing device 14 mounted on the HMD 100 are provided on the input device 16. The information processing apparatus 10 analyzes an image obtained by imaging the input apparatus 16 to estimate position information and orientation information of the input apparatus 16 in a real space, and provides the estimated position information and orientation information to the game.

On the HMD 100, a plurality of image sensing apparatuses 14 are mounted. The plurality of image sensing apparatuses 14 are connected to different positions of the front surface of the HMD 100 in different postures so that the entire imaging range obtained by adding up the imaging ranges of the image sensing apparatuses 14 includes the entire field of view of the user. The image sensing device 14 is preferably an image sensor that can acquire images of a plurality of indicia in the input device 16. For example, when the marker emits visible light, the image sensing device 14 includes a visible light sensor used in a general digital camera, such as a Charge Coupled Device (CCD) sensor or a Complementary Metal Oxide Semiconductor (CMOS) sensor. When the marker emits invisible light, the image sensing device 14 includes an invisible light sensor. The plurality of image sensing apparatuses 14 image a space in front of the user at synchronized times and at predetermined intervals (for example, 60 frames/second), and transmit image data obtained through the imaging input apparatus 16 to the information processing apparatus 10.

The information processing apparatus 10 identifies the positions of a plurality of marker images of the input apparatus 16 included in the captured image. Note that although one input device 16 can be imaged with a plurality of image sensing devices 14 at the same time, since the connection positions and connection postures of the image sensing devices 14 are known, the information processing device 10 combines a plurality of captured images to recognize the positions of the marker images.

The three-dimensional shape of the input device 16 and the position coordinates of the plurality of markers arranged on the surface thereof are known, and therefore, the information processing device 10 estimates the position coordinates and the posture of the input device 16 based on the distribution of the marker images in the captured image. The position coordinates of the input device 16 may be position coordinates in a three-dimensional space with the reference position as an origin, and the reference position may be position coordinates (longitude and latitude) set before the game starts.

Note that the information processing apparatus 10 may also estimate the position coordinates and the orientation of the input apparatus 16 by using sensor data detected by an orientation sensor in the input apparatus 16. Therefore, the information processing apparatus 10 of the present embodiment uses the estimation result based on the captured image obtained using the image sensing apparatus 14 and the estimation result based on the sensor data in order to perform the tracking processing on the input apparatus 16 with high accuracy.

Fig. 2 depicts an example of the external shape of the HMD 100. The HMD 100 includes an output mechanism portion 102 and a mounting mechanism portion 104. The mounting mechanism portion 104 includes a mounting strap 106 to be worn by a user around the head to secure the HMD 100 to the head. The mounting strap 106 has a material or structure that can be adjusted in length to the user's head circumference.

The output mechanism section 102 includes a housing 108 having a shape covering the left and right eyes in a state where the HMD 100 is worn by the user, and further includes therein a display panel directly opposed to the eyes when the HMD 100 is worn. The display panel may be a liquid crystal panel, an organic Electroluminescent (EL) panel, or the like. Within the housing 108, there is also included a pair of left and right optical lenses positioned between the display panel and the user's eyes for magnifying the user's viewing angle. The HMD 100 may also include speakers and headphones at locations corresponding to the user's ears, or may be configured to connect with external headphones.

A plurality of image sensing devices 14a, 14b, 14c, and 14d are disposed on the outer surface of the front side of the housing 108. With respect to the direction of the user's sight line, the image sensing device 14a is connected to the upper right corner of the front side outer surface so that the camera optical axis points diagonally upward to the right, the image sensing device 14b is connected to the upper left corner of the front side outer surface so that the camera optical axis points diagonally upward to the left, the image sensing device 14c is connected to the lower right corner of the front side outer surface so that the camera optical axis points diagonally downward to the right, and the image sensing device 14d is connected to the lower left corner of the front side outer surface so that the camera optical axis points diagonally downward to the left. The plurality of image sensing apparatuses 14 are installed in this manner, and therefore, the overall imaging range obtained by adding the imaging ranges of the image sensing apparatuses 14 includes the entire field of view of the user. The user's field of view may be the user's field of view in a virtual three-dimensional space.

The HMD 100 transmits sensor data detected by the posture sensor and image data obtained by imaging by the image sensing device 14 to the information processing device 10, and receives game image data and game sound data generated in the information processing device 10.

Fig. 3 depicts functional blocks of the HMD 100. The control section 120 is a main processor that processes and outputs various types of data, such as image data, sound data, and sensor data and commands. The storage section 122 temporarily stores data, commands, and the like processed by the control section 120. The gesture sensor 124 acquires sensor data regarding movement of the HMD 100. The attitude sensor 124 includes at least a three-axis acceleration sensor and a three-axis gyro sensor.

The communication control section 128 transmits data output from the control section 120 to the external information processing apparatus 10 through a network adapter or an antenna by wired or wireless communication. Further, the communication control section 128 receives data from the information processing apparatus 10 and outputs it to the control section 120.

When the control section 120 receives game image data or game sound data from the information processing apparatus 10, the control section 120 supplies the game image data to the display panel 130 and causes the display panel 130 to display the image, or supplies the game sound data to the sound output section 132 and causes the sound output section 132 to output the sound. The display panel 130 includes a display panel 130a for the left eye and a display panel 130b for the right eye, and a parallax image pair is displayed on the display panel. Further, the control section 120 causes the sensor data from the posture sensor 124, the sound data from the microphone 126, and the captured image data from the image sensing device 14 to be transmitted from the communication control section 128 to the information processing device 10.

Fig. 4 depicts the external shape of the input device 16. Fig. 4(a) depicts the front shape of the input device 16, and fig. 4(b) depicts the back shape of the input device 16. The input device 16 includes a housing 20, a plurality of operation members 22a, 22b, 22c, and 22d (hereinafter referred to as "operation members 22" when they are not particularly distinguished) operated by the user, and a plurality of markers 30a to 30t (hereinafter referred to as "markers 30" or "markers 30" when they are not particularly distinguished) for emitting light to the outside of the housing 20. The operation member 22 is disposed at the head of the housing 20, and includes a dummy lever for performing a tilting operation, a button, a trigger button for inputting a pulling amount, and the like. An indicator 32 indicating the status of the input device 16 is disposed at the head of the housing 20. Indicator 32 may include an LED device that displays the battery charge status of input device 16.

The housing 20 includes a grip portion 21 and a curved portion 23, the curved portion 23 connecting the housing head portion and the housing bottom portion, and the user passes fingers from the index finger to the little finger between the grip portion 21 and the curved portion 23 to grasp the grip portion 21. In a state where the user grips the grip portion 21, the user operates the operation members 22a, 22b, and 22c with the thumbs and operates the operation member 22d with the index finger. Although the marks 30h, 30i, and 30j are provided on the grip portion 21, they are arranged at positions that are not hidden by the hand even in a state where the user grips the grip portion 21. At least one or more markers 30 are provided on the grip portion 21 to improve the accuracy of tracking.

The marker 30 is a light emitting portion that emits light to the outside of the housing 20, and includes a resin portion that diffusely emits light from a light source such as an LED element to the outside on the surface of the housing 20. The markers 30 are imaged with the image sensing device 14 for use in estimation processing of position information and pose information about the input device 16. Preferably, since the image sensing device 14 images the input device 16 at predetermined intervals (e.g., 60 frames/second), the markers 30 emit light in synchronization with the periodic imaging time of the image sensing device 14 and do not emit light in the non-exposure period of the image sensing device 14 to reduce unnecessary power consumption. The image sensing device 14 and the input device 16 operate based on respective clocks, and in the present embodiment, the exposure period of the image sensing device 14 and the period of lighting the mark 30 are subjected to the following synchronization processing.

Fig. 5 depicts an example of a light emission pattern of the synchronization process for identifying the imaging time of the image sensing device 14. The length in the horizontal direction indicates an imaging interval (16.7 msec) corresponding to one frame, and the lighting control of the mark 30 is performed in units of a time grid obtained by dividing the imaging interval. In this example, the imaging interval is divided into 32 parts, and one time grid is 521 μ s. In fig. 5, a colored time grid indicates the lighting period at the first luminance, and an uncolored time grid indicates the lighting period at the second luminance. Note that the first luminance is different from the second luminance, and the first luminance may be higher than the second luminance. In the case where the mark 30 is imaged when light is emitted at the first brightness, a high-brightness mark image is included in the captured image, and in the case where the mark 30 is imaged when light is emitted at the second brightness, a low-brightness mark image is included in the captured image. The light emission pattern is determined such that when the mark 30 is successively imaged at imaging intervals corresponding to one frame to obtain six captured images, the order of emitting light at the first luminance and emitting light at the second luminance differs depending on the time grid.

In the synchronization process, the lighting control of one or more markers 30 imaged using the image sensing device 14 is performed using the light emission pattern shown in fig. 5. It is assumed that the marker 30 is imaged under such lighting control that, after the synchronization process starts, the marker 30 is lit at "first brightness" for the first captured image (frame 0), the marker 30 is lit at "second brightness" for the second captured image (frame 1), the marker 30 is lit at "first brightness" for the third captured image (frame 2), the marker 30 is lit at "second brightness" for the fourth captured image (frame 3), the marker 30 is lit at "second brightness" for the fifth captured image (frame 4), and the marker 30 is lit at "first brightness" for the sixth captured image (frame 5). The time grid corresponding to the combination of the first luminance and the second luminance in the six successively captured images is the time grid of grid number 14. Therefore, thereafter, the input device 16 periodically lights the marker 30 at the timing of the grid number 14, and thus can perform control to synchronize the period in which the marker 30 is lighted with the exposure period of the image sensing device 14 to prevent the marker 30 from being lighted in the non-exposure period of the image sensing device 14.

Fig. 6 depicts functional blocks of the input device 16. The control section 50 receives operation information input to the operation member 22, and also receives sensor data acquired using the attitude sensor 52. The posture sensor 52 acquires sensor data on the movement of the input device 16, and includes at least a three-axis acceleration sensor and a three-axis gyro sensor. The control section 50 supplies the received operation information and sensor data to the communication control section 54. The communication control section 54 transmits the operation information and the sensor data output from the control section 50 to the information processing apparatus 10 through a network adapter or an antenna by wired or wireless communication. Further, the communication control section 54 acquires a light emission pattern and/or a light emission instruction for synchronization processing from the information processing apparatus 10.

The input device 16 includes a light source 56 for illuminating the indicator 32 and a light source 58 for illuminating the indicia 30. Each of the light source 56 and the light source 58 may be an LED element. The mark 30 includes a resin portion that diffuses light emission to the outside on the surface of the housing 20, and the resin portion of the mark 30 illuminated by the light source 58 may be a resin for sealing the LED element. Here, the marker 30 and the light source 58 may have the form of one LED device.

The indicator 32 functions to inform the user of the charge status of the battery in the input device 16. The light source 56 may emit light of multiple colors, and the indicator 32 may indicate the charge status by illuminating the color. For example, when the indicator 32 is lit in green, this indicates that the state of charge is sufficient, whereas when the indicator 32 is lit in red, it indicates that the remaining amount of power in the battery is small. The user can recognize the charge state of the battery from the lighting color of the indicator 32.

Fig. 7 shows an example of a portion of an image obtained by the imaging input device 16. As shown, the captured image includes an image of the luminescent marker 30. In the HMD 100, the communication control portion 128 transmits image data obtained by imaging by the image sensing apparatus 14 to the information processing apparatus 10, and the information processing apparatus 10 extracts an image of the marker 30 from the image data. In the synchronization process, the information processing apparatus 10 can distinguish between the case where the marker 30 emits light at the first luminance and the case where the marker 30 emits light at the second luminance.

When the synchronization processing of the exposure time period of the image sensing device 14 and the time period in which the marker 30 is lit is performed before the game starts, the synchronization processing may be performed while the game is being played. Since the image sensing device 14 cannot image the image of the marker 30 when synchronization is lost, in the case of loss of synchronization, the synchronization process must be performed immediately.

In the synchronization process, the control section 50 causes the one or more light sources 58 to emit light in the light emission pattern for the synchronization process provided by the information processing apparatus 10 (see fig. 5). As shown in fig. 5, in the light emission mode, a period in which the marker 30 is lit at the first luminance and a period in which the marker 30 is lit at the second luminance are determined in a plurality of frame periods. The information processing apparatus 10 recognizes a change pattern of the luminance value of the marker 30 included in a plurality of continuously captured images to recognize a time grid number included in the exposure period of the image sensing apparatus 14. For example, the exposure period may be set to a length of about twice the time grid.

Fig. 8 depicts a period of time during which the marker 30 set within the exposure period of the image sensing device 14 is illuminated. When the information processing apparatus 10 recognizes the time grid number, the information processing apparatus 10 generates an instruction to emit light at the time of the time grid number and transmits it to the input apparatus 16. In the input device 16, based on the light emission instruction, the control section 50 periodically causes all the markers 30 to emit light at the time positions of the grid numbers 14. After the synchronization is established, the control part 50 turns on the light source only once in a grid (521 microseconds) for one frame period and turns off the light source in a period other than the once grid, so that wasteful power consumption can be reduced.

Although in the present embodiment, the indicator 32 notifies the user of the state of charge of the battery using a lighting color, since the indicator 32 is not the mark 30, it is not desirable to include the image of the indicator 32 in the captured image. Therefore, after the synchronization is established, the control portion 50 lights the indicator 32 in a period in which the markers 30 are not lighted while the plurality of markers 30 are lighted at predetermined intervals.

Fig. 9 shows a relationship between a period of time during which the marker is illuminated and a period of time during which the indicator is illuminated. As also shown in fig. 8, the control section 50 sets a period during which the marker 30 is lit within the exposure period of the image sensing device 14. On the other hand, the control portion 50 controls the light emission of the light source 56 to prevent the period during which the indicator 32 is lit from overlapping with the period during which the marker 30 is lit. Specifically, the control section 50 causes the light source 56 to emit light for a period in which the marker 30 is not illuminated to illuminate the indicator 32. In other words, the control portion 50 does not light the indicator 32 for the time period in which the marker 30 is lighted.

Referring to the exposure period of the image sensing device 14, the control section 50 does not light the indicator 32 during the exposure period in which the image sensing device 14 images the input device 16. In this way, the image sensing device 14 does not image the illuminated indicator 32. Note that since the image sensing device 14 and the input device 16 operate based on respective clocks, the exact time at which the image sensing device 14 starts exposure remains unclear to the control section 50. However, since the control portion 50 has the length of the exposure period, the control portion 50 may establish an additional predetermined period around the period in which the marker 30 is lighted, thereby setting a period in which the indicator is prohibited from lighting, which period of course includes the exposure period. The control portion 50 may determine the time period during which the indicator 32 is illuminated to fall within a time period other than the time period during which the indicator is inhibited from being illuminated. The predetermined time period may be determined with reference to a time grid, and the control portion 50 may establish an additional time period corresponding to two or more time grids around the time period in which the marker 30 is illuminated in order to set the time period in which the indicator is inhibited from illuminating, and may thereby determine the time period in which the indicator 32 will be illuminated.

In the HMD 100, when the communication control portion 128 transmits image data obtained by imaging by the image sensing apparatus 14 to the information processing apparatus 10, the information processing apparatus 10 extracts an image of the marker 30 from the image data. Since the three-dimensional shape of the input device 16 and the position coordinates of the markers 30 arranged on the surface thereof are known, the information processing device 10 solves the perspective n-point (PnP) problem from the distribution of the images of the markers 30 within the captured image in order to estimate the position and posture of the input device 16 with respect to the image sensing device 14.

The present invention has been described above based on this embodiment. The above embodiments are illustrative, and those skilled in the art will understand that various variations of the constituent elements thereof and combinations of processes are possible and that such variations are also within the scope of the present invention.

Although the arrangement of the plurality of markers 30 in the input device 16 including the operation part 22 is described in this embodiment, the device to be tracked does not necessarily need to include the operation part 22. Further, although in this embodiment, the image sensing device 14 is connected to the HMD 100, the image sensing device 14 may be connected to a location other than the HMD 100 as long as the image sensing device 14 can image the marker image.

[ Industrial Applicability ]

The invention may be used with devices that include multiple markers.

[ list of reference symbols ]

1: information processing system

10: information processing apparatus

14: image sensing apparatus

16: input device

30: marking

32: indicator device

50: control section

56. 58: a light source.

The claims (modification according to treaty clause 19)

1. An apparatus, comprising:

a housing; and

a plurality of markers configured to emit light to an outside of the housing,

wherein the apparatus comprises:

an indicator configured to indicate a status of the device; and

a control section configured to light up the plurality of marks at predetermined intervals,

the light source of the indicator is different from the light sources of the plurality of marks, and the control portion lights the indicator during a period in which the plurality of marks are not lighted.

2. The apparatus as set forth in claim 1, wherein,

wherein the control portion does not light the indicator for a period of time in which the plurality of markers are lit.

3. The apparatus of claim 1 or 2,

wherein the control section does not light the indicator for an exposure period during which the image sensing apparatus images the apparatus.

4. The apparatus of any one of claims 1 to 3,

wherein the device is an input device including an operation section operated by a user.

5. The apparatus of any one of claims 1 to 4,

wherein the device is imaged by an image sensing device connected to the head mounted display.

19页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:生成适用于游戏应用程序的游戏机器人

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类