Expansion control device and image control method

文档序号:593451 发布日期:2021-05-28 浏览:16次 中文

阅读说明:本技术 扩充控制装置及影像控制方法 (Expansion control device and image control method ) 是由 李国玄 于 2020-07-30 设计创作,主要内容包括:一种扩充控制装置及影像控制方法,适于配合一电子设备,电子设备显示有图形化使用者界面。扩充控制装置包括供接收影像信号并发送输入信号的通信模块以及多个输入显示模块。输入显示模块包括响应输入操作产生输入信号的输入单元及根据影像信号显示的显示单元。电子设备依图形化使用者界面的操作区域中的影像产生第一影像信号,使操作区域中的影像分别映射至输入显示模块的显示单元上显示。图形化使用者界面中对应的操作区域根据该些输入信号执行对应的操作指令。(An expansion control device and an image control method are suitable for being matched with an electronic device, and the electronic device displays a graphical user interface. The expansion control device comprises a communication module for receiving image signals and sending input signals and a plurality of input display modules. The input display module comprises an input unit for responding to input operation to generate an input signal and a display unit for displaying according to an image signal. The electronic equipment generates a first image signal according to the image in the operation area of the graphical user interface, so that the image in the operation area is respectively mapped to the display unit of the input display module for display. And the corresponding operation area in the graphical user interface executes the corresponding operation instruction according to the input signals.)

1. An expansion control device, adapted to cooperate with an electronic device, wherein the electronic device displays a graphical user interface, the graphical user interface having a plurality of operation areas, the expansion control device comprising:

the communication module is in communication connection with the electronic equipment and is used for receiving a plurality of first image signals generated according to images in the operating areas in the graphical user interface from the electronic equipment; and

the input units of the input display modules respectively respond to an input operation to generate a plurality of first input signals, the communication module sends the plurality of first input signals to the electronic equipment, the corresponding operation areas of the graphical user interface execute corresponding operation instructions according to the plurality of first input signals, and the display units of the input display modules respectively map images in the operation areas to the display units of the input display modules to display according to the plurality of first image signals.

2. The expansion control device as claimed in claim 1, wherein the input unit comprises a touch panel disposed corresponding to the display surface of the display unit, the input operation being a touch operation.

3. The expansion control device of claim 1, wherein the input unit comprises a switch, and the input operation is a keystroke operation.

4. The expansion control device as claimed in claim 1, wherein the graphical user interface further comprises a plurality of interactive areas, the electronic device further generates second image signals according to images in the interactive areas, the expansion control device further comprises a touch screen, the touch screen divides a plurality of mapping areas and generates a plurality of second input signals in response to a touch operation corresponding to the mapping areas, the interactive areas corresponding to the graphical user interface of the electronic device execute a corresponding interactive command according to the second input signals, and the expansion control device causes the images in the interactive areas to be mapped to the mapping areas for display according to the second image signals.

5. The expansion control device according to claim 4, further comprising a processor connected between the communication module and the touch screen.

6. The expansion control device according to claim 1, further comprising a plurality of processors, one ends of which are connected to the communication module, and the other ends of which are adapted to be connected to the plurality of input display modules one-to-one to control the input unit and the display unit of the connected input display module.

7. The expansion control device of claim 1, further comprising a three-dimensional motion detection module and a processor, wherein the three-dimensional motion detection module comprises:

a plane sensing unit for sensing a plane coordinate displacement of a dynamic object; and

a distance sensing unit for sensing a vertical distance relative to the dynamic object;

the processor calculates a plane moving distance of the dynamic object according to the vertical distance and the plane coordinate displacement, and obtains three-dimensional moving information of the dynamic object by matching with the change of the vertical distance of the dynamic object.

8. The expansion control device of claim 7, wherein the plane sensing unit comprises:

an infrared sensor for detecting the existence of the dynamic object; and

an image sensor for acquiring a plurality of time-series images of the dynamic object;

the processor identifies the characteristics corresponding to the dynamic object in the time sequence images, and obtains the plane coordinate displacement according to the displacement of the characteristics.

9. The expansion control device of claim 7, wherein the distance sensing unit comprises:

a sonar sensor for sensing a separation distance relative to the dynamic object; and

a proximity sensor having an effective detection region for determining that the dynamic object exists in the effective detection region;

the processor obtains the vertical distance according to the spacing distance when the dynamic object exists in the effective detection interval.

10. The expansion control device of claim 1, further comprising a peripheral device, wherein the peripheral device is a microphone, a joystick, a button, a touch pad, a vibration motor, or a light.

11. An image control method suitable for being matched with an electronic device is characterized by comprising the following steps:

a plurality of operation areas of a graphical user interface of the electronic equipment respectively display an image;

the electronic equipment generates a plurality of first image signals according to the plurality of images;

the electronic equipment outputs the plurality of first image signals to an expansion control device, so that images in the plurality of operation areas are respectively mapped to a plurality of display units of the expansion control device to be displayed;

the input unit of the expansion control device responds to an input operation to generate a first input signal; and

the electronic equipment receives the first input signal from the expansion control device so as to enable the corresponding operation area in the graphical user interface to execute a corresponding operation instruction.

12. The image control method as claimed in claim 11, wherein the input unit comprises a touch panel disposed corresponding to the display surface of the display unit, the input operation being a touch operation.

13. The image control method as claimed in claim 11, wherein the input unit comprises a switch, and the input operation is a keystroke operation.

14. The image control method as claimed in claim 11, further comprising:

the electronic equipment generates a plurality of second image signals according to the images in the interaction areas;

the electronic equipment outputs the plurality of second image signals to the expansion control device, so that the expansion control device respectively maps the images in the plurality of interaction areas to a plurality of mapping areas in a touch screen of the expansion control device according to the plurality of second image signals for display;

the expansion control device generates a plurality of second input signals according to a touch operation respectively corresponding to the mapping areas; and

the electronic device receives the second input signals to enable the corresponding interaction areas in the graphical user interface to execute a corresponding interaction instruction.

Technical Field

The present invention relates to an expansion device, and more particularly, to an expansion control device and an image control method.

Background

Existing electronic games are usually controlled by a user through input interfaces such as a joystick, a key, a keyboard, a mouse, etc. These input interfaces are not intuitive and require the user to exercise familiarity and even have to store the function of each key to play normally.

Disclosure of Invention

In view of the above, an embodiment of the present invention provides an expansion control device, which is suitable for being matched with an electronic device. The electronic equipment is displayed with a graphical user interface, and the graphical user interface is provided with a plurality of operation areas.

The expansion control device comprises a communication module and a plurality of input display modules. The communication module is in communication connection with the electronic equipment to receive a plurality of first image signals generated according to images in an operation area in the graphical user interface from the electronic equipment. Each input display module comprises an input unit and a display unit, wherein the input unit in the input display module respectively responds to an input operation to generate a plurality of first input signals, the first input signals are sent to the electronic equipment through the communication module, the corresponding operation area in the graphical user interface executes a corresponding operation instruction according to the first input signals, and the display unit in the input display module respectively maps images in the operation area to the display units of the input display modules to display according to the first image signals. Therefore, the user can directly operate and interact with the input display module on the expansion control device.

An embodiment of the present invention further provides an image control method, including: a plurality of operation areas of a graphical user interface of the electronic equipment respectively display an image; the electronic equipment generates a plurality of first image signals according to the images; the electronic equipment outputs the first image signals to an expansion control device, images in the operation areas are mapped to a plurality of display units of the expansion control device respectively to be displayed, the expansion control device responds to an input operation respectively to generate a plurality of first input signals, and the electronic equipment receives the first input signals from the expansion control device; so that the corresponding operation area in the graphical user interface executes a corresponding operation instruction.

In some embodiments, the input unit includes a touch panel disposed corresponding to the display surface of the display unit, and the input operation is a touch operation.

In some embodiments, the input unit is a switch and the input operation is a one-click operation.

In some embodiments, the graphical user interface further includes a plurality of interaction areas, and the electronic device generates the second image signal according to images in the plurality of interaction areas. The expansion control device also comprises a touch screen, wherein the touch screen divides a plurality of mapping areas and responds to touch operation respectively corresponding to the mapping areas to generate a plurality of second input signals. The electronic equipment outputs the second image signals to the expansion control device, so that the expansion control device respectively maps the images in the interaction areas to the mapping areas of the expansion control device according to the second image signals for display; the electronic equipment receives the second input signals so that the corresponding interaction areas in the graphical user interface execute a corresponding interaction instruction.

In some embodiments, the expansion control device further comprises a processor, and the processor is connected between the communication module and the touch screen.

In some embodiments, the expansion control device further comprises a plurality of processors, one end of each processor is connected with the communication module, and the other end of each processor is suitable for being connected with the input display module in a one-to-one manner to control the input unit and the display unit of the connected input display module.

In some embodiments, the expansion control device further comprises a three-dimensional motion detection module and a processor. The three-dimensional motion detection module comprises a plane sensing unit and a distance sensing unit. The plane sensing unit is used for sensing the plane coordinate displacement of the dynamic object. The distance sensing unit is used for sensing a vertical distance relative to the dynamic object. The processor calculates the plane movement distance of the dynamic object according to the vertical distance and the plane coordinate displacement, and obtains the three-dimensional movement information of the dynamic object by matching the change of the vertical distance of the dynamic object.

In some embodiments, the planar sensing unit includes an infrared sensor and an image sensor. The infrared sensor is used for detecting the existence of the dynamic object. The image sensor is used for acquiring a plurality of time sequence images of the dynamic object. The processor identifies the characteristics corresponding to the dynamic object in the time sequence image and obtains the plane coordinate displacement according to the displacement of the characteristics.

In some embodiments, the distance sensing unit includes a sonar sensor and a proximity sensor. The sonar sensor is used for sensing the spacing distance relative to the dynamic object. The proximity sensor has an effective detection region for determining the existence of the dynamic object in the effective detection region. The processor obtains the vertical distance according to the spacing distance when the dynamic object exists in the effective detection interval.

In some embodiments, the expansion control device further comprises a peripheral device, wherein the peripheral device is a microphone, a joystick, a button, a touch pad, a vibration motor or a light.

In summary, according to the embodiments of the present invention, compared to the original electronic device, it is able to provide multi-element and intuitive operations, increase the user experience, reduce the operation difficulty of the user, and manage a part of hardware by a plurality of processors respectively, so that a lower-order processor can be selected, thereby saving the cost and energy consumption.

Drawings

FIG. 1 is a schematic diagram of an expansion control device according to a first embodiment of the present invention.

Fig. 2 is a circuit block diagram of an expansion control device according to a first embodiment of the invention.

FIG. 3 is a flowchart illustrating an image control method according to a first embodiment of the present invention.

FIG. 4 is a schematic diagram of an expansion control device according to a second embodiment of the present invention.

FIG. 5 is a circuit block diagram of an expansion control device according to a second embodiment of the present invention.

FIG. 6 is a flowchart illustrating an image control method according to a second embodiment of the present invention.

FIG. 7 is a schematic diagram of an expansion control device according to a third embodiment of the present invention.

Fig. 8 is a circuit block diagram of an expansion control device according to a third embodiment of the present invention.

Fig. 9 is a schematic measurement diagram of a three-dimensional motion detection module according to a third embodiment of the invention.

Fig. 10 is a three-dimensional motion detection flowchart according to a third embodiment of the invention.

Description of reference numerals:

electronic device 100

Graphic user interface 110

Operation regions 120, 120a to 120d

Interaction areas 141, 141a, 141b

Expansion control device 300

Communication module 310

Input display modules 320, 320 a-320 d

Input unit 321

Switch 3211

Touch panel 3212

Display unit 322

Processors 330, 350, 370

Touch screen 340

Mapping regions 341, 341a, 341b

Three-dimensional motion detection module 360

Planar sensing unit 361

Distance sensing unit 362

Infrared sensor 363

Sonar sensor 364

Image sensor 365

Proximity sensor 366

Peripheral device 380

Microphone 381

Rocker 382

Key 383

Touch pad 384

Vibrating motor 385

Light 386

Dynamic object 700

Axle X, Y, Z

Vertical distance H

Distance D of plane movement

Plane coordinate displacement d

Focal length l

Steps S401 to S405

Steps S601 to S605

Steps S801 to S807

Detailed Description

Referring to fig. 1, fig. 1 is a schematic diagram illustrating an expansion control device 300 according to a first embodiment of the present invention. The expansion control device 300 is adapted to cooperate with the electronic apparatus 100 to provide an operation interface for a user to control the electronic apparatus 100. The electronic apparatus 100 may be, for example, a computing device with software execution capability, such as a desktop computer, a notebook computer, a tablet computer, a mobile phone, etc., and it has hardware, such as a processor, a memory, a storage medium, etc., without excluding that other required hardware may be included, such as a network interface in case of network resources. The electronic device 100 executes an application program such as, but not limited to, game software, and displays a graphical user interface 110, where the graphical user interface 110 has a plurality of operation areas 120, and here, four operation areas 120 a-120 d are taken as an example.

Referring to fig. 1 and fig. 2 together, fig. 2 is a circuit block diagram of an expansion control device 300 according to a first embodiment of the present invention. The expansion control device 300 includes a communication module 310 and a plurality of input display modules 320, here, four input display modules 320 a-320 d are taken as an example. The communication module 310 communicatively connects the electronic device 100 to communicate signals with the electronic device 100. The communication module 310 supports a wired transmission interface such as Universal Serial Bus (USB), or supports a wireless transmission interface such as Bluetooth (Bluetooth) or wireless hot spot (Wi-Fi).

In some embodiments, the expansion control device 300 further includes a plurality of processors 330 connected between the communication module 310 and the plurality of input display modules 320 to control the input display modules 320. One end of the processors 330 is connected to the communication module 310, and the other end of the processors 330 is adapted to be connected to the input display module 320 in a one-to-one manner. Therefore, compared with only one arithmetic unit, the multiple processors 330 share the arithmetic resources, and the hardware with low arithmetic resources and simplified connection interface can be adopted.

In some embodiments, the number of processors 330 may be less than the number of input display modules 320. That is, some or all of the processor 330 may be connected to a plurality of input display modules 320.

The single input display module 320 includes an input unit 321 and a display unit 322. The input unit 321 is used for a user to perform an input operation and generates an input signal (hereinafter referred to as a "first input signal") in response to the input operation. In some embodiments, the input display module 320 is in a form of a button capable of receiving an input operation of a keystroke operation from a user, and the input unit 321 includes a switch 3211 for detecting the keystroke operation. In some embodiments, the input unit 321 includes a touch panel 3212 capable of receiving an input operation from a user. Here, the touch panel 3212 is disposed corresponding to the display surface of the display unit 322, i.e., the touch area of the touch panel 3212 substantially overlaps with the display surface of the display unit 322.

The display unit 322 receives a video signal (hereinafter referred to as a "first video signal") transmitted by the electronic device 100 via the communication module 310 to display a picture according to the first video signal. The Display unit 322 may be a Display panel such as an Organic Light-Emitting Diode (OLED) or a Liquid-Crystal Display (Liquid-Crystal Display).

Here, how the first image signal is generated will be described. Referring to fig. 3, a flowchart of an image control method according to a first embodiment of the invention is shown. First, a plurality of operation areas 120a to 120d of a graphical user interface 110 of the electronic device 100 respectively display an image (step S401). Next, in step S402, the electronic apparatus 100 generates a plurality of first image signals according to the images. Then, the electronic device 100 outputs the first image signals to the expansion control device 300, so that the images in the operation areas 120a to 120d are respectively mapped to the plurality of display units 322 of the expansion control device 300 for display (step S403).

In detail, the electronic device 100 can be used for the user to set the pairing relationship between the operation region 120 and the input display module 320 on the graphical user interface 110. For example, the image in the operation area 120a is mapped to the display unit 322 of the input display module 320a for display; the image in the operation area 120b is mapped to the display unit 322 of the input display module 320b for display. The electronic device 100 can acquire images in each of the operation areas 120, encode the acquired images into first image signals, and transmit the first image signals to the corresponding processors 330 of the expansion control device 300 according to the set pairing relationship. The image acquisition may be performed a single time, multiple times, or continuously. The processor 330 decodes the first video signal after receiving the first video signal, and controls the display unit 322 to display the video. Therefore, the images in the operation regions 120a to 120d on the gui 110 are displayed on the display units 322 corresponding to the input display modules 320a to 320d, respectively.

In some embodiments, since the pixel size and shape of the operation area 120 may be different from the resolution and shape of the display unit 322, image processing, such as zooming in, zooming out, cropping, etc., is required on the image of the operation area 120 to conform to the resolution and shape of the display unit 322. The image processing may be executed by the electronic device 100 or the processor 330, which is not limited in the present invention.

In some embodiments, the display unit 322 is connected to the Processor 330 via a Mobile Industry Processor Interface (MIPI).

Next, how the electronic device 100 operates according to the first input signal generated by the input unit 321 will be described. First, the self-expansion control device 300 generates a first input signal in response to an input operation (step S404). The electronic apparatus 100 receives a first input signal from the expansion control device 300, so that the corresponding operation area 120 in the graphical user interface 110 executes a corresponding operation instruction (step S405). That is, through the aforementioned pairing relationship between the operation region 120 and the input display module 320, the input operation of the input unit 321 of the input display module 320a generates a first input signal, and the operation region 120a executes a corresponding operation instruction according to the first input signal; the input operation of the input unit 321 of the input display module 320b generates a first input signal, and the operation region 120b executes a corresponding operation instruction according to the first input signal. In particular, when the input operation is a keystroke operation of the switch 3211, the processor 330 transmits an input signal representing that the switch 3211 is clicked to the electronic device 100 via the communication module 310. According to the pairing relationship between the operation region 120 and the input display module 320, the electronic device 100 converts the first input signal into a click operation instruction in the corresponding operation region 120. For example, the gui 110 has a virtual button located in the operation area 120, and the application program will execute a feedback action of clicking the virtual button (e.g. causing a character in the game to execute a jumping action) according to the click operation command. Thus, the user performs the keystroke operation on different input display modules 320, which is similar to the click operation on the corresponding operation region 120 in the graphical user interface 110. Similarly, when the input operation is a touch operation of the touch panel 3212, the processor 330 transmits an input signal containing touch information to the electronic device 100 via the communication module 310. According to the pairing relationship between the operation area 120 and the input display module 320, the electronic device 100 converts the first input signal into a touch operation instruction in the corresponding operation area 120. Therefore, the user converts the touch trajectory of the touch panel 3212 of the input display module 320 into the touch trajectory in the corresponding operation area 120, and the application program can execute the corresponding feedback action, such as executing a slider operation for adjusting the volume. In addition, if the touch operation is a click operation, the application program can also perform the action of clicking the virtual button as described above, depending on the feedback action defined by the application program for the touch operation in the operation area 120.

Since the touch coordinates of the touch panel 3212 are not consistent with the touch coordinates mapped to the operation area 120, coordinate conversion of the touch information is required. The coordinate transformation may be performed by the electronic device 100 or the processor 330, but the invention is not limited thereto.

In some embodiments, the touch panel 3212 is implemented via an Integrated Circuit bus (I)2C) Is coupled to the processor 330.

In some embodiments, the switch 3211 is connected to the processor 330 via a General-Purpose Input/Output (GPIO) interface.

In some embodiments, steps S404-S405 can be performed before steps S402-S404, or simultaneously in a multi-threaded manner.

Accordingly, the user can see the image of the corresponding operation region 120 on the display unit 322 of each input display module 320 to perform the input operation on the input display module 320, so that the user can feel intuitive in use and reduce the burden of the user.

Referring to fig. 4 to fig. 6 together, fig. 4 is a schematic diagram illustrating an architecture of an expansion control device 300 according to a second embodiment of the present invention, fig. 5 is a block diagram illustrating a circuit of the expansion control device 300 according to the second embodiment of the present invention, and fig. 6 is a flowchart illustrating an image control method according to the second embodiment of the present invention. The difference between the first embodiment and the second embodiment is that the expansion control device 300 of the second embodiment of the present invention further includes a touch screen 340 and a processor 350. The processor 350 is connected between the communication module 310 and the touch screen 340. Unlike the aforementioned one-to-one pairing relationship between the operation area 120 and the input display module 320, the touch screen 340 can be customized by the user to be paired with the plurality of interaction areas 141 in the graphical user interface 110. The touch screen 340 is divided into a plurality of mapping regions 341 (two are taken as an example here, 341a and 341b respectively), and a one-to-one corresponding pairing relationship between the mapping regions 341 and a plurality of interactive regions 141 (two are taken as an example here, 141a and 141b respectively) in the gui 110 can be set through user operation. As in the first embodiment, the mapping region 341 and the corresponding interaction region 141 are made to correspond to each other according to the matching relationship. The image control method of the present embodiment further includes steps S601 to S605. First, a plurality of interactive regions 141 of the gui 110 respectively display an image (step S601). Next, the electronic apparatus 100 generates a second image signal according to the image in the interaction area 141 (step S602). In step S603, the electronic device 100 outputs a second image signal to the expansion control device 300, so that the expansion control device 300 respectively maps the images in the interactive area 141 to the corresponding mapping areas 341 in the touch screen 340 of the expansion control device 300 according to the second image signal for displaying. In step S604, the expansion control device 300 generates a plurality of second input signals according to a touch operation respectively corresponding to the mapping regions 341. The electronic device 100 receives the second input signals to enable the corresponding interaction areas 141 in the gui 110 to execute a corresponding interaction command (step S605). Please refer to the description of the first embodiment, which will not be repeated herein.

In some embodiments, steps S604-S605 may be performed before steps S602-S604, or simultaneously in a multi-threaded manner.

In some embodiments, the touch screen 340 is connected to the Processor 350 via a Mobile Industry Processor Interface (MIPI). In some embodiments, the touch screen 340 is also connected to the processor 350 via an integrated circuit bus.

Referring to fig. 7 and 8 together, fig. 7 is a schematic diagram illustrating an architecture of an expansion control device 300 according to a third embodiment of the present invention, and fig. 8 is a circuit block diagram of the expansion control device 300 according to the third embodiment of the present invention. The difference between the foregoing embodiments is that the expansion control device 300 according to the third embodiment of the present invention further includes a three-dimensional motion detection module 360 and a processor 370. The processor 370 is connected between the communication module 310 and the three-dimensional motion detection module 360. The three-dimensional motion detection module 360 includes a plane sensing unit 361 and a distance sensing unit 362.

Referring to fig. 9, a measurement diagram of a three-dimensional motion detection module 360 according to a third embodiment of the invention is shown. Regarding the three-dimensional coordinate system, the plane sensing unit 361 is used for sensing a plane coordinate displacement of the dynamic object 700 (here, the palm is taken as an example) on the X-axis Y-axis plane, and the distance sensing unit 362 is used for sensing a vertical distance of the dynamic object 700 on the Z-axis. The processor 370 can calculate the planar moving distance D of the dynamic object 700 according to the vertical distance H and the planar coordinate displacement D. Specifically, the planar moving distance D is calculated according to equation 1, and the focal length l is the focal length of the planar sensing unit 361. The processor 370 may further coordinate the calculated planar movement distance D with the change of the vertical distance H of the dynamic object 700 (i.e. the vertical movement distance), so as to obtain the three-dimensional movement information of the dynamic object 700. Accordingly, the application program can execute corresponding feedback actions according to the three-dimensional movement information.

In detail, the plane sensing unit 361 includes an infrared sensor 363 and an image sensor 365. The focal length l is the focal length of the image sensor 365. The infrared sensor 363 is used to detect the existence of the dynamic object 700. The infrared sensor 363 may be a pyroelectric sensor or a quantum sensor, and detects the presence of the dynamic object 700 by sensing heat or light. The image sensor 365 is used for acquiring a plurality of images (or called time series images) of the dynamic object 700 in time series. The processor 370 may identify the feature corresponding to the dynamic object 700 in the time-series images, and obtain the plane coordinate displacement d according to the displacement of the feature, which will be described later. The distance sensing unit 362 includes a sonar sensor 364 and a proximity sensor 366. The sonar sensor 364 is used to sense the separation distance with respect to the dynamic object 700. The proximity sensor 366 has a valid detection interval, i.e. a minimum value and a maximum value of the detection range on the Z-axis, and the maximum value and the minimum value are the valid detection interval for determining that the dynamic object 700 exists in the valid detection interval. When the processor 370 detects that the dynamic object 700 exists in the valid detection interval through the proximity sensor 366, the vertical distance H can be obtained according to the spacing distance obtained through the sonar sensor 364. Thus, the detection result is doubly confirmed to be correct by the sonar sensor 364 and the proximity sensor 366. In some embodiments, sonar sensors 364 and proximity sensors 366 may be used simultaneously. In some embodiments, to save power, the proximity sensor 366 may be used first, and the sonar sensor 364 is enabled only when the dynamic object 700 is detected to be present within the valid detection interval.

Referring to fig. 10, fig. 10 is a flowchart illustrating a three-dimensional motion detection process according to a third embodiment of the present invention, which is executed by the processor 370. First, the time-series image is acquired (step S801). Then, the time-series images are preprocessed (e.g., the time-series images are divided into a plurality of grids) for facilitating the subsequent feature detection (step S802). In step S803, a feature, which may be, for example, a corner (corner) feature, is identified for the dynamic object 700 in the time-series image. After the foregoing steps S801 to S803 are performed on each time-series image, the displacements of the corresponding features in the time-series images can be compared (step S804), and then the plane coordinate displacement d can be obtained (step S805). Then, the vertical distance H is acquired from the sonar sensor 364 (step S806). Then, the planar moving distance D of the dynamic object 700 can be calculated according to equation 1 (step S807).

In some embodiments, step S806 is not necessarily performed after step S805, but may be performed before step S805.

In some embodiments, the infrared sensor 363 is a thermal imager, and the processor 370 may use the acquired thermal image as the time-series image, and perform the above steps S801 to S805 to obtain another plane coordinate displacement d, and perform double confirmation with the plane coordinate displacement d obtained according to the time-series image of the image sensor 365.

As shown in FIG. 8, the expansion control device 300 may further include one or more peripheral devices 380 coupled to the processor 370. The peripheral device 380 may include a microphone 381, a joystick 382, a button 383, a touch pad 384, a vibration motor 385, and a light 386. The microphone 381 is used for receiving the voice of the user for voice input. The rocker 382, the key 383 and the touch pad 384 are used as input interfaces of other pipelines. The vibration motor 385 can provide a motion sensing function of the vibration. The light 386 may be, for example, a light bar for varying the intensity, on/off, and color of the light in coordination with the application.

In summary, compared with the existing electronic game, the expansion control device and the image control method according to the present invention can provide multi-element and intuitive operations, increase the user experience, reduce the operation difficulty of the user, and manage a part of hardware through a plurality of processors respectively, so that a lower-order processor can be selected, thereby saving the cost and energy consumption.

20页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种智能联网手游设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类