Object judgment system and electronic device capable of selecting operation mode according to object components

文档序号:1597751 发布日期:2020-01-07 浏览:29次 中文

阅读说明:本技术 物体判断系统及可根据物体成份选择运作模式的电子装置 (Object judgment system and electronic device capable of selecting operation mode according to object components ) 是由 王国振 于 2018-12-28 设计创作,主要内容包括:本发明公开了一种物体判断系统和可根据物体成份选择运作模式的电子装置,该物体判断系统包含:光学传感器,包含种类判断区以及成份判断区,其中该光学传感器通过该种类判断区撷取物体的物体影像,并通过该成份判断区取得成份分析光学数据;种类判断电路,用以根据该物体影像判断该物体的物体种类;以及成份分析电路,用以根据该成份分析光学数据以及该物体种类分析该物体的成份。(The invention discloses an object judging system and an electronic device capable of selecting an operation mode according to object components, wherein the object judging system comprises: the optical sensor comprises a type judgment area and a composition judgment area, wherein the optical sensor captures an object image of an object through the type judgment area and obtains composition analysis optical data through the composition judgment area; a kind judging circuit for judging the kind of the object according to the object image; and a composition analyzing circuit for analyzing the composition of the object based on the composition analyzing optical data and the object type.)

1. An object assessment system, comprising:

the optical sensor comprises a type judgment area and a composition judgment area, wherein the optical sensor captures an object image of an object through the type judgment area and obtains composition analysis optical data through the composition judgment area;

a kind judging circuit for judging the kind of the object according to the object image; and

a component analyzing circuit for analyzing the components of the object based on the component analyzing optical data and the object type.

2. The object determination system of claim 1, wherein the component determination region comprises a multi-spectral color filter.

3. The object assessment system according to claim 1, wherein the component assessment area surrounds the species assessment area.

4. The object determination system of claim 1, wherein the optical sensor comprises mixing regions having the same shape, each of the mixing regions comprising a portion of the component determination region and a portion of the species determination region.

5. The object determination system of claim 4, wherein the blending region is square, the portion of the type determination region is 3/4 of the blending region, and the portion of the type determination region is 1/4 of the blending region.

6. The object assessment system according to claim 1, further comprising:

a main controller as the component analyzing circuit;

a sub-controller as the type determination circuit;

wherein the primary controller is in a sleep mode before being awakened by the secondary controller.

7. An electronic device capable of selecting an operation mode according to a composition of an object, comprising:

an object assessment system comprising:

the optical sensor comprises a type judgment area and a composition judgment area, wherein the optical sensor captures an object image of an object through the type judgment area and obtains composition analysis optical data through the composition judgment area;

a kind judging circuit for judging the kind of the object according to the object image; and

a component analyzing circuit for analyzing the components of the object based on the component analyzing optical data and the object type to generate a component analyzing result; and

and the processing circuit is used for selecting the operation mode of the electronic device according to the component analysis result.

8. The electronic device of claim 7, wherein the electronic device is an optical tracking device, and the processing circuit selects a tracking mode of the electronic device according to the result of the component analysis.

9. The electronic device of claim 7, wherein the electronic device is a sweeper, and the processing circuit selects the sweeping intensity of the electronic device based on the component analysis.

10. The electronic device of claim 7, wherein the electronic device is an intelligent wearable electronic device, and the processing circuit selects the electronic device to be in a light emitting mode or a parameter calculating mode according to the component analysis result.

11. The electronic device of claim 7, wherein the component determination region comprises a multi-spectral color filter.

12. The electronic device as claimed in claim 7, wherein the component judgment region surrounds the species judgment region.

13. The electronic device of claim 7, wherein the optical sensor comprises mixing regions having the same shape, each of the mixing regions comprising a portion of the component determination region and a portion of the species determination region.

14. The electronic device of claim 13, wherein the blending region is square, the portion of the type determination region covers 3/4 of the blending region, and the portion of the type determination region covers 1/4 of the blending region.

15. The electronic device of claim 7, further comprising:

a main controller as the component analyzing circuit;

a sub-controller as the type determination circuit;

wherein the primary controller is in a sleep mode before being awakened by the secondary controller.

16. An object assessment system, comprising:

a processing circuit;

a first-stage object sensor for generating a first-stage object sensing result; and

a second stage object sensor for generating a second stage object sensing result of the object when the processing circuit determines that the object is within a predetermined range of the object determination system according to the first stage object sensing result;

the processing circuit further determines the object type of the object according to the second stage object sensing result.

17. The object assessment system according to claim 16, further comprising:

a linear light source for generating linear light;

a zone light source for generating a plane light;

the first-stage object sensor generates a first-stage object sensing result according to the linear light, and the second-stage object sensor generates a second-stage object sensing result according to the planar light.

18. The object determination system of claim 16, wherein the first-stage object sensor is a depth sensor or a thermal sensor.

19. The object determination system of claim 16, wherein the second-stage object sensor is an image sensor or a thermal sensor.

20. The object assessment system according to claim 16, further comprising:

a third-stage object sensor for generating a third-stage object sensing result of the object when the processing circuit determines that the object is within a predetermined range of the object determination system according to the first-stage object sensing result;

the processing circuit determines the object type according to the second-stage object sensing result and the third-stage object sensing result.

21. The object determination system of claim 20, wherein the second-stage object sensor is an image sensor and the third-stage object sensor is a thermal sensor.

22. The object determination system of claim 16, wherein the second-stage object sensor is in a sleep mode before the processing circuit determines that the object is within the predetermined range of the object determination system according to the first-stage object sensing result.

23. The object determination system of claim 16, wherein the processing circuit further transmits the second-stage object sensing result to a user for determination.

Technical Field

The present invention relates to an object determination system and an electronic device using the same, and more particularly, to an object determination system using two stages and an electronic device capable of selecting an operation mode according to the composition of an object.

Background

A conventional sweetness gauge may be used to measure the sweetness level of an object based on the object's spectrum. For example, a sweetmeter may emit light to an object and then calculate a sweetness level of the object from a spectrum of the reflected light from the object.

However, the conventional sweetness meter requires the user to set an appropriate object type, otherwise the calculated sweetness level may not be correct. For example, if the object is an apple, but the user sets the object to be a guava for sweetness calculation, the calculated sweetness level may not be correct

In addition, automatic cleaners are becoming more popular in every home. However, conventional automatic sweepers cannot determine what is in front of them, which can lead to poor results if the automatic sweepers press over something to avoid.

Disclosure of Invention

The present invention discloses an object determination system, which can determine the kind of an object and determine the composition of the object accordingly.

An object of the present invention is to disclose an object determining system, which can determine the kind of an object in two stages.

An embodiment of the present invention discloses an object determination system, which is characterized by including: the optical sensor comprises a type judgment area and a composition judgment area, wherein the optical sensor captures an object image of an object through the type judgment area and obtains composition analysis optical data through the composition judgment area; a kind judging circuit for judging the kind of the object according to the object image; and a composition analyzing circuit for analyzing the composition of the object based on the composition analyzing optical data and the object type.

Another embodiment of the present invention discloses an electronic device capable of selecting an operation mode according to a composition of an object, comprising: an object assessment system, comprising: the optical sensor comprises a type judgment area and a composition judgment area, wherein the optical sensor captures an object image of an object through the type judgment area and obtains composition analysis optical data through the composition judgment area; a kind judging circuit for judging the kind of the object according to the object image; and a component analysis circuit for analyzing the components of the object based on the component analysis optical data and the object type to generate a component analysis result. The electronic device further comprises a processing circuit for selecting an operation mode of the electronic device according to the component analysis result.

Another embodiment of the present invention discloses an object determination system, including:

a processing circuit; a first-stage object sensor for generating a first-stage object sensing result; and a second stage object sensor for generating a second stage object sensing result of the object when the processing circuit determines that the object is within a predetermined range of the object determination system according to the first stage object sensing result; the processing circuit further determines the object type of the object according to the second stage object sensing result.

According to the foregoing embodiments, the object type or the object composition can be automatically obtained, and the electronic device can select an appropriate operation according to the object type or the object composition. Accordingly, the problems of the prior art can be solved. It should be noted, however, that the present disclosure is not limited to solving the problems noted in the prior art.

Drawings

Fig. 1 is a block diagram of an object determination system according to an embodiment of the invention.

Fig. 2 and 3 are examples of the optical sensor in fig. 1.

FIG. 4 is a schematic diagram illustrating operations of the type determining circuit and the component analyzing circuit according to an embodiment of the present invention.

Fig. 5 shows an example of the actual operation of the object determination system shown in fig. 1 according to the present invention.

Fig. 6 is a block diagram of an object determination system according to another embodiment of the invention.

Fig. 7 to 11 are schematic views of an object determination system according to an embodiment of the present invention.

Wherein the reference numerals are as follows:

100,600 object assessment system

101 optical sensor

103 type determination circuit

105 component analyzing circuit

107 memory device

501, Ob _1, Ob _2, Ob _3 objects

601 processing circuit

700 automatic cleaning machine

701 linear light source

901 area light source

M mobile phone

Os _1 first stage object sensor

Os _2 second stage object sensor

U user

Detailed Description

The inventive concept will be described below in several embodiments. Note also that elements in the embodiments may be implemented in hardware (e.g., a circuit or a device) or in firmware or software (e.g., at least one program is installed in a processor). In addition, the elements of each embodiment may be divided into more elements or integrated into fewer elements. Moreover, the steps in each embodiment may be divided into more steps or integrated into fewer steps. Such variations are intended to be included within the scope of the present invention.

Fig. 1 is a block diagram of an object determination system according to an embodiment of the invention. As shown in fig. 1, the object determination system 100 includes an optical sensor 101, a species determination circuit 103, and a component analysis circuit 105. The optical sensor 101 is a sensor that can generate an optical image OI and component analysis optical data OD. The optical sensor 101 includes a species determination region and a composition analysis region. The optical sensor 101 captures at least one object image OI of the object through the species determination region and obtains the composition-analyzed optical data OD through the composition analysis region. The species judgment region and the component analysis region will be described later in more detail. Further, the kind judgment circuit 103 judges the object kind OK of the object based on the object image OI, and the composition analysis circuit 105 analyzes the composition of the object based on the composition analysis optical data OD and the object kind OK to generate a composition analysis result Ear.

In one embodiment, the species judgment region and the component analysis region are defined by color filters provided on the optical sensor 101. In detail, a general color filter such as an RGB color filter or a CMYG color filter is disposed on the component analysis area, and a multispectral color filter is disposed on the kind judgment area. The multispectral color filter may have an array of color filters in a plurality of portions, and each portion may contain a color filter having a different spectrum than the other portions. Thus, the aforementioned component analysis optical data OD may be a spectrum of the object. The component analyzing circuit 105 can analyze the component of the object based on the component analyzing optical data OD. However, the species discriminating region and the component analyzing region may be defined by other methods.

Fig. 2 and 3 are examples of the optical sensor in fig. 1. In the example of fig. 2, the optical sensor 101 comprises a plurality of mixing zones MR _1, MR _2, MR _3 having a square shape. Note also that for ease of understanding, only three blend zones MR _1, MR _2, MR _3 are labeled. The mixing regions MR _1, MR _2, MR _3 have the same shape, and each of them contains a part of the species judgment region and a part of the component analysis region. Taking the mixing region MR _1 as an example, in the embodiment of fig. 3, the type determination region includes a red region Rr, a green region Gr, and a blue region Br. The red region Rr, the green region Gr, and the blue region Br occupy one-fourth of the mixed region MR _1, respectively, and thus the species discrimination region occupies three-fourths of the mixed region MR _ 1. In addition, the composition analysis region Er occupies one quarter of the mixing region MR _ 1.

Note that the arrangement of the species discriminating region and the component analyzing region is not limited to the embodiment shown in fig. 2. For example, as shown in fig. 3, the component analysis region Er surrounds the species determination region Kr, which also includes a red region Rr, a green region Gr, and a blue region Br. In other words, the component analysis region Er is a square having a hollow portion, and the species judgment region Kr is a square located in the hollow portion.

FIG. 4 is a schematic diagram illustrating operations of the type determining circuit and the component analyzing circuit according to an embodiment of the present invention. As described above, first, the type determination circuit 103 determines the object type OK of the object (e.g., apple, plate, glasses, etc.) from the object image OI. Then, the species judgment circuit 103 sends the object species OK to the component analysis circuit 105 to analyze the components (e.g., sweetness, moisture content, metal content, etc.) of the object based on the component analysis optical data OD and the component analysis database relating to the object. For example, if the optical sensor 101 captures an image of an apple, the category determination circuit 103 may refer to the category determination database stored in the storage device 107 in fig. 1 to determine that the object in the image is an apple. Then, the composition analyzing circuit 105 analyzes the composition of the object based on the composition analyzing optical data OD and the composition analyzing database relating to the apple. In one embodiment, the component analysis database may also be stored in the storage device 107 of FIG. 1. Note that the above-described category judgment database and component analysis database are not limited to being obtained from the storage device in the object judgment system 100.

The aforementioned "object categories" may have different levels of detail. For example, the category determination circuit 103 may determine that the object is an "apple" from the object image OI. However, the species determination circuit 103 may also determine whether the object is a "grand smith apple" or a "fuji apple" based on the object image OI. The degree of detail of the "object category" may be determined based on the contents of the category determination database and the setting provided by the user.

In one embodiment, the component analysis circuit 105 and the species determination circuit 103 may be implemented by separate hardware. For example, the category judgment circuit is implemented by a sub-controller (e.g., an IC), and the composition analysis circuit is implemented by a main controller (e.g., a processor independent of the IC) independent of the sub-controller. In addition, the primary controller is in a sleep mode until the secondary controller wakes up the primary controller. The secondary controller may wake up the primary controller when it is determined that an object is present, or when the user triggers an "analyze object" function. In this way, power consumption may be reduced because the composition analysis circuit 105 requires more power to perform more data calculations to analyze the composition of the object.

Fig. 5 shows an example of the actual operation of the object determination system shown in fig. 1 according to the present invention. Reference is made to fig. 1-3 and 5 for a clearer understanding of the concepts of the present invention. In one embodiment, the object determination system 100 is provided in a mobile phone M. The user U takes a picture of the object 501 (i.e., captures an object image) through the species determination region Kr of the optical sensor 101 using the mobile phone M. The species determination circuit 103 determines from the picture that the object 501 is an apple, and acquires the component optical data OD through the component analysis region Er of the optical sensor 101. Further, the species determination circuit 103 transmits the object species Ok to the component analysis circuit 105. Then, the component analyzing circuit 105 analyzes the components of the object 501 based on the component analysis optical data OD and the apple-related component analysis database to generate a component analysis result Ear.

Please note that the object determination system 100 disclosed in the present invention is not limited to be applied to a mobile phone. In one embodiment, the object assessment system 100 is located in an electronic device having processing circuitry for selecting an operating mode of the electronic device based on the component analysis results from the component analysis circuit 105. The processing circuit may be integrated with the species determination circuit 103 and/or the component analysis circuit 105, or may be independent of the species determination circuit 103 and/or the component analysis circuit 105.

In one embodiment, the electronic device is an optical tracking device, and the processing circuit selects a tracking mode of the electronic device based on the component analysis results. For example, the electronic device is an optical mouse that can analyze the composition of a mouse pad under the optical mouse. In such an example, the processing circuitry may select a high tracking mode (high sensitivity) or a low tracking mode (low sensitivity) of the electronic device based on the composition of the mouse pad.

In another embodiment, the electronic device is a sweeper, and the processing circuit selects a sweeping intensity of the electronic device based on the composition analysis. For example, the electronic device is a vacuum cleaner or an automatic sweeper that can analyze the underlying floor. In such an example, the processing circuitry may select a strong cleaning mode or a weak cleaning mode (i.e., select a cleaning intensity) of the electronic device based on the floor composition.

In another embodiment, the electronic device is a smart wearable electronic device, and the processing circuit selects a lighting mode or a parameter calculation mode of the electronic device according to the component analysis result. For example, the electronic device is a smart watch or a smart bracelet that can analyze objects underneath. In such an example, the processing circuitry may select a lighting mode or a parameter calculation mode (i.e., select a calculation algorithm) of the electronic device based on a condition of an object beneath the electronic device. For example, an appropriate lighting pattern or parameter calculation pattern may be selected depending on whether the electronic device is in contact with the skin of the user. Also for example, an appropriate lighting pattern or parameter calculation pattern may be selected depending on whether the object is living or non-living. The parameter calculation may be calculating the heart rate of the user, or calculating the blood pressure of the user.

Briefly, the above-described embodiment applies a two-stage procedure to analyze an object. The first stage judges the object type, and the second stage analyzes the object components according to the object type. The concept of a two-stage step can be presented in another aspect, such as that described in the following embodiments of the invention.

Fig. 6 is a block diagram of an object determination system according to another embodiment of the invention. As shown in fig. 6, the object determination system 600 includes a processing circuit 601, a first-stage object sensor Os _1, and a second-stage object sensor Os _ 2. The first-stage object sensor Os _1 is configured to generate a first-stage object sensing result Osr _ 1. The second-stage object sensor Os _2 generates a second-stage object sensing result Osr _2 of the object when the processing circuit 601 determines from the first-stage object sensing result Osr _1 that the object is within the predetermined range of the object determination system 600. Then, the processing circuit 601 also determines the object type of the object from the second-stage object sensing result Osr _ 2.

In one embodiment, the object determination system 600 further comprises a linear light source for generating linear light and an area light source for generating planar light. In such an embodiment, the first-stage object sensor Os _1 generates the first-stage object sensing result Osr _1 from the linear light, and the second object sensor Os _2 generates the second object sensing result Osr _2 from the planar light (square light in this example). It is also understood that the first-stage object sensor Os _1 and the second-stage object sensor Os _2 may operate with any kind of light.

Fig. 7 to 11 are schematic views of an object determination system according to an embodiment of the present invention. In these embodiments, the object determination system 600 is located in the automatic sweeper 700. In addition, in these embodiments, the first-stage object sensor Os _1 is a depth sensor, and the second-stage object sensor Os _2 is an image sensor. Therefore, the first-stage object sensing result Osr _1 is depth information, and the second-stage object sensing result Osr _2 is an image. The depth sensor may be a laser depth sensor or any other optical sensor.

Fig. 8 is a schematic view along the X-direction viewing angle in fig. 7, and as shown in fig. 7 and 8, in the first-stage object sensing step, the automatic cleaner 700 emits linear light LL with the linear light source 701, and determines with the processing circuit 601 whether any object is present within a predetermined range of the automatic cleaner 700 (i.e., within a predetermined range of the object determination system) according to the first-stage object sensing result Osr _ 1. If there is no object within a predetermined range of the automatic cleaner 700, the depth sensed by the first-stage object sensor Os _1 may be deep. In contrast, if the object is within a predetermined range of the automatic sweeper 700, the depth sensed by the first-stage object sensor Os _1 may be shallow. Thus, the depth sensor may be used to sense whether any objects are present within a predetermined range of the automatic sweeper 700. Further, since only the presence of the object is determined in the first-stage object sensing step, the details of the object are not required. Thus, the linear light source 701 may provide sufficient light to determine the presence of an object.

In the embodiment shown in fig. 7 and 8, the objects Ob _1 to Ob _3 are located in front of the automatic cleaner 700 (i.e., within a predetermined range of the automatic cleaner 700), so the linear light LL from the linear light source 701 can be irradiated to the objects Ob _1 to Ob _3, and the processing circuit 601 can judge the presence of the objects Ob _1 to Ob _3 based on the sensing result of the first-stage object sensor Os _1 generated based on the linear light LL.

Fig. 10 is a schematic view along the Y-direction viewing angle in fig. 9. In the second stage object sensing step, as shown in fig. 9 and 10, the automatic cleaning machine 700 emits the square light SL by the area light source 901, and the processing circuit 601 determines the object type of the object from the second stage object sensing result Osr _2 (the object image in the present embodiment) generated by the square light SL. Since the object type is determined in the second stage object sensing step, more details of the object may be needed, so the area light source 901 is more helpful for determining the object type. In one embodiment, the object type may be determined based on object type data stored in the storage device 107 or any other storage device.

It should be noted that the first-stage object sensor Os _1 is not limited to a depth sensor, the second-stage object sensor Os _2 is not limited to an image sensor, and any sensor that can be used to implement the above functions should fall within the scope of the present invention. For example, in one embodiment, the first stage object sensor is a thermal sensor. Also, the second stage object sensor may be a thermal sensor.

Further, in one embodiment, the second stage object sensor Os _2 is in a sleep mode until the processing circuit 601 determines from the first stage object sensing result Osr _1 that an object is within the predetermined range of the automatic sweeper 700 and wakes up the second stage object sensor Os _ 2. Since the second object sensor Os _2 requires more power than the first object sensor Os _1, more power can be saved in this way.

In another embodiment, the object determination system 600 further includes a third-stage object sensor (not shown), and if the processing circuit 601 determines that the object is within the predetermined range of the object determination system 600 according to the first-stage object sensing result Osr _1, the third-stage object sensor generates a third-stage object sensing result of the object. Then, the processing circuit 601 determines the object type according to the second-stage object sensing result and the third-stage object sensing result. In such an embodiment, the second stage object sensor Os _2 is an image sensor and the third stage object sensor is a thermal sensor.

After determining the type of the object, the automatic cleaning machine 700 may be operated according to the type of the object. For example, if the object type indicates that the object is dirty, such as feces, the automatic sweeper 700 may stop or turn in another direction to avoid the object, otherwise the floor may become quite dirty if the automatic sweeper 700 attempts to sweep the object. In addition, if the object is a sharp object such as a nail and is sucked into the automatic cleaning machine 700, the object may damage the automatic cleaning machine 700. It should be noted that the object determination system 600 is not limited to the application to the automatic cleaning machine 700, and the object determination system 600 may be applied to any other electronic device that can operate according to the determined object type.

In an embodiment, the object species database may be further optimized with the second stage object sensing results Osr _ 2. As shown in fig. 11, in an embodiment, the user uses the mobile phone M to control the automatic cleaning machine 700, and the second-stage object sensing result Osr _2 is an object image. In this case, the object determination system in the automatic sweeper 700 may transmit the second-stage object sensing result Osr _2 to the user's mobile phone M, and the user may determine whether the second-stage object sensing result Osr _2 shows an object that needs to be avoided. If the user determines that the object needs to be avoided, he may send a confirmation instruction to the automatic sweeper 700 via the mobile phone M (e.g., by the "YES" icon in FIG. 11), and the automatic sweeper 700 may optimize the object type database based on the confirmation instruction from the user. The related object determination method can be obtained according to the above embodiments, but for brevity, the description is omitted.

According to the foregoing embodiments, the object type or the object composition can be automatically obtained, and the electronic device can select an appropriate operation according to the object type or the object composition. Accordingly, the problems of the prior art can be solved. It should be noted, however, that the present disclosure is not limited to solving the problems noted in the prior art.

The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

15页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:测量装置、电子设备以及测量方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!