Imaging device and electronic apparatus

文档序号:1713395 发布日期:2019-12-13 浏览:35次 中文

阅读说明:本技术 摄像装置和电子设备 (Imaging device and electronic apparatus ) 是由 木村胜治 于 2018-04-13 设计创作,主要内容包括:本技术涉及能够以高精度调整焦点位置的摄像装置和电子设备。本技术设置有:透镜,其收集被摄体光;摄像元件,其对来自所述透镜的所述被摄体光进行光电转换;电路基板,其包括从所述图像传感器向外部输出信号的电路;致动器,其利用脉冲宽度调制(PWM)波形来驱动所述透镜;以及检测单元,其检测由所述致动器中包括的线圈产生的磁场。检测单元检测由磁场产生的感应电动势。检测单元还基于感应电动势检测透镜的位置。本技术适用于摄像装置。(The present technology relates to an imaging device and an electronic apparatus capable of adjusting a focal position with high accuracy. The technology is provided with: a lens that collects subject light; an image pickup element that photoelectrically converts the object light from the lens; a circuit board including a circuit for outputting a signal from the image sensor to the outside; an actuator to drive the lens with a Pulse Width Modulation (PWM) waveform; and a detection unit that detects a magnetic field generated by a coil included in the actuator. The detection unit detects an induced electromotive force generated by the magnetic field. The detection unit also detects a position of the lens based on the induced electromotive force. The present technology is applicable to an image pickup apparatus.)

1. An image pickup apparatus comprising:

A lens that collects subject light;

An image sensor that photoelectrically converts the subject light from the lens;

A circuit board including a circuit that outputs a signal from the image sensor to the outside;

An actuator that drives the lens with a PWM (pulse Width modulation) waveform; and

A detection unit that detects a magnetic field generated by a coil included in the actuator.

2. The image pickup apparatus according to claim 1, wherein

The detection unit detects an induced electromotive force generated by the magnetic field.

3. The image pickup apparatus according to claim 2, wherein

The detection unit detects the position of the lens according to the induced electromotive force.

4. The image pickup apparatus according to claim 1, wherein

The detection unit is formed on the circuit substrate.

5. The image pickup device according to claim 1, further comprising a spacer for fixing the image sensor and the circuit substrate,

Wherein the detection unit is formed on the gasket.

6. The image pickup apparatus according to claim 1, wherein

the image pickup device is accommodated in a housing, and

The detection unit is formed on the housing.

7. The image capture device of claim 6, further comprising a securing mechanism that secures the image capture device to the housing,

Wherein the detection unit is formed on the fixing mechanism.

8. The image pickup apparatus according to claim 1, wherein

The detection unit comprises a coil of wire,

The circuit substrate includes a plurality of layers, and

The coil is formed across the plurality of layers of the circuit substrate.

9. The image pickup apparatus according to claim 1, wherein

The image sensor has a CSP (chip size package) shape.

10. The image pickup apparatus according to claim 1, wherein

The image sensor has a CSP (chip size package) shape, and

An infrared cut filter and a lens of a lowermost layer of the lenses are provided on a glass substrate of the CSP-shaped image sensor.

11. the image pickup apparatus according to claim 1, wherein

The image sensor has a flip chip structure.

12. The image pickup apparatus according to claim 1, wherein

the image sensor has a flip chip structure, is mounted on the circuit substrate, and

An infrared cut filter serving as a base material is adhered to the circuit substrate.

13. The image pickup apparatus according to claim 1, further comprising a storage unit that stores a correction value for correcting a difference between the image pickup apparatuses.

14. An electronic apparatus comprising an image pickup device, the image pickup device comprising:

A lens that collects subject light;

An image sensor that photoelectrically converts the subject light from the lens;

A circuit board including a circuit that outputs a signal from the image sensor to the outside;

An actuator that drives the lens with a PWM (pulse Width modulation) waveform; and

A detection unit that detects a magnetic field generated by a coil included in the actuator.

Technical Field

the present technology relates to an imaging apparatus and an electronic apparatus, and for example, to an imaging apparatus and an electronic apparatus capable of controlling a lens position with high accuracy.

Background

In recent years, for example, an increase in pixel density, an enhancement in performance, a reduction in size, and the like of an image pickup apparatus have been advanced. As the pixel density and performance of the image pickup apparatus increase, the power consumption of an image sensor, such as a charge-coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) image sensor, mounted on the image pickup apparatus also increases.

In addition, since power consumption of an actuator or the like for driving the focal point of the lens also increases, power consumption of the image pickup apparatus also tends to increase.

In order to reduce power consumption, the following methods are proposed: in which power consumption is reduced to about half by converting a driving signal of an actuator into a Pulse Width Modulation (PWM) waveform. However, it is known that when the actuator is PWM-driven, a magnetic field is generated and becomes a disturbing factor of the image sensor, and noise is mixed in the image sensor.

in order to reduce noise, it is proposed to synchronize a driving waveform of an image sensor with an auto focus driver generating a PWM signal and output the PWM waveform in a dead zone (dead zone) region within a driving time of the image sensor.

In addition, as a method of improving the performance of the image pickup apparatus, it has also been proposed to mount a Hall element on an actuator and output the position of the lens to the outside to always detect the focal position of the lens and move the lens to a position where the subject light is rapidly collected.

For example, patent document 1 proposes: the driving element (actuator) is controlled by a PWM signal from a focus driving circuit, and the lens is driven to change the focus of the lens and achieve autofocus. In patent document 1, it is also proposed to mount a Hall element for high-performance detection of the lens position.

In patent document 2, it is proposed to reduce noise of the image sensor caused by a magnetic field generated by PWM driving of the actuator by providing a metal plate to block (shield) the magnetic field.

In patent document 3, it is proposed to detect the position of the lens by using a PWM signal (alternating current signal) based on the electromotive force of a detection coil disposed opposite to the excitation power. In this scheme, a detection coil is mounted on one side of the operation lens, and it is proposed to detect a position from a phase of an electromotive current (electromotive current) in a parallel motion of the excitation coil and the detection coil.

List of cited documents

Patent document

Patent document 1: JP 2011-022563 publication

Patent document 2: JP 2014-082682A

Patent document 3: JP 2000-295832 publication

Disclosure of Invention

Technical problem to be solved by the invention

According to patent document 1, it is difficult to reduce the size of the image pickup apparatus because a Hall element needs to be mounted and thus the size of the actuator increases. In addition, since a Hall element needs to be mounted, there is a fear that the image pickup apparatus becomes expensive.

According to patent document 2, since gold, silver, copper, aluminum, or the like is used as a metal plate that blocks a magnetic field, there is a concern that the imaging device may become expensive. In addition, providing a metal plate for blocking a magnetic field does not contribute to downsizing of the image pickup apparatus.

Recent actuators have the following structure: wherein the coil is disposed outside the lens, and the focus is detected by moving the coil to a vertical side of the image sensor based on the excitation power. When cited document 3 is applied to such a structure, the coil of the excitation power and the detection coil are disposed opposite to each other, and the position of the lens cannot be detected by parallel movement of these coils. That is, it is difficult to apply cited document 3 to the latest actuator.

the present technology has been made in view of the above circumstances, and can provide an image pickup apparatus capable of improving performance, reducing power consumption, and reducing size.

Technical scheme for solving problems

An image pickup apparatus according to an aspect of the present technology includes: a lens that collects subject light; an image sensor that photoelectrically converts the subject light from the lens; a circuit board including a circuit for outputting a signal from the image sensor to the outside; an actuator to drive the lens with a Pulse Width Modulation (PWM) waveform; and a detection unit that detects a magnetic field generated by a coil included in the actuator.

An electronic apparatus according to an aspect of the present technology includes an image pickup device including: a lens that collects subject light; an image sensor that photoelectrically converts the subject light from the lens; a circuit board including a circuit for outputting a signal from the image sensor to the outside; an actuator to drive the lens with a Pulse Width Modulation (PWM) waveform; and a detection unit that detects a magnetic field generated by a coil included in the actuator.

An image pickup apparatus according to an aspect of the present technology includes: an image sensor that photoelectrically converts subject light from a lens for collecting the subject light; a circuit board including a circuit for outputting a signal from the image sensor to the outside; and an actuator that drives the lens with a Pulse Width Modulation (PWM) waveform. In the image pickup apparatus, a magnetic field generated by a coil included in the actuator is detected.

Note that the image pickup apparatus and the electronic apparatus may each be a separate apparatus, or the image pickup apparatus and the electronic apparatus may be internal blocks constituting one apparatus.

Effects of the invention

According to an aspect of the present technology, an image pickup apparatus capable of improving performance, reducing power consumption, and reducing size can be provided.

Note that the effects described herein are not necessarily restrictive, and any of the effects described in the present disclosure may be obtained.

Drawings

fig. 1 is a diagram showing the configuration of an embodiment of an image pickup apparatus to which the present technology is applied.

Fig. 2 is a diagram for explaining a magnetic field to be generated.

Fig. 3 is a diagram for explaining a coil to be formed.

Fig. 4 is a diagram showing a configuration example of the detection circuit.

Fig. 5 is a diagram for explaining the position of the lens and the amount of induced electromotive force.

Fig. 6 is a diagram for explaining a case where the coil is formed on the case.

Fig. 7 is a diagram showing another configuration example of the image pickup apparatus.

Fig. 8 is a diagram illustrating another configuration example of the image pickup apparatus.

Fig. 9 is a diagram for explaining a case where the coil is formed on the case.

Fig. 10 is a diagram for explaining a case where a coil is formed in a spacer (spacer).

fig. 11 is a diagram illustrating another configuration example of the image pickup apparatus.

Fig. 12 is a diagram illustrating another configuration example of the image pickup apparatus.

Fig. 13 is a diagram illustrating another configuration example of the image pickup apparatus.

Fig. 14 is a diagram illustrating another configuration example of the image pickup apparatus.

Fig. 15 is a diagram illustrating another configuration example of the image pickup apparatus.

Fig. 16 is a diagram showing an example of a schematic configuration of an endoscopic surgical system.

Fig. 17 is a block diagram showing an example of the functional configurations of the camera and the CCU.

Fig. 18 is a block diagram showing an example of a schematic configuration of a vehicle control system.

Fig. 19 is an explanatory view showing an example of the mounting positions of the vehicle exterior information detection unit and the imaging unit.

Detailed Description

hereinafter, a mode for carrying out the present technology (hereinafter, referred to as an embodiment) will be described.

< construction of imaging apparatus >

The present technology can be applied to an image pickup apparatus including an image sensor such as a charge-coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) image sensor. In addition, the present technology can also be applied to an apparatus including such as an image pickup device, for example, a portable terminal apparatus.

Fig. 1 is a diagram showing the configuration of an embodiment of an image pickup apparatus according to an aspect of the present technology. The image pickup apparatus 1 shown in fig. 1 includes an image sensor 11 such as a CCD sensor or a CMOS image sensor that picks up an image by photoelectrically converting subject light from a subject.

In addition, the image pickup apparatus 1 includes: a lens 16 for collecting subject light; and an infrared cut filter 17 for blocking infrared light from the optical signal transmitted through the lens 16. In addition, the image pickup apparatus 1 further includes an actuator 18, and the actuator 18 drives the lens vertically in the direction of the image sensor 11 to focus the lens 16.

In addition, the image pickup apparatus 1 includes an autofocus driver 20 for controlling the actuator 18 from the outside, and further includes a circuit board 13 for outputting an electric signal of the image sensor 11 to the outside. Note that although the circuit board 13 is described here, a circuit substrate may be used instead of the plate-like board.

In addition, the image pickup device 1 includes a metal wire 12 for electrically connecting the image sensor 11 and the circuit board 13, includes an adhesive material 15 for fixing the image sensor 11 and the circuit board 13, and further includes a spacer 14 for fixing the above-described actuator 18 and the circuit board 13.

the autofocus driver 20 has a function of outputting a Pulse Width Modulation (PWM) waveform to the actuator 18 in order to reduce power consumed by the image pickup apparatus 1. The actuator 18 has a function of driving the focal point of the lens 16 with the input PWM waveform.

the circuit board 13 has a function of detecting an induced electromotive force generated by a magnetic field generated from the PWM waveform, and a function of detecting the position of the lens 16 from the detected induced electromotive force. In addition, the circuit board 13 also has a function of realizing high-performance focus movement of the lens by outputting the detection result to the outside.

< detection of induced electromotive force >

Fig. 2 is a diagram for explaining a magnetic field generated by a PWM waveform and an induced electromotive force generated by the magnetic field.

The actuator 18 has a voice coil motor (voice coil motor) structure, and the coil 21 is supported by a spring 23. For example, the coil 21 is disposed on a side surface of the lens holder, and the magnet 22 is disposed on the opposite side of the coil 21.

When a current flows through the coil 21, a force is generated in the vertical direction in the drawing. By the generated force, the lens 16 held by the lens barrel moves upward or downward, and the distance of the lens 16 from the image sensor 11 changes. By this mechanism, Autofocus (AF) is achieved.

Incidentally, the case where the current flowing through the coil 21 is a PWM waveform drive signal (a signal in which high and low levels are switched at predetermined cycles) consumes less power than the case where the current flowing through the coil 21 is a signal having a constant voltage value (a signal in which a high level state is always maintained) than the case where the high level state continues.

Therefore, in order to reduce power consumption, in the case where the signal supplied to the coil 21 is a PWM waveform driving signal, a magnetic field is generated in the direction shown in fig. 2. Referring to fig. 2, the magnetic field is generated in a direction from the lens 16 side toward the image sensor 11.

Note that although a magnetic field is generated in a direction different from the direction shown in fig. 2 according to the direction of the current, the description will be continued here by taking the case where a magnetic field is generated in the direction shown in fig. 2 as an example.

the generated magnetic field passes through the image sensor 11. Therefore, the image captured by the image sensor 11 may be affected. For example, noise may be generated under the influence of a magnetic field, and an image (image signal) mixed with the noise may be output from the image sensor 11.

By synchronizing the PWM waveform drive signal with the drive signal of the image sensor 11, a magnetic field is not generated in the drive period which becomes noise of the image sensor 11, and therefore the influence of noise from the magnetic field can be reduced. By such synchronization, an image that is not affected by the magnetic field can be output from the imaging device 1.

The magnetic field generated by supplying the PWM waveform driving signal to the coil 21 also reaches the circuit board 13. A function of detecting the position of the lens 16 by detecting the strength of the magnetic field reaching the circuit board 13 will be explained.

as shown in fig. 2, the circuit board 13 is provided with a coil 32. By providing the coil 32 in the direction perpendicular to the magnetic field generated by the PWM waveform driving, an induced electromotive force is generated in the coil 32, and the position of the lens 16 (lens holder) can be detected based on the magnitude of the induced electromotive force.

In addition, by detecting the position of the lens 16 (lens holder) (in other words, detecting the distance between the lens 16 and the image sensor 11), high-performance driving of the lens, that is, auto-focusing can be achieved.

First, as shown in fig. 2, here is shown the following example: in this case, the induced electromotive force is detected by mounting a coil 32 constituting a part of the detection circuit 31 on the circuit board 13.

Fig. 3 is a diagram showing an example in which a coil 32 constituting a part of the detection circuit 31 is mounted on the circuit board 13.

The coil 32 has a start point 32a and an end point 32b, and both the start point 32a and the end point 32b are connected to the detection circuit 31, which is not shown in fig. 3. Since the coil 32 has a loop shape, and in order to avoid overlapping of the lines, one of the start point 32a and the end point 32b is placed inside the loop, and the other is placed outside the loop.

Therefore, when it is considered that both the start point 32a and the end point 32b are connected to the detection circuit 31, in other words, when lines are drawn from both the start point 32a and the end point 32b, the coil 32 needs to be formed across a plurality of layers.

Refer to a of fig. 3. It is assumed that the circuit board 13 includes one layer, for example, the starting point 32a of the coil 32 is a point on the lower right side in the drawing, and the end point is a central portion of the coil 32 (indicated by a black dot in a of fig. 3). In the case of drawing a wire from the end point of the central portion of the coil 32, it is difficult to draw such a wire so as not to overlap with the formed coil 32.

To solve this problem, the circuit board 13 includes two layers as shown in a of fig. 3. The circuit board 13 shown in a of fig. 3 includes two layers, i.e., a circuit board 13-1 and a circuit board 13-2. On the circuit board 13-1, a starting point 32a of the coil 32 is formed, and the coil is formed in a loop shape from the starting point 32a to the inside.

In addition, an end point of the coil 32 of the first layer is formed at a central portion of the coil 32 formed on the circuit board 13-1, and a start point of the coil 32 of the second layer is connected to the end point. On the circuit board 13-2 of the second layer, the coil 32 is formed from the starting point in a loop shape from the inside to the outside.

The loop-shaped coil 32 is formed from a start point 32a formed on the circuit board 13-1 to an end point 32b formed on the circuit board 13-2. In addition, the coil 32 can be connected to the detection circuit 31, not shown, by using the start point 32a formed on the circuit board 13-1 and the end point 32b formed on the circuit board 13-2.

Note that although not shown in a of fig. 3, a circuit for outputting an electric signal from the image sensor 11 to the outside is formed in a portion other than the portion where the coil 32 is formed, for example.

Although the case where the circuit board 13 includes two layers has been shown as an example in the example shown in a of fig. 3, the circuit board 13 may also include three layers as shown in B of fig. 3. In the example shown in B of fig. 3, the circuit board 13 includes three layers, i.e., circuit boards 13-1 to 13-3, a loop-shaped coil 32 is formed on each circuit board 13, and the coils 32 on the respective layers are connected to form one coil.

In addition, as shown in B of fig. 3, in the case where the circuit board 13 includes three layers, for example, the coil 32 may be formed on the first-layer circuit board 13-1 and the third-layer circuit board 13-3 without forming the coil 32 on the second-layer circuit board 13-2, and the circuit board 13-2 may be exclusively used for a circuit that outputs an electric signal from the image sensor 11 to the outside.

In the case where the circuit board 13 is formed in this manner, wiring for connecting the coil 32 formed on the circuit board 13-1 and the coil 32 formed on the circuit board 13-3 is formed on the circuit board 13-2.

Thus, the circuit board 13 can include multiple layers, and the coil 32 can be formed across the multiple layers. In addition, the number of layers and layer configuration of the circuit board 13 can be those shown herein, or can be other numbers of layers and other layer configurations.

The circuit board 13 is, for example, a board including a plurality of layers connected by copper wires (e.g., FPC), and the circuit board 13 has a role of outputting an electric signal of the image sensor 11 (fig. 1) to the outside. Copper wires are also wired to such a circuit board 13 in a coil shape for detecting a magnetic field.

The magnetic field generated when current flows through the coil 21 (fig. 2) in the actuator 18 flows into such a coil 32. As a result, an induced electromotive force is generated in the coil 32. The induced electromotive force generated can be obtained by the federler's law.

When the magnetic flux passing through the N-turn coil varies by Δ Φ [ Wb ] within Δ t [ s ], the induced electromotive force V [ V ] generated in the coil is represented by the following formula (1).

V = -N·ΔΦ/Δt ... (1)

As can be seen from equation (1), as the number of turns N increases, the induced electromotive force increases accordingly. As described above, the number of turns and the induced electromotive force can be increased by forming the coil 32 on a plurality of layers of the circuit board 13. Therefore, the coil 32 can be configured to easily detect the induced electromotive force to be generated.

The configuration of the detection circuit 31 connected to such a coil 32 will be explained. Note that, hereinafter, the circuit board 13 is shown as if it includes one layer in the drawings, and the description is continued. However, as described above, the circuit board 13 includes a plurality of layers.

< construction of detection Circuit >

Fig. 4 is a diagram showing a configuration example of the detection circuit 31. The induced electromotive force generated by the coil 32 is input to the amplifying unit 51 of the detection circuit 31 and amplified. The amplified induced electromotive force is input to an analog/digital (a/D) conversion unit 52, and is converted from analog data to digital data.

The AF control unit 53 controls the actuator 18, and recognizes the focal length of the lens 16 (fig. 1) using digital data from the a/D conversion unit 52. In the case where the focal distance needs to be corrected, that is, in the case of being determined to be out of focus, the AF control unit 53 generates a PWM control signal based on the movement distance required for correction, and supplies the PWM control signal to the actuator 18. Note that the AF control unit 53 also performs the following processing: a PWM control signal is generated based on a signal from the control unit 54 that controls Auto Focus (AF), and is supplied to the actuator 18.

The detection circuit 31 may be mounted in the image pickup apparatus 1 as one integrated circuit, or may be mounted outside the image pickup apparatus 1. In addition, the detection circuit 31 may not be implemented as an integrated circuit, but as software, or as software of an integrated CPU of the camera.

The present technology includes a function of detecting an induced electromotive force and a function of adjusting the focus of the lens with high accuracy by the induced electromotive force, and of course, a case where these functions are realized by an integrated circuit or software as described above is within the scope of the present invention. However, it is within the scope of the invention to implement these functions by other means.

It has been described that the position of the lens 16 can be detected by detecting the induced electromotive force flowing into the coil 32. This is because the relationship shown in fig. 5 holds. Fig. 5 is a graph showing the relationship between the position of the lens 16 and the detected induced electromotive force. In fig. 5, the vertical axis represents the position of the lens, and the horizontal axis represents the amount of current (digital data) of the induced electromotive force.

As described above, the auto-focus is achieved by adjusting the distance between the image sensor 11 and the lens 16. Therefore, the distance between the lens 16 and the coil 32 is also changed by the auto-focusing. In other words, the coil 21 (fig. 2) in the actuator 18 moves as the lens 16 moves.

When the lens 16 (coil 21) is located near the coil 32, the influence of the magnetic field generated by the current flowing through the coil 21 on the coil 32 is large, and when the lens 16 (coil 21) is located far from the coil 32, the influence of the magnetic field generated by the current flowing through the coil 21 on the coil 32 is small. Therefore, when the lens 16 (coil 21) is located near the coil 32, the induced electromotive force is large, and when the lens 16 (coil 21) is away from the coil 32, the induced electromotive force is small.

this is represented by the graph shown in fig. 5. Fig. 5 is a graph showing a case where the lens 16 approaches the coil 32 from the top toward the bottom in the figure. In addition, in the graph of fig. 5, the current value increases from the left side toward the right side in the figure. In addition, in fig. 5, the center position of the movable range of the lens is set to 0, and the current value is set to positive in the case where the current flows in the predetermined direction, and negative in the case where the current flows in the direction opposite to the predetermined direction.

As can be read from the graph shown in fig. 5, the induced electromotive force linearly changes. As can be understood from the above, the induced electromotive force and the position of the lens 16 are in a one-to-one correspondence relationship. Therefore, by detecting the induced electromotive force flowing into the coil 32, the position of the lens 16 at this time can be detected.

By utilizing such a relationship, for example, after the AF control unit 53 performs control for moving the lens 16 to the desired position a, the detection circuit 31 can detect the position B of the lens 16.

In addition, in the case where there is a deviation between the desired position a and the detected position B, the deviation can be corrected, and the lens 16 can be moved to the desired position a. Therefore, high-performance lens movement can be achieved.

< embodiment in which detection circuit is formed on case >

In the above-described embodiment, for example, as described with reference to fig. 2, a case where the coil 32 connected to the detection circuit 31 is formed on the circuit board 13 on the lower side of the image sensor 11 has been explained as an example. Hereinafter, description will be continued assuming that the side having the lens 16 is an upper side of the image sensor 11 and the side having the circuit board 13 is a lower side of the image sensor 11.

The magnetic field from the coil 21 included in the actuator 18 is generated not only on the circuit board 13 on the lower side of the image sensor 11 but also on the upper side (light receiving surface side) of the lens 16. That is, in the above-described embodiment, although the configuration in which the circuit board 13 on the lower side of the image sensor 11 receives the magnetic field from the coil 21 to detect the induced electromotive force is explained as an example, the following configuration may be adopted: a configuration in which, for example, the housing 101 shown in fig. 6 receives a magnetic field from the coil 21 on the upper side of the image sensor 11, thereby detecting an induced electromotive force.

Fig. 6 shows a configuration example of the appearance of the smartphone. Recent smartphones are typically equipped with a camera. The lens portion of the camera is located at a portion of a lens window 102 provided in a portion of the housing 101.

The coil 32 may be formed in the area around the lens window 102. Fig. 7 is a diagram illustrating an example of a sectional configuration of the image pickup apparatus 1 when the coil 32 is formed in the region around the lens window 102.

The configuration of the image pickup apparatus 1b shown in fig. 7 is substantially the same as that of the image pickup apparatus 1a shown in fig. 1, the same reference numerals are given to the same parts, and the description thereof will be omitted.

The lens window 102 is located above the lens 16 of the image pickup device 1 b. The image pickup device 1b is accommodated in a housing 101 formed with a lens window 102. As described with reference to fig. 6, the coil 32 is formed around the lens window 102.

When the image pickup device 1b is mounted in a housing or the like of a portable terminal such as a smartphone, the image pickup device 1b is mounted on a mechanism for fixing the image pickup device 1b, and is mounted in the housing of the portable terminal together with the fixing mechanism. In the imaging apparatus 1b shown in fig. 8, a fixing mechanism 110 is provided. The coil 32 may be formed on the fixing mechanism 110.

Fig. 9 shows a configuration example in the case where the coil 32 is formed on the fixing mechanism 110. The fixing mechanism 110 is also provided with a lens window 102 (the lens window 102 having substantially the same size is provided at a position corresponding to the lens window 102 of the housing 101).

The coil 32 can be formed in the area around the lens window 102. In addition, the start point (start point 32a in fig. 9) and the end point (end point 32b in fig. 9) of the coil 32 are provided at the lower portion of the side wall of the fixing mechanism 110.

The start point 32a and the end point 32b of the coil 32 are each formed in contact with the circuit board 13' (described in a point to distinguish from the circuit board 13 shown in fig. 1). A detection circuit 31 (at least a wiring connected to the detection circuit 31) is formed on the circuit board 13', and both the start point 32a and the end point 32b of the coil 32 can be connected to the detection circuit 31 formed on the circuit board 13'.

The coil 32 may be formed on the housing 101, or the coil 32 may be formed on the fixing mechanism 110. In addition, a part of the coil 32 may be formed on the housing 101, a part of the coil 32 may be formed on the fixing mechanism 110, and these coils 32 may be connected to form a single coil 32.

In addition, even in the case where the fixing mechanism 110 is provided, the coil 32 may be formed on the housing 101.

As described above, by forming the coil 32 on the housing 101 and/or the fixing mechanism 110, the magnetic field generated from the coil 21 (fig. 2) constituting the actuator 18 can be captured by the coil 32, and the induced electromotive force can be detected, and the position of the lens 16 can be detected, similarly to the case of the above-described image pickup apparatus 1 a.

For example, by forming the coil 32 on the housing 101, the image pickup apparatus 1 can have a function of detecting the position of the lens 16 as an end product such as a smartphone even if the image pickup apparatus 1 itself is not provided with the coil 32 (detection circuit 31).

That is, in the image pickup apparatus 1a shown in fig. 1, even with the image pickup apparatus 1 (the conventional image pickup apparatus 1) in which the coil 32 (the detection circuit 31) is not mounted on the circuit board 13, it is possible to dispose the coil 32 (the detection circuit 31) on the housing 101 of a product including the image pickup apparatus 1 or on the fixing mechanism 110 used when the image pickup apparatus 1 is mounted in the housing 101. Therefore, the final product can have a lens position detection mechanism with high accuracy.

In addition, even if the coil 32 is formed on the housing 101 or the fixing mechanism 110, the size of the imaging device 1 itself does not increase. Therefore, the imaging apparatus 1 can improve its performance without hindering the size reduction itself, and can improve the performance at low cost.

< embodiment in which detection circuit is formed in pad >

In the above-described embodiment, for example, the mechanism of detecting the position of the lens 16 by forming the coil 32 on the lower side of the image sensor 11 as described with reference to fig. 1 or on the upper side of the image sensor 11 as described with reference to fig. 8 has been described.

As shown in fig. 10, a coil 32 is formed in the pad 14, and a start point 31a and an end point 32b for connection with the detection circuit 31 are both formed at a portion of the pad 14 that contacts the circuit board 13. In the case where the coil 32 is formed in the spacer 14, the configuration of the imaging device 1 can be, for example, the same configuration as that of the imaging device 1a shown in fig. 1. However, the configuration of the image pickup apparatus 1 differs from that of the image pickup apparatus 1a in that the coil 32 is not formed on the circuit board 13. Here, although not shown, the image pickup apparatus 1 including the spacer 14 shown in fig. 10 will be explained as an image pickup apparatus 1 c.

The imaging device 1c can also detect the position of the lens 16 similarly to the case where the coil 32 is provided on the lower side of the image sensor 11 (the imaging device 1a) or the case where the coil 32 is provided on the upper side of the image sensor 11 (the imaging device 1 b).

In the case of the imaging device 1c, the size of the imaging device 1 itself does not increase. Therefore, the imaging apparatus 1 can improve the performance without hindering the size reduction itself, and can improve the performance at low cost.

< other configuration examples of image pickup apparatus >

The basic configurations of the above-described imaging devices 1a to 1c are similar to each other. As described above, the image pickup device 1a and the image pickup device 1c (not shown) shown in fig. 1 have the same configuration, the only difference being the portion where the coil 32 is formed. This difference does not affect the configuration of the image pickup apparatus 1.

In addition, the image pickup apparatus 1b shown in fig. 8 is obtained by simply adding the fixing mechanism 110 to the image pickup apparatus 1a shown in fig. 1. The fixing mechanism 110 itself does not affect the configuration of the imaging apparatus 1a itself.

That is, the image pickup apparatus 1 can have the same configuration regardless of where the coil 32 is provided. In other words, the present technology can be applied to any configuration of the image pickup apparatus 1, and is not limited to the configuration of the image pickup apparatuses 1a to 1c described above.

Now, other configurations of the image pickup apparatus 1 will be explained below. Note that each configuration described herein is also merely an example, not a limitation.

Fig. 11 is a diagram illustrating another configuration example of the image pickup apparatus 1. The imaging device 1d shown in fig. 11 has a configuration in which a Chip Size Package (CSP) shaped image sensor 11d is used as the image sensor 11.

Even in the case of using the CSP-shaped image sensor 11d as the image sensor 11, the coil 32 can be formed in or on the circuit board 13, the spacer 14, the housing 101, or the fixing mechanism 110, and the position of the lens 16 can be detected.

Fig. 12 is a diagram illustrating another configuration example of the image pickup apparatus 1. Similar to the image pickup apparatus 1d shown in fig. 11, the image pickup apparatus 1e shown in fig. 12 shows a configuration in the case of employing a CSP-shaped image sensor 11e as the image sensor 11.

The imaging device 1e shown in fig. 12 has a function (filter) of cutting infrared rays on a glass plate of the CSP-shaped image sensor 11e, and a lens 201 is formed on the glass plate.

Therefore, by providing the glass plate of the image sensor 11e with a function of cutting off infrared rays, the thickness of the infrared ray cut filter can be reduced. With this arrangement, the height of the image pickup device 1e can be reduced.

In addition, the fact that the lens 201 is formed on the glass plate means that the lens of the lowermost layer of the plurality of lenses constituting the lens 16 is formed on the glass plate of the image sensor 11e in the CSP shape. This configuration makes it possible to further reduce the thickness of the image pickup device 1 e.

Similarly, in the thin imaging device 1e, the coil 32 can be formed on the circuit board 13, the spacer 14, the housing 101, or the fixing mechanism 110, or on the circuit board 13, the spacer 14, the housing 101, or the fixing mechanism 110, and the position of the lens 16 can be detected.

fig. 13 is a diagram illustrating another configuration example of the image pickup apparatus 1. The image pickup device 1f shown in fig. 13 has a structure of an image sensor 11f in which the image sensor 11 (for example, the image sensor 11 of the image pickup device 1a shown in fig. 1) is a flip chip structure.

In the imaging device 1f shown in fig. 13, an electric signal output from the image sensor 11f is output to the outside through the holder 211 having a circuit function. The holder 211 also has a holder function of the actuator 18, and an electric signal from the image sensor 11f is output to the outside through the thin circuit board 13 connected to the holder 211.

Also with such an imaging device 1f, the coil 32 can be formed in the circuit board 13, the spacer 14 (corresponding to the holder 211 in the imaging device 1 f), the housing 101, or the fixing mechanism 110, or on the circuit board 13, the spacer 14 (corresponding to the holder 211 in the imaging device 1 f), the housing 101, or the fixing mechanism 110, and the position of the lens 16 can be detected.

Fig. 14 is a diagram illustrating another configuration example of the image pickup apparatus 1. The image pickup device 1g shown in fig. 14 has an image sensor 11g of a flip chip structure, similar to the image sensor 11f of the image pickup device 1f shown in fig. 13.

The imaging device 1g shown in fig. 14 has the following configuration: here, the infrared cut filter 17 is used as a base material when the image pickup device 1g is mounted, and the circuit board 13 is adhered to the infrared cut filter 17.

In addition, the image pickup apparatus 1g includes a holder 231 having a circuit function, similarly to the image pickup apparatus 1f shown in fig. 13. In addition, as shown in fig. 14, in the case where the image sensor 11g is provided on the lower side of the circuit board 13 (opposite to the side where the lens 16 is provided), when the image pickup device 1g is mounted on the terminal, a protective material 232 for protecting the image sensor 11g is also provided.

Also with such an imaging device 1g, the coil 32 can be formed in the circuit board 13, the spacer 14 (corresponding to the holder 231 or the protective material 232 in the imaging device 1 g), the housing 101, or the fixing mechanism 110, or on the circuit board 13, the spacer 14 (corresponding to the holder 231 or the protective material 232 in the imaging device 1 g), the housing 101, or the fixing mechanism 110, and the position of the lens 16 can be detected.

Fig. 15 is a diagram illustrating another configuration example of the image pickup apparatus 1. The image pickup apparatus 1g shown in fig. 15 has a configuration similar to that of the image pickup apparatus 1a shown in fig. 1, except that a storage unit 251 is added to the image pickup apparatus 1 g. The storage unit 251 stores data for correcting the difference of each image pickup apparatus 1.

The amount of induced electromotive force for adjusting the lens position varies depending on the number of turns and size of the coil 21 (fig. 2) of the actuator 18 and the formation state (the number of turns, the number of layers of the formed circuit board 13, etc.) of the coil 32 (fig. 3) of the circuit board 13. Therefore, at the time of manufacturing the image pickup apparatus 1h, the difference in induced electromotive force is measured, and an adjustment value for adjusting the difference is stored in the storage unit 251.

Then, at the time of actual control, the adjustment values stored in the storage unit 251 are used and processed so as to correct the differences of the respective image pickup apparatuses 1. With this arrangement, the position of the lens 16 can be detected and adjusted, and the difference between the respective image pickup devices 1 is improved.

Note that the mounting position of the storage unit 251 may be on the circuit board 13 shown in fig. 15, or the storage unit 251 may be mounted outside the image pickup apparatus 1 h. In addition, an image pickup apparatus 1h obtained by mounting the storage unit 251 on the image pickup apparatus 1a has been described here as an example. However, it is needless to say that the storage unit 251 may be attached to the imaging devices 1b to 1 g.

Also with this image pickup apparatus 1h, the coil 32 can be formed in the circuit board 13, the spacer 14, the housing 101, or the fixing mechanism 110, or on the circuit board 13, the spacer 14, the housing 101, or the fixing mechanism 110, and the position of the lens 16 can be detected.

According to the present technology, power consumption can be reduced by driving the lens by PWM. In addition, when PWM driving is performed, induced electromotive force generated by a magnetic field generated by an actuator (coil in the actuator) that drives the lens can be detected.

In addition, the position of the lens can be detected by detecting such induced electromotive force. Further, by detecting the position of the lens, the position can be corrected in the case where a positional deviation occurs.

According to the present technology, by controlling the focal position of the lens of the imaging device, it is possible to achieve an improvement in performance and a reduction in size of the imaging device.

The imaging device 1 can be used for a digital video camera, a digital still camera, and the like. In addition, the above-described image pickup apparatus 1 can also be used for an image input camera such as a monitoring camera and an in-vehicle camera. In addition, the imaging apparatus 1 can be used for the following electronic devices: such as scanner devices, facsimile devices, television telephone devices, and mobile terminal devices having cameras.

< example of application of endoscopic surgery System >

The techniques according to the present disclosure are applicable to a variety of products. For example, techniques according to the present disclosure may be applied to endoscopic surgical systems.

Fig. 16 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technique according to the present disclosure (present technique) can be applied.

Fig. 16 shows an operator (doctor) 11131 operating on a patient 11132 on a patient bed 11133 using an endoscopic surgical system 11000. As shown, the endoscopic surgical system 11000 includes: an endoscope 11100; other surgical tools 11110 such as a pneumoperitoneum tube 11111 and an energy treatment device 11112; a support arm device 11120 for supporting the endoscope 11100; and a cart 11200 on which various devices for endoscopic surgery are mounted.

Endoscope 11100 comprises: a lens barrel (lens barrel)11101 in which a region having a predetermined length from a tip is inserted into a body cavity of a patient 11132; and a camera 11102 connected to a bottom end of the lens barrel 11101. In the illustrated example, an endoscope 11100 configured as a so-called rigid mirror having a rigid lens barrel 11101 is shown. However, the endoscope 11100 may also be configured as a so-called flexible mirror having a flexible lens barrel.

At the tip of the lens barrel 11101, an opening is provided in which an objective lens is fitted. The light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101, and the light is irradiated to an observation object in the body cavity of the patient 11132 through the above-described objective lens. Note that the endoscope 11100 may be a forward-looking endoscope (forward-viewing endoscope), an oblique-viewing endoscope (oblique-viewing endoscope), or a side-viewing endoscope (side-viewing endoscope).

An optical system and an image sensor are provided inside the camera 11102, and reflected light (observation light) from an observation target is collected by the optical system onto the image sensor. Observation light is photoelectrically converted by the image sensor, thereby generating an electric signal corresponding to the observation light, that is, an image signal corresponding to an observation image. The image signal is transmitted as RAW data to a Camera Control Unit (CCU) 11201.

the CCU 11201 includes a Central Processing Unit (CPU) and a Graphics Processing Unit (GPU), and centrally controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera 11102, and performs various image processes such as a development process (demosaicing process) on the image signal to cause an image based on the image signal to be displayed, for example.

Under the control of the CCU 11201, the display device 11202 displays an image based on the image signal subjected to the image processing by the CCU 11201.

the light source device 11203 includes a light source such as a Light Emitting Diode (LED), for example, and supplies irradiation light to the endoscope 11100 when capturing an image of a surgical site or the like.

the input device 11204 is an input interface of the endoscopic surgical system 11000. The user can input various information and instructions to the endoscopic surgery system 11000 through the input device 11204. For example, the user inputs an instruction for changing the imaging conditions (the type, magnification, focal length, and the like of the irradiation light) of the endoscope 11100.

the treatment tool control device 11205 controls the driving of the energy treatment device 11112 for ablation, cutting, sealing of blood vessels, and the like of tissue. The pneumoperitoneum device 11206 delivers gas into the body cavity of the patient 11132 through the pneumoperitoneum tube 11111, thereby inflating the body cavity to ensure the field of view of the endoscope 11100 and to ensure the working space of the operator. The recorder 11207 is a device capable of recording various information relating to the operation. The printer 11208 is a device capable of printing various information related to the operation in various formats (e.g., text, images, charts, or the like).

Note that the light source device 11203 that supplies irradiation light to the endoscope 11100 when taking an image of the surgical site includes a white light source including, for example, an LED, a laser light source, or a combination of an LED and a laser light source. In the case where the white light source includes a combination of R, G, B laser light sources, the output intensity and the output timing for each color (each wavelength) can be controlled with high accuracy, and therefore, the light source device 11203 can adjust the white balance of the captured image. Further, in this case, by irradiating the observation target with the laser light from each of the RGB laser light sources in a time-division manner and controlling the driving of the image sensor of the camera 11102 in synchronization with the irradiation timing, images respectively corresponding to R, G and B can also be captured in a time-division manner. According to this method, a color image can be obtained without providing a color filter for the image sensor.

Further, the driving of the light source device 11203 may be controlled so as to change the intensity of light to be output at predetermined intervals. Images are acquired in a time-division manner by controlling the driving of the image sensor of the camera 11102 in synchronization with the timing of the change in light intensity, and these images are combined together, whereby a high dynamic range image free from so-called underexposure (underexposure) and overexposure (overexposure) can be produced.

Further, the light source device 11203 is capable of providing light in a predetermined wavelength band compatible with special light observation. In special light observation, for example, so-called Narrow Band Imaging (NBI) is performed in which narrow band light is emitted compared with irradiation light (i.e., white light) during ordinary observation by utilizing wavelength dependence of light absorption of human tissue, thereby taking an image of a predetermined tissue such as blood vessels in a surface portion of a mucous membrane with high contrast. Alternatively, in the special light observation, fluorescence observation in which an image is obtained by fluorescence generated by irradiation with excitation light may be performed. In the fluorescence observation, for example, fluorescence from human tissue may be observed by irradiating the human tissue with excitation light (autofluorescence observation), or a fluorescence image may be obtained by locally injecting a reagent such as indocyanine green (ICG) into the human tissue while also irradiating the human tissue with excitation light corresponding to the fluorescence wavelength of the reagent. The light source device 11203 is capable of providing narrow-band light and/or excitation light corresponding to such special light observations.

Fig. 17 is a block diagram showing an example of the functional configuration of the camera 11102 and the CCU 11201 shown in fig. 16.

The camera 11102 includes a lens unit 11401, an image pickup unit 11402, a drive unit 11403, a communication unit 11404, and a camera control unit 11405. The CCU 11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413. The camera 11102 and the CCU 11201 are communicably connected to each other by a transmission cable 11400.

The lens unit 11401 is an optical system, which is provided at a connection portion with the lens barrel 11101. Observation light introduced from the tip of the lens barrel 11101 is guided to the camera 11102, and enters the lens unit 11401. The lens unit 11401 includes a combination of a plurality of lenses (including a zoom lens and a focus lens).

The image pickup unit 11402 includes an image sensor. The number of image sensors constituting the image pickup unit 11402 may be one (so-called single plate type) or plural (so-called multi-plate type). For example, in the case where the image pickup unit 11402 is configured as a multi-plate type image pickup unit, each image sensor may generate image signals corresponding to each of R, G and B, and a color image may be obtained by combining these image signals. Alternatively, the image sensing unit 11402 may include a pair of image sensors for acquiring a right eye image signal and a left eye image signal compatible with three-dimensional (3D) display. The 3D display enables the operator 11131 to grasp the depth of the biological tissue in the surgical site more accurately. Note that in the case where the image pickup unit 11402 is configured as a multi-plate type image pickup unit, a plurality of systems of the lens unit 11401 may be provided corresponding to the respective image sensors.

Further, the image pickup unit 11402 may not necessarily be provided in the camera 11102. For example, the image pickup unit 11402 may be disposed inside the lens barrel 11101 and immediately behind the objective lens.

The driving unit 11403 includes an actuator, and the driving unit 11403 moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera control unit 11405. With this arrangement, the magnification and focus of the captured image of the image pickup unit 11402 can be appropriately adjusted.

A communication unit 11404 includes a communication device for transmitting and receiving various information to and from the CCU 11201. The communication unit 11404 transmits the image signal acquired from the image pickup unit 11402 to the CCU 11201 as RAW data through the transmission cable 11400.

Further, the communication unit 11404 receives a control signal for controlling driving of the camera 11102 from the CCU 11201, and supplies the control signal to the camera control unit 11405. For example, the control signal includes information related to imaging conditions such as: information for specifying the frame rate of a captured image, information for specifying the exposure value at the time of capturing an image, and/or information for specifying the magnification and focus of a captured image, and the like.

note that the above-described image capturing conditions such as the frame rate, the exposure value, the magnification, and the focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. In the latter case, an Auto Exposure (AE) function, an Auto Focus (AF) function, and an Auto White Balance (AWB) function are installed in the endoscope 11100.

the camera control unit 11405 controls driving of the camera 11102 based on a control signal from the CCU 11201 received through the communication unit 11404.

the communication unit 11411 includes a communication device for transmitting and receiving various information to and from the camera 11102. The communication unit 11411 receives an image signal transmitted from the camera 11102 through the transmission cable 11400.

Further, the communication unit 11411 transmits a control signal for controlling driving of the camera 11102 to the camera 11102. The image signal and the control signal can be transmitted by electrical communication, optical communication, or the like.

The image processing unit 11412 performs various image processes on the image signal as RAW data transmitted from the camera 11102.

The control unit 11413 executes various controls related to capturing an image of a surgical site or the like by the endoscope 11100 and display of a captured image obtained by capturing an image of the surgical site or the like. For example, the control unit 11413 generates a control signal for controlling driving of the camera 11102.

Further, based on the image signal that has been image-processed by the image processing unit 11412, the control unit 11413 causes the display device 11202 to display a captured image in which the surgical site and the like are captured. At this time, the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, by detecting the shape, color, and the like of the edge of the object included in the captured image, the control unit 11413 can recognize a surgical tool such as forceps, a specific biological site, bleeding, mist when the energy treatment device 11112 is used, and the like. When the control unit 11413 causes the display device 11202 to display the photographed image, the control unit 11413 may superimpose and display various kinds of operation assistance information on the image of the operation site using the recognition result. The operation assistance information is superimposed and displayed and then presented to the operator 11131, so that the burden on the operator 11131 can be reduced and the operator 11131 can perform an operation surely.

The transmission cable 11400 connecting the camera 11102 and the CCU 11201 together is an electric signal cable compatible with communication of electric signals, an optical fiber compatible with optical communication, or a composite cable configured by combining an electric signal cable and an optical fiber.

Here, in the illustrated example, although communication is performed by wire using the transmission cable 11400, communication between the camera 11102 and the CCU 11201 may be performed by wireless.

note that although the endoscopic surgical system has been described here as an example, the technique according to the present disclosure can also be applied to other systems, such as a microsurgical system or the like.

< application example of Mobile body >

The techniques according to the present disclosure are applicable to a variety of products. For example, the techniques according to the present disclosure may be implemented as an apparatus mounted on any type of moving body, such as an automobile, an electric automobile, a hybrid automobile, a motorcycle, a bicycle, a personal mobile device, an airplane, a drone, a ship, or a robot.

Fig. 18 is a block diagram showing a schematic configuration example of a vehicle control system as an example of a mobile body control system to which the technique according to the present disclosure can be applied.

The vehicle control system 12000 includes a plurality of electronic control units connected to each other through a communication network 12001. In the example shown in fig. 18, the vehicle control system 12000 includes: a drive system control unit 12010, a vehicle body system control unit 12020, a vehicle exterior information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050. Further, as a functional configuration of the integrated control unit 12050, a microcomputer 12051, a sound/image output unit 12052, and an in-vehicle network interface (I/F)12053 are shown.

The drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 functions as a control device of each of the following devices: a driving force generation apparatus such as an internal combustion engine or a driving motor for generating a driving force of the vehicle; a driving force transmission mechanism for transmitting a driving force to a wheel; a steering mechanism for adjusting a steering angle of the vehicle; and a brake apparatus for generating a braking force of the vehicle, and the like.

The vehicle body system control unit 12020 controls the operations of various devices mounted on the vehicle body according to various programs. For example, the vehicle body system control unit 12020 functions as a control device for each of the following devices: a keyless entry system; a smart key system; a power window device; or various lamps such as headlights, tail lights, brake lights, blinkers, or fog lights. In this case, a radio wave transmitted from the mobile device or a signal of various switches in place of the key can be input to the vehicle body system control unit 12020. The vehicle body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, or the like of the vehicle.

Vehicle exterior information detection section 12030 detects information on the exterior of the vehicle to which vehicle control system 12000 is attached. For example, the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030. The vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the outside of the vehicle, and receives the captured image. Based on the received image, the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing that detects an object such as a pedestrian, a vehicle, an obstacle, a sign, or a character on a road surface.

The image pickup unit 12031 is an optical sensor for receiving light and outputting an electric signal according to the amount of received light. The imaging unit 12031 can output the electrical signal as an image or can output the electrical signal as distance measurement information. Further, the light received by the image pickup unit 12031 may be visible light, or may be non-visible light such as infrared light.

The in-vehicle information detection unit 12040 detects information inside the vehicle. For example, the in-vehicle information detection unit 12040 is connected to a driver state detection unit 12041 for detecting the state of the driver. For example, the driver state detection unit 12041 includes a camera for taking an image of the driver, and based on the detection information input from the driver state detection unit 12041, the in-vehicle information detection unit 12040 may calculate the degree of fatigue or concentration of the driver, or may determine whether the driver is asleep.

Based on the vehicle exterior and interior information acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, the microcomputer 12051 calculates a control target value of the driving force generation device, the steering mechanism, or the brake device, and is able to output a control command to the drive system control unit 12010. For example, the microcomputer 12051 can execute cooperative control for realizing Advanced Driver Assistance System (ADAS) functions including: collision avoidance or collision mitigation of the vehicle, follow-up running based on the inter-vehicle distance, vehicle speed maintenance running, vehicle collision warning, lane departure warning of the vehicle, or the like.

Further, based on the information about the vehicle surroundings acquired by the vehicle exterior information detecting unit 12030 or the vehicle interior information detecting unit 12040, the microcomputer 12051 can execute cooperative control for automatic driving or the like for automatic running that does not depend on the operation of the driver by controlling the driving force generating apparatus, the steering mechanism, the brake apparatus, or the like.

Further, based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030, the microcomputer 12051 can output a control command to the vehicle body system control unit 12020. For example, the microcomputer 12051 can perform cooperative control for realizing no glare by controlling headlights and switching from high beam to low beam according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detecting unit 12030.

The sound/image output unit 12052 transmits an output signal of at least one of sound or image to an output device capable of visually or aurally notifying a passenger on the vehicle or the outside of the vehicle of information. In the example of fig. 18, an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices. For example, the display unit 12062 may include at least one of an on-board display (on-board display) or a head-up display (head-up display).

Fig. 19 is a diagram illustrating an example of the mounting position of the imaging unit 12031.

In fig. 19, as the image pickup unit 12031, the vehicle 12100 includes image pickup units 12101, 12102, 12103, 12104, and 12105.

For example, the image pickup units 12101, 12102, 12103, 12104, and 12105 are provided at the following positions of the vehicle 12100: such as the nose, side mirrors, rear bumpers, trunk doors, and the upper portion of the windshield in the vehicle compartment. The camera unit 12101 provided on the nose and the camera unit 12105 provided on the upper portion of the windshield in the vehicle compartment mainly acquire images in front of the vehicle 12100. The camera units 12102 and 12103 provided on the side mirrors mainly acquire images of both sides of the vehicle 12100. The imaging unit 12104 provided on the rear bumper or the trunk door mainly acquires an image behind the vehicle 12100. The images in front of the vehicle 12100 acquired by the imaging units 12101 and 12105 are mainly used to detect a preceding vehicle, a pedestrian, an obstacle, a traffic signal, a traffic sign, a lane, or the like.

Note that fig. 19 shows an example of imaging ranges of the imaging units 12101 to 12104. An imaging range 12111 represents an imaging range of the imaging unit 12101 provided on the nose, imaging ranges 12112 and 12113 represent imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively, and an imaging range 12114 represents an imaging range of the imaging unit 12104 provided on the rear bumper or the trunk door. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's eye view image of the vehicle 12100 viewed from above can be obtained.

At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of image sensors, or may be an image sensor having pixels for phase difference detection.

For example, based on the distance information obtained from the imaging units 12101 to 12104, the microcomputer 12051 obtains the distances from the respective three-dimensional objects within the imaging ranges 12111 to 12114 and the changes in the distances with time (relative speed with respect to the vehicle 12100), thereby extracting, as the preceding vehicle, the three-dimensional object that is closest to the vehicle 12100, especially on the traveling road, and that travels at a predetermined speed (for example, greater than or equal to 0km/h) in substantially the same direction as the vehicle 12100. Further, the microcomputer 12051 can set in advance the inter-vehicle distance to be secured before the preceding vehicle, and can execute automatic braking control (including following stop control), automatic acceleration control (including following start control), and the like. In this way, it is possible to execute cooperative control for the purpose of realizing automatic driving or the like for automatic travel that does not necessarily depend on an operation by the driver or the like.

For example, based on the distance information obtained from the imaging units 12101 to 12104, the microcomputer 12051 can extract three-dimensional object data about a three-dimensional object by classifying the three-dimensional object into a two-wheeled vehicle, a general vehicle, a large vehicle, a pedestrian, and other three-dimensional objects such as a utility pole, and can automatically avoid an obstacle using the three-dimensional object data. For example, the microcomputer 12051 recognizes obstacles around the vehicle 12100 as obstacles that the driver of the vehicle 12100 can visually recognize and obstacles that are difficult to visually recognize. Then, the microcomputer 12051 determines a collision risk indicating the risk of collision with each obstacle, and in the case where the collision risk is greater than or equal to the set value and there is a possibility of collision, the microcomputer 12051 can output a warning to the driver through the audio speaker 12061 or the display unit 12062, and perform forced deceleration or avoidance steering through the drive system control unit 12010, thereby performing drive assist to avoid the collision.

At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the images captured by the imaging units 12101 to 12104. Such identification of pedestrians is performed, for example, by the following procedure: a process of extracting feature points in a captured image of the imaging units 12101 to 12104 as the infrared cameras; and a process of determining whether or not the object is a pedestrian by performing pattern matching processing on a series of feature points representing the outline of the object. When the microcomputer 12051 determines that a pedestrian is present in the captured images of the image capturing units 12101 to 12104 and recognizes the pedestrian, the sound/image output unit 12052 controls the display unit 12062 so that a rectangular outline for emphasis is superimposed and displayed on the recognized pedestrian. Further, the sound/image output unit 12052 may also control the display unit 12062 so that an icon or the like representing a pedestrian is displayed at a desired position.

In the present specification, the term "system" refers to an entire apparatus including a plurality of devices.

Note that the effects described in this specification are merely exemplary effects, not restrictive, and other effects may be achieved.

Note that the embodiments of the present technology are not limited to the above-described embodiments, and various modifications can be made without departing from the gist of the present technology.

Note that the present technology can also include the following configurations.

(1)

An image pickup apparatus comprising:

A lens that collects subject light;

An image sensor that photoelectrically converts the subject light from the lens;

A circuit board including a circuit for outputting a signal from the image sensor to the outside;

An actuator to drive the lens with a Pulse Width Modulation (PWM) waveform; and

A detection unit that detects a magnetic field generated by a coil included in the actuator.

(2)

the image pickup apparatus according to (1), wherein

The detection unit detects an induced electromotive force generated by the magnetic field.

(3)

The image pickup apparatus according to (2), wherein

The detection unit detects the position of the lens according to the induced electromotive force.

(4)

The imaging apparatus according to any one of (1) to (3), wherein

The detection unit is formed on the circuit substrate.

(5)

The image pickup apparatus according to any one of (1) to (3), further comprising a spacer for fixing the image sensor and the circuit board,

Wherein the detection unit is formed on the gasket.

(6)

The imaging apparatus according to any one of (1) to (3), wherein

The image pickup device is accommodated in a housing, and

The detection unit is formed on the housing.

(7)

the image pickup apparatus according to (6), further comprising a fixing mechanism that fixes the image pickup apparatus to the housing,

Wherein the detection unit is formed on the fixing mechanism.

(8)

The image pickup apparatus according to (1), wherein

The detection unit comprises a coil of wire,

The circuit substrate includes a plurality of layers, and

The coil is formed across the plurality of layers of the circuit substrate.

(9)

The imaging apparatus according to any one of (1) to (8), wherein

The image sensor has a Chip Size Package (CSP) shape.

(10)

The imaging apparatus according to any one of (1) to (8), wherein

The image sensor has a Chip Size Package (CSP) shape, and

An infrared cut filter and a lens of the lowermost layer of the lens are provided on a glass substrate of the CSP-shaped image sensor.

(11)

the imaging apparatus according to any one of (1) to (8), wherein

The image sensor has a flip chip structure.

(12)

The imaging apparatus according to any one of (1) to (8), wherein

The image sensor has a flip chip structure, is mounted on the circuit substrate, and

An infrared cut filter serving as a base material is adhered to the circuit substrate.

(13)

The imaging apparatus according to any one of (1) to (8), further comprising a storage unit that stores a correction value for correcting a variation of each imaging apparatus.

(14)

An electronic apparatus including an image pickup device, comprising:

A lens that collects subject light;

An image sensor that photoelectrically converts the subject light from the lens;

A circuit board including a circuit for outputting a signal from the image sensor to the outside;

An actuator to drive the lens with a Pulse Width Modulation (PWM) waveform; and

a detection unit that detects a magnetic field generated by a coil included in the actuator.

List of reference numerals

1 image pickup device

11 image sensor

12 metal wire

13 Circuit board

14 shim

15 adhesive

16 lens

17 infrared ray cut-off filter

18 actuator

19 connector

20 auto focus driver

31 detection circuit

32 coil

51 amplifying unit

52A/D conversion unit

53 AF control unit

54 control unit

101 casing

102 camera window

110 fixing mechanism

39页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种多群组镜头、摄像模组及其组装方法、电子设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!