Imaging element, imaging device, method for operating imaging element, and program

文档序号:91112 发布日期:2021-10-08 浏览:14次 中文

阅读说明:本技术 成像元件、摄像装置、成像元件的工作方法及程序 (Imaging element, imaging device, method for operating imaging element, and program ) 是由 小林诚 河合智行 樱武仁史 长谷川亮 菅原一文 于 2020-01-28 设计创作,主要内容包括:成像元件具备:第1通信接口,将基于通过拍摄被摄体而得到的图像数据的第1图像数据输出到外部处理器,且内置于成像元件中;存储器,存储图像数据,且内置于成像元件中;及第2通信接口,将基于存储于存储器中的图像数据的第2图像数据输出到外部处理器,且内置于成像元件中,第1通信接口的输出方式与第2通信接口的输出方式不同。(The imaging element is provided with: a 1 st communication interface that outputs 1 st image data based on image data obtained by photographing a subject to an external processor and is built in an imaging element; a memory which stores image data and is built in the imaging element; and a 2 nd communication interface which outputs 2 nd image data based on the image data stored in the memory to an external processor and is built in the imaging element, wherein an output mode of the 1 st communication interface is different from an output mode of the 2 nd communication interface.)

1. An imaging element, comprising:

a processor; and

a memory, either internal or coupled to the processor,

the processor and the memory are built into the imaging element,

the processor has a 1 st communication interface and a 2 nd communication interface,

the 1 st communication interface outputs 1 st image data based on image data obtained by photographing an object to an external processor disposed outside the imaging element,

the memory stores the image data in a memory,

the 2 nd communication interface outputs 2 nd image data based on the image data stored in the memory to the external processor,

the 1 st communication interface has an output mode different from that of the 2 nd communication interface.

2. The imaging element according to claim 1,

the output of the 1 st image data based on the 1 st communication interface and the output of the 2 nd image data based on the 2 nd communication interface are performed independently of each other.

3. The imaging element according to claim 1,

the 1 st communication interface outputs the 1 st image data in a period different from an output period of the 2 nd image data based on the 2 nd communication interface.

4. The imaging element according to claim 3,

the 2 nd communication interface outputs the 1 st image data according to a request from the external processor.

5. The imaging element according to claim 3 or 4,

the output period is a vertical blanking period after 1 frame of the 1 st image data is output from the 1 st communication interface.

6. The imaging element according to claim 3 or 4,

the output period is a vertical blanking period before 1 frame of the 1 st image data is output from the 1 st communication interface.

7. The imaging element according to claim 3 or 4,

the output period is a horizontal blanking period after 1 line of the 1 st image data is output from the 1 st communication interface.

8. The imaging element according to claim 3 or 4,

the output period is a horizontal blanking period before 1 line of the 1 st image data is output from the 1 st communication interface.

9. The imaging element of claim 3 or 4, comprising:

a 1 st A/D converter for A/D converting the analog image data,

the output period is an a/D conversion period based on the 1 st a/D converter before the 1 st image data of 1 line is output from the 1 st communication interface.

10. The imaging element of any one of claims 1 to 9, comprising:

a 2 nd A/D converter for A/D converting the analog image data,

the processor has a memory controller that stores digital image data obtained by digitizing the analog image data by the 2 nd A/D converter in the memory,

the 1 st communication interface is configured to output the digital image data obtained from the 2 nd A/D converter as the 1 st image data without storing the digital image data in the memory,

the 2 nd communication interface is configured to output the digital image data read out from the memory by the memory controller as the 2 nd image data.

11. The imaging element according to any one of claims 1 to 10,

The memory is a memory whose write time and read time are different.

12. The imaging element according to claim 11,

the memory is a DRAM.

13. The imaging element according to any one of claims 1 to 12, which is formed by single-chip fabrication of at least a photoelectric conversion element and the memory.

14. The imaging element according to claim 13,

the imaging element is a stacked imaging element in which the memory is stacked on the photoelectric conversion element.

15. An image pickup apparatus, comprising:

the imaging element of any one of claims 1 to 14; and

and a display processor that performs control for displaying on a display at least one of a 1 st image based on the 1 st image data output by the 1 st communication interface and a 2 nd image based on the 2 nd image data output by the 2 nd communication interface.

16. An image pickup apparatus, comprising:

the imaging element of any one of claims 1 to 14; and

and a storage processor for performing control of storing at least one of the 1 st image data output from the 1 st communication interface and the 2 nd image data output from the 2 nd communication interface in a storage device.

17. An operating method of an imaging element, the imaging element is internally provided with a processor and a memory which is internally provided with or connected with the processor, the processor is provided with a 1 st communication interface and a 2 nd communication interface, the operating method of the imaging element comprises the following steps:

the 1 st communication interface outputs 1 st image data based on image data obtained by photographing an object to an external processor disposed outside the imaging element;

the memory stores the image data; and

the 2 nd communication interface outputs 2 nd image data based on the image data stored in the memory to the external processor,

the 1 st communication interface has an output mode different from that of the 2 nd communication interface.

18. A program for causing a computer to function as a 1 st communication interface and a 2 nd communication interface included in an imaging element having a processor and a memory built in or connected to the processor, the processor having the 1 st communication interface and the 2 nd communication interface,

the 1 st communication interface outputs 1 st image data based on image data obtained by photographing an object to an external processor disposed outside the imaging element,

The memory stores the image data in a memory,

the 2 nd communication interface outputs 2 nd image data based on the image data stored in the memory to the external processor,

the 1 st communication interface has an output mode different from that of the 2 nd communication interface.

Technical Field

The present invention relates to an imaging element, an imaging apparatus, a method of operating the imaging element, and a program.

Background

Japanese patent laid-open No. 2018-6806 discloses an image pickup apparatus including: a laminated image sensor having a sensor section, a 1 st logic section and a 1 st memory section; and a 2 nd logic section.

The sensor unit is a so-called CMOS (Complementary Metal Oxide Semiconductor) image sensor unit. The sensor section converts the received light into an electric signal. The sensor unit digitizes the electric signal and transmits the digitized RAW data to the 1 st logic unit.

The 1 st logic unit includes a 1 st memory control unit, a 1 st inter-chip communication I/F (Interface), a simplified developing unit, and a 1 st display control unit. The 1 st memory controller is a so-called memory controller, and writes the RAW data from the sensor section into the 1 st memory section. The 1 st inter-chip communication I/F accesses the 1 st memory unit via the 1 st memory control unit and transfers the RAW data read from the 1 st memory unit to the 2 nd logic unit. The simple developing unit accesses the 1 st memory unit via the 1 st memory control unit and performs a developing process on the RAW data read from the 1 st memory unit, thereby generating display data that can be displayed on the display unit. The simple developing unit writes the display data back to the 1 st memory unit via the 1 st memory control unit. The 1 st display control unit reads the display data from the 1 st memory unit via the 1 st memory control unit and outputs the display data to the 2 nd logic unit.

In this manner, in the image pickup apparatus described in japanese patent application laid-open No. 2018-6806, image data is output from the lamination type image sensor to the 2 nd logic portion via each of two output paths.

Disclosure of Invention

One embodiment according to the present technology provides an imaging element, an imaging device, a method of operating the imaging element, and a program, which can suppress output stagnation of image data compared to a case where image data is output to a processing unit (an external processor disposed outside the imaging element) from only a single communication I/F.

Means for solving the technical problem

A 1 st aspect according to the present technology is an imaging element including: a 1 st output section that outputs 1 st image data based on image data obtained by photographing a subject to an external processing section of the imaging element and is built in the imaging element; a storage section storing image data and built in the imaging element; and a 2 nd output unit configured to output the 2 nd image data based on the image data stored in the storage unit to an external processing unit and to be incorporated in the imaging element, wherein the 1 st output unit is configured to output an image in a manner different from that of the 2 nd output unit. Thus, even when the image data is stored in the storage unit, the image data can be output without delay.

A 2 nd aspect according to the present invention is the imaging element according to the 1 st aspect, wherein the output of the 1 st image data by the 1 st output unit and the output of the 2 nd image data by the 2 nd output unit are performed independently of each other. Thus, the output timing of the 1 st image data and the output timing of the 2 nd image data to the processing unit can be freely changed.

A 3 rd aspect relating to the technology of the present invention is the imaging element according to the 1 st aspect, wherein the 1 st output unit outputs the 1 st image data in a period different from an output period of the 2 nd image data by the 2 nd output unit. This enables the image data to be output to the processing unit without delay.

A 4 th aspect relating to the technology of the present invention is the imaging element of the 3 rd aspect, wherein the 2 nd output unit outputs the 2 nd image data in response to a request from the external processing unit. Thus, even if the processing unit is not in a state of receiving the 2 nd image data, the 2 nd image data can be prevented from being output to the processing unit.

A 5 th aspect relating to the technology of the present invention is the imaging element of the 3 rd or 4 th aspect, wherein the output period is a vertical blanking period after 1 st image data of 1 frame is output from the 1 st output unit. This makes it possible to avoid output from the imaging element to the processing unit from being suspended due to the writing operation of the image data to the storage unit.

A 6 th aspect relating to the technology of the present invention is the imaging element of the 3 rd or 4 th aspect, wherein the output period is a vertical blanking period before 1 st image data of 1 frame is output from the 1 st output unit. This makes it possible to avoid output from the imaging element to the processing unit from being suspended due to the writing operation of the image data to the storage unit.

A 7 th aspect relating to the technology of the present invention is the imaging element of the 3 rd or 4 th aspect, wherein the output period is a horizontal blanking period after 1 st image data of 1 line is output from the 1 st output unit. This makes it possible to avoid output from the imaging element to the processing unit from being suspended due to the writing operation of the image data to the storage unit.

An 8 th aspect relating to the technology of the present invention is the imaging element according to the 3 rd or 4 th aspect, wherein the output period is a horizontal blanking period before 1 st image data of 1 line is output from the 1 st output unit. This makes it possible to avoid output from the imaging element to the processing unit from being suspended due to the writing operation of the image data to the storage unit.

A 9 th aspect according to the present invention is the imaging element according to the 3 rd or 4 th aspect, including: and a 1 st A/D converter for A/D converting the analog image data, wherein the output period is an A/D conversion period based on the 1 st A/D converter before 1 line of the 1 st image data is output from the 1 st output unit. This makes it possible to avoid output from the imaging element to the processing unit from being suspended due to the writing operation of the image data to the storage unit.

A 10 th aspect relating to the technology of the present invention is the imaging element according to any one of the 1 st to 9 th aspects, including: a 2 nd A/D converter for A/D converting the analog image data; and a memory controller for storing digital image data obtained by digitizing analog image data by the 2 nd A/D converter in the storage unit, wherein the 1 st output unit outputs the digital image data obtained by the 2 nd A/D converter as the 1 st image data without storing the digital image data in the storage unit, and the 2 nd output unit outputs the digital image data read from the storage unit by the memory controller as the 2 nd image data. Thus, even while the image data is being written in the storage section, the output from the imaging element to the processing section can be continued.

An 11 th aspect relating to the technology of the present invention is the imaging element according to any one of the 1 st to 10 th aspects, wherein the storage unit is a memory having a write time and a read time different from each other. Thus, even if the storage unit is a memory having a write timing different from a read timing, the output from the imaging element to the processing unit can be continued.

An 11 th aspect according to the present invention is the imaging element according to the 10 th aspect, wherein the memory unit is a DRAM. Thus, even if the storage section is a DRAM, the output from the imaging element to the processing section can be continued.

A 13 th aspect according to the present invention is the imaging element according to any one of the 1 st to 12 th aspects, wherein at least the photoelectric conversion element and the storage unit are formed in one chip. This improves the portability of the imaging element compared to an imaging element in which the photoelectric conversion element and the storage unit are not formed into a single chip.

A 14 th aspect relating to the technology of the present invention is the imaging element according to the 13 th aspect, wherein the imaging element is a laminated imaging element in which a memory portion is laminated on a photoelectric conversion element. This makes it possible to increase the transfer rate of image data from the photoelectric conversion element to the storage unit, as compared with the case where the photoelectric conversion element and the storage unit are not stacked.

A 15 th aspect according to the present invention is an imaging apparatus including: the imaging element according to any one of modes 1 to 14; and a display control unit that controls the display unit to display at least one of a 1 st image based on the 1 st image data output by the 1 st output unit and a 2 nd image based on the 2 nd image data output by the 2 nd output unit. Thus, even when the image data is stored in the storage unit, the image data can be output without delay.

A 16 th aspect according to the present invention is an imaging apparatus including: the imaging element according to any one of modes 1 to 14; and a storage control unit that performs control to store at least one of the 1 st image data output by the 1 st output unit and the 2 nd image data output by the 2 nd output unit in the storage device. Thus, even when the image data is stored in the storage unit, the image data can be output without delay.

A 17 th aspect of the present invention is a method of operating an imaging element having a 1 st output unit, a storage unit, and a 2 nd output unit built therein, the method including: an external processing section of the 1 st output section outputting 1 st image data based on image data obtained by photographing an object to the imaging element; the storage part stores image data; and a 2 nd output unit that outputs the 2 nd image data based on the image data stored in the storage unit to the external processing unit, wherein the 1 st output unit outputs the image data in a manner different from that of the 2 nd output unit. Thus, even when the image data is stored in the storage unit, the image data can be output without delay.

An 18 th aspect relating to the technique of the present invention is a program for causing a computer to function as a 1 st output unit and a 2 nd output unit included in an imaging element, the imaging element incorporating the 1 st output unit, a storage unit, and the 2 nd output unit, the 1 st output unit outputting 1 st image data based on image data obtained by imaging a subject to an external processing unit of the imaging element; the storage part stores image data; and a 2 nd output unit that outputs the 2 nd image data based on the image data stored in the storage unit to the external processing unit, wherein the 1 st output unit outputs the image data in a manner different from that of the 2 nd output unit. Thus, even when the image data is stored in the storage unit, the image data can be output without delay.

A 19 th aspect of the present invention is an imaging element including a 1 st processor, a memory, and a 2 nd processor, the 1 st processor outputting 1 st image data based on image data obtained by imaging a subject to an external processing unit of the imaging element, the memory storing the image data, the 2 nd processor outputting 2 nd image data based on the image data stored in the memory to the external processing unit, an output mode of the 1 st processor being different from an output mode of the 2 nd processor. Thus, even when the image data is stored in the storage unit, the image data can be output without delay.

Drawings

Fig. 1 is a perspective view showing an example of an external appearance of the imaging device according to embodiments 1 to 3.

Fig. 2 is a rear view showing an example of the appearance of the rear surface side of the imaging apparatus shown in fig. 1.

Fig. 3 is a block diagram showing an example of the configuration of the imaging apparatus according to embodiments 1 to 3.

Fig. 4 is a conceptual diagram for explaining the frame rate of the imaging element included in the imaging apparatus according to embodiments 1 to 3.

Fig. 5 is a block diagram showing an example of the configuration of an electric system of the imaging device main body according to embodiments 1 to 3.

Fig. 6 is a schematic configuration diagram showing a configuration of a hybrid finder included in the imaging apparatus according to embodiments 1 to 3.

Fig. 7 is a block diagram showing an example of a stacked structure of imaging elements included in the imaging devices according to embodiments 1 to 3 and an example of a connection relationship among the imaging elements, a signal processing circuit, and a controller.

Fig. 8 is a block diagram showing an example of the configuration of an electrical system of an imaging element included in the imaging apparatus according to embodiment 1.

Fig. 9 is a state transition diagram showing an example of processing contents in a time series of image pickup processing and output processing performed by an imaging element included in the image pickup apparatus according to embodiment 1.

Fig. 10 is a timing chart showing an example of a mode in which the 1 st output and the 2 nd output are performed in parallel.

Fig. 11 is a flowchart showing an example of the flow of the control processing according to embodiment 1.

Fig. 12 is a block diagram showing an example of the configuration of an electrical system of an imaging element included in the imaging apparatus according to embodiments 2 and 3.

Fig. 13 is a state transition diagram showing an example of the processing contents in the time series of the image pickup processing according to embodiment 2.

Fig. 14 is a state transition diagram showing an example of the processing contents in the time series of the output processing according to embodiment 2.

Fig. 15 is a timing chart showing an example of a mode in which the 2 nd output is performed in the vertical blanking period and the 1 st output is performed before and after the vertical blanking period.

Fig. 16 is a flowchart showing an example of the flow of the image pickup processing according to embodiment 2.

Fig. 17 is a flowchart showing an example of the output processing flow according to embodiment 2.

Fig. 18 is a state transition diagram showing an example of the processing contents in the time series of the image pickup processing according to embodiment 3.

Fig. 19 is a state transition diagram showing an example of the processing contents in the time series of the output processing according to embodiment 3.

Fig. 20 is a timing chart showing an example of a mode in which the 2 nd output is performed in the horizontal blanking period and the 1 st output is performed before and after the horizontal blanking period.

Fig. 21 is a flowchart showing an example of the flow of the image pickup processing according to embodiment 3.

Fig. 22 is a flowchart showing an example of the output processing flow according to embodiment 3.

Fig. 23A is a timing chart showing an example of a mode in which the 2 nd output is performed during the digital signal processing period and the 1 st output is performed during the writing period.

Fig. 23B is a timing chart showing an example of a mode in which the 2 nd output is performed during the a/D conversion period and the 1 st output is performed during the writing period.

Fig. 24 is a conceptual diagram illustrating an example of a mode in which various programs are installed in a computer in an imaging device from a storage medium in which the various programs are stored.

Fig. 25 is a block diagram showing an example of a schematic configuration of a smart device incorporating the imaging element according to embodiments 1 to 3.

Detailed Description

Hereinafter, an example of an embodiment of an imaging device according to the technique of the present invention will be described with reference to the drawings.

First, terms used in the following description will be described.

CPU refers to "Central Processing Unit: short for central processing unit ". RAM refers to "Random Access Memory: random access memory ("ram") for short. ROM refers to "Read Only Memory: read-only memory. DRAM refers to a "Dynamic Random Access Memory: dynamic random access memory ("DRAM") is short. The SRAM refers to "Static Random Access Memory: static random access memory ("sram") is short.

LSI means "Large-Scale Integration: short for large-scale integration. ASIC refers to "Application Specific Integrated Circuit: application specific integrated circuit. PLD refers to "Programmable Logic Device: programmable logic device. FPGA refers to Field-Programmable Gate Array: field programmable gate array.

SSD refers to "Solid State Drive: solid state drive "for short. DVD-ROM refers to "Digital Versatile Disc Read Only Memory: abbreviation of digital general purpose optical disc read only memory. USB refers to "Universal Serial Bus: short for universal serial bus. The HDD refers to "Hard Disk Drive: hard disk drive "for short. EEPROM refers to "Electrically Erasable and Programmable Read Only Memory: short for charged erasable programmable read-only memory.

The CCD means "Charge Coupled Device: short for charge coupled device ". CMOS refers to the "Complementary Metal Oxide Semiconductor: complementary metal oxide semiconductor. EL refers to "Electro-Luminescence: short for electroluminescence ". A/D refers to "Analog/Digital: short for analog/digital. I/F refers to "Interface: interface "for short. UI refers to "User Interface: short for user interface ".

LVDS refers to "Low Voltage Differential Signaling: low voltage differential signaling "is abbreviated. PCI-e refers to "Peripheral Component Interconnect Express: abbreviation of peripheral component interconnect standard ". SATA refers to "Serial Advanced Technology Attachment: short for serial advanced technology attachment ". SLVS-EC refers to "Scalable Low Signaling with Embedded Clock: short for extensible low voltage signal with embedded clock. MIPI refers to "Mobile Industry Processor Interface: mobile industry processor interface.

[ embodiment 1 ]

As an example, as shown in fig. 1, the imaging device 10 is a lens interchangeable camera. The image pickup apparatus 10 is a digital camera that includes an image pickup apparatus body 12, an interchangeable lens 14 mounted to the image pickup apparatus body 12 in an interchangeable manner, and omits a mirror.

An imaging element 44 is provided on the image pickup apparatus main body 12. When the interchangeable lens 14 is attached to the image pickup apparatus main body 12, subject light representing a subject is transmitted through the interchangeable lens 14 to be imaged on the imaging element 44, and image data 69 (see fig. 3 and 4) representing an image of the subject is generated by the imaging element 44.

A hybrid viewfinder (registered trademark) 16 is provided in the image pickup apparatus main body 12. The hybrid finder 16 is a finder that selectively uses, for example, an optical finder (hereinafter, referred to as "OVF") and an electronic finder (hereinafter, referred to as "EVF"). In addition, OVF means "optical viewfinder: optical viewfinder "is short. And, EVF means "electronic viewfinder: electronic viewfinder "is short.

A viewfinder switching lever 18 is provided on the front surface of the imaging apparatus main body 12. By rotating the finder switching lever 18 in the direction of arrow SW, an optical image that can be visually recognized by OVF and an electronic image that can be visually recognized by EVF, that is, a through image, are switched. The "through image" referred to herein is a moving image for display based on image data 69 (see fig. 3 and 4) captured by the imaging element 44. The through image is also commonly referred to as a live view image. A release button 20 and a dial 22 are provided on the upper surface of the imaging device main body 12. The dial 22 is operated when setting an operation mode of the imaging system, an operation mode of the playback system, and the like.

The release button 20 functions as an imaging preparation instructing unit and an imaging instructing unit, and can detect two stages of pressing operations, i.e., an imaging preparation instructing state and an imaging instructing state. The imaging preparation instruction state is, for example, a state in which the image is pressed from the standby position to the intermediate position (half-pressed position), and the imaging instruction state is a state in which the image is pressed to the final pressed position (full-pressed position) beyond the intermediate position.

In the image pickup apparatus 10, as the operation mode, a shooting mode and a playback mode are selectively set according to an instruction of a user. The shooting modes are roughly classified into a shooting mode for displaying moving pictures and a shooting mode for recording.

As an example, as shown in fig. 2, a touch panel display 26, an instruction key 28, and a viewfinder eyepiece portion 30 are provided on the back surface of the imaging apparatus main body 12.

The touch panel display 26 includes a 1 st display 32 and a touch panel 34 (see also fig. 5). As an example of the 1 st display 32, a liquid crystal display may be mentioned. The 1 st display 32 may be another display such as an organic EL display, instead of the liquid crystal display.

The 1 st display 32 displays image and character information and the like. The 1 st display 32 is used to display a through image obtained by continuous shooting when the image pickup apparatus 10 is in the shooting mode. The 1 st display 32 is also used to display a still image obtained by shooting when an instruction for still image shooting is given. The 1 st display 32 is also used to display a playback image, a menu screen, and the like when the image pickup apparatus 10 is in the playback mode.

The touch panel 34 is a transmission type touch panel, and overlaps with the surface of the display area of the 1 st display 32. The touch panel 34 detects a contact by a pointer such as a finger or a stylus pen.

The instruction key 28 receives various instructions such as selection of one or more menus, determination of selected contents, deletion of selected contents, zooming, and frame transfer.

As an example, as shown in fig. 3, the interchangeable lens 14 has an imaging lens 40. The imaging lens 40 includes an objective lens 40A, a focusing lens 40B, and a diaphragm 40C. The objective lens 40A, the focus lens 40B, and the diaphragm 40C are arranged in this order from the object side to the image pickup apparatus main body 12 side along the optical axis L1. The focus lens 40B and the diaphragm 40C are operated by power from a drive source (not shown) such as a motor. That is, the focus lens 40B and the diaphragm 40C move along the optical axis L1 in accordance with the applied power. Further, the diaphragm 40C adjusts exposure by operating in accordance with the applied power.

The imaging apparatus main body 12 includes a mechanical shutter 42, an imaging element 44, and a processing unit 45. The mechanical shutter 42 is operated by receiving power from a drive source (not shown) such as a motor. The imaging element 44 includes a photoelectric conversion element 61 having a light receiving surface 61A. When the interchangeable lens 14 is attached to the image pickup apparatus body 12, subject light representing a subject passes through the imaging lens 40 and forms an image on the light receiving surface 61A of the imaging element 44 via the mechanical shutter 42. The photoelectric conversion element 61 generates image data 69 representing an image of an object by photoelectrically converting object light imaged on the light receiving surface 61A. The imaging element 44 digitizes the image data 69 generated by the photoelectric conversion element 61, and outputs the digitized image data to the processing unit 45 via each of the communication lines 53 and 55.

The imaging apparatus main body 12 includes a processing unit 45 and a UI system device 48. The processing section 45 is an external processor disposed outside the imaging element 44. The processing unit 45 is an example of the "external processing unit for an imaging element" according to the technique of the present invention. The processing section 45 is a circuit located at a subsequent stage of the imaging element 44, and includes a controller 46 and a signal processing circuit 50.

The controller 46 controls the entire image pickup apparatus 10. The UI system component 48 is a component that prompts information to the user or receives instructions from the user. The UI system device 48 is connected to the controller 46, and the controller 46 acquires various information from the UI system device 48 and controls the UI system device 48.

The imaging element 44 is connected to the controller 46 via a communication line 57, and generates image data 69 by photographing a subject under the control of the controller 46.

The imaging element 44 is connected to the signal processing circuit 50 via a communication line 53 and a communication line 55. Specifically, the imaging element 44 and the signal processing circuit 50 are connected in parallel via a communication line 53 and a communication line 55. The imaging element 44 and the signal processing circuit 50 are connected by the PCI-e connection standard via a communication line 53, and are connected by the LVDS connection standard via a communication line 55.

Here, PCI-e and LVDS are exemplified as connection standards, but the technique of the present invention is not limited thereto, and other connection standards may be used. Examples of other connection standards include SATA, SLVS-EC, and MIPI. However, these connection standards are merely examples, and any connection standard may be used as long as communication via the communication line 53 and communication via the communication line 55 can be performed between the imaging element 44 and the signal processing circuit 50 independently of each other.

Here, an example of a mode in which communication is performed between the imaging element 44 and the signal processing circuit 50 in a wired form using the communication line 53 and the communication line 55 is given, but the technique of the present invention is not limited to this. For example, instead of wired communication between the imaging element 44 and the signal processing circuit 50 via each of the communication line 53 and the communication line 55, wireless communication may be performed between the imaging element 44 and the signal processing circuit 50. In this case, it is sufficient to secure a 1 st communication path of a wireless format corresponding to a wired communication path via the communication line 53 and a 2 nd communication path of a wireless format corresponding to a wired communication path via the communication line 55. The 1 st communication path and the 2 nd communication path are communication paths capable of performing wireless communication in a frequency band that does not interfere with each other in a wireless format in which communication standards are different from each other. Further, communication may be performed between the imaging element 44 and the signal processing circuit 50 independently of each other through two communication paths, i.e., a wired communication path and a wireless communication path.

The signal processing circuit 50 is an LSI, specifically, a device including an ASIC. The signal processing circuit 50 is connected to a controller 46 via a communication line 60, and the controller 46 acquires various information from the signal processing circuit 50 and controls the signal processing circuit 50.

The image data 69 is input from the imaging element 44 to the signal processing circuit 50 via the communication lines 53, 55. As will be described in detail later, the signal processing circuit 50 performs various signal processes on the image data 69 input via the communication lines 53 and 55.

In the present embodiment, a device including an ASIC is used as the signal processing circuit 50. However, this is merely an example, and the signal processing circuit 50 may also be a device including an ASIC, an FPGA, and/or a PLD. The signal processing circuit 50 may be a computer including a CPU, a ROM, and a RAM. The number of the CPUs may be one or plural. The signal processing circuit 50 may be implemented by a combination of a hardware configuration and a software configuration.

The imaging element 44 is an example of a "laminated imaging element" according to the technique of the present invention. In the present embodiment, the imaging element 44 is a CMOS image sensor. Further, although a CMOS image sensor is exemplified as the imaging element 44 here, the technique of the present invention is not limited to this, and the technique of the present invention is also applicable to, for example, a CCD image sensor as the imaging element 44.

As an example, as shown in fig. 4, a read synchronization signal is input from the controller 46 to the imaging element 44 via the communication line 57. The read synchronization signal includes a vertical synchronization signal and a horizontal synchronization signal. The vertical synchronization signal is a synchronization signal that defines the read start timing of the image data 69 every 1 frame from the photoelectric conversion element 61. The horizontal synchronization signal is a synchronization signal that specifies the read start timing of the image data 69 from each horizontal line of the photoelectric conversion elements 61. In the imaging element 44, image data 69 is read out from the photoelectric conversion element 61 at a frame rate determined in accordance with a vertical synchronization signal input from the controller 46 via the communication line 57.

In the example shown in fig. 4, as the frame rate of the imaging element 44, a frame rate at which readout is performed for 8 frames from the photoelectric conversion element 61 in the period T is shown. An example of a specific frame rate is 120fps (frame per second: frames per second).

For example, as shown in fig. 5, the controller 46 includes a CPU46A, a ROM46B, a RAM46C, a connection I/F46D, and an input I/F46E. The CPU46A, ROM46B, RAM46C, connection I/F46D, and input I/F46E are connected to each other via a bus 88.

Various programs are stored in the ROM 46B. The CPU46A reads out various programs from the ROM46B and expands the read out various programs to the RAM 46C. The CPU46A controls the entire image pickup apparatus 10 in accordance with various programs extended to the RAM 46C.

The connection I/F46D is a communication device having an FPGA, and is connected to the imaging element 44 via a communication line 57. The CPU46A controls the imaging element 44 via the connection I/F46D.

The input I/F46E is a communication device having an FPGA, and is connected to the signal processing circuit 50 via a communication line 60. The image data 69 (see fig. 3 and 4) subjected to various signal processing by the signal processing circuit 50 is input to the input I/F46E via the communication line 60. The input I/F46E transfers the image data 69 input from the signal processing circuit 50 to the CPU 46A.

The secondary storage device 80 and the external I/F82 are connected to the bus 88. The auxiliary storage device 80 is a nonvolatile memory such as an SSD, an HDD, or an EEPROM. The CPU46A reads and writes various information from and to the auxiliary storage device 80. The auxiliary storage device 80 is an example of the "storage device" according to the technique of the present invention.

The external I/F82 is a communication device with an FPGA. External devices (not shown) such as USB memories and memory cards are connected to the external I/F82. The external I/F82 controls the transfer of various information between the CPU46A and external devices. In addition, an external device such as a USB memory or a memory card is an example of the "storage device" according to the technology of the present invention.

UI system device 48 includes hybrid viewfinder 16, touch panel display 26, and receiving device 84. The 1 st display 32 and the touch panel 34 are connected to the bus 88. Therefore, the CPU46A causes the 1 st display 32 to display various information and operates in accordance with various instructions received by the touch panel 34.

The receiving device 84 includes the touch panel 34 and the hard key portion 25. The hard key portion 25 is a plurality of hard keys, and has a release button 20, a dial 22, and an indication key 28. The hard key unit 25 is connected to the bus 88, and the CPU46A operates in accordance with various instructions received by the hard key unit 25.

The hybrid viewfinder 16 includes a 2 nd display 86, and the CPU46A causes the 2 nd display 86 to display various information. As an example of the 2 nd display 86, a liquid crystal display can be mentioned. The 2 nd display 86 may be another display such as an organic EL display instead of the liquid crystal display.

For example, as shown in fig. 6, the hybrid viewfinder 16 includes an OVF90 and an EVF 92. OVF90 is a reverse galileo viewfinder and has an eyepiece lens 94, a prism 96 and an objective lens 98. EVF92 has a 2 nd display 86, a prism 96, and an eyepiece lens 94.

A liquid crystal shutter 100 is disposed on the object side of the objective lens 98 along the optical axis L2 of the objective lens 98, and when the EVF92 is used, the liquid crystal shutter 100 blocks light so that an optical image does not enter the objective lens 98.

The prism 96 reflects the electronic image or various information displayed on the 2 nd display 86 and guides to the eyepiece lens 94, and synthesizes the optical image and the electronic image and/or various information displayed on the 2 nd display 86. An example of the electronic image displayed on the 2 nd display 86 is a through image 102 based on the image data 69.

In the OVF mode, the CPU46A controls the liquid crystal shutter 100 to be in the non-light-shielding state so that the optical image can be visually recognized from the eyepiece lens 94. In the EVF mode, the CPU46A controls the liquid crystal shutter 100 to be in the light-shielded state so that only the electronic image displayed on the 2 nd display 86 can be visually recognized from the eyepiece lens 94.

In the following description, for convenience of explanation, the display is referred to as "display" without reference numerals when it is not necessary to distinguish between the 1 st display 32 (see fig. 2 and 5) and the 2 nd display 86. The display is an example of the "display unit (display)" according to the technology of the present invention. The CPU46A is an example of the "display control unit (display processor)" and the "storage control unit (storage processor)" according to the technique of the present invention.

As an example, as shown in fig. 7, the imaging element 44 incorporates a photoelectric conversion element 61, a processing circuit 62, and a memory 64. The imaging element 44 is an imaging element in which the photoelectric conversion element 61, the processing circuit 62, and the memory 64 are integrated into a single chip. That is, the photoelectric conversion element 61, the processing circuit 62, and the memory 64 are packaged. In the imaging element 44, the processing circuit 62 and the memory 64 are stacked on the photoelectric conversion element 61. Specifically, the photoelectric conversion element 61 and the processing circuit 62 are electrically connected to each other by a bump (not shown) having conductivity such as copper, and the processing circuit 62 and the memory 64 are also electrically connected to each other by a bump (not shown) having conductivity such as copper. Here, a three-layer structure of the photoelectric conversion element 61, the processing circuit 62, and the memory 64 is exemplified, but the technique of the present invention is not limited to this, and a two-layer structure of the photoelectric conversion element 61 and a memory layer in which the processing circuit 62 and the memory 64 are one layer may be employed. The memory 64 is an example of the "storage unit (memory)" according to the technique of the present invention.

The processing circuit 62 is, for example, an LSI. The memory 64 is a memory whose write timing and read timing are different. Here, as an example of the memory 64, a DRAM is used. However, the technique of the present invention is not limited to this, and SRAM may be employed as the memory 64 instead of DRAM.

The processing circuit 62 is a device including an ASIC and an FPGA, and controls the entire imaging element 44 according to the instruction of the controller 46. Note that, although the processing circuit 62 is realized by a device including an ASIC and an FPGA here, the technique of the present invention is not limited to this, and may be a device including an ASIC, an FPGA, and/or a PLD, for example. Further, as the processing circuit 62, a computer including a CPU, a ROM, and a RAM can be used. The number of the CPUs may be one or plural. The processing circuit 62 may be implemented by a combination of a hardware configuration and a software configuration.

The photoelectric conversion element 61 includes a plurality of photodiodes arranged in a matrix. As an example of the plurality of photodiodes, photodiodes of the number of "4896 × 3265" pixels can be given.

A color filter is disposed in each photodiode included in the photoelectric conversion element 61. The color filters include a G filter corresponding to G (green), an R filter corresponding to R (red), and a B filter corresponding to B (blue) that are most useful for obtaining a luminance signal. The photoelectric conversion element 61 has R pixels, G pixels, and B pixels.

The R pixel is a pixel corresponding to a photodiode in which an R color filter is arranged, the G pixel is a pixel corresponding to a photodiode in which a G color filter is arranged, and the B pixel is a pixel corresponding to a photodiode in which a B color filter is arranged. The R pixels, G pixels, and B pixels are arranged at predetermined periodicity in a row direction (horizontal direction) and a column direction (vertical direction), respectively. In this embodiment mode, the R pixels, the G pixels, and the B pixels are arranged in a periodic manner corresponding to an X-Trans (registered trademark) arrangement. In addition, although the X-Trans array is exemplified here, the technique of the present invention is not limited to this, and the array of the R pixels, G pixels, and B pixels may be a bayer array, a honeycomb array, or the like.

The imaging element 44 has a so-called electronic shutter function, and controls the charge accumulation time of each photodiode in the photoelectric conversion element 61 by activating the electronic shutter function under the control of the controller 46. The charge accumulation time is a so-called shutter speed.

In the imaging apparatus 10, still image shooting and moving image shooting are performed in a rolling shutter method. The still image shooting is realized by activating the electronic shutter function and operating the mechanical shutter 42 (refer to fig. 3), and the through image shooting is realized by activating the electronic shutter function without operating the mechanical shutter 42. Note that, although the rolling shutter method is illustrated here, the technique of the present invention is not limited to this, and a global shutter method may be applied instead of the rolling shutter method.

The processing circuit 62 reads out image data 69 (refer to fig. 3 and 4) obtained by imaging an object by the photoelectric conversion element 61. The image data 69 is signal charges accumulated in the photoelectric conversion element 61. The processing circuit 62 a/D converts analog image data 69 read out from the photoelectric conversion element 61. The processing circuit 62 stores digital image data 69 obtained by a/D converting the analog image data 69 in the memory 64.

The processing circuit 62 is connected to the signal processing circuit 50 via the communication line 53 and the communication line 55. The processing circuit 62 is connected to the controller 46 via a communication line 57.

The processing circuit 62 and the signal processing circuit 50 communicate with each other via the communication line 53 in accordance with the PCI-e connection standard and communicate with each other via the communication line 55 in accordance with the LVDS connection standard.

As an example, as shown in fig. 8, the processing circuit 62 is an example of a "processor" according to the technique of the present invention. The readout circuit 62A, the digital processing circuit 62B, the selector 62C, the control circuit 62D, and the communication I/fs 62E1, 62E2, 62E3 are provided. The communication I/F62E2 is an example of the "2 nd output unit (2 nd communication interface)" according to the technique of the present invention, and the communication I/F62E3 is an example of the "1 st output unit (1 st communication interface)" according to the technique of the present invention. The control circuit 62D is an example of the "memory controller" according to the technique of the present invention.

The readout circuit 62A is connected to each of the photoelectric conversion element 61, the digital processing circuit 62B, and the control circuit 62D. The memory 64 is connected to the control circuit 62D. The selector 62C is connected to each of the digital processing circuit 62B, the control circuit 62D, and the communication I/F62E 3. Each of the communication I/fs 62E1, 62E2, 62E3 is connected to the control circuit 62D.

As an example, as shown in fig. 8, the image data 69 is roughly divided into analog image data 69A and digital image data 69B. Hereinafter, for convenience of explanation, the analog image data 69A and the digital image data 69B will be referred to as "image data 69" without distinguishing them.

The communication I/F62E1 is a communication device having an FPGA, and is connected to the controller 46 via a communication line 57. The controller 46 outputs the readout synchronization signal to the communication I/F62E1 via the communication line 57. The communication I/F62E1 receives the read synchronization signal from the controller 46 via the communication line 57, and outputs the received read synchronization signal to the control circuit 62D.

The communication I/F62E2 is a communication device having an FPGA, and is connected to the signal processing circuit 50 via the communication line 53 in accordance with the PCI-E connection standard. The communication I/F62E2 controls communication between the signal processing circuit 50 and the control circuit 62D. Here, a communication device having an FPGA is used as the communication I/F62E2, but this is merely an example, and the communication I/F62E2 may be a device including an ASIC, an FPGA, and/or a PLD. The communication I/F62E2 may be a computer including a CPU, a ROM, and a RAM. The number of the CPUs may be one or plural. Also, the communication I/F62E2 may be implemented by a combination of a hardware configuration and a software configuration.

The communication I/F62E3 is a communication device having an FPGA, and is connected to the signal processing circuit 50 via the communication line 55 in accordance with the LVDS connection standard. The communication I/F62E2 controls communication between the signal processing circuit 50 and the selector 62C and communication between the signal processing circuit 50 and the control circuit 62D. Here, a communication device having an FPGA is used as the communication I/F62E3, but this is merely an example, and the communication I/F62E3 may be a device including an ASIC, an FPGA, and/or a PLD. The communication I/F62E3 may be a computer including a CPU, a ROM, and a RAM. The number of the CPUs may be one or plural. Also, the communication I/F62E3 may be implemented by a combination of a hardware configuration and a software configuration.

The readout circuit 62A controls the photoelectric conversion element 61 under the control of the control circuit 62D, and reads out analog image data 69A from the photoelectric conversion element 61. The analog image data 69A from the photoelectric conversion element 61 is read in accordance with a read synchronization signal input from the controller 46 to the processing circuit 62.

Specifically, first, the communication I/F62E1 receives the read synchronization signal from the controller 46, and outputs the received read synchronization signal to the control circuit 62D. Next, the control circuit 62D transmits the readout synchronization signal input from the communication I/F62E1 to the readout circuit 62A. That is, the vertical synchronization signal and the horizontal synchronization signal are transmitted to the readout circuit 62A. Further, the readout circuit 62A starts reading out the analog image data 69A from the photoelectric conversion element 61 in frame units in accordance with the vertical synchronization signal transmitted from the control circuit 62D. Further, the readout circuit 62A starts readout of the analog image data 69A in units of horizontal lines in accordance with the horizontal synchronization signal transmitted from the control circuit 62D.

The readout circuit 62A performs analog signal processing on the analog image data 69A read out from the photoelectric conversion element 61. The analog signal processing includes known processing such as noise removal processing and analog gain processing. The noise elimination processing is processing of eliminating noise caused by a variation in characteristics between pixels included in the photoelectric conversion element 61. The analog gain processing is processing of applying a gain to the analog image data 69A. The analog image data 69A thus subjected to the analog signal processing is output to the digital processing circuit 62B through the readout circuit 62A.

The digital processing circuit 62B includes an a/D converter 62B 1. The a/D converter 62B 1a/D converts the analog image data 69A. The a/D converter 62B1 is an example of the "1 st a/D converter" and the "2 nd a/D converter" according to the technique of the present invention.

The digital processing circuit 62B performs digital signal processing on the analog image data 69A input from the readout circuit 62A. The digital signal processing includes, for example, correlated double sampling, a/D conversion based on the a/D converter 62B1, and digital gain processing.

Analog image data 69A is correlated double sampled by digital processing circuit 62B. The analog image data 69A subjected to the correlated double sampling signal processing is a/D converted by the a/D converter 62B1, whereby the analog image data 69A is digitized to obtain digital image data 69B as RAW data. Also, digital gain processing is performed on the digital image data 69B by the digital processing circuit 62B. The digital gain processing refers to processing of applying a gain to the digital image data 69B. The digital image data 69B obtained by performing the digital signal processing in this manner is output to the selector 62C by the digital processing circuit 62B.

The selector 62C selectively transfers the digital image data 69B input from the digital processing circuit 62B to two transfer destinations. That is, the selector 62C selectively transfers the digital image data 69B input from the digital processing circuit 62B to the control circuit 62D and the communication I/F62E3 in accordance with an instruction from the control circuit 62D.

The control circuit 62D stores the digital image data 69B input from the selector 62C in the memory 64. The memory 64 is a memory capable of storing digital image data 69B of a plurality of frames. The memory 64 has a storage area (not shown) in pixel units, and the digital image data 69B is stored in the corresponding storage area in the memory 64 in pixel units by the control circuit 62D.

The control circuit 62D can make random access to the memory 64, and acquire the digital image data 69B from the memory 64 in accordance with a request from the signal processing circuit 50 via the communication I/F62E 2. The control circuit 62D outputs the digital image data 69B acquired from the memory 64 to the communication I/F62E 2.

The processing circuit 62 outputs the digital image data 69B to the signal processing circuit 50 independently of each other under the control of the control circuit 62D, i.e., the 1 st output via the communication line 53 and the 2 nd output via the communication line 55. The 1 st output and the 2 nd output are outputs of different output modes. That is, in the 1 st output and the 2 nd output, the transmission path of the digital image data 69B transmitted before outputting the digital image data 69B to the signal processing circuit 50 is different, and the connection standard between the imaging element 44 and the signal processing circuit 50 is also different.

The 1 st output refers to output of the digital image data 69B to the signal processing circuit 50 via the 1 st transmission path. The 1 st transfer path is a path for transferring the digital image data 69B to the selector 62C, the communication I/F62E3, and the signal processing circuit 50 in this order without passing through the control circuit 62D. That is, the output mode of the 1 st output is an output mode in which the digital image data 69B obtained from the a/D converter 62B1 is output without being stored in the memory 64. The digital image data 69B transmitted through the 1 st transmission path is an example of the "1 st image data obtained by imaging the subject" according to the technique of the present invention.

The 2 nd output refers to output of the digital image data 69B to the signal processing circuit 50 via the 2 nd transmission path. The 2 nd transfer path is a path for transferring the digital image data 69B to the memory 64, the control circuit 62D, the communication I/F62E2, and the signal processing circuit 50 in this order. That is, the output mode of the 2 nd output is an output mode of outputting the digital image data 69B read out from the memory 64 by the control circuit 62D. The digital image data 69B transmitted through the 2 nd transmission channel is an example of the "2 nd image data based on the image data stored in the storage unit" according to the technique of the present invention.

The 1 st output is realized by using the communication I/F62E3 and the communication line 55. That is, when the digital image data 69B is input from the selector 62C, the communication I/F62E3 outputs the input digital image data 69B to the signal processing circuit 50 via the communication line 55.

The 2 nd output is realized by using the communication I/F62E2 and the communication line 53. That is, when the digital image data 69B is input from the control circuit 62D, the communication I/F62E2 outputs the input digital image data 69B to the signal processing circuit 50 via the communication line 53.

The signal processing circuit 50 performs the above-described various signal processes on the digital image data 69B input from the processing circuit 62 via the communication lines 53 and 55. The various signal processing includes known signal processing such as demosaicing, digital thinning, and digital addition.

The demosaicing process is a process of calculating all color information for each pixel from a mosaic image corresponding to the arrangement of color filters. For example, in the case of an imaging element configured by color filters of three colors of RGB, color information of all RGB is calculated for each pixel from a mosaic image configured by RGB. The digital thinning-out processing is processing of thinning out pixels included in the digital image data 69B in units of lines. The line unit refers to, for example, a horizontal line unit and/or a vertical line unit. The digital addition processing is, for example, processing of performing addition averaging on pixel values of a plurality of pixels included in the digital image data 69B.

In addition, the various signal processing also includes other well-known signal processing. Examples of other known signal processing include white balance adjustment, sharpness adjustment, gamma correction, color space conversion processing, and color difference correction.

As an example, as shown in fig. 9, processing including image pickup processing and output processing is performed in the imaging element 44. In the image pickup processing, after the nth (natural number) exposure, nth readout, nth reset, nth digital signal processing, and nth storage are performed, the (N + 1) th exposure, the (N + 1) th readout, the (N + 1) th reset, and the (N + 1) th digital signal processing are performed. After the output processing is performed, N is added by 1, and the image pickup processing and the output processing are repeated.

At the start of the image pickup processing, the photoelectric conversion element 61 is reset by the readout circuit 62A, and the residual charge of each pixel in the photoelectric conversion element 61 is eliminated. The photoelectric conversion element 61 is exposed for the nth time during a period from the previous reset of the photoelectric conversion element 61 by the readout circuit 62A to the nth readout.

When the vertical synchronization signal of the nth time is input to the readout circuit 62A, the readout circuit 62A performs the nth time readout. The nth reading is reading of the analog image data 69A by the reading circuit 62A in accordance with the input of the nth vertical synchronization signal to the reading circuit 62A.

The nth reset refers to resetting of the photoelectric conversion element 61 by the readout circuit 62A in correspondence with the nth readout. The nth-order digital signal processing refers to digital signal processing performed by the digital processing circuit 62B on the analog image data 69A obtained by the nth-order readout.

The nth storage is storage of the digital image data 69B obtained by the nth digital signal processing into the memory 64. The nth storage is realized by using the selector 62C, the control circuit 62D, and the memory 64. That is, the digital image data 69B obtained by the nth digital signal processing is input to the control circuit 62D via the selector 62C, and is stored in the memory 64 by the control circuit 62D.

The photoelectric conversion element 61 performs exposure for the (N + 1) th time during a period from the (N) th reset to the (N + 1) th readout.

When the vertical synchronization signal of the (N + 1) th time is input to the readout circuit 62A, the readout circuit 62A performs readout of the (N + 1) th time. The (N + 1) th readout is readout of the analog image data 69A by the readout circuit 62A in accordance with input of the (N + 1) th vertical synchronization signal to the readout circuit 62A.

The N +1 th reset refers to resetting of the photoelectric conversion element 61 by the readout circuit 62A in correspondence with the N +1 th readout. The N +1 th digital signal processing is digital signal processing performed by the digital processing circuit 62B on the analog image data 69A obtained by the N +1 th readout.

In the output process, the 1 st output and the 2 nd output are performed in parallel. That is, the newest digital image data 69B is output to the signal processing circuit 50 via the 1 st transfer path, and the digital image data 69B 1 frame before is output to the signal processing circuit 50 via the 2 nd transfer path.

Here, the latest digital image data 69B is digital image data 69B obtained by the N +1 th order digital signal processing. The digital image data 69B 1 frame before is the digital image data 69B stored in the memory 64 at the present time. The digital image data 69B stored in the memory 64 at the present time is digital image data 69B obtained by the nth-order digital signal processing, which is input to the control circuit 62D via the selector 62C and stored in the memory 64 by the control circuit 62D.

In the imaging element 44, since the memory 64 is a DRAM, writing and reading cannot be simultaneously done to the memory 64. Therefore, as an example, as shown in fig. 10, the 1 st output and the 2 nd output are performed during a period in which writing to the memory 64 is impossible (the "write impossible period" shown in fig. 10). In other words, the imaging element 44 outputs digital image data of 2 consecutive frames in parallel to the signal processing circuit 50 by the period during which writing is impossible.

In the example shown in fig. 10, the 1 st output is performed in accordance with the horizontal synchronization signal input from the controller 46 via the communication I/F62E. That is, the communication I/F62E3 outputs the digital image data 69B input from the selector 62C (see fig. 8) to the signal processing circuit 50 every 1 horizontal line in accordance with the horizontal synchronization signal input from the controller 46 via the communication I/F62E1 and the control circuit 62D.

On the other hand, the 2 nd output is performed in parallel with the 1 st output. That is, while the 1 st output is being performed, the control circuit 62D acquires from the memory 64 the digital image data 69B 1 frame earlier than the digital image data 69B output from the communication I/F62E3, and outputs the acquired digital image data 69B to the communication I/F62E 2. The communication I/F62E2 outputs the digital image data 69B of 1 frame amount input from the control circuit 62D to the signal processing circuit 50.

Next, the operation of the imaging apparatus 10 will be described.

First, with reference to fig. 11, a flow of control processing performed by the processing circuit 62 of the imaging element 44 will be described.

In the control processing shown in fig. 11, first, in step ST10, the control circuit 62D determines whether or not the digital image data 69B is not stored in the memory 64. If the digital image data 69B is stored in the memory 64 in step ST10, the determination is no, and the control process proceeds to step ST 22. If the digital image data 69B is not stored in the memory 64 in step ST10, the determination is yes, and the control process proceeds to step ST 12.

In step ST12, the control circuit 62D determines whether a vertical synchronization signal is received by the communication I/F62E1 from the controller 46. In step ST12, if the communication I/F62E1 does not receive the vertical synchronization signal from the controller 46, the determination is no, and the control process proceeds to step ST 20. In step ST12, when the vertical synchronization signal is received from the controller 46 by the communication I/F62E1, the determination is yes, and the control process proceeds to step ST 14.

In step ST14, the readout circuit 62A performs readout of the analog image data 69A and reset of the photoelectric conversion element 61, and the control process then shifts to step ST 16.

In step ST16, the digital processing circuit 62B performs digital signal processing on the analog image data 69A, and then the control processing shifts to step ST 18.

The digital image data 69B obtained by subjecting the analog image data 69A to digital signal processing in step ST16 is output to the selector 62C, and the selector 62C transfers the digital image data 69B to the control circuit 62D.

In step ST18, the control circuit 62D stores the digital image data 69B in the memory 64, and then the control process shifts to step ST 20.

In step ST20, the control circuit 62D determines whether or not a condition for ending the control process (hereinafter referred to as a "control process ending condition") is satisfied. As an example of the control process termination condition, a condition in which the reception device 84 (see fig. 5) receives an instruction to terminate the control process is given. If the control process termination condition is not satisfied at step ST20, the process is determined as no, and the control process proceeds to step ST 10. In step ST20, if the control process termination condition is satisfied, it is determined as yes, and the control process is terminated.

In step ST22, the control circuit 62D determines whether a vertical synchronization signal is received by the communication I/F62E1 from the controller 46. In step ST22, if the communication I/F62E1 does not receive the vertical synchronization signal from the controller 46, the determination is no, and the control process proceeds to step ST 30. In step ST22, when the vertical synchronization signal is received from the controller 46 by the communication I/F62E1, the determination is yes, and the control process proceeds to step ST 24.

In step ST24, the readout circuit 62A performs readout of the analog image data 69A and reset of the photoelectric conversion element 61, and the control process then shifts to step ST 26.

In step ST26, the digital processing circuit 62B performs digital signal processing on the analog image data 69A, and then the control processing shifts to step ST 28.

The digital image data 69B obtained by subjecting the analog image data 69A to digital signal processing in step ST26 is output to the selector 62C, and the selector 62C transfers the digital image data 69B to the communication I/F62E 3.

In step ST28, the processing circuit 62 performs the 1 ST output and the 2 nd output, and then proceeds to step ST 30. The 1 st output and the 2 nd output are outputs having different output modes from each other. That is, the 1 st output is an output in accordance with the connection standard of LVDS using the 1 st transmission path (refer to fig. 9), and the 2 nd output is an output in accordance with the connection standard of PCI-e using the 2 nd transmission path (refer to fig. 9).

In step ST28, the communication I/F62E3 outputs the digital image data 69B transferred from the selector 62C to the signal processing circuit 50 (1 ST output) via the communication line 55. On the other hand, the control circuit 62D acquires the digital image data 69B of 1 frame before from the memory 64 in accordance with a request from the controller 46, and outputs the data from the communication I/F62E2 to the signal processing circuit 50 via the communication line 53 (2 nd output).

In step ST30, the control circuit 62D determines whether or not the control process termination condition is satisfied. If the control process termination condition is not satisfied at step ST30, the process is determined as no, and the control process proceeds to step ST 10. In step ST20, if the control process termination condition is satisfied, it is determined as yes, and the control process is terminated.

The digital image data 69B output from the communication I/F62E3 to the signal processing circuit 50 via the communication line 55 by executing the present control processing is transferred to the controller 46 when input to the signal processing circuit 50. On the other hand, when the digital image data 69B output from the communication I/F62E2 to the signal processing circuit 50 via the communication line 53 is input to the signal processing circuit 50, it is also transmitted to the controller 46. In the controller 46, the digital image data 69B is input to the input I/F46E, and an image based on the digital image data 69B is displayed on the display by the CPU 46A. The image based on the digital image data 69B input to the input I/F46E is an example of the "1 st image based on the 1 st image data" and the "2 nd image based on the 2 nd image data" according to the technique of the present invention.

The digital image data 69B input to the input I/F46E is stored in the auxiliary storage device 80 by the CPU46A, or stored in an external device such as a USB memory (not shown) or a memory card (not shown) via the external I/F82.

As described above, in the image pickup apparatus 10, the digital image data 69B obtained by photographing the subject is output to the signal processing circuit 50 by the communication I/F62E 3. Also, the digital image data 69B stored in the memory 64 is output to the signal processing circuit 50 by the communication I/F62E 2. The output mode of the communication I/F62E3 is different from that of the communication I/F62E 2. That is, the newest digital image data 69B is output to the signal processing circuit 50 through the communication line 55 by the 1 st transfer path (refer to fig. 9), and the digital image data 69B before 1 frame is output to the signal processing circuit 50 through the communication line 53 by the 2 nd transfer path (refer to fig. 9). The communication I/F62E3 and the signal processing circuit 50 are connected according to the LVDS connection standard, and the communication I/F62E2 and the signal processing circuit 50 are connected according to the PCI-E connection standard. Therefore, according to the imaging apparatus 10, compared to a case where the digital image data 69B is output to the processing section 45 from only a single communication I/F, output stagnation of the digital image data 69B can be suppressed.

In the imaging apparatus 10, the 1 st output and the 2 nd output (see fig. 8 to 10) are independently performed under the control of the control circuit 62D. Therefore, according to the imaging device 10, the timing of performing the 1 st output and the timing of performing the 2 nd output can be freely changed.

Then, in the imaging apparatus 10, the 2 nd output is performed in accordance with a request from the controller 46. Therefore, according to the imaging apparatus 10, even if the processing unit 45 is not in a state of receiving the 2 nd output, the 2 nd output can be avoided.

In the imaging apparatus 10, as the output method of the 1 st output, an output method is adopted in which the digital image data 69B obtained from the a/D converter 62B1 is output without being stored in the memory 64. As the output method of the 2 nd output, an output method of outputting the digital image data 69B read out from the memory 64 by the control circuit 62D is adopted. That is, even when the 2 nd output cannot be performed, the 1 st output is continued. Therefore, according to the imaging apparatus 10, even while the digital image data 69B is being written into the memory 64, the output from the imaging element 44 to the signal processing circuit 50 can be continued.

In the imaging apparatus 10, a memory having a write time different from a read time is used as the memory 64. In the imaging apparatus 10, the 1 st output and the 2 nd output are performed at timings at which the write timing to the memory 64 is avoided. Therefore, according to the imaging apparatus 10, even if the memory 64 is a memory whose write timing and read timing are different, the digital image data 69B can be continuously output from the imaging element 44 to the processing unit 45.

In the imaging apparatus 10, a DRAM is used as the memory 64. In the imaging device 10, the 1 st output and the 2 nd output are performed at timings avoiding the write timing to the DRAM. Therefore, according to the image pickup apparatus 10, even if the memory 64 is a DRAM, the digital image data 69B can be continuously output from the imaging element 44 to the processing section 45.

In the imaging apparatus 10, an imaging element in which the photoelectric conversion element 61, the processing circuit 62, and the memory 64 are formed into a single chip is used as the imaging element 44. This improves the portability of the imaging element 44 compared to an imaging element in which the photoelectric conversion element 61, the processing circuit 62, and the memory 64 are not formed in one chip. In addition, the degree of freedom in design can be improved as compared with an imaging element in which the photoelectric conversion element 61, the processing circuit 62, and the memory 64 are not formed in one chip. Further, it is also possible to contribute to downsizing of the imaging device main body 12, compared to an imaging element in which the photoelectric conversion element 61, the processing circuit 62, and the memory 64 are not formed into one chip.

As shown in fig. 7, a multilayer imaging element in which a memory 64 is laminated on a photoelectric conversion element 61 is used as the imaging element 44. Thus, since the wiring connecting the photoelectric conversion element 61 and the memory 112 can be shortened, the wiring delay can be reduced, and as a result, the transfer speed of the image data 69 from the photoelectric conversion element 61 to the memory 64 can be increased as compared with the case where the photoelectric conversion element 61 and the memory 64 are not stacked. The increase in the transfer speed also contributes to speeding up the processing in the entire processing circuit 62. Further, the degree of freedom in design can be improved as compared with the case where the photoelectric conversion element 61 and the memory 64 are not stacked. Further, as compared with the case where the photoelectric conversion element 61 and the memory 64 are not laminated, it is also possible to contribute to downsizing of the imaging device main body 12.

Then, in the imaging apparatus 10, a through image based on the digital image data 69B is displayed on the 2 nd display 86. This enables the user to visually recognize the image represented by the digital image data 69B.

Further, in the image pickup apparatus 10, the latest digital image data 69B output from the communication I/F62E2 to the signal processing circuit 50 is stored in the auxiliary storage device 80, the USB memory, and/or the memory card or the like by the CPU 46A. Also, the digital image data 69B before 1 frame output from the communication I/F62E3 to the signal processing circuit 50 is also stored in the auxiliary storage device 80, the USB memory, the memory card, and/or the like by the CPU 46A. This allows management of the digital image data 69B of the total frame size obtained by imaging the subject without excess or deficiency.

In addition, in embodiment 1 described above, the case where the 2 nd output is performed in response to a request from the controller 46 has been described, but the technique of the present invention is not limited to this. For example, the 2 nd output may be started on the condition that the selector 62C starts transferring the digital image data 69B obtained by the above-described "N +1 th digital signal processing" to the communication I/F62E 3. The 2 nd output may be started on the condition that the digital processing circuit 62B starts outputting the digital image data 69B obtained by the above-described "N +1 th digital signal processing" to the selector 62C. In short, the 2 nd output may be started while the write to the memory 64 is not performed.

In the above embodiment 1, the digital image data 69B transmitted from the selector 62C to the communication I/F62E3 is output to the signal processing circuit 50 via the communication line 55. For example, image data obtained by subjecting the digital image data 69B to some image processing by an image processing circuit (not shown) between the selector 62C and the communication I/F62E3 may be output to the signal processing circuit 50 via the communication I/F62E 3. Here, the image processing includes known image processing such as thinning-out processing and addition processing. Further, image data obtained by performing some image processing on the digital image data 69B by the image processing circuit between the selector 62C and the communication I/F62E3 is an example of the "1 st image data" according to the technique of the present invention.

In addition, in embodiment 1 described above, an example of a mode in which the digital image data 69B stored in the memory 64 is output to the signal processing circuit 50 via the communication I/F62E2 has been described, but the technique of the present invention is not limited to this. For example, image data obtained by performing the above-described image processing on the digital image data 69B stored in the memory 64 by the control circuit 62D may be output to the signal processing circuit 50 via the communication I/F62E 2. The image data obtained by the control circuit 62D performing the image processing described above on the digital image data 69B stored in the memory 64 is an example of the "2 nd image data" according to the technique of the present invention.

In addition, in embodiment 1 described above, an example is given in which each image based on the digital image data 69B output from each of the communication I/F62E2 and the communication I/F62E3 is displayed on the display by the CPU46A, but the technique of the present invention is not limited to this. For example, an image based on the digital image data 69B output from the communication I/F62E2 or the communication I/F62E3 to the signal processing circuit 50 may be displayed on the display by the CPU 46A.

In the above embodiment 1, an example of a mode is given in which the digital image data 69B output from each of the communication I/F62E2 and the communication I/F62E3 is stored in the auxiliary storage device 80 or the like by the CPU46A, but the technique of the present invention is not limited to this. For example, the digital image data 69B output from the communication I/F62E2 or the communication I/F62E3 to the signal processing circuit 50 may be stored in the auxiliary storage device 80 or the like by the CPU 46A.

In addition, in embodiment 1, as the imaging element 44, an imaging element in which the photoelectric conversion element 61, the processing circuit 62, and the memory 64 are formed in one chip is exemplified, but the technique of the present invention is not limited thereto. For example, at least the photoelectric conversion element 61 and the memory 64 of the photoelectric conversion element 61, the processing circuit 62, and the memory 64 may be formed in one chip.

[ 2 nd embodiment ]

In embodiment 1 described above, an example of a mode in which the 1 st output and the 2 nd output are performed in parallel is described, but in embodiment 2, an example of a mode in which the 1 st output and the 2 nd output are performed alternately is described. In embodiment 2, the same components as those in embodiment 1 are denoted by the same reference numerals, and descriptions thereof are omitted. Hereinafter, a description will be given of a portion different from the above embodiment 1.

As an example, as shown in fig. 12, the imaging apparatus 10 according to embodiment 2 is different from embodiment 1 in that a communication line 59 branched from the communication line 57 is connected to the signal processing circuit 50. In the description of embodiment 2, the imaging apparatus 10 according to embodiment 2 will be simply referred to as "imaging apparatus 10" for convenience of description.

Since the communication line 59 branched from the communication line 57 is connected to the signal processing circuit 50, the read synchronization signal output from the controller 46 is input to the signal processing circuit 50 via the communication line 59. Therefore, the signal processing circuit 50 can perform an operation corresponding to the read synchronization signal input from the controller 46 via the communication line 59.

A vertical synchronization signal is input from the controller 46 to the signal processing circuit 50 via the communication line 59. The signal processing circuit 50 determines a vertical blanking period in accordance with the input timing of the vertical synchronization signal. When entering the vertical blanking period, the signal processing circuit 50 generates an output request signal requesting the processing circuit 62 to start the 2 nd output, and outputs the generated output request signal to the communication I/F62E2 via the communication line 53. The output request signal is transmitted from the communication I/F62E2 to the control circuit 62D. When the output request signal is transmitted from the communication I/F62E2 to the control circuit 62D, the 2 nd output is started. As explained in the above-described embodiment 1, the digital image data 69B is transferred using the 2 nd transfer path in the 2 nd output.

That is, when the output request signal is transmitted from the communication I/F62E2, the control circuit 62D acquires the digital image data 69B from the memory 634, and outputs the acquired digital image data 69B to the communication I/F62E 2. The communication I/F62E2 outputs the digital image data 69B input from the control circuit 62D to the signal processing circuit 50 via the communication line 53.

As an example, as shown in fig. 13 and 14, the imaging element 44 performs imaging processing and output processing. In the imaging process shown in fig. 13, exposure by the photoelectric conversion element 61, reading of analog image data 69A, resetting of the photoelectric conversion element 61, digital signal processing, and storage of digital image data 69B in the memory 64 are performed in the same manner as in embodiment 1.

As an example, as shown in fig. 14, in the output processing, the 1 st output and the 2 nd output are alternately performed. When the vertical synchronization signal is input from the controller 46 to the processing circuit 62, the 1 st output described in the above embodiment 1 is performed. In the vertical blanking period, when the output request signal is input from the signal processing circuit 50 to the processing circuit 62, the 2 nd output described in the above-described embodiment 1 is performed.

However, when a vertical synchronizing signal is input from the controller 46 to the imaging element 44, readout of 1 frame amount of analog image data 69A from the photoelectric conversion element 61 is started. Then, digital image data 69B obtained by performing digital signal processing on the analog image data 69A is transferred to the control circuit 62D by the selector 62C, and is stored in the memory 64 by the control circuit 62D. Since the memory 64 is a DRAM, the control circuit 62D cannot read from the memory 64 during the period of writing to the memory 64 by the control circuit 62D.

Therefore, as an example, as shown in fig. 15, during the writing period into the memory 64, the 1 st output not depending on the control circuit 62D is performed. That is, the latest digital image data 69B of 1 frame is output from the digital processing circuit 62B to the selector 62C, and is transmitted from the selector 62C to the communication I/F62E 3. The latest digital image data 69B of 1 frame is output to the signal processing circuit 50 via the communication line 55 by the communication I/F62E 3.

When the 1 st output is completed, a vertical blanking period is entered. Since writing into the memory 64 is not performed during the vertical blanking period, reading from the memory 64 by the control circuit 62D can be performed.

Therefore, as an example, as shown in fig. 15, during the vertical blanking period, that is, during the reading period from the memory 64, the 2 nd output is performed in association with the reading of the digital image data 69B from the memory 64 by the control circuit 62D. That is, the digital image data 69B of 1 frame amount obtained 1 frame ago is read out from the memory 64 by the control circuit 62D, and is transferred to the communication I/F62E 2. The digital image data 69B of 1 frame read out from the memory 64 is output to the signal processing circuit 50 via the communication line 53 by the communication I/F62E 2.

When the vertical synchronization signal is input to the imaging element 44, the 1 st output and the 2 nd output are sequentially performed until the next vertical synchronization signal is input to the imaging element 44, and as a result, the 1 st output and the 2 nd output are alternately performed as shown in fig. 15, for example. This means that the 1 st output is performed in a period different from the period in which the 2 nd output is performed. That is, the 2 nd output is performed in the vertical blanking period before the 1 st output is performed and in the vertical blanking period after the 1 st output is performed. The period different from the period during which the 2 nd output is performed is an example of the "period different from the output period of the 2 nd image data by the 2 nd output unit" according to the technique of the present invention.

Next, the operation of the imaging apparatus 10 will be described.

First, with reference to fig. 16, a flow of image pickup processing performed by the processing circuit 62 of the imaging element 44 will be described.

In the image pickup processing shown in fig. 16, first, in step ST50, the control circuit 62D determines whether or not a vertical synchronization signal from the controller 46 is received by the communication I/F62E 1. In step ST50, if the communication I/F62E1 does not receive the vertical synchronization signal from the controller 46, the determination is no, and the image capturing process proceeds to step ST 58. In step ST50, when the vertical synchronization signal is received from the controller 46 by the communication I/F62E1, the determination is yes, and the image capturing process proceeds to step ST 52.

In step ST52, the readout circuit 62A reads out the analog image data 69A and resets the photoelectric conversion element 61, and the image pickup processing proceeds to step ST 54.

In step ST54, the digital processing circuit 62B performs digital signal processing on the analog image data 69A, and then the control processing shifts to step ST 56.

The digital image data 69B obtained by subjecting the analog image data 69A to digital signal processing in step ST56 is output to the selector 62C, and the selector 62C transfers the digital image data 69B to the control circuit 62D.

In step ST56, the control circuit 62D stores the digital image data 69B in the memory 64, and the image pickup process then shifts to step ST 58.

In step ST58, the control circuit 62D determines whether or not a condition for ending the image pickup processing (hereinafter referred to as "image pickup processing ending condition") is satisfied. As an example of the imaging processing end condition, a condition in which an instruction to end the imaging processing is received by the receiving device 84 (see fig. 5) may be given. If the imaging process termination condition is not satisfied in step ST58, the determination is no, and the imaging process proceeds to step ST 50. In step ST58, if the image capturing process end condition is satisfied, the determination is yes, and the image capturing process ends.

Next, with reference to fig. 17, a flow of output processing performed by the processing circuit 62 of the imaging element 44 will be described.

In the output processing shown in fig. 17, in step ST100, the control circuit 62D determines whether or not the vertical synchronization signal from the controller 46 is received by the communication I/F62E 1. In step ST100, if the communication I/F62E1 does not receive the vertical synchronization signal from the controller 46, the determination is no, and the output process proceeds to step ST 106. In step ST100, when the vertical synchronization signal is received from the controller 46 by the communication I/F62E1, the determination is yes, and the output process proceeds to step ST 102.

In step ST102, the control circuit 62D starts the 1 ST output by controlling the selector 62C and the communication I/F62E3, and then the output processing shifts to step ST 104. While the 1 st output is being performed, the imaging process shown in fig. 16 is executed, and the digital image data 69B is written into the memory 64.

In step ST104, the control circuit 62D determines whether the 1 ST output is ended. The end of the 1 st output is the end of the output of the latest digital image data 69B of 1 frame. In step ST104, if the 1 ST output is not completed, the determination is no, and the determination in step ST104 is performed again. If the 1 ST output is ended in step ST104, the determination is yes, and the output process proceeds to step ST 106.

In step ST106, the control circuit 62D determines whether or not the vertical blanking period is entered. If the vertical blanking period is not entered in step ST106, the determination is no, and the process proceeds to step ST 114. If the vertical blanking period is entered in step ST106, the determination is yes, and the output process proceeds to step ST 108.

When entering the vertical blanking period, an output request signal is output from the signal processing circuit 50 to the communication I/F62E2 via the communication line 53.

Therefore, in step ST108, the control circuit 62D determines whether an output request signal is received by the communication I/F62E 2. In step ST108, if the communication I/F62E2 does not receive the output request signal, the determination is no, and the output process proceeds to step ST 114. In step ST108, when the output request signal is received by the communication I/F62E2, the determination is yes, and the output process proceeds to step ST 110.

In step ST110, the control circuit 62D starts the 2 nd output, and then the output processing shifts to step ST 112. When the 2 nd output is started, the digital image data 69B of 1 frame stored in the memory 64 is read out and output to the signal processing circuit 50 via the communication line 53 by the communication I/F62E 2.

In step ST112, the control circuit 62D determines whether the 2 nd output is ended. The end of the 2 nd output refers to the end of the output of 1 frame worth of digital image data 69B stored in the memory 64, i.e., 1 frame worth of digital image data 69B obtained 1 frame ago. In step ST112, if the 2 nd output is not completed, the determination is no, and the determination in step ST112 is performed again. In step ST112, when the 2 nd output is ended, the determination is yes, and the output process proceeds to step ST 114.

In step ST114, the control circuit 62D determines whether or not a condition for ending the output processing (hereinafter referred to as "output processing ending condition") is satisfied. As an example of the output process termination condition, a condition in which the reception device 84 (see fig. 5) receives an instruction to terminate the output process is given. In step ST114, if the output process termination condition is not satisfied, the determination is no, and the output process proceeds to step ST 100. In step ST114, if the output process termination condition is satisfied, it is determined as yes, and the output process is terminated.

As described above, in the imaging apparatus 10, the 1 st output is performed in a period different from the period in which the 2 nd output is performed. Thereby, the digital image data 69B can be output to the signal processing circuit 50 without delay.

In the imaging apparatus 10, the 2 nd output is performed during the vertical blanking period before the 1 st output is performed and during the vertical blanking period after the 1 st output is performed. This can prevent the output from the imaging element 44 to the signal processing circuit 50 from being stopped due to the writing operation of the digital image data 69B to the memory 64.

In embodiment 2, an example in which the 2 nd output is performed during two periods, i.e., the vertical blanking period before the 1 st output is performed and the vertical blanking period after the 1 st output is performed, has been described, but the technique of the present invention is not limited to this. The 2 nd output may be performed during the vertical blanking period before the 1 st output is performed or during the vertical blanking period after the 1 st output is performed.

[ embodiment 3 ]

In embodiment 2 described above, an example of a mode in which the 1 st output and the 2 nd output are alternately performed in accordance with the input of the vertical synchronization signal is shown, but in embodiment 2, a case in which the 1 st output and the 2 nd output are alternately performed in accordance with the input of the horizontal synchronization signal will be described. In embodiment 3, the same components as those in embodiment 2 are denoted by the same reference numerals, and descriptions thereof are omitted. Hereinafter, a description will be given of a portion different from the above embodiment 2.

In the imaging apparatus 10 according to embodiment 3, as an example, imaging processing and output processing are performed as shown in fig. 18 and 19. In the image pickup processing shown in fig. 18, exposure by the photoelectric conversion element 61, reading of analog image data 69A, resetting of the photoelectric conversion element 61, digital signal processing, and storage of digital image data 69B into the memory 64 are performed.

As an example, as shown in fig. 12, a horizontal synchronization signal is input from the controller 46 to the signal processing circuit 50 via the communication line 59. The signal processing circuit 50 determines a horizontal blanking period in accordance with the input timing of the horizontal synchronization signal. When entering the horizontal blanking period, the signal processing circuit 50 generates an output request signal requesting the processing circuit 62 to start the 2 nd output, and outputs the generated output request signal to the communication I/F62E2 via the communication line 53. The output request signal is transmitted from the communication I/F62E2 to the control circuit 62D. When the output request signal is transmitted from the communication I/F62E2 to the control circuit 62D, the 2 nd output is started. As described in the above-described embodiments 1 and 2, the digital image data 69B is transferred using the 2 nd transmission channel in the 2 nd output.

As an example, as shown in fig. 18 and 19, the imaging element 44 performs imaging processing and output processing. In the imaging process shown in fig. 18, exposure by the photoelectric conversion element 61, reading of analog image data 69A, resetting of the photoelectric conversion element 61, digital signal processing, and storage of digital image data 69B in the memory 64 are performed in the same manner as in embodiment 1.

When the horizontal synchronization signal is input to the imaging element 44 from the controller 46, as shown in fig. 18, for example, the analog image data 69A of 1 horizontal line amount is read and the photoelectric conversion element 61 is reset, and the read analog image data 69A of 1 horizontal line amount is subjected to digital signal processing. Digital image data 69B for 1 horizontal line obtained by performing digital signal processing on analog image data 69A for 1 horizontal line is output from the digital processing circuit 62B to the selector 62C. The digital image data 69B of 1 horizontal line is transferred from the selector 62C to the control circuit 62D, and stored in the memory 64 by the control circuit 62D.

As an example, as shown in fig. 19, in the output processing, the 1 st output and the 2 nd output are alternately performed. When the horizontal synchronization signal is input from the controller 46 to the processing circuit 62, the 1 st output is performed. In the horizontal blanking period, when an output request signal is input from the signal processing circuit 50 to the processing circuit 62, the 2 nd output is performed.

However, when a horizontal synchronization signal is input from the controller 46 to the imaging element 44, readout of analog image data 69A of 1 horizontal line amount from the photoelectric conversion element 61 is started. Then, the selector 62C transfers the digital image data 69B for 1 horizontal line obtained by performing digital signal processing on the analog image data 69A for 1 horizontal line to the control circuit 62D, and the control circuit 62D stores the digital image data in the memory 64. Since the memory 64 is a DRAM, the control circuit 62D cannot read from the memory 64 during the period of writing to the memory 64 by the control circuit 62D.

Therefore, as an example, as shown in fig. 20, during a writing period into the memory 64 ("writing period" shown in fig. 20), the 1 st output not dependent on the control circuit 62D is performed. That is, the latest digital image data 69B of 1 horizontal line amount is output from the digital processing circuit 62B to the selector 62C, and is transmitted from the selector 62C to the communication I/F62E 3. The latest digital image data 69B corresponding to 1 horizontal line is output to the signal processing circuit 50 via the communication line 55 by the communication I/F62E 3.

When the 1 st output is completed, a horizontal blanking period is entered. The horizontal blanking period is a non-writing period. The non-writing period is a period in which writing into the memory 64 is not performed. In this manner, since writing into the memory 64 is not performed in the horizontal blanking period, reading from the memory 64 by the control circuit 62D can be performed.

Therefore, as an example, as shown in fig. 20, during the horizontal blanking period, that is, during the reading period from the memory 64, the 2 nd output is performed in association with the reading of the digital image data 69B from the memory 64 by the control circuit 62D. That is, the digital image data 69B of 1 horizontal line amount obtained before 1 line is read out from the memory 64 by the control circuit 62D, and is transferred to the communication I/F62E 2. The digital image data 69B of 1 horizontal line read out from the memory 64 is output to the signal processing circuit 50 via the communication line 53 by the communication I/F62E 2.

When the horizontal synchronization signal is input to the imaging element 44, the 1 st output and the 2 nd output are sequentially performed until the next horizontal synchronization signal is input to the imaging element 44, and as a result, the 1 st output and the 2 nd output are alternately performed as shown in fig. 20, for example. It means that the 1 st output is performed in a period different from the period in which the 2 nd output is performed. That is, the 2 nd output is performed in the horizontal blanking period before the 1 st output is performed and in the horizontal blanking period after the 1 st output is performed. In addition, "DT" shown in fig. 20 refers to digital image data 69B of 1 line.

Next, the operation of the imaging apparatus 10 will be described.

First, with reference to fig. 21, a flow of image pickup processing performed by the processing circuit 62 of the imaging element 44 will be described.

In the image pickup processing shown in fig. 21, first, in step ST200, the control circuit 62D determines whether or not the horizontal synchronization signal from the controller 46 is received by the communication I/F62E 1. In step ST200, if the communication I/F62E1 does not receive the horizontal synchronization signal from the controller 46, the determination is no, and the image capturing process proceeds to step ST 208. In step ST200, when the horizontal synchronization signal from the controller 46 is received by the communication I/F62E1, the determination is yes, and the image capturing process proceeds to step ST 202.

In step ST202, the readout circuit 62A reads out the analog image data 69A and resets the photoelectric conversion element 61, and the image pickup process proceeds to step ST 204.

In step ST204, the digital processing circuit 62B performs digital signal processing on the analog image data 69A, and then the control processing shifts to step ST 206.

Digital image data 69B obtained by subjecting the analog image data 69A to digital signal processing in step ST206 is output to the selector 62C, and the selector 62C transfers the digital image data 69B to the control circuit 62D.

In step ST206, the control circuit 62D stores the digital image data 69B in the memory 64, and the image pickup process then shifts to step ST 208.

In step ST208, the control circuit 62D determines whether or not the above-described image pickup processing end condition is satisfied. In step ST208, if the imaging process termination condition is not satisfied, the process is determined as no, and the process proceeds to step ST 200. In step ST208, if the imaging process termination condition is satisfied, it is determined as yes, and the imaging process is terminated.

Next, with reference to fig. 22, the flow of output processing performed by the processing circuit 62 of the imaging element 44 will be described.

In the output processing shown in fig. 22, in step ST250, the control circuit 62D determines whether or not the horizontal synchronization signal from the controller 46 is received by the communication I/F62E 1. In step ST250, if the communication I/F62E1 does not receive the horizontal synchronization signal from the controller 46, the determination is no, and the output process proceeds to step ST 256. In step ST250, when the horizontal synchronization signal is received from the controller 46 by the communication I/F62E1, the determination is yes, and the output process proceeds to step ST 252.

In step ST252, the control circuit 62D starts the 1 ST output by controlling the selector 62C and the communication I/F62E3, and the output processing shifts to step ST 254. While the 1 st output is being performed, the imaging process shown in fig. 18 is executed, and the digital image data 69B is written into the memory 64.

In step ST254, the control circuit 62D determines whether the 1 ST output is ended. The end of the 1 st output is the end of the output of the latest digital image data 69B of 1 horizontal line amount. In step ST254, if the 1 ST output is not completed, the determination is no, and the determination in step ST254 is performed again. If the 1 ST output is ended, the process is determined as yes in step ST254, and the output process proceeds to step ST 256.

In step ST256, the control circuit 62D determines whether or not the horizontal blanking period is entered. If the horizontal blanking period is not entered in step ST256, the determination is no, and the output process proceeds to step ST 264. If the horizontal blanking period is entered in step ST256, the determination is yes, and the output process proceeds to step ST 258.

When entering the horizontal blanking period, an output request signal is output from the signal processing circuit 50 to the communication I/F62E2 via the communication line 53.

Therefore, in step ST258, the control circuit 62D determines whether an output request signal is received by the communication I/F62E 2. In step ST258, if the communication I/F62E2 does not receive the output request signal, the determination is no, and the output process proceeds to step ST 264. In step ST258, when the output request signal is received by the communication I/F62E2, the determination is yes, and the output process proceeds to step ST 260.

In step ST260, the control circuit 62D starts the 2 nd output, and then the output processing shifts to step ST 262. When the 2 nd output is started, the digital image data 69B of 1 frame stored in the memory 64 is read out and output to the signal processing circuit 50 via the communication line 53 by the communication I/F62E 2.

In step ST262, the control circuit 62D determines whether the 2 nd output is ended. The end of the 2 nd output refers to the end of the output of 1 frame worth of digital image data 69B stored in the memory 64, i.e., 1 frame worth of digital image data 69B obtained 1 frame ago. In step ST262, if the 2 nd output is not completed, the determination is no, and the determination in step ST262 is performed again. If the 2 nd output is ended in step ST262, the determination is yes, and the output process proceeds to step ST 264.

In step ST264, the control circuit 62D determines whether or not the output process end condition is satisfied. In step ST264, if the output process termination condition is not satisfied, the determination is no, and the output process proceeds to step ST 250. In step ST264, if the output process termination condition is satisfied, it is determined as yes, and the output process is terminated.

As described above, in the imaging apparatus 10, the 2 nd output is performed in the horizontal blanking period before the 1 st output is performed and in the horizontal blanking period after the 1 st output is performed. This can prevent the output from the imaging element 44 to the signal processing circuit 50 from being stopped due to the writing operation of the digital image data 69B to the memory 64.

In embodiment 3, the 2 nd output is performed in two periods, i.e., a horizontal blanking period before the 1 st output is performed and a horizontal blanking period after the 1 st output is performed, but the technique of the present invention is not limited to this. The 2 nd output may be performed during a horizontal blanking period before the 1 st output is performed or during a horizontal blanking period after the 1 st output is performed.

In addition, although the case where the 2 nd output is performed in the horizontal blanking period has been described in embodiment 3 above, the technique of the present invention is not limited to this. For example, as shown in fig. 23A, it is also possible to perform the 2 nd output during the digital signal processing before the 1 st output is performed. Here, the digital signal processing period refers to a period during which digital signal processing is performed by the digital processing circuit 62B. The digital signal processing period is included in the non-writing period. Since writing into the memory 64 is not performed during the non-writing period, the digital image data 69B can be read from the memory 64. That is, the digital image data 69B can be transferred from the memory 64 to the signal processing circuit 50 using the 2 nd transfer path described above. For example, as shown in fig. 23A, by setting to perform the 2 nd output during the digital signal processing period, it is possible to avoid the output from the imaging element 44 to the signal processing circuit 50 from being stopped due to the writing operation of the digital image data 69B to the memory 64.

Also, the digital signal processing period includes an a/D conversion period. The a/D conversion period is a period during which a/D conversion is performed by the a/D converter 62B1 (refer to fig. 12). In this manner, since the digital signal processing period includes the a/D conversion period, as shown in fig. 23B, for example, the 2 nd output may be performed during the a/D conversion period. This can prevent the output from the imaging element 44 to the signal processing circuit 50 from being stopped due to the writing operation of the digital image data 69B to the memory 64.

In the above embodiments, the processing circuit 62 is realized by a device including an ASIC and an FPGA by way of example, but the technique of the present invention is not limited thereto. For example, the image pickup processing may be realized by a software configuration of a computer.

In this case, for example, as shown in fig. 24, various programs for causing the computer 852 built in the imaging element 44 to execute the above-described control processing, imaging processing, and output processing are stored in the storage medium 900.

The various programs are a control program 902, an imaging program 904, and an output program 906. The control program 902 is a program for causing the computer 852 to execute the control processing described above. The imaging program 904 is a program for causing the computer 852 to execute the imaging processing described above. The output program 906 is a program for causing the computer 852 to execute the above-described output processing.

For example, as shown in fig. 24, the computer 852 includes a CPU852A, a ROM852B, and a RAM 852C. Various programs stored in the storage medium 900 are installed in the computer 852. The CPU852A executes the control processing described above in accordance with the control program 902. The CPU852A executes the image pickup processing described above in accordance with the image pickup program 904. Further, the CPU852A executes the above-described output processing in accordance with the output program 906.

Here, although one CPU is illustrated as the CPU852A, the technique of the present invention is not limited to this, and a plurality of CPUs may be used instead of the CPU 852A. In addition, the storage medium 900 is a non-transitory storage medium. As an example of the storage medium 900, an arbitrary portable storage medium such as an SSD or a USB memory can be given.

In the example shown in fig. 24, various programs are stored in the storage medium 900, but the technique of the present invention is not limited to this. For example, various programs may be stored in advance in the ROM852B, and the CPU852A may read out the various programs from the ROM852B, expand the programs to the RAM852C, and execute the expanded programs.

Further, various programs may be stored in a storage unit such as another computer or a server device connected to the computer 852 via a communication network (not shown), and downloaded to the computer 852 in response to a request from the imaging device 10. In this case, the various downloaded programs are executed by the CPU852A of the computer 852.

Also, the computer 852 may be provided outside the imaging element 44. In this case, the computer 852 may control the processing circuit 62 according to a program.

As hardware resources for executing the control processing, the imaging processing, and the output processing (hereinafter referred to as "various processing") described in the above embodiments, various processors shown below can be used. The processor may be, for example, a CPU that is a general-purpose processor, and functions as a hardware resource that executes various processes by executing a program that is software as described above. The processor may be, for example, a dedicated circuit as a processor having a circuit configuration specifically designed to execute a specific process, such as an FPGA, a PLD, or an ASIC.

The hardware resources for executing various kinds of processing may be constituted by one of these various kinds of processors, or may be constituted by a combination of two or more processors of the same kind or different kinds (for example, a combination of a plurality of FPGAs, or a combination of a CPU and an FPGA). Also, the hardware resource that performs various processes may be one processor.

As an example of one processor, there is a first method of: as represented by a computer such as a client or a server, a combination of one or more CPUs and software constitutes one processor, and the processor functions as a hardware resource for executing processing in the imaging element. Second, there are the following ways: a processor is used in which the functions of the entire System including a plurality of hardware resources that execute various processes are realized by one IC chip, as typified by SoC (System-on-a-chip) or the like. As such, the processing within the imaging element is achieved by using one or more of the various processors described above as hardware resources.

As the hardware configuration of these various processors, more specifically, a circuit in which circuit elements such as semiconductor elements are combined can be used.

In the above embodiments, the interchangeable lens camera is exemplified as the imaging device 10, but the technique of the present invention is not limited to this. For example, the techniques of the present invention may be applied to the smart device 950 shown in FIG. 25. As an example, the smart device 950 shown in fig. 25 is an example of the imaging apparatus according to the technique of the present invention. The imaging element 44 described in the above embodiments is mounted on the smart device 950. Even with the smart device 950 configured in this manner, the same operation and effects as those of the imaging apparatus 10 described in each of the above embodiments can be obtained. The technique of the present invention is not limited to the smart device 950, and can be applied to a personal computer or a wearable terminal device.

In the above embodiments, the 1 st display 32 and the 2 nd display 86 are illustrated, but the technique of the present invention is not limited to this. For example, a separate display attached to the image pickup apparatus main body 12 may be used as the "display portion (display)" according to the technique of the present invention.

The various processes described above are merely examples. Therefore, needless to say, unnecessary steps may be deleted, new steps may be added, or the processing order may be switched without departing from the scope of the invention.

The above descriptions and drawings are detailed descriptions of the technical aspects of the present invention, and are only examples of the technical aspects of the present invention. For example, the description about the above-described structure, function, operation, and effect is a description about an example of the structure, function, operation, and effect of the portion relating to the technology of the present invention. Therefore, needless to say, unnecessary portions may be deleted, new elements may be added, or replacement may be made to the above-described description and the illustrated contents without departing from the scope of the present invention. In order to avoid complication and to facilitate understanding of the portions relating to the technology of the present invention, descriptions related to technical common knowledge and the like, which do not require any particular description in terms of enabling implementation of the technology of the present invention, are omitted from the above descriptions and drawings.

In the present specification, "a and/or B" has the same meaning as "at least one of a and B". That is, "a and/or B" means that a may be only a, only B, or a combination of a and B. In the present specification, the same concept as "a and/or" B "may be applied to the case where" and/or "is added to represent 3 or more items.

All documents, patent applications, and technical standards described in the present specification are incorporated by reference into the present specification to the same extent as if each document, patent application, and technical standard was specifically and individually indicated to be incorporated by reference.

51页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:图像传感器及其控制方法、搭载图像传感器的成像装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类