Optical component inspection system and method for inspecting at least one component
阅读说明:本技术 用于检测至少一个组件的光学组件检测系统和方法 (Optical component inspection system and method for inspecting at least one component ) 是由 乌韦·弗朗茨·奥格斯特 于 2018-03-13 设计创作,主要内容包括:公开了一种用于检测至少一个组件的至少一个表面的光学组件检测系统,其中,容纳部配置成将组件定位在照相机装置之前,以便借助照相机装置检测组件的第一表面。照相机装置包括图像传感器,其配置成接收组件的第一表面的反射光。光学组件检测系统还包括布置在反射光至图像传感器的光路中的第一光学有效元件和用于第一光学有效元件的调节装置。调节装置包括用于第一光学有效元件的支架。支架固定在空心圆柱形的镜筒的内侧上并且镜筒的纵向中心线与第一光学有效元件的光学轴线同轴。支架至少在镜筒纵向方向上可弹性弯曲。此外,调节装置包括配置成调节光学有效元件和图像传感器之间的相对距离的第一执行装置,以便使光学有效元件沿着光学轴线相对于图像传感器移位。(An optical component inspection system for inspecting at least one surface of at least one component is disclosed, wherein a receptacle is configured to position the component in front of a camera device for inspection of a first surface of the component by means of the camera device. The camera device includes an image sensor configured to receive reflected light of the first surface of the component. The optical component detection system further comprises a first optically active element arranged in the optical path of the reflected light to the image sensor and an adjustment means for the first optically active element. The adjusting device comprises a holder for the first optically active element. The holder is fixed on the inside of a hollow cylindrical barrel and the longitudinal center line of the barrel is coaxial with the optical axis of the first optically effective element. The holder is elastically bendable at least in the lens barrel longitudinal direction. Furthermore, the adjustment device comprises a first actuator configured to adjust the relative distance between the optically active element and the image sensor so as to displace the optically active element along the optical axis relative to the image sensor.)
1. An optical component detection system (200) for detecting at least one surface of at least one component (B), wherein a receptacle (150) is configured to position the component (B) in front of a camera device (220) for detecting a first surface (O) of the component (B) by means of the camera device (220),
wherein the camera arrangement (220) comprises an image sensor (230) configured to receive reflected light (L) on the first surface (O) of the component (B);
wherein the optical component detection system (200) comprises a first optically effective element (130) arranged in the optical path of the reflected light (L) to the image sensor (230) and an adjustment device (100) for the first optically effective element (130), the adjustment device (100) comprising:
a holder (140) for the first optically active element (130), wherein the holder (140) is fixed on the inside of a hollow-cylindrical barrel (110) and the longitudinal center line of the barrel (110) is coaxial with the Optical Axis (OA) of the first optically active element (130), and wherein the holder (140) is elastically bendable at least in the barrel longitudinal direction; and
a first actuator (120) for adjusting a relative distance between the optically active element (130) and the image sensor (230) so as to displace the optically active element (130) along the Optical Axis (OA) relative to the image sensor (230).
2. The optical component detection system (200) of claim 1,
wherein the first actuator (120) comprises a coil (121) at least partially surrounding the holder (140);
wherein the control device (ECU) is configured to control the current delivered to the coil (121) for generating the magnetic field;
wherein the holder (140) has a soft-iron or permanent magnet component (141, 142) configured to displace the holder (140) along a longitudinal centerline of the lens barrel (110) in dependence on the current delivered to the coil (121).
3. The optical component detection system (200) according to any one of the preceding claims, further comprising:
-second actuator means (240) controlled by said control means (ECU) for adjusting said image sensor (230) so as to displace said image sensor (230) along said Optical Axis (OA) with respect to said first optically active element (130); and/or
At least one light source configured to send light onto the first surface (O) of the component (B);
wherein, as an option, the control device (ECU) is configured to adjust the relative distance between the first optically active element (130) and the image sensor (230) by controlling the first and/or second execution device (120, 240) so as to project the sharpness plane (SE) of the reflected light (L) onto an image sensor surface (231) of the image sensor facing the reflected light (L).
4. The optical component detection system (200) according to any one of the preceding claims, wherein the image processing arrangement (BV) is configured to:
-taking a first image (BA) by means of the image sensor (230), the first image being provided to the image processing apparatus (BV);
determining, based on the generated first image (BA), whether a sharpness plane (SE) of the reflected light (L) is substantially fully projected onto the image sensor surface (231) of the image sensor (230); and is
If the sharpness plane (SE) of the reflected light (L) is not substantially completely projected onto the image sensor surface (231) of the image sensor (230), then
Determining a first image region (B1) of a first plurality of image regions (B1, B2) of the first image (BA), in which first image region (B1) a first partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230);
determining a second image region (B2) of the first plurality of image regions (B1, B2) of the first image (BA), in which second image region (B2) a second partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230);
-providing control commands to the control means (ECU) to control the first and/or second actuator means (120, 240) in order to adjust the relative distance between the first optically active element (130) and the image sensor (230) and to project a second partial area of the sharpness plane (SE) of the reflected light (L) on the image sensor surface (231) of the image sensor (230);
-taking a second image (BB) by means of the image sensor (230), the second image being provided to the image processing means (BV);
determining a third image region (B3) of a second plurality of image regions (B3, B4) of the second image (BB), in which third image region (B3) a first partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230); and
determining a fourth image region (B4) of the second plurality of image regions (B3, B4) of the second image (BB), in which fourth image region (B4) a second partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230).
5. The optical component detection system (200) according to any one of claims 1 to 3, wherein the image processing arrangement (BV) is configured to:
-taking a first image (BA) by means of the image sensor (230), the first image being provided to the image processing apparatus (BV);
-providing control commands to the control means (ECU) to control the first and/or second actuator means (120, 240) to adjust the relative distance between the first optically active element (130) and the image sensor (230) by a predetermined path length;
-taking a second image (BB) by means of the image sensor (230), the second image being provided to the image processing device (BV).
6. The optical component detection system (200) according to any one of claims 1 to 3, wherein the image processing arrangement (BV) is configured to:
-taking a first image (BA) by means of the image sensor (230), the first image being provided to the image processing apparatus (BV);
-providing control commands to the control means (ECU) to control the first and/or second actuator means (120, 240) to adjust the relative distance between the first optically active element (130) and the image sensor (230) at a predetermined speed during capturing of the first image (BA);
-adjusting the relative distance between the first optically active element (130) and the image sensor (230) after capturing the first image (BA) or after a predetermined period of time after capturing the first image (BA), during which a second image (BB) is simultaneously captured by means of the image sensor (230), the second image being provided to the image processing device (BV).
7. The optical component detection system (200) according to any one of claims 5 or 6, wherein the image processing arrangement (BV) is configured to, after capturing the first image (BA) or after capturing the second image (BB):
determining a first image region (B1) of a first plurality of image regions (B1, B2) of the first image (BA), in which first image region (B1) a first partial region of a sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230); and/or
Determining a second image region (B2) of the first plurality of image regions (B1, B2) of the first image (BA) in which a second partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230); and/or
Determining a third image region (B3) of a second plurality of image regions (B3, B4) of the second image (BB), in which a first partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230); and/or
Determining a fourth image region (B4) of the second plurality of image regions (B3, B4) of the second image (BB), in which fourth image region (B4) a second partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230).
8. Optical component detection system (200) according to claim 4 or 7, wherein the first and second image regions (B1, B2) of the first image (BA) respectively image a portion of the component (B) which substantially corresponds to the portion of the component (B) in the third and fourth image regions (B3, B4) of the second image (BB); and/or
Wherein the first partial region of the definition plane (SE) in the first image (BA) and the second partial region of the definition plane (SE) in the second image (BB) are projected onto the image sensor surface (231) of the image sensor (230) within a predetermined depth of field (ST); and
wherein the second partial region of the definition plane (SE) in the first image (BA) and the first partial region of the definition plane (SE) in the second image (BB) are projected outside the predetermined depth of field (ST) onto the image sensor surface (231) of the image sensor (230).
9. The optical component detection system (200) according to any one of claims 4 or 7 and/or 8, wherein the image processing device is further configured to:
-cutting out the first image area (B1) from the first image (BA) and the fourth image area (B4) from the second image (BB); and
combining the first and fourth cut-out image regions (B1, B4) to produce a third image (BC); and/or
Determining whether the component (B) has at least one defect based on the first image region (B1) of the first image (BA) and/or the fourth image region (B4) of the second image (BB) and/or the third image (BC); and
providing defect information on the component (B) if the image processing apparatus (BV) determines that there is at least one defect.
10. The optical component detection system (200) according to any one of the preceding claims, further comprising:
a position detection sensor (250) configured to determine a position and/or orientation of the first optically active element (130) and/or the image sensor surface (231) of the image sensor (230) and to provide information about the position and/or orientation of the first optically active element (130) and/or the image sensor surface (231) of the image sensor (230) to the control device (ECU), which controls the first and/or second actuator device (120, 240) based on the provided information,
wherein, as an option, the position detection sensor is an optical or (electro-) mechanical position detection sensor.
11. The optical component acquisition system (200) according to any one of the preceding claims, wherein the camera arrangement (220) comprises a second optically active element (160) in the optical path between the first optically active element (130) and the image sensor (230), wherein an optical axis of the second optically active element (160) is coaxial with an optical axis of the first optically active element (130); and/or
Wherein the first optically effective element (130) is an achromatic lens; and/or
Wherein the second optically active element (160) is a condenser lens.
12. An optical component detection system (200) for detecting at least one surface of at least one component (B), wherein a receptacle (150) is configured to position the component (B) in front of a camera device (220) for detecting a first surface (O) of the component (B) by means of the camera device (220),
wherein the camera arrangement (220) comprises an image sensor (230) configured to receive reflected light (L) on the first surface (O) of the component (B);
wherein the optical component detection system (200) comprises a first optically effective element (130) arranged in the optical path of the reflected light (L) to the image sensor (230) and an adjustment device (100) for the first optically effective element (130), the adjustment device (100) comprising:
a holder (140) for the first optically active element (130), wherein the holder (140) is fixed on the inside of a hollow-cylindrical barrel (110) by means of a linear guide and the longitudinal center line of the barrel (110) is coaxial with the Optical Axis (OA) of the first optically active element (130), and wherein the linear guide is configured to guide the holder (140) parallel to the Optical Axis (OA);
a first actuator (120) for adjusting the relative distance between the image sensor (230) and the first optically active element (130) in order to displace the first optically active element (130) relative to the image sensor (230).
13. Adjustment device (100) for an optically active element (130), comprising:
a holder (140) for the first optically active element (130), wherein the holder (140) is fixed on the inside of a hollow-cylindrical barrel (110) and the longitudinal center line of the barrel (110) is coaxial with the Optical Axis (OA) of the first optically active element (130), and wherein the holder (140) is elastically bendable at least in the barrel longitudinal direction; and
a first actuator (120) for adjusting the holder (140) in order to displace the first optically active element (130) along the Optical Axis (OA) relative to the barrel (110).
14. Adjustment device (100) for an optically active element (130), comprising:
a holder (140) for the first optically active element (130), wherein the holder (140) is fixed on the inside of a hollow-cylindrical barrel (110) by means of a linear guide and the longitudinal center line of the barrel (110) is coaxial with the Optical Axis (OA) of the first optically active element (130), and wherein the linear guide is configured to guide the holder (140) parallel to the Optical Axis (OA); and
a first actuator (120) for adjusting the holder (140) in order to displace the first optically active element (130) along the Optical Axis (OA) relative to the barrel (110).
15. Method for inspecting at least one surface of at least one component (B), comprising the steps of:
aligning the component (B) with a camera device (220);
-detecting a first surface (O) of said component (B) by means of said camera means (220);
receiving reflected light (L) on the first surface (O) of the component (B) by means of an image sensor (230) of the camera device (220);
-holding a first optically effective element (130) in the optical path of the reflected light (L) by means of a holder (140) being elastically bendable in a longitudinal direction, wherein the longitudinal direction is parallel to an Optical Axis (OA) of the optically effective element (130);
adjusting a relative distance between the image sensor (230) and the first optically active element (130) so as to displace the first optically active element (130) along the optical axis relative to the image sensor (230).
16. The method of claim 15, further comprising the steps of:
adjusting a relative distance between the image sensor (230) and the first optically active element (130) so as to displace the image sensor (230) along the Optical Axis (OA) relative to the first optically active element (130); and as an option to do so,
adjusting a relative distance between the image sensor (230) and the first optically active element (130) so as to project a sharpness plane (SE) of the reflected light (L) onto an image sensor surface (231) of the image sensor (230) facing the reflected light (L).
17. The method according to any one of claims 15 or 16, comprising the steps of:
-taking a first image (BA);
determining, based on the first image (BA), whether a sharpness plane (SE) of the reflected light (L) is substantially fully projected onto the image sensor surface (231) of the image sensor (230); and
if the sharpness plane (SE) of the reflected light (L) is not substantially completely projected onto the image sensor surface (231) of the image sensor (230), then
Determining a first image region (B1) of a first plurality of image regions (B1, B2) of the first image (BA), in which first image region (B1) a first partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230);
determining a second image region (B2) of the first plurality of image regions (B1, B2) of the first image (BA), in which second image region (B2) a second partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230);
-adjusting the relative distance between the first optically active element (130) and the image sensor (230) so as to project a second partial area of the sharpness plane (SE) of the reflected light (L) on the image sensor surface (231) of the image sensor (230);
-taking a second image (BB);
determining a third image region (B3) of a second plurality of image regions (B3, B4) of the second image (BB), in which third image region (B3) a first partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230); and
determining a fourth image region (B4) of the second plurality of image regions (B3, B4) of the second image (BB), in which fourth image region (B4) a second partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230).
18. The method according to claim 15 or 16, comprising the steps of:
-taking a first image (BA);
adjusting a relative distance between the first optically active element (130) and the image sensor (230) by a predetermined path length;
a second image (BB) is captured.
19. The method according to claim 15 or 16, comprising the steps of:
-taking a first image (BA);
-adjusting the relative distance between the first optically active element (130) and the image sensor (230) at a predetermined speed during taking of the first image (BA);
-adjusting the relative distance between the first optically active element (130) and the image sensor (230) after taking the first image (BA) or after a predetermined period of time after taking the first image (BA), during which a second image (BB) is taken simultaneously.
20. The method according to any one of claims 18 or 19, comprising the steps of:
determining a first image region (B1) of a first plurality of image regions (B1, B2) of the first image (BA), in which first image region (B1) a first partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230); and/or
Determining a second image region (B2) of the first plurality of image regions (B1, B2) of the first image (BA), in which second image region (B2) a second partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230); and/or
Determining a third image region (B3) of a second plurality of image regions (B3, B4) of the second image (BB), in which third image region (B3) a first partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230); and/or
Determining a fourth image region (B4) of the second plurality of image regions (B3, B4) of the second image (BB), in which fourth image region (B4) a second partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230).
21. The method of any one of claims 17 or 20,
the first and second image areas (B1, B2) of the first image (BA) each image a portion of the component (B) which substantially corresponds to the portion of the component (B) in the third and fourth image areas (B3, B4) of the second image (BB); and/or
Wherein a first partial region of the definition plane (SE) in the first image (BA) and a second partial region of the definition plane (SE) in the second image (BB) are projected onto the image sensor surface (231) of the image sensor (230) within a predetermined depth of field (ST); and
wherein a second partial region of the definition plane (SE) in the first image (BA) and a first partial region of the definition plane (SE) in the second image (BB) are projected outside the predetermined depth of field (ST) onto the image sensor surface (231) of the image sensor (230).
22. The method according to any one of claims 17 or 20 and/or 21, comprising the steps of:
-cutting out the first image area (B1) from the first image (BA) and the fourth image area (B4) from the second image (BB); and
combining the cut-out first and fourth image regions (B1, B4) to produce a third image (BC); and/or
Determining whether the component (B) has at least one defect based on the first image region (B1) of the first image (BA) and/or the fourth image region (B4) of the second image (BB) and/or the third image (BC); and
providing defect information on the component (B) if the image processing apparatus (BV) determines that there is at least one defect.
23. The method of any one of claims 15 to 22,
if the support (140) has a component comprising soft iron, the support (140) is deformed in a first direction along the Optical Axis (OA) relative to the image sensor (230) by the generated magnetic field.
24. The method according to any of claims 15 to 22, wherein, if the holder (140) has a permanent magnetic component, the holder (140) is deformed by the generated magnetic field in a first or a second direction with respect to the image sensor (230) depending on the flow direction of the current in the coil (121), wherein the second direction is opposite to the first direction.
Technical Field
Semiconductor components find application in a variety of technical fields, such as the manufacture of semiconductor electronics, photovoltaics, optical detectors, and radiation sources (e.g., light emitting diodes). The widespread use of semiconductor components places ever greater demands on the semiconductor component manufacturers, in particular with regard to quality. Defects or damage in or on the semiconductor component are undesirable because they can lead to a malfunctioning semiconductor component. Therefore, the semiconductor assembly is inspected for defects and damage at the time of manufacture. One method of inspecting semiconductor components is to image the semiconductor components with the aid of a micrometer-scale optical system.
But the conventional structure of the optical system has a low depth of field. If the surface to be imaged of the semiconductor component to be examined is inclined relative to the image sensor surface of the image sensor of the optical system, the semiconductor cannot be imaged completely clearly in the image. Secondly, the optical components built into the optical system are generally of a greater mass. Therefore, the adjustment process may be slow, or the optical system may vibrate while changing the focus by the adjustment member, thereby significantly deteriorating the image quality. It is therefore necessary to wait until the vibrations have damped before an image can be taken. Such increased image acquisition time in continuously manufacturing and inspecting semiconductor devices results in a reduction in the yield of semiconductor devices.
Background
DE 102008018586 a1 relates to an optical inspection device for inspecting at least one surface of a component. The assembly is directed to the camera device by means of the fastening element and a first surface of the assembly is illuminated by the light source with a first light beam in the short-wave range. Furthermore, the detection device comprises a second light source which irradiates a second light beam in the long-wave range onto a second surface of the component, wherein the second surface of the component is opposite to the first surface. The light beams reflected on the respective surfaces are received by means of a camera device.
JP 2016128781 discloses a device for inspecting electronic components. The apparatus comprises first and second image pickup devices which take images of different regions of the electronic component, wherein the regions are two different regions of a cuboid.
Disclosure of Invention
It is an object of the present application to provide an efficient optical component detection system for detecting components.
An optical component detection system for detecting at least one surface of at least one component is proposed. The receptacle is configured to position the component in front of the camera device for inspection of the first surface of the component by means of the camera device. The camera device includes an image sensor configured to receive reflected light on the first surface of the assembly. The optical component detection system further comprises a first optically active element arranged in the optical path of the reflected light to the image sensor and an adjustment means for the first optically active element. The adjusting device comprises a holder for the first optically active element. The holder is fixed on the inside of a hollow cylindrical barrel and the longitudinal center line of the barrel is coaxial with the optical axis of the first optically effective element. The holder is elastically bendable at least in the lens barrel longitudinal direction. Furthermore, the adjustment device comprises a first actuator for adjusting the relative distance between the optically effective element and the image sensor in order to displace the optically effective element along the optical axis relative to the image sensor.
By means of the elastically bendable holder, the first optically active element can be held without a corresponding thread or a longitudinally adjustable support of the optically active element. If the optically effective element is an achromatic lens, the lens pair of the achromatic lens (e.g., consisting of a flint lens and a crown lens) is held by a holder. Consequently, the support of the lens pair is superfluous and the mass to be moved of the optically active element and the overall mass of the optical component detection system are reduced. Due to the low mass of the optically active element and the holder, the position of the optically active element can be adjusted with a relatively small application of force. The lower mass allows low inertia steering of the optically active element; whereby the sharpness plane of the assembly can be quickly readjusted.
The elastically bendable holder may comprise a rubber-containing material or a helical spring or be constructed as a bellows. Alternatively, the holder may be configured as a coil spring, wherein the coil spring has an opening in the center line accommodating the first optically active element.
Disturbances such as vibrations from external sources, which are applied to the optically effective element, can thereby be significantly reduced or completely avoided. The measurements performed by means of the optical component detection system are therefore less prone to interference.
Furthermore, the elastically bendable holder can be continuously stretched or contracted due to its elastic properties, so that the position of the first optically effective element can be precisely adjusted.
In a practical production environment, the components may not be accurately aligned with the optical device. In this case, the component cannot be imaged completely clearly with sufficient depth of field. The optical component detection system proposed here makes it possible to rapidly carry out a series of multiple image recordings.
The good and fast adjustability of the optical component detection system allows component measurements to be made faster and more accurately, thereby achieving high throughput.
The image sensor may be a CCD chip. In other variations, the image sensor may be a CMOS chip or an image sensor sensitive to certain wavelength ranges, such as a microbolometer array or a pyroelectric array.
The lens barrel may have a low magnetic permeability. In one variant, the barrel is of paramagnetic material. In addition, the lens barrel has at least aluminum or plastic.
The coil may abut against the lens barrel or be spaced apart from the lens barrel. The position of the coil is offset on the optical axis relative to the position of the holder.
The holder has a first end region in which the holder is fixed to the lens barrel. The holder also has a second end region on which the first optically active element is held. In the rest state, the position of the first end region of the holder coincides with the position of one end of the coil with respect to the optical axis. In another variant, the position of the first end region of the holder is located inside or outside the coil with respect to the optical axis. Alternatively or additionally, the second end region of the stent is located inside or outside the coil with respect to the optical axis.
The support of the first actuator may be at least partially surrounded by a coil. The control device is configured to control the current delivered to the coil for generating the magnetic field. The holder also has a soft-iron or permanent magnet yoke as a soft-iron or permanent magnet assembly configured to displace the holder along the longitudinal centerline of the lens barrel in accordance with a current supplied to the coil.
The wear or wear on the means for adjusting the relative distance is reduced by this structure. Furthermore, precise adjustment of the relative distance can be achieved by the force generated by the magnetic field, which is applied to the holder.
If the holder with the yoke comprising soft iron is arranged at one end of the coil, this force acts on the holder in the direction of the center of the coil. If the magnetic field is lowered and then switched off, no more force acts on the holder. The holder is now in its initial position again.
If the support has a permanent magnet yoke, the support can be stretched in two directions depending on the direction of current flow through the coil and the orientation of the north and south poles of the permanent magnet assembly. Accordingly, the relative distance can be set in a targeted manner in both directions by means of the permanent magnet assembly.
The receptacle is configured to position the component in front of the camera device such that a sharpness plane of the component is at least partially projected on an image sensor surface of the image sensor.
The optical component detection system may further comprise second actuator means controlled by the control means for adjusting the image sensor so as to displace the image sensor along the optical axis relative to the first optically active element.
The second actuator may comprise at least one piezo-electric actuator or a micro-linearly operating shaft system. The micromotive linearly operating shaft system can be a system composed of push rods.
The image sensor can be displaced along the optical axis by means of the second actuator and the sharpness plane can be projected onto the image sensor surface of the image sensor. Thus, the optical device may be allowed to quickly focus the assembly.
Thus, even when the component is first photographed by means of the image sensor, a specific portion of the component can be clearly imaged in the image.
Further, the optical component detection system may include a light source configured to transmit light onto a surface of the component. The light source is configured to emit light of a predetermined wavelength or a predetermined wavelength range onto a surface of the component. The light emitted by the light source may be visible light, infrared light and/or polarized light having a wavelength in the range of about 380nm to 780 nm. Furthermore, image sensors have sensitivity in the infrared or to polarized light.
As an option, the control means may be configured to adjust the relative distance between the first optically active element and the image sensor by controlling the first and/or second actuator means in order to project the sharpness plane of the reflected light onto the image sensor surface of the image sensor facing the reflected light.
The relative distance between the first optically active element and the image sensor can be adjusted quickly and precisely by controlling the first and/or second actuator, so that a sharpness plane is projected onto the image sensor surface of the image sensor.
The image processing device of the optical component detection system may be configured to take a first image of the component by means of the image sensor. The image sensor provides the first image to the image processing device. The image processing device determines whether the sharpness plane of the reflected light is substantially completely projected onto the image sensor surface of the image sensor based on the generated first image. In case the sharpness plane of the reflected light is not substantially fully projected onto the image sensor surface of the image sensor, the image processing device is configured to determine a first image area of the first plurality of image areas of the first image in which a first partial area of the sharpness plane of the reflected light is projected onto the image sensor surface of the image sensor.
Furthermore, the image processing device is configured to determine a second image region of the first plurality of image regions of the first image in which a second partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor. The image processing apparatus is further configured to provide the control command to the control apparatus. The control means are configured to control the first and/or second actuator means based on the control command in order to adjust the relative distance between the optically active element and the image sensor. In this case, the second subregion of the sharpness plane of the reflected light is adjusted in such a way that the sharpness plane of the reflected light is projected onto the image sensor surface of the image sensor.
The image processing device is also configured to take a second image of the component by means of the image sensor. In the second image, the image processing device determines a third image region of the second plurality of image regions of the second image in which the first partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor. Furthermore, the image processing device determines a fourth image region of the second plurality of image regions of the second image, in which a second partial region of the sharpness plane of the reflected light is projected on the image sensor surface of the image sensor.
If the component is not clearly imaged in the image, at least two images of the component can be generated for the component by the process described above, wherein the image area of each image separately clearly images a portion of the component. In addition, the image regions detected as not sharp are imaged sharp in the next image, so that the inspection of the entire assembly is achieved.
In a first alternative, the image processing device may be configured to take the first image by means of an image sensor, which provides the first image to the image processing device. The image processing means is further configured to provide control commands to the control means to control the first and/or second actuator means to adjust the relative distance between the first optically active element and the image sensor with a predetermined path length. The image processing device is furthermore configured to capture a second image by means of an image sensor, wherein the image sensor supplies the second image to the image processing device.
The image processing means may provide the control command to the first and/or second execution means after a predetermined period of time after the first image is captured.
The predetermined path length may be related to the optical properties of the first and/or second optically active element and/or the dimensions of the surface of the component. The preset path length is preferably preset in accordance with the depth of field such that the preset path length is less than, equal to or greater than the depth of field.
In the case where the predetermined path length is less than or equal to the depth of field, the surface of the component is inspected entirely for defects by each of the clearly imaged image areas of the captured image. If the predetermined path length is greater than the depth of field, the predetermined surface area of the component may be inspected for defects. Accordingly, the other surface areas are not checked for defects. This allows the surface region to be inspected for defects with less computational cost and/or in a short time.
In a second variant, the image processing device may be configured to take the first image by means of an image sensor, which provides the first image to the image processing device. The image processing means are further configured to provide control commands to the control means to control the first and/or second actuator means to adjust the relative distance between the first optically active element and the image sensor at a predetermined speed during capturing of the first image. Furthermore, the image processing device is configured to take a second image by means of an image sensor, which provides the second image to the image processing device. After the first image or after a predetermined period of time after the first image is captured, the relative distance between the first optically active element and the image sensor is adjusted, during which a second image is simultaneously captured. In a further alternative, the second image is taken after a predetermined length change of the relative distance between the first optically active element and the image sensor.
After taking the second image, the image processing device may be configured to provide a further control command to the first and/or second execution device in order to stop the adjustment of the relative distance between the first optically active element and the image sensor.
The predetermined speed may be related to the optical properties of the first and/or second optically active element, the dimensions of the surface of the component, the quality of the first and/or second optically active element and/or the image sensor and its response to the first and/or second actuator.
The predetermined speed is selected such that the adjustment of the relative distance between the image sensor and the first optically active element during the taking of the image is not greater than the depth of field. Therefore, although the relative distance between the image sensor and the first optically effective element is adjusted simultaneously, the image is captured sufficiently clearly. By continuing to adjust the relative distance between the image sensor and the first optically active element, there is no need to stop and re-accelerate the image sensor and/or the first optically active element. This results in the advantage that vibrations caused by acceleration and stopping of the image sensor and/or the first optically active element are avoided. Another advantage is that the time for checking whether the component is defective is minimized, since the image sensor and/or the first optically active element only have to be accelerated and stopped once.
The image processing apparatus of the first and second alternatives may be configured to determine a first image area of the first plurality of image areas of the first image after the first image is captured or after the second image is captured. In the first image region, a first partial region of the sharpness plane of the reflected light is projected on an image sensor surface of the image sensor. In addition or alternatively, the image processing device determines a second image region of the first plurality of image regions of the first image in which a second partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor.
Furthermore, the image processing device additionally or alternatively determines a third image area of the second plurality of image areas of the second image after capturing the second image. In the third image region, the first partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor. Additionally or alternatively, the image processing device also determines a fourth image region of the second plurality of image regions of the second image, in which a second partial region of the sharpness plane of the reflected light is projected on the image sensor surface of the image sensor.
The image processing means may be further configured to provide control commands to the control means to control the first and/or second execution means after taking the second image in order to adjust the relative distance between the first optically active element and the image sensor to its initial length after taking the second image.
The first and second image areas of the first image may each form a portion of the assembly that substantially corresponds to portions of the assembly in the third and fourth image areas of the second image. The first and second image areas of the first image form a complete assembly.
The first plurality of image regions of the first image may correspond to a second plurality of image regions of the second image.
The first and second image areas of the first image and/or the third and fourth image areas of the second image may be contiguous image areas. Alternatively, the first and second image regions of the first image and the third and fourth image regions of the second image may at least partially overlap.
A first partial region of the sharpness plane in the first image and a second partial region of the sharpness plane in the second image are projected onto the image sensor surface of the image sensor within a predetermined depth of field. At the same time, the second partial region of the sharpness plane in the first image and the first partial region of the sharpness plane in the second image are projected outside the predetermined depth of field onto the image sensor surface of the image sensor.
The image processing device may be configured to cut out a first image region from the first image and a fourth image region from the second image. Further, the image processing device is configured to combine the cut-out first and fourth image regions to generate a third image.
By the generated third image, only one image, not two images, needs to be inspected by the image processing apparatus when inspecting whether the component is damaged or defective, thereby reducing time and calculation cost for inspecting the images.
The image processing device may be further configured to determine whether the component has at least one defect based on the first image region of the first image and/or the fourth image region of the second image and/or the third image. If the image processing device determines that there is at least one defect, the image processing device is configured to provide defect information about the component.
The optical assembly detection system may comprise a position detection sensor configured to determine a position and/or orientation of the first optically active element and/or an image sensor surface of the image sensor. The position detection sensor is further configured to provide information about the position and/or orientation of the first optically active element and/or the image sensor surface of the image sensor to the control device, which controls the first and/or second actuator device based on the provided information. The position detection sensor may be an optical or (electro) mechanical position detection sensor.
The position detection sensor may be mounted on the inner side of the lens barrel. In a further variant, the position detection sensor can be integrated in the image sensor. The support has a pattern on a surface area facing the image sensor. The image sensor is configured to identify a pattern on the surface area and determine a position and/or orientation of the optically effective element based on the identified surface area.
The information provided enables the control device to precisely position the first optically active element and/or the image sensor in order to project the sharpness of the reflected light onto the image sensor surface of the component.
The camera arrangement may include a second optically active element in the optical path between the first optically active element and the image sensor. The optical axis of the second optically active element is coaxial with the optical axis of the first optically active element.
The first optically effective element can be an achromatic lens or apochromatic lens and/or the second optically effective element can be a condenser lens.
The refractive index of the light increases continuously from red to blue and thus the focal length of the lens decreases. In order to compensate or correct imaging errors, achromatic lenses are used.
Furthermore, an optical component detection system for detecting at least one surface of at least one component is proposed. The receptacle is configured to position the component in front of the camera device for inspection of the first surface of the component by the camera device. The camera device includes an image sensor configured to receive reflected light on the first surface of the assembly. The optical component detection system further comprises a first optically active element arranged in the optical path of the reflected light to the image sensor and an adjustment means for the first optically active element. The adjusting device comprises a holder for the first optically active element. The holder is fixed on the inside of a hollow cylindrical barrel by means of a linear guide and the longitudinal center line of the barrel is coaxial with the optical axis of the first optically active element. The linear guide is configured to guide the carriage parallel to the optical axis. The adjustment means further comprise first actuator means for adjusting the relative distance between the optically active element and the image sensor in order to displace the first optically active element relative to the image sensor.
Furthermore, an adjustment device for an optically effective element is proposed, which comprises a holder for the first optically effective element. The holder is fixed on the inside of a hollow cylindrical barrel and the longitudinal center line of the barrel is coaxial with the optical axis of the first optically effective element. The holder is elastically bendable at least in the lens barrel longitudinal direction. Furthermore, the adjustment device comprises a first actuator for adjusting the holder in order to displace the first optically active element along the optical axis relative to the barrel.
Furthermore, an adjustment device for an optically effective element is proposed, which comprises a holder for the first optically effective element. The holder is fixed on the inside of a hollow cylindrical barrel by means of a linear guide and the longitudinal center line of the barrel is coaxial with the optical axis of the first optically active element. The linear guide is configured to guide the carriage parallel to the optical axis. Furthermore, the adjustment device comprises a first actuator for adjusting the holder in order to displace the first optically active element along the optical axis relative to the barrel.
Furthermore, a method for inspecting at least one surface of at least one component is proposed, which has the following steps: aligning the component with a camera device; detecting a first surface of the component by means of a camera device; receiving reflected light on a first surface of a component with an image sensor of a camera device; holding the first optically active element in the optical path of the reflected light by means of a holder which is elastically bendable in a longitudinal direction, wherein the longitudinal direction is parallel to the optical axis of the optically active element; the relative distance between the image sensor and the first optically active element is adjusted so as to displace the first optically active element along the optical axis relative to the image sensor.
The method may further comprise the steps of: adjusting a relative distance between the image sensor and the first optically active element so as to displace the image sensor along the optical axis relative to the first optically active element; and as an option, adjusting a relative distance between the image sensor and the first optically active element so as to project a sharpness plane of the reflected light onto an image sensor surface of the image sensor facing the reflected light.
The method may further comprise the steps of: a first surface of the assembly is illuminated with light. The light may be light of a particular wavelength or a particular range of wavelengths.
The method may further comprise the steps of: shooting a first image; determining whether a sharpness plane of the reflected light is substantially completely projected onto an image sensor surface of the image sensor based on the first image; and if the sharpness plane of the reflected light is not substantially completely projected onto the image sensor surface of the image sensor, determining a first image region of a first plurality of image regions of the first image in which a first partial region of the sharpness plane of the reflected light is projected onto the image sensor surface of the image sensor; determining a second image region of the first plurality of image regions of the first image in which a second partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor; adjusting the relative distance between the first optically active element and the image sensor so as to project a second partial area of the sharpness plane of the reflected light on the image sensor surface of the image sensor; shooting a second image; determining a third image region of the second plurality of image regions of the second image in which the first partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor; a fourth image region of the second plurality of image regions of the second image is determined, in which a second partial region of the sharpness plane of the reflected light is projected on the image sensor surface of the image sensor.
In a first alternative, the method may comprise the steps of: shooting a first image; adjusting a relative distance between the image sensor and the first optically active element by a predetermined path length; a second image is captured.
In a second alternative, the method may comprise the steps of: shooting a first image; adjusting a relative distance between the image sensor and the first optically active element at a predetermined speed during capturing of the first image; after the first image is captured or after a predetermined period of time after the first image is captured, the relative distance between the image sensor and the first optically active element is adjusted, during which a second image is simultaneously captured.
The method according to the first and second alternative may further comprise the steps of: determining a first image region of a first plurality of image regions of a first image in which a first partial region of a sharpness plane of the reflected light is projected on an image sensor surface of the image sensor; and/or determining a second image region of the plurality of image regions of the first image in which a second partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor; and/or determining a third image region of the second plurality of image regions of the second image in which the first partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor; and/or a fourth image region of the second plurality of image regions of the second image is determined, in which a second partial region of the sharpness plane of the reflected light is projected on the image sensor surface of the image sensor.
The method may further comprise the steps of: cutting out a first image area from the first image and a fourth image area from the second image; the cut-out first and fourth image regions are combined to generate a third image.
The method may further comprise the steps of: determining whether the component has at least one defect based on the first image region of the first image and/or the fourth image region of the second image and/or the third image; and if the component has at least one defect, providing defect information about the component.
Although some of the foregoing aspects relate to methods, these aspects may also be applied to devices. Likewise, the above aspects relating to the apparatus may be applied to the method accordingly.
Drawings
Further objects, features, advantages and applications refer to the following description of non-limiting embodiments, which is to be understood in connection with the accompanying drawings. In this case, all described and/or shown features are indicative of the subject matter disclosed herein, either by themselves or in any combination, irrespective of their combination or relationship in the respective claims. The dimensions and proportions of parts illustrated in the figures are not necessarily shown to scale herein; may be different in the embodiment to be realized.
Fig. 1 shows a schematic side view of an adjusting device for an optically effective element;
figures 2 and 3 show a top view of an embodiment of a bracket of the adjustment device;
FIG. 4 shows a schematic side view of an optical component detection system for detecting at least one surface of at least one component;
fig. 5 and 6 schematically show different image shots with image areas of different sharpness;
FIG. 7 schematically shows an image photograph, which is made up of image areas of a previous image shot;
figures 8 to 10 show side views of the clear plane of the component to be imaged relative to the image sensor of the optical component detection system;
FIG. 11 shows a time-velocity diagram according to which the relative distance between the optically active element and the image sensor of the optical assembly detection system is varied;
FIG. 12 shows a time-focus path diagram according to which an embodiment for taking an image of a component is explained;
fig. 13 shows another time-focus path diagram according to which another embodiment for capturing component images can be explained.
The device variants and their functions and operating solutions described herein are only intended to better understand their structure, manner of operation and performance; they do not limit the invention to the examples. The drawings are partly schematic, in which the main features and effects are shown greatly exaggerated, in order to clarify the function, principle of action, technical solutions and features. In this case, each mode of operation, each principle, each technical solution and each feature disclosed in the drawings or the text can be freely and arbitrarily combined with each feature in all the claims, in the description and in the other drawings, other modes of operation, principles, technical solutions and features included in or derived from the present disclosure, thereby enabling all conceivable combinations to be associated with the device of the present invention. In this case, the combination included in the text, i.e. in each paragraph of the description, between all the individual embodiments in the claims and between the different variants in the description, in the claims and in the drawings, is included herein and can be the subject of further claims. The claims are also not limited to the disclosure and all combinations of the shown features with one another. All disclosed features are also expressly disclosed either individually or in combination with all other features.
Detailed Description
Corresponding or functionally similar components in the figures are provided with the same reference numerals. Apparatus and methods will now be described in accordance with embodiments.
Fig. 1 shows an
The
As shown in fig. 1, the
In fig. 1,
In fig. 1, the first end region of the
In another variant, the position of the first end region of the
The
When a current flows through the
In another variant, the
Because the
A lightweight construction for the optically effective element is thus achieved by the construction shown in fig. 1. Further, the
In another embodiment, the
A possible embodiment of the
In fig. 2, the
According to the variant of the
Fig. 3 shows another variation of
According to the variant shown in fig. 4, the
In fig. 4, an optical
The
In the optical
Alternatively, the optical
In fig. 4, the first surface O of the semiconductor chip B is not arranged parallel to the
The optical
In fig. 4, the control device ECU includes an image processing device BV. In another variant, the control device ECU and the image processing device BV may be two independent units configured to communicate with each other.
Furthermore, the optical
Optical
In another variation, the
The
In fig. 4, the control device ECU is configured to adjust the relative distance between the
In one possible case, the surface O of the semiconductor chip B is substantially parallel to the
If the
The first and/or
Since the semiconductor chip B in fig. 4 is inclined with respect to the
The image processing apparatus BV is configured to capture a first image BA by the
The first captured image BA is schematically shown in fig. 5, in which the image area with the solid black line is a clearly imaged image area. In other words, the image areas with the black solid lines are areas in which partial areas of the definition plane SE of the reflected light L are projected onto the
The image processing device BV determines, based on the first image BA, a second image region B2 of the first plurality of image regions B1, B2 in which second image region B2 the second partial region of the sharpness plane SE of the reflected light L is not projected onto the
Next, the image processing apparatus BV supplies a control signal to the control apparatus ECU. The control device ECU controls the first and/or
As schematically shown in fig. 6, the image processing apparatus BV determines, based on the second image BB, a third image region B3 of the second plurality of image regions B3, B4 of the second image BB. In the third image region B3, the first partial region of the sharpness plane SE of the reflected light L is not projected onto the
In a preferred embodiment, the first and second image regions B1, B2 of the first image BA image a portion of the semiconductor chip B, respectively, which substantially corresponds to the portion of the semiconductor chip B in the third and fourth image regions B3, B4 of the second image BB. It is thus ensured that the same portion of the semiconductor chip B is imaged in the image areas B1, B2, B3, B4 of the first and second images BA, BB.
The first and second image regions B1, B2 of the first image BA and the third and fourth image regions B3, B4 of the second image BB are shown in fig. 5 to 7 as connected image regions. Alternatively, the first and second image regions B1, B2 of the first image BA and the third and fourth image regions B3, B4 of the second image BB may at least partially overlap.
The image processing apparatus BV is configured to cut out a first image area B1 from a first image BA and a fourth image area B4 from a second image BB. Based on the cut-out image regions B1, B4, the image processing apparatus BV generates a third image BC. The third image BC is completely clear imaging of the semiconductor chip B because the semiconductor chip B is clearly imaged in both the first image region B1 of the first image BA and the fourth image region B4 of the second image BB.
The image processing means BV are also configured to determine whether the semiconductor chip B has at least one defect or damage based on the first image region B1 of the first image B2 and/or the fourth image region B4 of the second image BB and/or the third image BC. The image processing apparatus BV provides defect information if the semiconductor chip B has a defect or damage.
The definition plane SE with the corresponding depth of field ST is now shown in fig. 8 to 10. In order to clearly photograph the semiconductor chip B, the definition plane SE must be substantially projected on the
As schematically shown in fig. 8, the sharpness plane SE is projected parallel to the
In fig. 9 and 10, the case of the semiconductor chip B being inclined is schematically shown. Since the surface O of the semiconductor chip B is inclined with respect to the
In fig. 9, a first subregion of the sharpness plane SE is projected onto the
The control device ECU controls the first and/or
As schematically shown in fig. 10, in the case of the fourth image region B4 of the captured second image BB, the second partial region of the sharpness plane SE of the reflected light L with the depth of field ST is projected onto the
In light of the foregoing description of the optical
In one step, the semiconductor chip B is aligned to the
In the next step, the light L reflected on the surface O of the semiconductor chip B is received by means of the
A time-velocity (t, v) diagram is shown in fig. 11. According to the figure, the control device ECU is configured to adjust the relative distance between the
At time t0, the relative distance between
After adjusting the
Subsequently, the first and second image areas B1, B2 are determined by the image processing apparatus BV based on the first image BA. In the case of the first image BA, in the first image region B1, a first partial region of the sharpness plane SE is projected onto the
Then at the beginning of a third time period (t2-t3), the control device ECU controls the first and/or
Accordingly, until half of the third time period (t2-t3), the
The image processing means BV are further configured to provide a further control command to the first and/or second executing means 120, 240 in order to adjust the relative distance between the
In a further variant, three images each having three image areas are captured by means of the
In a first variant, the
Fig. 12 and 13 each show a time-focus path diagram, in which
In fig. 12, before time point t0, the semiconductor chip B is positioned before the
In
The predetermined path length is related to the optical characteristics of
After the adjustment of the relative distance in
After completing the capturing of the second image in
After the adjustment of the relative distance in
After capturing the third image in
After
In another variant, after one of the
In another variant, the image processing means BV carry out
The image processing and the inspection of the image of the semiconductor chip B can be separated from the image capturing process. By making the capturing of the image of the semiconductor chip B independent of the image processing and the inspection of the image, the process of image capturing can be reduced to fewer steps. Accordingly, the duration of the image capturing of each component is reduced and a larger number of components and corresponding image capturing can be achieved in the same time.
In fig. 13, before time point t0, semiconductor chip B is positioned before
The timing of adjusting the relative distance between the
In fig. 13, the predetermined speed is kept constant along the moving path of the generated image. In another variation, the speed for adjusting the relative distance between
After adjusting the sharpness plane SE for a predetermined path length and/or a predetermined time, a second image is captured by means of the
After further adjustment of the sharpness plane SE by the predetermined path length and/or the predetermined time, a third image is captured in step 450 by means of the
According to fig. 13, the offset
- 上一篇:一种医用注射器针头装配设备
- 下一篇:用于收集三维数据的光电装置