Three-dimensional measurement device, control device, and robot system

文档序号:1685400 发布日期:2020-01-03 浏览:11次 中文

阅读说明:本技术 三维计测装置、控制装置及机器人系统 (Three-dimensional measurement device, control device, and robot system ) 是由 若林修一 日野真希子 堀口宏贞 石田大辅 于 2019-06-24 设计创作,主要内容包括:本发明涉及三维计测装置、控制装置及机器人系统。三维计测装置具备:人体感应信息接收部,接收来自对检测范围内的人进行检测的人体感应传感器的信息;激光照射部,向包括对象物的区域照射激光;照射控制部,将从激光照射部照射的激光的输出控制为第一输出和低于第一输出的第二输出;拍摄部,拍摄被照射有激光的对象物并获取图像数据;以及点群数据生成部,基于图像数据,生成包括对象物的区域的三维点群数据,当人体感应信息接收部从人体感应传感器接收到表示检测范围内没有人的第一信息时,照射控制部使激光的输出为第一输出。(The invention relates to a three-dimensional measurement device, a control device, and a robot system. The three-dimensional measurement device is provided with: a human body induction information receiving unit which receives information from a human body induction sensor which detects a person within a detection range; a laser irradiation unit that irradiates a region including an object with laser light; an irradiation control unit that controls an output of the laser light irradiated from the laser irradiation unit to a first output and a second output lower than the first output; an imaging unit that images an object irradiated with laser light and acquires image data; and a point group data generation unit that generates three-dimensional point group data of a region including the object based on the image data, wherein the irradiation control unit causes the output of the laser light to be a first output when the human body response information reception unit receives first information indicating that no human is present in the detection range from the human body response sensor.)

1. A three-dimensional measurement device for performing three-dimensional measurement of an object using a laser beam, the three-dimensional measurement device comprising:

a human body induction information receiving part which receives information from a human body induction sensor which detects a person within a detection range;

a laser irradiation unit configured to irradiate a region including the object with the laser light;

an irradiation control unit that controls an output of the laser light irradiated from the laser irradiation unit to a first output and a second output lower than the first output;

an imaging unit that images the object irradiated with the laser beam and acquires image data; and

a point cloud data generation unit that generates three-dimensional point cloud data of a region including the object based on the image data,

when the human body response information receiving unit receives first information indicating that no human being is present in the detection range from the human body response sensor, the irradiation control unit sets the output of the laser light to the first output.

2. The three-dimensional measurement device according to claim 1,

when the human body response information receiving unit receives second information indicating that a person is present within the detection range from the human body response sensor, the irradiation control unit causes the output of the laser light to be the second output.

3. The three-dimensional measurement device according to claim 1,

when the irradiation of the laser light is started at the second output and the human body sensing information receiving section receives the first information before the irradiation of the laser light is completed, the irradiation control section maintains the irradiation of the laser light at the second output.

4. The three-dimensional measurement device according to claim 1,

when the irradiation of the laser light is started with the first output and the human body response information receiving unit receives second information indicating that a person is present in the detection range from the human body response sensor before the irradiation of the laser light is completed, the irradiation control unit switches the output of the laser light to the second output or stops the irradiation of the laser light.

5. The three-dimensional measurement device according to claim 1,

the three-dimensional measuring device has an imaging control unit for controlling an imaging mode of the imaging unit,

the photographing control section has a first photographing mode and a second photographing mode as the photographing mode,

in the first photographing mode, an exposure time of the photographing part is a first exposure time,

in the second photographing mode, an exposure time of the photographing part is longer than the first exposure time,

when the output of the laser light is the first output, the imaging control unit sets the imaging mode to the first imaging mode.

6. The three-dimensional measurement device according to claim 5,

when the output of the laser light is the second output, the photographing control section makes the photographing mode the second photographing mode.

7. The three-dimensional measurement device according to claim 1,

the laser irradiation unit has a mirror for diffusing the laser light.

8. The three-dimensional measurement device according to claim 1,

the laser irradiation unit has a lens for diffusing the laser light.

9. The three-dimensional measurement device according to claim 1,

the laser irradiation unit includes a diffractive optical element that diffuses the laser light.

10. A control device connected to a laser irradiation unit that irradiates a region including an object with laser light, and an imaging unit that images the object irradiated with the laser light and acquires image data, the control device comprising:

a human body induction information receiving part which receives information from a human body induction sensor which detects a person within a detection range; and

a processor that controls an output of the laser light irradiated from the laser irradiation section to a first output and a second output lower than the first output,

when the human body response information receiving unit receives first information indicating that no human being is present in the detection range from the human body response sensor, the processor causes the output of the laser light to be the first output.

11. The control device according to claim 10,

when the human body response information receiving unit receives second information indicating that a person is present in the detection range from the human body response sensor, the processor causes the output of the laser light to be the second output.

12. The control device according to claim 10,

when the irradiation of the laser light is started at the second output and the human body sensing information receiving part receives the first information before the irradiation of the laser light is completed, the processor maintains the irradiation of the laser light at the second output.

13. The control device according to claim 10,

when the irradiation of the laser light is started at the first output and the human body response information receiving unit receives second information indicating that a person is present in the detection range from the human body response sensor before the irradiation of the laser light is completed, the processor switches the output of the laser light to the second output or stops the irradiation of the laser light.

14. A robot system is characterized by comprising:

a robot;

a human body induction sensor for detecting a human body within a detection range;

a three-dimensional measurement device for performing three-dimensional measurement of an object using a laser beam; and

a robot control device for controlling the operation of the robot based on the measurement result of the three-dimensional measurement device,

the three-dimensional measurement device includes:

a human body induction information receiving unit which receives information from a human body induction sensor which detects a human body within a detection range;

a laser irradiation unit configured to irradiate a region including the object with the laser light;

a processor that controls an output of the laser light irradiated from the laser irradiation section to a first output and a second output lower than the first output; and

an imaging unit that images the object irradiated with the laser beam and acquires image data,

when the human body response information receiving unit receives first information indicating that no human being is present in the detection range from the human body response sensor, the processor causes the output of the laser light to be the first output.

15. The robotic system of claim 14,

when the human body response information receiving unit receives second information indicating that a person is present in the detection range from the human body response sensor, the processor causes the output of the laser light to be the second output.

16. The robotic system of claim 14,

when the irradiation of the laser light is started at the second output and the human body sensing information receiving part receives the first information before the irradiation of the laser light is completed, the processor maintains the irradiation of the laser light at the second output.

17. The robotic system of claim 14,

when the irradiation of the laser light is started at the first output and the human body response information receiving unit receives second information indicating that a person is present in the detection range from the human body response sensor before the irradiation of the laser light is completed, the processor switches the output of the laser light to the second output or stops the irradiation of the laser light.

18. The robotic system of claim 14,

the three-dimensional measuring device has an imaging control unit for controlling an imaging mode of the imaging unit,

the photographing control section has a first photographing mode and a second photographing mode as the photographing mode,

in the first photographing mode, an exposure time of the photographing part is a first exposure time,

in the second photographing mode, an exposure time of the photographing part is longer than the first exposure time,

when the output of the laser light is the first output, the imaging control unit sets the imaging mode to the first imaging mode.

19. The robotic system as claimed in claim 18,

when the output of the laser light is the second output, the photographing control section makes the photographing mode the second photographing mode.

Technical Field

The invention relates to a three-dimensional measurement device, a control device, and a robot system.

Background

Patent document 1 describes an articulated robot equipped with a three-dimensional shape measuring device. The three-dimensional shape measuring device mounted on the articulated robot includes a measuring laser irradiator that projects a pattern on an object portion by scanning measuring laser light onto the object, and a light receiver that acquires an image of the object on which the pattern is projected, and is configured to perform three-dimensional measurement of the object based on the image acquired by the light receiver.

Patent document 1: japanese patent laid-open publication No. 2004-333369

Here, when performing three-dimensional measurement of an object using a three-dimensional shape measuring apparatus using a measuring laser, in order to prevent a human being from being present around a robot, for example, measures such as reducing the output (intensity) of the measuring laser need to be taken. However, if the output of the measurement laser light is reduced, there is a problem that the contrast of the pattern projected onto the object is reduced, and the three-dimensional measurement accuracy of the object is reduced.

Disclosure of Invention

A three-dimensional measurement device according to the present invention is a three-dimensional measurement device for performing three-dimensional measurement of an object using a laser beam, the three-dimensional measurement device including: a human body induction information receiving part which receives information from a human body induction sensor which detects a person within a detection range; a laser irradiation unit configured to irradiate a region including the object with the laser light; an irradiation control unit that controls an output of the laser light irradiated from the laser irradiation unit to a first output and a second output lower than the first output; an imaging unit that images the object irradiated with the laser beam and acquires image data; and a point cloud data generating unit that generates three-dimensional point cloud data of a region including the object based on the image data, wherein the irradiation control unit causes the output of the laser light to be the first output when the human body response information receiving unit receives first information indicating that no human being is present in the detection range from the human body response sensor.

The control device according to the present invention is a control device connected to a laser irradiation unit that irradiates a region including an object with laser light, and an imaging unit that images the object irradiated with the laser light and acquires image data, the control device including: a human body induction information receiving part which receives information from a human body induction sensor which detects a person within a detection range; and a processor that controls an output of the laser light irradiated from the laser irradiation unit to a first output and a second output lower than the first output, wherein the processor causes the output of the laser light to be the first output when the human body sensation information reception unit receives first information indicating that no human being is present in the detection range from the human body sensation sensor.

The robot system of the present invention is characterized by comprising: a robot; a human body induction sensor for detecting a human body within a detection range; a three-dimensional measurement device for performing three-dimensional measurement of an object using a laser beam; and a robot control device that controls an operation of the robot based on a measurement result of the three-dimensional measurement device, the three-dimensional measurement device including: a human body induction information receiving unit which receives information from a human body induction sensor which detects a human body within a detection range; a laser irradiation unit configured to irradiate a region including the object with the laser light; a processor that controls an output of the laser light irradiated from the laser irradiation section to a first output and a second output lower than the first output; and an imaging unit that images the object irradiated with the laser beam to acquire image data, wherein the processor causes the output of the laser beam to be the first output when the human body response information receiving unit receives first information indicating that no person is present in the detection range from the human body response sensor.

Drawings

Fig. 1 is a diagram showing an overall configuration of a robot system according to a first embodiment of the present invention.

Fig. 2 is a diagram showing the overall configuration of the three-dimensional measurement device.

Fig. 3 is a plan view showing an optical scanning unit included in the three-dimensional measurement device shown in fig. 2.

Fig. 4 is a plan view showing a projection pattern projected by the laser irradiation section.

Fig. 5 is a plan view showing a detection range set in the robot system.

Fig. 6 is a plan view showing a detection range set in the robot system according to the second embodiment of the present invention.

Fig. 7 is a plan view showing a detection range set in the robot system according to the third embodiment of the present invention.

Fig. 8 is a diagram showing an overall configuration of a laser irradiation unit included in a robot system according to a fourth embodiment of the present invention.

Fig. 9 is a plan view showing a diffractive optical element included in the laser irradiation unit shown in fig. 8.

Description of the reference numerals

1 … robotic system; 2 … robot; 21 … base; 221 to 226 … first to sixth arms; 24 … end effector; 251-256 … first to fifth driving devices; 3 … human body induction sensor; a 31 … camera; 4 … three-dimensional measuring device; 41 … laser irradiation unit; 42 … laser light source; 43 … diffractive optical element; 431-438 … diffraction gratings; 44 … optical system; 441 … collimating lens; 442 … rod lenses; 445 … projection lens; 45 … light scanning section; 451 … movable part; 452 … a support portion; 453 … beam section; 454 … mirror; 455 … permanent magnet; 456 … coil; 457 … piezoresistor part; 47 … imaging unit; 471 … cameras; 472 … camera element; 473 … condenser lens; 48 … control device; 481 … human body sensing information receiving part; 482 … an irradiation control unit; 483 … light scanning control section; 484 … imaging control unit; 485 … point group data generating part; 49 … actuator; 5 … robot control device; 6 … host computer; 61 … calculation section; a … central axis; j … rotating shaft; an L … laser; O1-O6 … first to sixth shafts; p … projection pattern; s1 … area of maximum mobility; s2, S2', S2 "… high intensity laser irradiated region; s3, S3', S3 "… laser output control region; s4 … robot drive control area; w … target.

Detailed Description

Hereinafter, the three-dimensional measurement device, the control device, and the robot system according to the present invention will be described in detail based on the embodiments shown in the drawings.

< first embodiment >

Fig. 1 is a diagram showing an overall configuration of a robot system according to a first embodiment of the present invention. Fig. 2 is a diagram showing the overall configuration of the three-dimensional measurement device. Fig. 3 is a plan view showing an optical scanning unit included in the three-dimensional measurement device shown in fig. 2. Fig. 4 is a plan view showing a projection pattern projected by the laser irradiation section. Fig. 5 is a plan view showing a detection range set in the robot system.

The robot system 1 shown in fig. 1 is a human-coexisting robot system that coexists with a human, that is, is premised on a human performing work around. Therefore, the robot system 1 is configured to detect the presence of a person within the detection range and to respond accordingly.

The robot system 1 includes a robot 2, a human body sensor 3 that senses a human body within a detection range set around the robot 2, a three-dimensional measurement device 4 that three-dimensionally measures an object W using a laser beam L, a robot control device 5 that controls driving of the robot 2 based on a measurement result of the three-dimensional measurement device 4, and a host computer 6 that can communicate with the robot control device 5. Note that these respective units may communicate by wire or wirelessly, and the communication may also be performed via a network such as the internet.

In the robot system 1, when the human body sensor 3 senses a human being in the detection range, the output of the laser light L is reduced to a level safe even for the eyes of the human being entering the detection range. Thereby, the robot system 1 is safe for a person within the detection range. Hereinafter, such a robot system 1 will be described in detail.

< robot >

The robot 2 is a robot that performs operations such as feeding, discharging, transporting, and assembling of precision equipment and its constituent components, for example. However, the use of the robot 2 is not particularly limited.

The robot 2 is a six-axis robot, and as shown in fig. 1, includes a base 21 fixed to a floor or a ceiling, a first arm 221 connected to the base 21 so as to be rotatable about a first axis O1, a second arm 222 connected to the first arm 221 so as to be rotatable about a second axis O2, a third arm 223 connected to the second arm 222 so as to be rotatable about a third axis O3, a fourth arm 224 connected to the third arm 223 so as to be rotatable about a fourth axis O4, a fifth arm 225 connected to the fourth arm 224 so as to be rotatable about a fifth axis O5, and a sixth arm 226 connected to the fifth arm 225 so as to be rotatable about a sixth axis O6. Further, a gripper joint is provided on the sixth arm 226, and the end effector 24 corresponding to the work performed by the robot 2 is attached to the gripper joint.

The robot 2 further includes a first drive 251 for rotating the first arm 221 with respect to the base 21, a second drive 252 for rotating the second arm 222 with respect to the first arm 221, a third drive 253 for rotating the third arm 223 with respect to the second arm 222, a fourth drive 254 for rotating the fourth arm 224 with respect to the third arm 223, a fifth drive 255 for rotating the fifth arm 225 with respect to the fourth arm 224, and a sixth drive 256 for rotating the sixth arm 226 with respect to the fifth arm 225. The first to sixth driving devices 251 to 256 respectively include, for example, a motor as a driving source, a controller for controlling the driving of the motor, and an encoder for detecting the amount of rotation of the motor. The first drive device 251 to the sixth drive device 256 are independently controlled by the robot control device 5.

Note that the robot 2 is not limited to the configuration of the present embodiment, and the number of arms may be 1 to 5, or 7 or more, for example. Further, for example, the robot 2 may be a SCARA robot or a two-arm robot.

< robot control device >

The robot control device 5 receives a position command of the robot 2 from the host computer 6, and independently controls the driving of the first drive device 251 to the sixth drive device 256 so that the arms 221 to 226 are positioned in accordance with the received position command. The robot control device 5 is configured by, for example, a computer, and includes a processor (CPU) for processing information, a memory communicably connected to the processor, and an external interface. Various programs executable by the processor are stored in the memory, and the processor can read and execute the various programs stored in the memory.

< human body sensor >

The human body sensor 3 senses the presence or absence of a human in a laser output control area S3 (detection range) set around the robot 2. Then, the human body sensor 3 transmits the result of sensing to the three-dimensional measurement device 4. Note that, hereinafter, the first information output from the human body sensor 3 when there is no person in the laser output control region S3 is also referred to as "undetected signal", and the second information output from the human body sensor 3 when there is a person in the laser output control region S3 is also referred to as "detected signal".

The structure of the human body induction sensor 3 is not particularly limited as long as such an object can be achieved. As shown in fig. 1, the human body sensor 3 of the present embodiment includes a camera 31 that is provided above the robot 2 and can capture an entire area of the laser output control area S3, and is configured to sense a person in the laser output control area S3 based on image data captured by the camera 31. However, the arrangement of the camera 31 is not limited to the ceiling, and may be provided on a wall, a movable or fixed stand, or a floor, for example. The number of cameras 31 is not particularly limited, and may be two or more. The camera 31 may be provided to the robot 2, for example. In this case, a camera 471 described later may also be used as the camera 31. In addition, as the human body sensor 3, a weight sensor, a laser sensor, an infrared sensor, a capacitance sensor, or the like may be used.

< three-dimensional measurement device >

The three-dimensional measurement device 4 detects the posture, position, and the like of the object W by a phase shift method. As shown in fig. 2, the three-dimensional measurement device 4 includes: a laser irradiation unit 41 that irradiates a region including the object W with laser light L; an imaging unit 47 that images the object W irradiated with the laser light L and acquires image data; and a control device 48 for controlling the driving of the laser irradiation unit 41 and the imaging unit 47, or for generating three-dimensional point group data of the object W from the image data acquired by the imaging unit 47.

In each of these components, the laser irradiation unit 41 and the imaging unit 47 are fixed to the fifth arm 225 of the robot 2. The laser irradiation unit 41 is arranged to irradiate the laser beam L toward the distal end side (end effector 24 side) of the fifth arm 225, and the imaging unit 47 is arranged to image a region including the irradiation range of the laser beam L toward the distal end side (end effector 24 side) of the fifth arm 225.

Here, even if the arms 221 to 224, 226 other than the fifth arm 225 are movable, the end effector 24 is maintained in a relationship of being positioned on the tip side of the fifth arm 225. Therefore, by fixing the laser irradiation unit 41 and the imaging unit 47 to the fifth arm 225, the three-dimensional measurement device 4 can always emit the laser light L to the distal end side of the end effector 24 and can image the distal end side of the end effector 24. Therefore, regardless of the posture when the end effector 24 is intended to grip the object W, that is, the posture in which the end effector 24 faces the object W, the laser light L can be irradiated toward the object W in this posture and the object W can be imaged. Therefore, the three-dimensional measurement of the object W can be performed more reliably.

However, the arrangement of the laser irradiation unit 41 and the imaging unit 47 is not particularly limited, and may be fixed to the first to fourth arms 221 to 224 and the sixth arm 226. The laser irradiation unit 41 and the imaging unit 47 may be fixed to different arms. At least one of the laser irradiation unit 41 and the imaging unit 47 may be fixed to an immovable portion such as the base 21, the floor, the ceiling, or the wall.

The laser irradiation unit 41 has a function of projecting a predetermined projection pattern P (see fig. 4) onto the object W by irradiating the object W with the laser light L. Such a laser irradiation unit 41 includes a laser light source 42 that emits laser light L, an optical system 44 including a plurality of lenses through which the laser light L passes, and a light scanning unit 45 that scans the laser light L passing through the optical system 44 toward the object W.

The laser light source 42 is not particularly limited, and for example, a semiconductor laser such as a Vertical Cavity Surface Emitting Laser (VCSEL) or a Vertical External Cavity Surface Emitting Laser (VECSEL) can be used. The wavelength of the laser beam L is not particularly limited, and may be in the visible region (400nm to 700nm), the invisible region (400nm or less, 1400nm to 1mm), or the near infrared region (700nm to 1400 nm). However, the wavelength of the laser light L is preferably in the visible region (400nm to 700 nm). In the visible region, even if the laser light L enters the eyes of a person who coexists with the robot 2, the person immediately feels dazzling and exhibits a defensive response to blinking. Therefore, the robot system 1 is safer by setting the wavelength of the laser light L to the visible region.

The optical system 44 includes: a collimator lens 441 that collimates the laser light L emitted from the laser light source 42; and a rod lens 442 (lens) for making the laser light L collimated by the collimator lens 441 into a linear shape extending in a direction parallel to a rotation axis J (depth direction of the paper surface in fig. 2) to be described later.

The optical scanning unit 45 has a function of scanning the laser light L formed in a linear shape by the rod lens 442. This allows the laser beam L to be diffused and irradiated two-dimensionally (in a planar manner). By diffusing the laser light L two-dimensionally in this manner, the output per unit area decreases as the optical path of the laser light L becomes longer. The optical scanning unit 45 is not particularly limited, and for example, MEMS (Micro electro mechanical Systems), galvano mirrors, polygon mirrors, and the like can be used.

The optical scanning unit 45 of the present embodiment is formed of a MEMS. As shown in fig. 3, the optical scanning unit 45 includes a movable portion 451, a support portion 452 supporting the movable portion 451, a beam portion 453 connecting the movable portion 451 and the support portion 452 and allowing the movable portion 451 to rotate relative to the support portion 452 about a rotation axis J, a mirror 454 disposed on the front surface (the surface on the near side of the paper surface in fig. 3) of the movable portion 451 and reflecting the laser light L, a permanent magnet 455 disposed on the back surface (the surface on the far side of the paper surface in fig. 3) of the movable portion 451, and a coil 456 disposed to face the permanent magnet 455. Note that the movable portion 451, the support portion 452, and the beam portion 453 are integrally formed of, for example, a silicon substrate.

The optical scanning unit 45 is disposed such that the rotation axis J substantially coincides with the extending direction of the linear laser light L. Then, when a drive signal (alternating voltage) is applied to the coil 456, the movable portion 451 rotates around the rotation axis J, thereby scanning the linear laser light L.

The optical scanning unit 45 includes a piezoelectric resistor 457 provided on the support portion 452. In the piezoelectric resistance portion 457, the resistance value changes according to the stress generated in the support portion 452 as the movable portion 451 rotates about the rotation axis J. Therefore, in the optical scanning unit 45, the rotation angle of the movable portion 451 can be detected based on the change in the resistance value of the piezoelectric resistance portion 457. The piezoelectric resistance portion 457 can be formed by doping (diffusing or implanting) a silicon substrate with an impurity such as phosphorus or boron.

The laser irradiation unit 41 has been described above. As described above, in such a laser irradiation section 41, the laser light L is two-dimensionally diffused by the optical system 44 and the light scanning section 45. Therefore, as the distance from the laser irradiation section 41 increases, in other words, the optical path length of the laser light L increases, the intensity of the laser light L, that is, the energy per unit time in each region that can be irradiated with the laser light L decreases. With this configuration, it is possible to more effectively suppress entry of the high-intensity laser light L into the eyes of the human coexisting with the robot 2. Therefore, the robot system 1 is safe for a person coexisting with the robot 2.

The configuration of the laser irradiation unit 41 is not particularly limited as long as the predetermined projection pattern P can be projected onto the object W. For example, in the present embodiment, the laser light L is linearly diffused by the optical system 44, but the present invention is not limited thereto, and for example, the laser light L may be linearly diffused by using MEMS or a current mirror. That is, the laser light L may be two-dimensionally scanned by the two light scanning units 45. In addition, for example, the laser light L may be two-dimensionally scanned using a gimbal-type MEMS having biaxial degrees of freedom.

The imaging unit 47 images a state in which the projection pattern P is projected on at least one object W. That is, the imaging unit 47 images at least one object W including the projection pattern P. As shown in fig. 2, the image pickup unit 47 is constituted by, for example, a camera 471, and the camera 471 includes an image pickup element 472 such as a CMOS image sensor or a CCD image sensor, and a condenser lens 473.

As shown in fig. 2, the control device 48 includes a human body sensing information receiving section 481 that receives information from the human body sensing sensor 3, an irradiation control section 482 that controls driving of the laser light source 42, a light scanning control section 483 that controls driving of the light scanning section 45, an imaging control section 484 that controls driving of the imaging section 47, and a point group data generating section 485 that generates three-dimensional point group data of an area including the object W based on image data acquired by the imaging section 47.

The control device 48 is constituted by a computer, for example, and has a processor (CPU) that processes information, a memory communicably connected to the processor, and an external interface. Various programs executable by the processor are stored (stored) in the memory, and the processor can read and execute the various programs stored in the memory.

The light scanning control section 483 controls the driving of the light scanning section 45 by applying a driving signal to the coil 456. Further, the optical scanning control section 483 detects the rotation angle of the movable section 451 based on the change in the resistance value of the piezoelectric resistance section 457. The irradiation control unit 482 controls the driving of the laser light source 42 by applying a drive signal to the laser light source 42. The irradiation control unit 482 emits the laser light L from the laser light source 42 in synchronization with the rotation of the movable unit 451 detected by the light scanning control unit 483, and forms a projected pattern P of ribs represented by light and dark luminance values on the object W as shown in fig. 4, for example.

The irradiation control unit 482 can select a first output and a second output lower than the first output as the output of the laser light L. The irradiation control unit 482 may be configured to be able to select a third output lower than the second output, a fourth output lower than the third output, and the like. That is, the number of selections of the outputs provided by the irradiation control unit 482 is not particularly limited.

Here, after the movable portion 451 starts rotating, the irradiation control portion 482 emits the laser light L from the laser light source 42. That is, the timing of emitting the laser light L from the laser light source 42 is later than the timing of starting the rotation of the movable portion 451. For example, when the laser light L is emitted in a state where the movable portion 451 does not rotate and the posture is constant, the laser light L is continuously irradiated to the same position. If a human eye coexisting with the robot 2 is present on the optical path of the laser light L, the laser light L continues to enter the human eye, and the human eye may be adversely affected by the intensity of the laser light L, and the like. On the other hand, if the movable portion 451 starts rotating before the laser light L is emitted, the laser light L is scanned and does not continue to be irradiated to the same position. Therefore, the above problem is not easily caused, and the robot system 1 is safer for a person who coexists with the robot 2.

In addition, the irradiation control section 482 may determine whether or not the movable section 451 rotates based on whether or not the light scanning control section 483 applies a drive signal to the coil 456, but, in contrast to this, preferably determines whether or not the movable section 451 rotates based on a change in the resistance value of the piezoelectric resistance section 457. For example, it is considered that the movable part 451 may not start rotating despite the application of the driving signal to the coil 456 due to a failure, disconnection, or the like of the optical scanning part 45. On the other hand, if the resistance value of the piezoelectric resistance portion 457 changes, the resistance value of the piezoelectric resistance portion 457 does not change when the movable portion 451 does not start rotating, and therefore it can be reliably confirmed that the movable portion 451 starts rotating. Therefore, the robot system 1 is safer for a person who coexists with the robot 2.

The photographing control unit 484 controls driving of the photographing unit 47 (camera 471). Here, the projection pattern P is projected four times with a shift of pi/2, and the imaging control unit 484 images the object W on which the projection pattern P is projected by the imaging unit 47 each time. However, the number of times of projection of the projection pattern P is not particularly limited as long as the number of phases can be calculated from the imaging result. In addition, similar projection and imaging may be performed using a pattern with a large pitch or a pattern with a small pitch, and phase connection may be performed. As the number of types of pitches increases, the measurement range and the resolution can be increased, but the time required to acquire image data increases as the number of times of imaging increases, and the work efficiency of the robot 2 decreases. Therefore, the number of projection times of the projection pattern P may be appropriately set according to the balance between the accuracy and the measurement range of the three-dimensional measurement and the operating efficiency of the robot 2.

The imaging control unit 484 has, as the imaging mode of the imaging unit 47, a first imaging mode for imaging the object W with a first exposure time and a second imaging mode for imaging the object W with a second exposure time longer than the first exposure time, and can select these modes. Note that, as the imaging mode of the imaging section 47, the imaging control section 484 may further have a third imaging mode in which the object W is imaged with a third exposure time longer than the second exposure time, a fourth imaging mode in which the object W is imaged with a fourth exposure time longer than the third exposure time, and the like. That is, the number of shooting modes that the shooting control section 484 has is not particularly limited.

As described above, the irradiation control unit 482 can select the first output and the second output having a lower laser intensity than the first output as the output of the laser light L. When the irradiation control unit 482 selects the first output, the imaging control unit 484 selects the first imaging mode. In the first output, since the projection pattern P is projected with sufficient brightness, by setting the first imaging mode to a short exposure time, it is possible to acquire image data with appropriate brightness and contrast while suppressing so-called "white misregistration" of the image acquired by the camera 471. On the other hand, when the irradiation control unit 482 selects the second output, the imaging control unit 484 selects the second imaging mode. In the second output, since the projection pattern P darker than the first output is projected, by setting the second photographing mode to a longer exposure time, it is possible to acquire image data having appropriate brightness and contrast while suppressing so-called "black misregistration".

By appropriately changing the first imaging mode and the second imaging mode in accordance with the output of the laser light L in this way, image data having sufficient brightness and contrast necessary for performing three-dimensional measurement of the object W can be acquired by the camera 471 regardless of the projection pattern P projected as the first output or the second output.

The first exposure time and the second exposure time are not particularly limited, respectively, and may be appropriately set according to the brightness of the projection pattern P projected at the first output and the second output, and for example, it is preferable that the exposure amounts (intensity of light incident to the imaging element × exposure time) be set to be equal to each other in the first imaging mode and the second imaging mode. Thereby, the image data acquired in the first photographing mode and the image data acquired in the second photographing mode can be made more homogeneous.

In the above configuration, the exposure time is appropriately changed according to the output of the laser light L, but the present invention is not limited thereto, and for example, the aperture value may be appropriately changed according to the output of the laser light L. For example, when the irradiation control unit 482 selects the first output, the imaging control unit 484 may select the first aperture value, and when the irradiation control unit 482 selects the second output, the imaging control unit 484 may select the second aperture value on the open side of the first aperture value. However, since the depth of field of the camera 471 changes by changing the aperture value, and there is a possibility that an image is defocused, it is preferable to change the exposure time as in the present embodiment.

The point cloud data generation unit 485 generates three-dimensional point cloud data of a region including the object W from the plurality of image data acquired by the imaging unit 47 by using a phase shift method. Then, the three-dimensional point cloud data generated by the point cloud data generation unit 485 is transmitted to the host computer 6. The three-dimensional point group data is, for example, data obtained by recording three-dimensional coordinates of each point on the image data.

The host computer 6 includes a calculation unit 61, and the calculation unit 61 calculates three-dimensional information including the posture, position (spatial coordinates), and the like of the object W based on the three-dimensional point group data received from the point group data generation unit 485. For example, the posture and position of the object W can be calculated by storing information on the shape of the object W in the calculation unit 61 and matching the three-dimensional point cloud data with the shape of the object W. However, the present invention is not limited to this, and the shape of the object W may be acquired from the three-dimensional point cloud data.

The host computer 6 generates a position command of the robot 2 based on the calculated three-dimensional information of the object W, and transmits the generated position command to the robot controller 5. The robot controller 5 independently drives the first to sixth driving devices 251 to 256 based on the position command received from the host computer 6, and moves the first to sixth arms 221 to 226 to the instructed positions.

Note that, in the present embodiment, the host computer 6 includes the calculation unit 61, but the present invention is not limited thereto, and for example, the three-dimensional measurement device 4 and the robot control device 5 may include the calculation unit 61 or may include other devices.

The configuration of the robot system 1 is described above. Next, a control method of the robot system 1 will be described. As shown in fig. 5, the robot system 1 has a maximum movable region S1 and a high-intensity laser irradiation region S2, the maximum movable region S1 is a region where the tip of the end effector 24 can be located by driving of the robot 2, and the high-intensity laser irradiation region S2 is a region that is larger than the maximum movable region S1 and in which the intensity of the laser light L emitted at the first output can be higher than a prescribed intensity. In the robot system 1, a laser output control area S3 (detection range) is set which is larger than the high-intensity laser irradiation area S2 and includes the entire area of the high-intensity laser irradiation area S2. Note that, in fig. 5, for convenience of explanation, the regions S1 to S3 are each shown as a circle, but the shape of each region is not particularly limited.

The "predetermined intensity" is not particularly limited, and may be, for example, the intensity of the laser beam L to the extent that the intensity does not affect the human body, or the intensity of the laser beam L at the second output. The laser output control region S3 (detection range) is a range that is a predetermined distance or less from the laser irradiation unit 41, and includes the irradiation range of the laser light L to be irradiated.

First, a case where no person is present in the laser output control area S3, that is, a case where the human body sensation information receiving unit 481 receives an undetected signal from the human body sensation sensor 3 will be described. Here, "nobody" means that all or a part of a person is not present in the laser output control region S3. In this case, since there is no person in the laser output control region S3, the high-intensity laser light L is less likely to enter the eyes of the person. Therefore, when three-dimensional measurement of the object W is performed, the irradiation control unit 482 emits the laser light L at the first output, and the imaging control unit 484 images the object W in the first imaging mode. In addition, in the case where there is no human being in the laser output control area S3, the robot 2 and the human being are less likely to collide. Therefore, the robot controller 5 drives the first to sixth arms 221 to 226 at the normal moving speed.

Next, a case where a person is present in the laser output control area S3, that is, a case where the human body sensation information receiving unit 481 receives a detection signal from the human body sensation sensor 3 will be described. Here, "human" means that all or a part of a human is present in the laser output control area S3. In this case, since a person is present in the laser output control area S3, there is a possibility that the laser light L enters the eyes of the person. Therefore, when performing three-dimensional measurement of the object W, the irradiation control unit 482 emits the laser light L at a second output lower in intensity than the first output, and the imaging control unit 484 images the object W in a second imaging mode in which the exposure time is longer than that in the first imaging mode. In addition, when a person is present in the laser output control area S3, there is a possibility that the robot 2 may collide with the person. Therefore, the robot controller 5 drives the first to sixth arms 221 to 226 at a moving speed (angular speed) lower than the normal moving speed.

In this way, when there is no person in the laser output control area S3, the object W is imaged with a shorter exposure time using the laser light L of higher intensity than when there is a person in the laser output control area S3. This can shorten the time required for three-dimensional measurement of the object W, and improve the work efficiency (processing speed) of the robot 2. On the other hand, when there is a person in the laser output control area S3, the object W is imaged with a longer exposure time using the laser light L of a lower intensity than when there is no person in the laser output control area S3. Accordingly, as compared with the case where no person is present in the laser output control area S3, the time required for three-dimensional measurement of the object W is longer, and the work efficiency of the robot 2 is reduced, but the safety of the person can be ensured because the output of the laser light L is reduced to a level at which the person can safely enter the eyes of the person located in the laser output control area S3.

In addition, when there is a person in the laser output control area S3, the movement speeds of the first arm 221 to the sixth arm 226 are made slower than when there is no person in the laser output control area S3. Thus, even if the robot 2 comes into contact with a person, the impact thereof can be suppressed to be small. Therefore, the robot system 1 is safe for a person located within the laser output control area S3.

Note that the present invention is not limited to this, and the irradiation control section 482 may stop the emission of the laser light L when there is a person in the laser output control region S3. When there is a person in the laser output control area S3, the image pickup control unit 484 may pick up an image of the object W in the first image pickup mode. As a result, the quality of the image data is deteriorated and the accuracy of three-dimensional measurement of the object W is lowered as compared with the case where the object W is imaged in the second imaging mode. Therefore, the work efficiency (processing speed) of the robot 2 can be improved. When there is a person in the laser output control area S3, the robot control device 5 may stop driving the first arm 221 to the sixth arm 226, respectively. Thus, the robot system 1 is safer for a person who is located within the laser output control area S3.

Next, a case where the signal received by the human body sensation information receiving unit 481 changes during the three-dimensional measurement of the object W will be described. Note that "while the three-dimensional measurement is being performed" may be, for example, from when the emission of the laser beam L is started to acquire the first image data to when the acquisition of the last (fourth in the present embodiment) image data is completed and the emission of the laser beam L is stopped.

First, a case where the passing person escapes from the laser output control area S3 during the three-dimensional measurement of the object W, that is, a case where the signal received by the human body sensation information receiving unit 481 changes from the detected signal to the undetected signal during the three-dimensional measurement of the object W will be described. In this case, the irradiation controller 482 maintains the output of the laser light L at the second output until the three-dimensional measurement of the object W is completed, and the imaging controller 484 maintains the imaging mode of the camera 471 at the second imaging mode. That is, while a person is getting out of the laser output control area S3 during the three-dimensional measurement, the output of the laser beam L and the exposure time of the camera 471 are maintained until the three-dimensional measurement of the object W is completed, instead of changing the output of the laser beam L to the first output and changing the image capturing mode of the camera 471 to the first image capturing mode in accordance with the output of the laser beam L.

If the output of the laser beam L and the exposure time of the camera 471 are changed during the three-dimensional measurement of the object W, for example, a difference in brightness or contrast occurs between the image data acquired before the change and the image data acquired after the change, or distortion or missing of the projection pattern P occurs when the output of the laser beam L is changed, which may reduce the accuracy of the three-dimensional measurement of the object W. On the other hand, by maintaining the output of the laser light L and the exposure time of the camera 471, the above-described problems are not likely to occur, and the three-dimensional measurement of the object W can be performed with higher accuracy.

Of course, when the above-described problem does not occur even when the above-described problem occurs, when the above-described problem does not occur in structure (for example, when the amount of exposure is equal in the first imaging mode and the second imaging mode), the output of the laser light L can be changed from the second output to the first output and the imaging mode of the camera 471 can be changed from the second imaging mode to the first imaging mode in accordance with a change in the signal received by the human body sensing information receiving unit 481 from the detected signal to the undetected signal, and the three-dimensional measurement of the object W can be continued under the changed condition. Thus, the time required for three-dimensional measurement of the object W is shortened as compared with the case where the output of the laser light L and the exposure time of the camera 471 are maintained, and the work efficiency of the robot 2 is improved.

As another control, if a person gets out of the laser output control area S3, the three-dimensional measurement of the object W is stopped, the output of the laser light L is changed from the second output to the first output, the image capturing mode of the camera 471 is changed from the second image capturing mode to the first image capturing mode, and then the three-dimensional measurement of the object W is resumed from the beginning. Thus, according to the measurement schedule up to now, the time required for three-dimensional measurement of the object W is shortened as compared with the case where the output of the laser light L and the exposure time of the camera 471 are maintained as described above, and the work efficiency of the robot 2 is improved.

Next, a case where a person enters the laser output control region S3 during the three-dimensional measurement of the object W, that is, a case where the signal received by the human body sensation information receiving unit 481 during the three-dimensional measurement of the object W changes from the undetected signal to a detected signal will be described. In this case, if a person enters the laser output control area S3, the output of the laser beam L is changed from the first output to the second output in accordance with the change, the imaging mode of the camera 471 is changed from the first imaging mode to the second imaging mode, and the three-dimensional measurement of the object W is continued under the changed conditions. This suppresses entry of the high-intensity laser light L into the human eye in the laser output control area S3, and the robot system 1 is safer. Further, since the three-dimensional measurement of the object W can be continued, the time required for the three-dimensional measurement of the object W is shortened as compared with a case where the three-dimensional measurement is newly performed, for example, and the work efficiency of the robot 2 is improved.

As another control, the irradiation control unit 482 may stop the emission of the laser light L if a person enters the laser output control region S3. This suppresses entry of the high-intensity laser light L into the human eye in the laser output control area S3, and the robot system 1 is safer. When the laser output control area S3 is occupied by a person, the irradiation control unit 482 may switch the output of the laser beam L between the third output and the fourth output, for example, when the laser output control area S3 has the third output and the fourth output.

Here, the size of the laser output control region S3 is not particularly limited, but the time T1 after the person enters the laser output control region S3 and reaches the high-intensity laser irradiation region S2 is set to be longer than the time T2 from when the person enters the laser output control region S3 to when the output of the laser light L is switched from the first output to the second output or the emission of the laser light L is stopped. This effectively prevents a person from entering the high-intensity laser irradiation region S2 and the high-intensity laser light L from entering the eyes of the person while the output of the laser light L is being changed. Note that the time T1 can be appropriately calculated based on, for example, the moving speed of the person in the work, the moving route, and the like.

The robot system 1 is explained in detail above. As described above, the robot system 1 includes the robot 2, the human body sensor 3 that detects a human being in the laser output control area S3 (detection range), the three-dimensional measurement device 4 that performs three-dimensional measurement of the object W using the laser light L, and the robot control device 5 that controls the operation of the robot 2 based on the measurement result of the three-dimensional measurement device 4. The three-dimensional measurement device 4 includes a laser irradiation unit 41 that irradiates a region including the object W with the laser light L, an irradiation control unit 482 that controls an output of the laser light L irradiated from the laser irradiation unit 41 to a first output and a second output lower than the first output, and an imaging unit 47 that images the object W irradiated with the laser light L to acquire image data. When information indicating that no person is present in the laser output control region S3 is output from the motion sensor 3, the irradiation control unit 482 sets the output of the laser beam L to the first output. When there is no person in the laser output control area S3, the laser light L is unlikely to enter the eyes of the person. Therefore, by emitting the laser light L at the first output, a brighter projection pattern P can be projected on the object W than in the case of emitting the laser light L at the second output, and accordingly, the time required for three-dimensional measurement is shortened. Therefore, the processing speed of the robot 2 is increased. On the other hand, when a signal indicating that a person is present in the laser output control area S3 is output from the motion sensor 3, the output is set to the second output. This makes it possible to reduce the output of the laser light L to a level safe for the eyes of the person in the laser output control area S3, and the robot system 1 is safe for the person in the laser output control area S3.

As described above, the three-dimensional measurement device 4 includes: a human body sensing information receiving unit 481 that receives information from the human body sensing sensor 3 that detects a human body in the laser output control region S3 (detection range); a laser irradiation unit 41 that irradiates a region including the object W with laser light L; an irradiation control unit 482 that controls the output of the laser light L irradiated from the laser irradiation unit 41 to a first output and a second output lower than the first output; an imaging unit 47 that images the object W irradiated with the laser light L to acquire image data; and a point cloud data generation unit 485 configured to generate three-dimensional point cloud data of a region including the object W based on the image data. When the human body sensing information receiving unit 481 receives the first information indicating that no human being is present in the laser output control region S3 from the human body sensing sensor 3, the irradiation control unit 482 sets the output of the laser light L to the first output. When there is no person in the laser output control area S3, the laser light L is unlikely to enter the eyes of the person. Therefore, by emitting the laser light L at the first output, a brighter projection pattern P can be projected on the object W than in the case of emitting the laser light L at the second output, and accordingly, the time required for three-dimensional measurement is shortened. Therefore, the processing speed of the robot 2 is increased.

As described above, when the human body sensing information receiver 481 receives the second information indicating that there is a person in the laser output control region S3 from the human body sensing sensor 3, the irradiation controller 482 sets the output of the laser light L to the second output. This makes it possible to reduce the output of the laser light L to a level safe for the eyes of the person in the laser output control area S3, and the robot system 1 is safe for the person in the laser output control area S3.

As described above, when the irradiation of the laser beam L is started at the second output and the human body sensing information receiving unit 481 receives the first information before the irradiation of the laser beam L is completed, the irradiation control unit 482 maintains the irradiation of the laser beam L at the second output. That is, even if the signal received by the human body sensation information receiving unit 481 changes from the detected signal to the undetected signal during the three-dimensional measurement, the output of the laser beam L and the exposure time of the camera 471 are maintained until the three-dimensional measurement of the object W is completed. When the output of the laser beam L and the exposure time of the camera 471 are changed during the measurement, for example, a difference in brightness or contrast occurs in the image data, a difference in brightness or contrast occurs between the image data before the change and the image data after the change, or the projection pattern P is distorted or missing when the output of the laser beam L is switched, which may reduce the accuracy of the three-dimensional measurement of the object W. Therefore, by maintaining the output of the laser light L and the exposure time of the camera 471, the above-described problems are not likely to occur, and three-dimensional measurement of the object W can be performed with higher accuracy.

As described above, when the irradiation of the laser beam L is started at the first output and the second information is received by the human body sensing information receiving unit 481 before the irradiation of the laser beam L is completed, the irradiation control unit 482 switches the output of the laser beam L to the second output or stops the irradiation of the laser beam L. Thus, the robot system 1 is safe for a person who is in the laser output control area S3.

Further, as described above, the imaging control unit 484 is provided to control the imaging mode of the imaging unit 47. The imaging control unit 484 has, as the imaging mode, a first imaging mode in which the exposure time of the imaging unit 47 is the first exposure time and a second imaging mode in which the exposure time of the imaging unit 47 is longer than the first exposure time, and when the output of the laser light L is the first output, the imaging control unit 484 sets the imaging mode to the first imaging mode. In this way, when the laser light L of high intensity is irradiated, the exposure time is set to a short exposure time, whereby image data having appropriate brightness and contrast can be acquired. Therefore, the three-dimensional measurement of the object W can be performed with higher accuracy.

As described above, when the output of the laser light L is the second output, the imaging control unit 484 sets the imaging mode to the second imaging mode. In this way, when the laser light L of low intensity is irradiated, image data having appropriate brightness and contrast can be acquired by setting the exposure time to a long exposure time. Therefore, the three-dimensional measurement of the object W can be performed with higher accuracy.

As described above, the laser irradiation unit 41 includes the mirror 454 that diffuses the laser light L. As described above, the intensity of the laser light L decreases as the laser light L is separated from the laser irradiation unit 41 by diffusing the laser light L, and therefore, the three-dimensional measurement device 4 is safer.

As described above, the laser irradiation unit 41 includes the rod lens 442 as a lens for diffusing the laser light L. As described above, the intensity of the laser light L decreases as the laser light L is separated from the laser irradiation unit 41 by diffusing the laser light L, and therefore, the three-dimensional measurement device 4 is safer.

Further, the control device 48 includes: a human body sensing information receiving unit 481 that receives information from the human body sensing sensor 3 that detects a human in the laser output control area S3; and an irradiation control unit 482 that controls the output of the laser light L irradiated from the laser irradiation unit 41 to a first output and a second output lower than the first output. When the human body sensing information receiving unit 481 receives the first information indicating that no human being is present in the laser output control region S3 from the human body sensing sensor 3, the irradiation control unit 482 sets the output of the laser light L to the first output. When there is no person in the laser output control area S3, the laser light L is unlikely to enter the eyes of the person. Therefore, by emitting the laser light L at the first output, a brighter projection pattern P can be projected on the object W than in the case of emitting the laser light L at the second output, and accordingly, the time required for three-dimensional measurement is shortened. Therefore, the working efficiency of the robot 2 is improved.

< second embodiment >

Fig. 6 is a plan view showing a detection range set in the robot system according to the second embodiment of the present invention.

In the following description, differences between the robot system according to the second embodiment and the above-described embodiments will be described, and descriptions of the same matters will be omitted. The robot system 1 of the second embodiment is basically the same as the robot system 1 of the first embodiment described above, except that a laser output control region S3 and a robot drive control region S4 wider than the laser output control region S3 are set as detection ranges. In fig. 6, the same components as those in the above embodiment are denoted by the same reference numerals.

As shown in fig. 6, the robot system 1 has a maximum movable region S1 and a high-intensity laser irradiation region S2, the maximum movable region S1 is a region where the tip of the end effector 24 can be located by driving of the robot 2, and the high-intensity laser irradiation region S2 is a region where the intensity of the laser light L emitted at the first output can be higher than a prescribed intensity. In the robot system 1, a laser output control area S3 for controlling the output of the laser light L and a robot drive control area S4 for controlling the drive of the robot 2 are set as detection ranges. The high-intensity laser irradiation region S2 is larger than the maximum movable region S1, and encloses the entire region of the maximum movable region S1.

Next, a method of driving the robot system 1 having such a configuration will be described. First, a case where the human body response information receiving unit 481 receives a signal indicating that no human being is present in the robot drive control area S4 from the human body sensor 3 will be described. In this case, since no person is present in the robot drive control area S4, the robot control device 5 drives the first arm 221 to the sixth arm 226 at the normal moving speed.

Next, a case will be described in which the human body response information receiving unit 481 receives a signal indicating that a human being is present in the robot drive control area S4 from the human body sensor 3. In this case, a person is in the vicinity of the robot 2, and the person may come into contact with the robot 2 due to the movement of the person. Therefore, the robot controller 5 drives the first to sixth arms 221 to 226 at a moving speed slower than the normal moving speed.

In this way, when a person is present in the robot drive control area S4, the moving speed of the first arm 221 to the sixth arm 226 is slower than when no person is present in the robot drive control area S4. Thus, even if the robot 2 comes into contact with a person, the impact thereof can be suppressed to be small. Therefore, the robot system 1 is safe for a person who is within the robot drive control area S4.

When there is a person in the laser output control area S3, or when there is no person, the control method is the same as that of the first embodiment. The second embodiment can also exhibit the same effects as those of the first embodiment.

< third embodiment >

Fig. 7 is a plan view showing a detection range set in the robot system according to the third embodiment of the present invention.

In the following description, differences between the robot system according to the third embodiment and the above-described embodiments will be described, and descriptions of the same matters will be omitted. The robot system 1 of the third embodiment is basically the same as the robot system 1 of the first embodiment described above, except that the output of the laser light L is the first to third outputs and the laser output control region S3 is divided into two regions. Note that, in fig. 7, the same reference numerals are given to the same components as those in the above embodiment.

In the robot system 1 of the present embodiment, the irradiation control unit 482 has a first output, a second output lower than the first output, and a third output lower than the second output as the outputs of the laser light L, and can select these outputs. On the other hand, as the image capturing mode of the image capturing section 47, the image capturing control section 484 has a first image capturing mode for capturing an image of the object W with a first exposure time, a second image capturing mode for capturing an image of the object W with a second exposure time longer than the first exposure time, and a third image capturing mode for capturing an image of the object W with a third exposure time longer than the second exposure time, and can select these modes.

As shown in fig. 7, the robot system 1 includes: the maximum movable region S1 is a region in which the tip of the end effector 24 can be located by the driving of the robot 2; a high-intensity laser irradiation region S2' which is a region in which the intensity of the laser light L emitted at the first output can be higher than a prescribed intensity; and a high-intensity laser irradiation region S2 ″ that is a region in which the intensity of the laser light L emitted at the second output can be higher than the prescribed intensity. In the robot system 1, laser output control regions S3', S3 ″ for controlling the output of the laser light L are set as the detection ranges. The laser output control region S3 ' is larger than the high-intensity laser irradiation region S2 ', and includes the entire region of the high-intensity laser irradiation region S2 '. In addition, the laser output control region S3 ″ is larger than the high-intensity laser irradiation region S2 ″ and encloses the entire region of the high-intensity laser irradiation region S2 ″.

Next, a method of driving the robot system 1 having such a configuration will be described. First, a case where the human body response information receiving unit 481 receives a signal indicating that no human being is present in the laser output control region S3' from the human body sensor 3 will be described. In this case, since there is no person in the laser output control region S3', the high-intensity laser light L is less likely to enter the eyes of the person. Therefore, when three-dimensional measurement of the object W is performed, the irradiation control unit 482 emits the laser light L at the first output, and the imaging control unit 484 images the object W in the first imaging mode.

Next, a case will be described in which the human body response information receiving unit 481 receives a signal indicating that a person is present in the laser output control region S3' from the human body sensor 3. In this case, since there is a person in the laser output control area S3', if the laser light L is emitted at the first output, there is a possibility that the high-intensity laser light L enters the eyes of the person. Therefore, when performing three-dimensional measurement of the object W, the irradiation control unit 482 emits the laser light L at a second output lower in intensity than the first output, and the imaging control unit 484 images the object W in a second imaging mode in which the exposure time is longer than that in the first imaging mode.

Next, a case will be described in which the human body response information receiving unit 481 receives a signal indicating that a person is present in the laser output control area S3 ″ from the human body sensor 3. In this case, since a person is present in the laser output control area S3 ″, the high-intensity laser light L may enter the eyes of the person even when the laser light L is driven at the second output. Therefore, when performing three-dimensional measurement of the object W, the irradiation control unit 482 emits the laser light L at a third output lower in intensity than the second output, and the imaging control unit 484 images the object W in a third imaging mode in which the exposure time is longer than the second imaging mode.

The third embodiment can also exhibit the same effects as those of the first embodiment.

< fourth embodiment >

Fig. 8 is a diagram showing an overall configuration of a laser irradiation unit included in a robot system according to a fourth embodiment of the present invention. Fig. 9 is a plan view showing a diffractive optical element included in the laser irradiation unit shown in fig. 8.

In the following description, differences between the robot system according to the fourth embodiment and the above-described embodiments will be described, and descriptions of the same matters will be omitted. The robot system 1 of the fourth embodiment is basically the same as the robot system 1 of the first embodiment described above, except that the configuration of the laser irradiation unit 41 is different. Note that, in fig. 8, the same reference numerals are given to the same components as those in the above embodiment.

As shown in fig. 8, the laser irradiation unit 41 of the present embodiment includes a laser light source 42 that emits laser light L, an optical system 44 including a projection lens 445, a diffractive optical element 43 located between the laser light source 42 and the projection lens 445, and an actuator 49 that rotates the diffractive optical element 43 about a central axis a. As shown in FIG. 9, the diffractive optical element 43 has 8-mode diffraction gratings 431-438 arranged around a central axis A. Then, by rotating the diffractive optical element 43 about the center axis a, the predetermined diffraction gratings 431 to 438 can be positioned on the optical path of the laser light L. It should be noted that, although not shown, the diffraction gratings 431 to 434 are in the form of stripes whose phases are shifted by pi/2. The diffraction gratings 435 to 438 are in the form of stripes having a phase shift of pi/2-like each of the projection patterns on the projection surface, and the pitch of the projection patterns on the projection surface is twice the pitch of the projection patterns of the diffraction gratings 431 to 434.

In the laser irradiation unit 41 having such a configuration, the diffraction gratings 431 to 438 are sequentially positioned on the optical path of the laser beam L, thereby sequentially projecting the eight projection patterns P onto the object W.

In the fourth embodiment, the laser irradiation unit 41 includes the diffractive optical element 43 for diffusing the laser light L. As described above, the intensity of the laser light L decreases as the laser light L is separated from the laser irradiation unit 41 by diffusing the laser light L, and therefore, the three-dimensional measurement device 4 is safer.

The three-dimensional measurement device, the control device, and the robot system according to the present invention have been described above based on the illustrated embodiments, but the present invention is not limited thereto, and the configurations of the respective portions may be replaced with any configurations having the same functions. In addition, other arbitrary structures may be added to the present invention.

26页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:车身外观缺陷检测方法及检测系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!