Robot system

文档序号:1820807 发布日期:2021-11-09 浏览:21次 中文

阅读说明:本技术 机器人系统 (Robot system ) 是由 长谷川省吾 吉田哲也 扫部雅幸 杉山裕和 冈元知文 于 2020-03-13 设计创作,主要内容包括:本发明所涉及的机器人系统具备:机器人(101),其设置于作业区域(201)内;操作器(102),其构成为由操作者把持来操作机器人(101);传感器(103),其配置于操作区域(202),且通过无线方式检测操作器(102)的位置信息以及姿势信息;以及控制装置(110),其构成为以传感器(103)检测出的操作器(102)的位置信息以及姿势信息为基础,计算操作器(102)的轨迹,实时使机器人(101)动作。(The robot system according to the present invention includes: a robot (101) provided in a work area (201); an operator (102) configured to be held by an operator to operate the robot (101); a sensor (103) that is disposed in the operation area (202) and that wirelessly detects position information and orientation information of the operator (102); and a control device (110) configured to calculate the trajectory of the operator (102) based on the position information and the posture information of the operator (102) detected by the sensor (103), and to operate the robot (101) in real time.)

1. A robot system is characterized by comprising:

a robot provided in a working area and configured to spray a liquid or a gas onto a workpiece and/or to cut or polish the workpiece;

an operator configured to be held by an operator to operate the robot;

a sensor that is disposed in an operation area and that wirelessly detects position information and posture information of the operator; and

a control device for controlling the operation of the motor,

the control device is configured to calculate a trajectory of the manipulator based on the position information and the posture information of the manipulator detected by the sensor, and to operate the robot in real time.

2. The robotic system of claim 1,

the control device is configured to calculate a trajectory of the manipulator based on position information and posture information of the manipulator detected by the sensor, and to cause the robot to perform any one of an injection operation of injecting a liquid or a gas to the workpiece, a cutting operation of cutting the workpiece, and a polishing operation of polishing the workpiece in real time based on the calculated trajectory.

3. The robotic system of claim 1 or 2,

the work area is divided into a plurality of work sections,

the operating region is divided into a plurality of operating intervals,

the robot is arranged for each of the work sections,

the sensors are configured for each of the operating zones,

the control device is configured to operate the robot disposed in an nth operation zone based on positional information and posture information of the manipulator detected by the sensor disposed in the nth operation zone.

4. The robotic system of claim 3,

the operator further has a switch that switches on/off of output of the position information and the posture information of the operator detected by the sensor.

5. The robotic system of claim 1 or 2,

the work area is divided into a plurality of work sections,

the robot is arranged for each of the work sections,

the manipulator further has a designator that designates the robot of an action among a plurality of the robots,

the control device is configured to operate the robot specified by the specifier in real time based on the position information and the posture information of the operator detected by the sensor.

6. Robot system according to any of claims 1 to 5,

a camera for imaging the machine disposed in the work area is disposed in the work area,

a display device that displays image information captured by the camera is disposed in the operation area.

7. The robotic system of claim 2,

a 1 st device is provided at a grip portion of the manipulator, the 1 st device being configured to give a tactile sensation to an operator,

the control device is provided with a memory which is provided with a memory,

the memory stores 1 st information, the 1 st information being trajectory information of the manipulator generated by an operation of a skilled person in any one of an ejection operation of ejecting a liquid or a gas to the workpiece, a cutting operation of cutting the workpiece, and a polishing operation of polishing the workpiece,

the control device causes the 1 st machine to operate so as to guide the operator based on the 1 st information stored in the memory.

8. The robotic system of claim 7,

the control device is configured to control the 1 st machine to give a tactile sensation as a warning to an operator when the robot is likely to move outside a preset operation range, when the robot approaches outside the operation range, or when the robot is likely to move to an area where movement is prohibited even if the robot is within the operation range.

Technical Field

The present invention relates to a robot system.

Background

There is known a robot operation control data generating method for detecting an operation of a teaching hand gun and teaching the operation to a coating robot (see, for example, patent document 1). In the method for generating operation control data for a robot disclosed in patent document 1, the operation control data is generated such that the movement path of the gun of the robot is moved along a straight line or a curved line in the injection section.

Patent document 1: japanese patent laid-open publication No. 2018-1381

However, in the operation control data generation method for a robot disclosed in patent document 1, after a program for teaching the operation of the robot is prepared, the robot is operated according to the program. Therefore, when the robot cannot perform accurate coating, it is necessary to create the program again or to correct the created program, which takes time for teaching.

Therefore, the method for generating operation control data of a robot disclosed in patent document 1 has room for improvement from the viewpoint of improving work efficiency.

Disclosure of Invention

The present invention has been made to solve the above conventional problems, and an object thereof is to provide a robot system capable of reducing the burden on an operator and improving work efficiency.

In order to solve the conventional problems, a robot system according to the present invention includes: a robot provided in a work area and configured to spray or eject a liquid onto a workpiece and/or cut or polish the workpiece; an operator configured to be held by an operator to operate the robot; a sensor that is disposed in an operation area and that wirelessly detects position information and posture information of the operator; and a control device configured to calculate a trajectory of the manipulator based on position information and posture information of the manipulator detected by the sensor, and to operate the robot in real time.

Thus, the operator can operate (operate) the robot in real time, and thus the operator can immediately determine whether or not the working operation performed by the robot on the workpiece is correctly performed. Therefore, compared to the method for generating operation control data of a robot disclosed in patent document 1, the time required for teaching task can be reduced. Therefore, the burden on the operator can be reduced, and the work efficiency can be improved.

According to the robot system of the present invention, the burden on the operator can be reduced, and the work efficiency can be improved.

Drawings

Fig. 1 is a schematic diagram showing a schematic configuration of a robot system according to embodiment 1.

Fig. 2 is a schematic diagram showing a schematic configuration of a robot system according to modification 1 of embodiment 1.

Fig. 3 is a schematic diagram showing a schematic configuration of a robot system according to modification 2 of embodiment 1.

Fig. 4 is a schematic diagram showing a schematic configuration of a robot system according to embodiment 2.

Fig. 5 is a schematic diagram showing a schematic configuration of a robot system according to embodiment 3.

Fig. 6 is a schematic diagram showing a schematic configuration of a robot system according to embodiment 4.

Detailed Description

Hereinafter, embodiments of the present invention will be described with reference to the drawings. In all the drawings, the same or corresponding portions are denoted by the same reference numerals, and redundant description thereof is omitted. In all the drawings, components for explaining the present invention are shown in the drawings, and the other components may not be shown. The present invention is not limited to the following embodiments.

(embodiment mode 1)

The robot system according to embodiment 1 includes: a robot provided in a work area and configured to spray or eject a liquid onto a workpiece; an operator configured to be held by an operator to operate the robot; a sensor that is disposed in the operation area and that wirelessly detects position information and posture information of the operator; and a control device configured to calculate a trajectory of the manipulator based on the position information and the posture information of the manipulator detected by the sensor, and to operate the robot in real time.

In the robot system according to embodiment 1, the control device may be configured to calculate a trajectory of the manipulator based on the position information and the posture information of the manipulator detected by the sensor, and cause the robot to perform any one of an injection operation of injecting liquid or gas to the workpiece, a cutting operation of cutting the workpiece, and a polishing operation of polishing the workpiece in real time based on the calculated trajectory.

In the robot system according to embodiment 1, the 1 st device may be provided at a grip portion of the manipulator, the 1 st device may be configured to give a tactile sensation to an operator, the control device may include a memory, the 1 st device may include 1 st information stored in the memory, the 1 st information may be trajectory information of the manipulator generated by an operation of a skilled person in any one of a spraying operation of liquid or gas to the workpiece, a cutting operation of cutting the workpiece, and a polishing operation of polishing the workpiece, and the control device may operate the 1 st device so as to guide the operator based on the 1 st information stored in the memory.

In the robot system according to embodiment 1, the control device may be configured to control the 1 st robot to give a tactile sensation to the operator as a warning when the robot may move outside a preset operation range, when the robot approaches outside the operation range, or when the robot may move to an area where movement is prohibited even though the robot is within the operation range.

An example of the robot system according to embodiment 1 will be described below with reference to fig. 1.

[ Structure of robot System ]

Fig. 1 is a schematic diagram showing a schematic configuration of a robot system according to embodiment 1.

As shown in fig. 1, a robot system 100 according to embodiment 1 includes: the robot 101 installed in the working area 201, the manipulator 102 disposed in the operating area 202, the sensor 103, and the control device 110 are configured such that the control device 110 operates the robot 101 in real time based on the positional information and the posture information of the manipulator 102 in the three-dimensional space detected by the sensor 103. The robot 101 is configured to spray or eject a liquid onto the workpiece 104, or to cut or polish the workpiece 104.

A wall member 203 is disposed between the work area 201 and the operation area 202. A window 204 is provided in the wall member 203 so that the robot 101 disposed in the working area 201 can be visually recognized. In embodiment 1, the wall member 203 is disposed between the work area 201 and the operation area 202, but the present invention is not limited to this, and a configuration in which the wall member 203 is not disposed may be employed.

The sensor 103 is configured to detect position information and posture information of the distal end portion of the operator 102 by wireless, for example, and output the same to the control device 110. The sensor 103 may output to the control device 110 by a wireless method, or may output to the control device 110 by a wired method.

The sensor 103 may be constituted by an infrared sensor, for example, or may be constituted by a camera. In the case where the sensor 103 is formed of a camera, the camera may not be disposed in the operation area 202. For example, the camera may be a camera provided in a mobile terminal or a head mounted display carried by the operator.

The operator 102 is configured to operate the robot 101 by gripping the grip portion 102A. Specifically, the robot 101 moves so as to follow the trajectory of the tip end portion of the body portion 102E of the gripped manipulator 102, and the operator can intuitively operate the robot 101 by the manipulator 102 in the operation region 202.

A device configured to transmit force information or voice information detected by a force sensor provided in the end effector 20 of the robot 101 to be described later to the operator may be disposed in the grip portion 102A. Examples of the device include a vibration motor, a speaker, and a mechanism for extending and contracting a housing constituting the grip portion 102A.

The manipulator 102 may be provided with a switch 102B, and the switch 102B may start and stop spraying a liquid or a gas onto the workpiece 104, or cutting or polishing the workpiece 104. The operator 102 may be portable and carried by the operator. The body 102E of the manipulator 102 may be formed in the same shape as the end effector 20 of the robot 101.

The robot 101 is a vertical articulated robot arm including a joint body of a plurality of links (here, the 1 st link 11a to the 6 th link 11f), a plurality of joints (here, the 1 st joint JT1 to the 6 th joint JT6), and a base 15 supporting these members. In embodiment 1, a vertical articulated robot is used as the robot 101, but the present invention is not limited to this, and a horizontal articulated robot may be used.

In the 1 st joint JT1, the base 15 and the base end portion of the 1 st link 11a are connected to be rotatable about an axis extending in the vertical direction. In the 2 nd joint JT2, the distal end portion of the 1 st link 11a and the base end portion of the 2 nd link 11b are coupled to each other so as to be rotatable about an axis extending in the horizontal direction. In the 3 rd joint JT3, the distal end portion of the 2 nd link 11b and the base end portion of the 3 rd link 11c are coupled to each other so as to be rotatable about an axis extending in the horizontal direction.

In the 4 th joint JT4, the distal end portion of the 3 rd link 11c and the base end portion of the 4 th link 11d are rotatably coupled to each other about an axis extending in the longitudinal direction of the 4 th link 11d. In the 5 th joint JT5, the distal end portion of the 4 th link 11d and the base end portion of the 5 th link 11e are coupled to each other so as to be rotatable about an axis perpendicular to the longitudinal direction of the 4 th link 11d. In the 6 th joint JT6, the tip end portion of the 5 th link 11e and the base end portion of the 6 th link 11f are torsionally and rotatably coupled.

Further, a mechanical interface is provided at the front end portion of the 6 th link 11f. An end effector 20 corresponding to the work content is detachably attached to the mechanical interface.

The end effector 20 is configured to spray or jet a liquid (e.g., paint) onto the workpiece 104. Further, a pipe 21 for supplying a liquid to the end effector 20 is connected to the end effector 20.

Further, drive motors (not shown) as an example of actuators for relatively rotating the two members connected to the respective joints are provided in the 1 st joint JT1 to the 6 th joint JT6, respectively. The drive motor may be a servo motor servo-controlled by the controller 110, for example. Further, a rotation sensor for detecting a rotation position of the drive motor and a current sensor (not shown) for detecting a current for controlling rotation of the drive motor are provided in each of the 1 st joint JT1 to the 6 th joint JT6. The rotation sensor may also be an encoder, for example.

The control device 110 includes an arithmetic unit 110a such as a microprocessor or CPU, and a memory 110b such as ROM or RAM. The memory 110b stores information such as a basic program and various fixed data. The arithmetic unit 110a reads out and executes software such as a basic program stored in the memory 110b, thereby controlling various operations of the robot 101.

The control device 110 is configured to cause the robot 101 (end effector 20) to move so as to follow the movement of the distal end portion of the manipulator 102, based on the position information and the posture information of the manipulator 102 input from the sensor 103.

That is, the control device 110 is configured to calculate the trajectory of the manipulator 102 based on the position information and the posture information of the manipulator 102 detected by the sensor 103, and to operate the robot 101 in real time.

Specifically, the controller 110 is configured to calculate the trajectory of the manipulator 102 based on the position information and the posture information of the manipulator 102 detected by the sensor 103, and to cause the robot 101 to perform any one of an injection operation of injecting a liquid or a gas onto the workpiece 104, a cutting operation of cutting the workpiece 104, and a polishing operation of polishing the workpiece 104 in real time based on the calculated trajectory.

Here, "work" of the spray work, the cutting work, and the polishing work refers to a series of operations performed by the robot 101 on the workpiece 104, and is a concept including a plurality of operations. The job includes, for example: the robot 101 performs an operation of approaching the workpiece 104, an operation of starting ejection of liquid to the workpiece 104, an operation of stopping ejection of liquid, and an operation of separating from the workpiece 104.

The control device 110 may be a single control device 110 that performs centralized control, or may be a plurality of control devices 110 that perform distributed control in cooperation with each other. The control device 110 may be constituted by a microcomputer, an MPU, a PLC (Programmable Logic Controller), a Logic circuit, or the like.

In the robot system 100 according to embodiment 1 configured as described above, the control device 110 is configured to calculate the trajectory of the manipulator 102 based on the position information and the posture information of the manipulator 102 detected by the sensor 103, and to operate the robot 101 in real time.

This enables the operator to operate the robot 101 in real time, and thus the operator can intuitively operate the robot 101. In addition, whether or not the working operation of the robot 101 with respect to the workpiece 104 is correctly performed can be determined instantaneously. Therefore, compared to the method for generating operation control data of a robot disclosed in patent document 1, the time required for teaching task can be reduced. Therefore, the burden on the operator can be reduced, and the work efficiency can be improved.

The control device 110 may be a device (actuator) such as a vibration motor provided in the grip portion 102A for imparting a tactile sensation, and may control the 1 st device for executing the tactile technique to impart a tactile sensation such as vibration to the operator.

In this case, the controller 110 may calculate a trajectory of the manipulator 102 generated by a skilled operator who performs operations such as a spraying operation, a cutting operation, and a polishing operation by moving (operating) the manipulator 102, and store in the memory 110b a task (1 st information that is trajectory information of the manipulator 102) to be executed by the robot 101 based on the calculated trajectory.

The control device 110 may operate the robot 101 based on trajectory information of the manipulator 102 stored in the memory 110b and based on the operation of a skilled person.

The control device 110 may control the 1 st device such as a vibration motor provided in the grip portion 102A to give a tactile sensation such as vibration to the operator based on the 1 st information so as to follow the trajectory of the operator 102 stored in the memory 110b and operated by a skilled person. This operation can teach an unskilled operator the operation of a skilled person.

Further, the control device 110 controls the 1 st machine such as a vibration motor provided in the grip portion 102A to give a tactile sensation such as vibration as a warning to the operator when there is a possibility that the robot 101 moves outside a preset operation range, when the robot 101 approaches outside the operation range, or when the robot 101 moves to a region where movement is prohibited even if the robot 101 is within the operation range.

Here, "tactile sensation as a warning" means a tactile sensation that the acceleration of vibration or the like is larger than a predetermined value set in advance, and for example, 55dB or more vibration or 65dB or more vibration may be given to the operator. Further, "tactile sensation as a warning" may be given a tactile sensation (vibration) larger than the tactile sensation given to the operator by vibration or the like based on the 1 st information stored in the memory 110 b.

[ modification 1]

Next, a modified example of the robot system according to embodiment 1 will be described.

The robot system according to modification 1 of embodiment 1 is configured such that the robot cuts or grinds a workpiece.

An example of a robot system according to modification 1 of embodiment 1 will be described below with reference to fig. 2.

Fig. 2 is a schematic diagram showing a schematic configuration of a robot system according to modification 1 of embodiment 1.

As shown in fig. 2, the robot system 100 according to modification 1 has the same basic configuration as the robot system 100 according to embodiment 1, but differs in that the end effector 20 of the robot 101 is configured to cut or grind a workpiece 104. Specifically, the end effector 20 may be configured as a cutting tool including, for example, a drill, an end mill, a reamer, or the like, and may cut the workpiece 104. The end effector 20 may be configured to grind the workpiece 104 by a grinding tool such as a grinding wheel.

The robot system 100 of modification example 1 configured as described above also has the same operational advantages as the robot system 100 according to embodiment 1.

[ modification 2]

The robot system according to variation 1 of embodiment 1 is provided with a detector for wirelessly detecting position information and orientation information of an operator in the operator, and a transmitter for transmitting the position information and the orientation information of the operator detected by the detector to a control device is disposed in an operation area.

An example of a robot system according to modification 2 of embodiment 1 will be described below with reference to fig. 3.

Fig. 3 is a schematic diagram showing a schematic configuration of a robot system according to modification 2 of embodiment 1.

As shown in fig. 3, the robot system 100 according to modification 2 has the same basic configuration as the robot system 100 according to embodiment 1, but differs in that a detector 12 for wirelessly detecting position information and/or orientation information of the operator 102 is provided in the operator 102, and a transmitter 13 for transmitting the position information and/or orientation information of the operator 102 detected by the detector 12 to a control device 110 is provided. The detector 12 may be, for example, a gyro sensor or a camera.

In modification 2, the detector 12 and the transmitter 13 constitute a sensor 103.

The robot system 100 of modification example 2 configured as described above also has the same operational advantages as the robot system 100 according to embodiment 1.

(embodiment mode 2)

In the robot system according to embodiment 2 (including the modified example), the working area is divided into a plurality of working sections, the working area is divided into a plurality of operation sections, the robot is disposed for each working section, the sensor is disposed for each operation section, and the control device is configured to operate the robot disposed in the nth working section based on the position information and the posture information of the operator detected by the sensor disposed in the nth working section. Further, N is a natural number.

In the robot system according to embodiment 2, the manipulator may further include a switch for switching on/off of output of the position information and the posture information of the manipulator detected by the sensor.

An example of the robot system according to embodiment 2 will be described below with reference to fig. 4.

[ Structure of robot System ]

Fig. 4 is a schematic diagram showing a schematic configuration of a robot system according to embodiment 2.

As shown in fig. 4, the robot system 100 according to embodiment 2 is basically the same as the robot system 100 according to embodiment 1, except that the working area 201 is divided into a plurality of (here, three) working sections 201A to 201C by a plurality of (here, two) wall members 205A and 205B, the operating area 202 is divided into a plurality of (here, three) operating sections 202A to 202C by a plurality of (here, two) wall members 206A and 206B, the robot 101 is arranged for each working section, and the sensors 103 are arranged for each operating section.

When it is necessary to distinguish the robots 101 arranged in the respective work sections, the robot arranged in the work section 201A is referred to as a robot 101A, the robot arranged in the work section 201B is referred to as a robot 101B, and the robot arranged in the work section 201C is referred to as a robot 101C. Similarly, when it is necessary to distinguish between the sensors 103 arranged in the operation sections, the sensor arranged in the operation section 202A is referred to as a sensor 103A, the sensor arranged in the operation section 202B is referred to as a sensor 103B, and the sensor arranged in the operation section 202C is referred to as a sensor 103C.

In the robot system 100 according to embodiment 2, the operator 102 further includes a switch 102C, and the switch 102C switches on/off of the outputs of the position information and the posture information of the operator 102 detected by the sensor 103.

For example, when the operator moves from the operation section 202A to the operation section 202C, the operator can operate the switch 102C to turn off the outputs of the position information and the posture information in the operation section 202A, and after moving to the operation section 202C, the operator can operate the switch 102C to turn on the output of the sensor 103C.

In the robot system 100 according to embodiment 2, the control device 110 is disposed for each work section. When it is necessary to distinguish between the control devices 110 arranged in the respective work areas, the control device arranged in the work area 201A is referred to as a control device 110A, the control device arranged in the work area 201B is referred to as a control device 110B, and the control device arranged in the work area 201C is referred to as a control device 110C.

In embodiment 2, the control devices 110A to 110C disposed in the respective working sections 201A to 201C are configured to control the robots 101A to 101C disposed in the working sections 201A to 201C, but the present invention is not limited thereto. The robots 101A to 101C arranged in the respective work sections 201A to 201C may be controlled by one control device 110.

In embodiment 2, the control device 110A is configured to operate the robot 101A arranged in the operation section 202A (1 st operation section) based on the position information and the posture information output from the sensor 103A arranged in the operation section 201A (1 st operation section). Similarly, the control device 110B is configured to operate the robot 101B arranged in the operation zone 202B (operation zone 2) based on the position information and the posture information output from the sensor 103B arranged in the operation zone 201B (operation zone 2). Further, the control device 110C is configured to operate the robot 101C disposed in the operation zone 202C (operation zone 3) based on the position information and the posture information output from the sensor 103C disposed in the operation zone 201C (operation zone 3).

That is, in embodiment 2, the control device 110 is configured to operate the robot 101 disposed in the nth operation zone based on the position information and the posture information output from the sensor 103 disposed in the nth operation zone.

The robot system 100 according to embodiment 2 configured as described above also has the same operational advantages as the robot system 100 according to embodiment 1.

In the robot system 100 according to embodiment 2, the control device 110 is configured to operate the robot 101 arranged in the nth operation zone based on the position information and the posture information output from the sensor 103 arranged in the nth operation zone.

This allows operators to be placed in each operation section, and each operator can simultaneously operate the robot 101 placed in each operation section. Further, the operator can move the robot 101 disposed in each work section by using one manipulator 102 as the operator moves in the operation section.

In the robot system 100 according to embodiment 2, the operator 102 further includes a switch 102C, and the switch 102C switches on/off of the outputs of the position information and the posture information of the operator 102 detected by the sensor 103.

Thus, the operator can move the robot 101 disposed in each work section by using one manipulator 102 as the operator moves in the operation section.

(embodiment mode 3)

A robot system according to embodiment 3 is the robot system according to embodiment 1 (including the modified examples), wherein the working area is divided into a plurality of working sections, the robots are arranged for each working section, and the manipulator further includes a designator that designates an operation of a robot among the plurality of robots, and the control device is configured to operate the robot designated by the designator in real time based on position information and posture information of the manipulator detected by the sensor.

An example of the robot system according to embodiment 3 will be described below with reference to fig. 5.

[ Structure of robot System ]

Fig. 5 is a schematic diagram showing a schematic configuration of a robot system according to embodiment 3.

As shown in fig. 5, the robot system 100 according to embodiment 3 has the same basic configuration as the robot system 100 according to embodiment 1, but differs from the robot 101 in that a plurality of (here, two) wall members 205A and 205B are divided into a plurality of (here, three) working sections 201A to 201C in a working area 201.

When it is necessary to distinguish the robots 101 arranged in the respective work sections, the robot arranged in the work section 201A is referred to as a robot 101A, the robot arranged in the work section 201B is referred to as a robot 101B, and the robot arranged in the work section 201C is referred to as a robot 101C.

In the robot system 100 according to embodiment 3, the manipulator 102 further includes a designator 102D, and the designator 102D designates an operating robot 101 among the plurality of robots 101. The designator 102D may be constituted by a numeric keypad, a jog dial (rotary selector), or a cross key.

Further, a reporter may be provided in the robot 101 and/or each work section, and the operating robot 101 may be notified to the operator by operating the designator 102D. The annunciator may be a display device (screen) for displaying character data, image data, or the like, a speaker or the like for making a notification by voice, or a light or color for making a notification. Note that the notification may be performed by a smartphone, a mobile phone, a tablet computer, or the like using mail or an application via a communication network.

Further, in the robot system 100 according to embodiment 3, the control devices 110A to 110C disposed in the respective working sections 201A to 201C control the robots 101A to 101C disposed in the working sections 201A to 201C, but the present invention is not limited thereto. The robots 101A to 101C arranged in the respective work sections 201A to 201C may be controlled by one control device 110.

The robot system 100 according to embodiment 3 configured as described above also has the same operational advantages as the robot system 100 according to embodiment 1.

The robot system 100 according to embodiment 3 further includes a designator 102D, and the designator 102D designates a robot 101 that operates among the plurality of robots 101. Thus, the operator can operate the robot 101 disposed in each work section by using one manipulator 102.

(embodiment mode 4)

In the robot system according to embodiment 4 (including the modifications) according to any one of embodiments 1 to 3, a camera for capturing an image of a machine placed in the work area is disposed in the work area, and a display device for displaying image information captured by the camera is disposed in the operation area.

An example of the robot system according to embodiment 4 will be described below with reference to fig. 6.

[ Structure of robot System ]

Fig. 6 is a schematic diagram showing a schematic configuration of a robot system according to embodiment 4.

As shown in fig. 6, the robot system 100 according to embodiment 4 has the same basic configuration as the robot system 100 according to embodiment 1, but differs in that a camera 105 for capturing images of machines (for example, the robot 101, the workpiece 104, and the like) arranged in the work area is arranged in a work area 201, and a display device 106 for displaying image information captured by the camera 105 is arranged in an operation area 202.

The camera 105 may be provided on, for example, a ceiling, a side wall surface (wall member 203), a front end portion of the robot 101, or the like. The display device 106 may be a fixed display used by being fixed to a desk, a floor, or the like. The display device 106 may be a head-mounted display or glasses that the operator wears on and uses.

Further, the control device 110 may display the image information on the display device 106. The image information may be, for example, a virtual workpiece, a virtual robot, a work process, or information such as the material or size of the workpiece 104, and the robot system 100 according to embodiment 4 configured as above also has the same operational advantages as the robot system 100 according to embodiment 1.

In the robot system 100 according to embodiment 4, the camera 105 that captures images of the devices disposed in the work area is disposed in the work area 201, and the display device 106 that displays the image information captured by the camera 105 is disposed in the operation area 202.

Thus, even when the operator separates the work area 201 from the operation area 202, the operator can remotely operate the robot 101.

In embodiment 4, the control device 110 may calculate a trajectory of the operator 102 generated by the operator moving (operating) the operator 102, and store a job (trajectory information of the operator 102) to be executed by the robot 101 based on the calculated trajectory in the memory 110 b. The control device 110 may operate the robot 101 based on the trajectory information of the manipulator 102 stored in the memory 110 b.

Further, the control device 110 may operate the virtual robot displayed on the display device 106 based on the trajectory information of the manipulator 102 stored in the memory 110 b. In this case, the control device 110 may start the operation (work) of the robot 101 by the operator using the manipulator 102 and operate the virtual robot displayed on the display device 106 based on the trajectory information of the manipulator 102 stored in the memory 110 b.

Many modifications and other embodiments of the invention will come to mind to one skilled in the art to which this invention pertains having the benefit of the teachings presented in the foregoing descriptions. Accordingly, the foregoing description is to be construed as illustrative only and is for the purpose of teaching those skilled in the art the best mode of carrying out the invention. The details of the structure and/or function of the present invention can be changed without departing from the spirit thereof.

Industrial applicability of the invention

The robot system according to the present invention is useful in the field of robots because it can reduce the burden on the operator and improve the work efficiency.

Description of the reference numerals

11a.. 1 st link; 11b.. 2 nd connecting rod; 11c.. 3 rd connecting rod; 11d.. 4 th connecting rod; 11e.. 5 th connecting rod; 11f.. 6 th connecting rod; a detector; a transmitter; an abutment; an end effector; a tubing; a robotic system; a robot; a robot; a robot; a robot; an operator; a grip portion; a switch; a switch 102c.. to; a designator; a sensor; a sensor; a sensor; a sensor; a workpiece; a camera; a display device; a control device; a work area; a work interval; a working interval; a working interval; an operating area; an operation interval; 202b.. operation interval; 202c.. operation interval; a wall member; a window; a wall member; a wall member; a wall member; a wall member; j 1.. 1 st joint; j 2.. 2 nd joint; j 3.. 3 th joint; j 4.. 4 th joint; j 5.. 5 th joint; j 6.

18页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:移动机器人和控制多个移动机器人的方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!