Operating system of automated guided vehicle

文档序号:214447 发布日期:2021-11-05 浏览:8次 中文

阅读说明:本技术 无人搬运车的操作系统 (Operating system of automated guided vehicle ) 是由 中川真人 于 2020-02-25 设计创作,主要内容包括:终端(60)仅仅通过利用拍摄部(62)拍摄无人搬运车(10),就能够操作该无人搬运车(10)。另外,在拍摄部(62)持续获取识别信息的标记(55)的图像的期间,操作状态控制部(65)能够使终端(60)对无人搬运车(10)进行操作。即,在从作业人员的位置无法确认识别信息的标记(55)时、或者在因过远而无法确认识别信息时,作业人员无法操作无人搬运车。因此,操作状态控制部(65)能够在将作业人员与无人搬运车(10)之间的位置关系设置为适当的状态的基础上,使终端(60)对无人搬运车(10)进行操作。(The terminal (60) can operate the automated guided vehicle (10) only by imaging the automated guided vehicle (10) by the imaging unit (62). The operation state control unit (65) can cause the terminal (60) to operate the automated guided vehicle (10) while the image capturing unit (62) continues to acquire the image of the mark (55) of the identification information. That is, when the identification information mark (55) cannot be confirmed from the position of the operator or when the identification information cannot be confirmed due to being too far away, the operator cannot operate the automated guided vehicle. Therefore, the operation state control unit (65) can cause the terminal (60) to operate the automated guided vehicle (10) while setting the positional relationship between the operator and the automated guided vehicle (10) to an appropriate state.)

1. An operation system of an automated guided vehicle having a plurality of automated guided vehicles and a terminal for operating the automated guided vehicles,

the terminal is provided with a shooting part and a control part,

identification information for identifying each of the automated guided vehicles is provided to the plurality of automated guided vehicles,

the operation system includes an operation state control unit that enables the terminal to operate the automated guided vehicle while the image capturing unit continues to acquire the image of the identification information.

2. The automated guided vehicle operating system of claim 1, wherein,

the information processing apparatus further includes an estimation unit that estimates a positional relationship of the automated guided vehicle with respect to the position of the terminal based on a display form of the information in the image acquired by the imaging unit.

3. The automated guided vehicle operating system of claim 2, wherein,

the estimation unit acquires at least one of an orientation and a distance of the automated guided vehicle with respect to the terminal based on a display form of the identification information in the image acquired by the imaging unit.

4. The automated guided vehicle operating system according to any one of claims 1 to 3, wherein,

the operation state control unit adjusts the operation mode of the terminal based on a result of comparison between a reference region set for the image acquired by the image capturing unit and the content displayed in the image.

5. The automated guided vehicle operation system according to any one of claims 1 to 4, wherein,

the operation state control unit may cause the terminal to operate the automated guided vehicle at a predetermined time even when the image obtained by the image capturing unit from which the identification information is obtained is out of frame.

6. The automated guided vehicle operating system of any one of claims 1 to 5, wherein,

the operation state control unit executes control for limiting the operation of the automated guided vehicle based on a display mode of the identification information in the image acquired by the imaging unit.

Technical Field

The present invention relates to an operation system of an automated guided vehicle.

Background

An automated guided vehicle described in patent document 1, for example, has been known in the past. An automated guided vehicle is used as a vehicle for transporting a container unloaded from a ship to a desired position in, for example, a harbor. In a bay, a predetermined travel route is set, and a plurality of automated guided vehicles travel on the travel route in accordance with a command from a tower as an external commander.

Documents of the prior art

Patent document

Patent document 1: japanese laid-open patent publication No. 2005-239314

Disclosure of Invention

Problems to be solved by the invention

Here, the automated guided vehicle as described above is retracted from the travel path to the maintenance area during maintenance or the like. In the above-described evacuation, since the automated guided vehicle does not have a cab, the operator moves the automated guided vehicle to the maintenance area by operating a dedicated remote controller. Here, the modes of connecting the remote controller and the automated guided vehicle include wired connection and wireless connection. In the case of a wired remote controller, it is necessary to connect an electric wire of the remote controller to a connector provided in the automated guided vehicle. In order to prevent the worker from contacting the machine base, the remote controller needs to be operated from a position spaced apart by a predetermined distance, but when the worker and the automated guided vehicle move together with each other and delay occurs, the electric wire may be caught in the automated guided vehicle, and the remote controller and the connector may be damaged. On the other hand, in the case of using a wireless remote controller, it is necessary to connect a receiver necessary for pairing to a connector of the automated guided vehicle. Since the receivers are generally associated with the remote controllers in a one-to-one manner, if there are a plurality of automated guided vehicles, it is necessary to prepare the number of remote controllers and receivers corresponding to the number of the automated guided vehicles. As described above, in order to connect the remote controller and the automated guided vehicle, a connector needs to be connected, and a connector having high waterproofness is needed. As described above, there is a need for an operation system capable of operating an automated guided vehicle with a simple configuration without using a connector or the like that requires waterproofness. In addition, when the worker operates the automated guided vehicle, it is preferable to operate the automated guided vehicle after the positional relationship between the work vehicle and the automated guided vehicle is set to an appropriate state, but if only the remote controller is connected to the automated guided vehicle, the operation considering the positional relationship cannot be performed.

The present invention aims to provide an operation system which enables an operator to operate an automated guided vehicle from an appropriate position with a simple structure.

Means for solving the problems

An operation system for an automated guided vehicle according to an aspect of the present invention is an operation system for an automated guided vehicle including a plurality of automated guided vehicles and a terminal for operating the automated guided vehicle, wherein the terminal includes an imaging unit for providing identification information for identifying each of the plurality of automated guided vehicles, and the operation system includes an operation state control unit for enabling the terminal to operate the automated guided vehicle while the imaging unit continues to acquire an image of the identification information.

In an automated guided vehicle operation system, identification information for identifying each automated guided vehicle is given to a plurality of automated guided vehicles. That is, the terminal can identify the automated guided vehicle as the operation target by imaging the identification information by the imaging unit and can operate the automated guided vehicle. Thus, the operator can operate the automated guided vehicle through the terminal only with a simple configuration of providing identification information to the automated guided vehicle, without using a connector or the like requiring waterproofness. The operation state control unit may cause the terminal to operate the automated guided vehicle while the image capturing unit continues to acquire the image of the identification information. That is, when the identification information cannot be confirmed from the position of the operator or when the identification information cannot be confirmed due to being too far away, the operator cannot operate the automated guided vehicle. Therefore, the operation state control unit can operate the automated guided vehicle by the terminal while setting the positional relationship between the operator and the automated guided vehicle to an appropriate state. As described above, the automated guided vehicle can be operated by the operator from an appropriate position with a simple configuration.

The operation system of the automated guided vehicle may further include an estimation unit that estimates a positional relationship of the automated guided vehicle with respect to the position of the terminal based on a display form of information in the image acquired by the imaging unit. In this way, the estimating unit can easily estimate the positional relationship between the terminal (i.e., the operator) and the automated guided vehicle from the image acquired by the imaging unit without using complicated processing or special sensors. In addition, the operation state control unit can control the operation state of the terminal by effectively using the estimation result of the estimation unit.

The estimation unit may acquire at least one of an orientation and a distance of the automated guided vehicle with respect to the terminal based on a display form of the identification information in the image acquired by the imaging unit. Thus, the estimating unit can easily acquire at least one of the orientation and the distance of the automated guided vehicle with respect to the terminal based on the display form of the identification information, without performing complicated image processing or the like.

The operation state control section may adjust the operation mode of the terminal based on a result of comparison between a reference region set for the image acquired by the photographing section and the content displayed in the image. The operation state control unit can perform control in consideration of the positional relationship between the operator and the automated guided vehicle only by comparing the reference region and the content displayed in the image, without performing complicated processing.

The operation state control unit can operate the automated guided vehicle by the terminal at a predetermined time even when the image obtained by the image capturing unit in which the identification information is obtained is out of the frame. Thus, even when the identification information accidentally goes out of the frame, the operation can be continued as long as the identification information is immediately captured.

The operation state control unit may perform control to restrict the operation of the automated guided vehicle based on a display mode of the identification information in the image acquired by the imaging unit. This can improve the safety when the automated guided vehicle is operated.

ADVANTAGEOUS EFFECTS OF INVENTION

According to the present invention, an operation system capable of allowing an operator to operate an automated guided vehicle from an appropriate position with a simple configuration is provided.

Drawings

Fig. 1 is a schematic plan view showing a container terminal using an operation system of an automated guided vehicle according to an embodiment of the present invention.

Fig. 2 is a schematic side view showing a state of the automated guided vehicle in the container terminal shown in fig. 1.

Fig. 3 is a schematic diagram showing an example of an automated guided vehicle to which identification information is added.

Fig. 4 is a block diagram showing a system configuration of an operating system.

Fig. 5 is a diagram showing an example of an operation screen of the terminal.

Fig. 6 is a diagram showing an example of an operation screen of the terminal.

Fig. 7 is a schematic plan view showing a positional relationship between the worker and the automated guided vehicle.

Fig. 8 is a diagram showing an example of an operation screen of the terminal.

Fig. 9 is a schematic view showing an automated guided vehicle according to a modification.

Fig. 10 is a schematic view showing an automated guided vehicle according to a modification.

Fig. 11 is a diagram showing an example of a pattern of the arrangement of the marks.

Detailed Description

Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In the drawings, the same or equivalent elements are denoted by the same reference numerals, and redundant description thereof is omitted.

Fig. 1 is a schematic plan view showing a container terminal using an operation system 100 of an automated guided vehicle 10 according to an embodiment of the present invention. Fig. 2(a) and 2(b) are schematic side views showing the situation of the automated guided vehicle 10 in the container terminal shown in fig. 1. First, the structure of the automated guided vehicle 10 and the operation of the automated guided vehicle 10 in the container terminal will be described with reference to fig. 1, 2(a) and 2 (b).

As shown in fig. 1, in a container terminal in a bay, a plurality of Automated Guided Vehicles (AGVs) 10 travel on a predetermined travel route R in accordance with a command S1 from a tower 101 as an external commander. The automated guided vehicle 10 wirelessly receives a command S1 for controlling the traveling state of the automated guided vehicle 10 from the tower 101, and travels on the travel path R. Each automated guided vehicle 10 stops at the loading position of the container W in accordance with the command S1. The position of the crane 103 is controlled by the tower 101. The crane 103 moves to a loading position of the container W in a state of suspending the container W. The position information of the crane 103 is fed back to the tower 101. The container W is loaded and unloaded from the container ship 102 to each automated guided vehicle 10 by the crane 103. The container W loaded and unloaded by the crane 103 is mounted on the automated guided vehicle 10. The automated guided vehicle 10 mounted with the container W travels along the travel path R to the rubber-tired crane 104. The container W mounted on the automated guided vehicle 10 is unloaded by the rubber-tired crane 104. The automated guided vehicle 10 that has unloaded the container W and becomes empty returns to the crane 103 along the travel path R.

As shown in fig. 2(a) and 2(b), the body 11 of the automated guided vehicle 10 includes a loading platform 20, a plurality of drive wheels 30, and a plurality of motors 40. The cargo platform 20 is provided with a mounting surface 21 on which a container W suspended by the crane 103 is mounted. The mounting surface 21 is a flat surface extending horizontally. The mounting surface 21 is formed in a square shape wider than the lower surface bs of the container W in a plan view. The motor 40 generates power for operating the drive wheel 30. The motor 40 includes a traveling motor 41 and a steering motor 42 as power sources. The travel motor 41 generates a driving force for causing the automated guided vehicle 10 to travel in the front-rear direction. The steering motor 42 generates power for steering the drive wheel 30 to change the traveling direction of the automated guided vehicle 10.

Next, an operation system 100 of the automated guided vehicle 10 according to the embodiment of the present invention will be described. The operation system 100 is a system for manually operating a specific automated guided vehicle 10 among the plurality of automated guided vehicles 10 by a terminal operation of an operator. For example, the handling system 100 according to the present embodiment is used when maintenance or the like needs to be performed on a specific automated guided vehicle 10 among the automated guided vehicles 10 that perform automatic operation as described above. As shown in fig. 1, the handling system 100 includes a plurality of automated guided vehicles 10 and a terminal 60 for handling the automated guided vehicles. Although only one terminal 60 is shown in fig. 1, the operation system 100 may include a plurality of terminals 60 by allowing a plurality of operators to include the terminals 60, respectively.

In the operation system 100, the automated guided vehicle 10 to be operated is specified using the identification information individually given to the automated guided vehicle 10. Fig. 3(a) and 3(b) are schematic diagrams showing an example of the automated guided vehicle 10 to which identification information is added. As shown in fig. 3(a) and 3(b), identification information for identifying each automated guided vehicle 10 is given to the automated guided vehicle 10. The identification information is unique information given to one automated guided vehicle 10. The identification information is information that enables one automated guided vehicle 10 to be specified from among the plurality of automated guided vehicles 10. In the example shown in fig. 3(a) and 3(b), the identification information is represented by a mark 55 of a QR Code (Quick Response Code: two-dimensional Code, registered trademark). The mark 55 contains certain information by being represented by a special graphic. By reading the mark 55 as an image, it is possible to display the mark so as to grasp specific information including the identification information. In addition to the QR code, the identification information may be represented by reading a combination of patterns and symbols, etc., which enable the user to grasp the specific content, using a barcode or the like. Alternatively, the identification information may be directly expressed by characters, numbers, or the like. For example, the identification information may be represented by directly recording the number assigned to the automated guided vehicle 10 on the vehicle body 11. In addition, as long as one automated guided vehicle 10 can be identified from among a plurality of automated guided vehicles 10, one automated guided vehicle 10 may be identified by a combination of a plurality of pieces of identification information included therein.

Reference numeral 55 denotes the body 11 of the automated guided vehicle 10. Here, reference numerals 55 denote a side surface 11a and an end surface 11b of the vehicle body 11, respectively. In addition, the number of marks 55 differs between the side surface 11a and the end surface 11 b. On the side surface 11a, two marks 55 are shown at positions spaced apart from each other in the longitudinal direction (see fig. 3 (b)). One mark 55 is shown on the end surface 11b (see fig. 3 (a)). Note that each mark 55 may include information other than identification information. For example, the mark 55 may include information of a formation position of the mark 55, i.e., which surface is formed in the vehicle body 11, which position is formed in the surface, and the like.

Next, a system configuration of the operation system 100 of the automated guided vehicle 10 according to the embodiment of the present invention will be described with reference to fig. 4. Fig. 4 is a block diagram showing the system structure of the operating system 100.

As shown in fig. 4, the automated guided vehicle 10 includes a drive control unit 71, a navigation unit 72, a sensor unit 73, and a communication unit 74. The drive control unit 71 controls the motor 40 to cause the automated guided vehicle 10 to travel in a desired direction and at a desired speed. The navigation unit 72 is a part that guides the automated guided vehicle 10 to convey along what route when the automated guided vehicle 10 automatically travels. The sensor unit 73 is a part configured by various sensors and cameras for grasping surrounding conditions when the automated guided vehicle 10 is automatically traveling. The communication unit 74 is a part that exchanges signals for communicating with the tower 101 and other devices. Various kinds of information are transmitted and received between the communication unit 74 and the terminal 60. The communication unit 74 receives an operation command signal from the terminal 60.

The terminal 60 includes an information processing unit 61, an imaging unit 62, an input unit 63, and an output unit 64. As the terminal 60, for example, a general-purpose device such as a smartphone or a tablet terminal can be used. When a general-purpose device is used as the terminal 60, an operator installs a dedicated application program in the terminal 60.

The imaging unit 62 is a part that acquires an image by imaging. The imaging unit 62 is constituted by a camera or the like built in the terminal 60. An external camera may be used as the imaging unit 62. The input unit 63 is a part that receives an operation input from an operator. The input unit 63 is configured by a touch panel, a microphone for performing voice recognition, and the like. The output unit 64 is a part that outputs information to the operator. The output unit 64 is constituted by a monitor, a speaker, and the like.

The information processing unit 61 is a part that performs various processes for operating the automated guided vehicle 10. The information processing unit 61 includes a processor, an internal Memory (Memory), and an external Memory (Storage). The processor is an arithmetic Unit such as a CPU (Central Processing Unit). The internal Memory is a storage medium such as a ROM (Read Only Memory) or a RAM (Random Access Memory). The external memory is a storage medium such as an HDD (Hard Disk Drive). The processor includes an internal memory, an external memory, a communication interface, and a user interface, and realizes the functions of the information processing unit 61 described later. In the information processing section 61, for example, various functions are realized by loading a program stored in the ROM into the RAM and executing the program loaded into the RAM by the CPU.

The information processing unit 61 includes an operation state control unit 65, an estimation unit 66, an operation content reception unit 67, an action instruction unit 68, and a communication unit 69.

The operation state control unit 65 controls the operation state of the terminal 60 of the determined automated guided vehicle 10. The operation state control portion 65 performs control based on the image acquired by the photographing portion 62 so that the automated guided vehicle 10 in the image can be operated. Preferably, the image is acquired using an image extracted from a continuously photographed picture, but the acquisition method is not limited thereto. While the image capturing unit 62 continues to acquire the image of the identification information (here, the mark 55), the operation state control unit 65 can cause the terminal 60 to operate the automated guided vehicle 10. That is, the operation state control portion 65 determines the automated guided vehicle 10 capable of acquiring the identification information from the image, and establishes the connection state of the communication with the automated guided vehicle 10. For example, when the operator terminates the image capturing of the automated guided vehicle 10 and cannot acquire the image of the identification information, the operation state control unit 65 releases the connection state with the automated guided vehicle 10 and ends the operable state. The specific operation of the operation state control unit 65 will be described later.

The estimation unit 66 is a part that estimates the positional relationship of the automated guided vehicle 10 with respect to the position of the terminal 60 based on the display form of the information in the image acquired by the imaging unit 62. Here, the information in the image means various information that can be acquired from the image, such as a characteristic portion (for example, a company logo or the like, which does not include identification information but is easily recognized in the image) shown in the vehicle body 11, an edge (outer shape) of the vehicle body 11, and the like, in addition to the mark 55 in the image. The display mode refers to the size, position, number, and the like of the image, and specifically refers to the size, number, position, and the like of the mark 55 in the image, the position of the feature portion shown in the vehicle body 11, and the like. The positional relationship of the automated guided vehicle 10 with respect to the position of the terminal 60 refers to the orientation, distance, and the like of the automated guided vehicle 10 with respect to the position of the terminal 60. The specific operation of the estimating unit 66 will be described later.

The operation content receiving unit 67 is a part that receives the operation content when the operator performs an input for operating the automated guided vehicle 10 using the input unit 63. The operation command unit 68 is a part that generates an operation command signal to the automated guided vehicle 10 based on the operation content received by the operation content receiving unit 67. The communication unit 69 is a part that exchanges signals for communicating with other devices. Various kinds of information are transmitted and received between the communication unit 69 and the automated guided vehicle 10. The communication unit 69 transmits an operation command signal to the automated guided vehicle 10. The method of establishing the connection state between the communication units 69 and 74 is not particularly limited, and any method such as Wi-Fi (wireless fidelity), Bluetooth (registered trademark), and 4G communication or 5G communication optimized for communication within the yard of the container terminal may be used. It is preferable to use a short-range connection means such as Bluetooth from the viewpoint of operating the automated guided vehicle 10 located near the terminal 60 regardless of the radio wave state.

Next, the operation contents of the automated guided vehicle 10 of the operation system 100 will be described in detail with reference to fig. 5 to 8 (b). Fig. 5 is a diagram showing an example of an operation screen of the terminal 60. As shown in fig. 5, on a screen 80 of the terminal 60, there are displayed: an image display unit 81 for displaying the image acquired by the imaging unit 62; and an operation unit 82 that displays an interface for the operator to perform an operation.

In the state shown in fig. 5, a mark 55 is displayed in the image. That is, the imaging unit 62 acquires the image of the mark 55 including the identification information. In this state, the operation state control unit 65 establishes a connection state with the automated guided vehicle 10 displayed in the image via the communication units 69 and 74. Thus, the operator can operate the automated guided vehicle 10 displayed in the image by operating the operation portion 82. When the operator operates the operation unit 82, the operation content receiving unit 67 receives the operation content. The operation instructing unit 68 generates an operation instruction signal based on the operation content received by the operation content receiving unit 67. The communication unit 69 transmits an operation command signal to the communication unit 74 of the automated guided vehicle 10. The drive control unit 71 of the automated guided vehicle 10 generates a drive signal based on the operation command signal. Thus, the automated guided vehicle 10 operates according to the operation content of the operator.

The image display unit 81 sets a first reference region SE1 and a second reference region SE2 for the image acquired by the imaging unit 62. In addition, the operation state control section 65 adjusts the operation mode of the terminal 60 based on the comparison result between the first reference region SE1 and the second reference region SE2 set for the image and the content displayed in the image. The first reference region SE1 sets a region in which the automated guided vehicle 10 should be accommodated in the image. That is, the fact that the automated guided vehicle 10 in the image is out of the first reference area SE1 means that the terminal 60 (i.e., the worker) is too close to the automated guided vehicle 10. The second reference region SE2 is a region set inside the first reference region SE 1. The second reference area SE2 sets a reference for the size of the automated guided vehicle 10 to be displayed in the image. That is, the fact that the automated guided vehicle 10 in the image is accommodated in the second reference region SE2 means that the terminal 60 (i.e., the worker) is too far away from the automated guided vehicle.

The operating state control unit 65 detects the edge of the vehicle body 11 to recognize the outer shape of the vehicle body 11. The operation state control section 65 adjusts whether or not to permit or restrict the operation of the terminal 60 based on the result of comparison between the outer shape of the vehicle body 11 displayed in the image and the reference regions SE1, SE 2.

For example, in the state shown in fig. 6(a), the vehicle body 11 in the image is accommodated within the first reference region SE1 and is beyond the second reference region SE 2. In this case, the operation state control unit 65 determines that the distance between the terminal 60 and the automated guided vehicle 10 is appropriate, and sets the state in which the automated guided vehicle 10 can be operated without providing any restriction on the operation.

In the state shown in fig. 6(b), the vehicle body 11 in the image is beyond the first reference region SE 1. In this case, the operation state control unit 65 determines that the distance between the terminal 60 and the automated guided vehicle 10 is too short, and sets operation restrictions. The operation restriction means that the automated guided vehicle 10 cannot be operated and cannot be moved by restricting the transmission of the operation command signal to the automated guided vehicle 10. Further, the operational restriction includes not only a state in which the operation is disabled but also a case in which the operation mode is adjusted by limiting the traveling speed or displaying a warning to the operator. For example, when the vehicle body 11 exceeds the first reference region SE1 by a small amount, the operation state control portion 65 performs speed limitation, warning display, and the like.

In the state shown in fig. 6(c), the vehicle body 11 in the image is accommodated in the second reference region SE 2. In this case, the operation state control unit 65 determines that the distance between the terminal 60 and the automated guided vehicle 10 is too long, and sets operation restrictions. The operational restrictions have the same meaning as in the case of fig. 6(b) described above. Further, as shown in fig. 6(d), the second reference region SE2 may also track the vehicle body 11 in the image. The second reference region SE2 moves such that its center point coincides with the center point of the vehicle body 11 in the image.

Even when the image acquired from the imaging unit 62 by the identification information (mark 55) is out of frame, the operation state control unit 65 can cause the terminal 60 to operate the automated guided vehicle 10 for a predetermined time. Out of frame means that no mark 55 is displayed in the image. Even when the vehicle body 11 in the image is out of the first reference region SE1 or when the vehicle body is accommodated in the second reference region SE2, the operation state control unit 65 can operate the automated guided vehicle 10 for a predetermined time.

The operation state control unit 65 executes control for restricting the operation of the automated guided vehicle 10 based on the display mode of the identification information (mark 55) in the image acquired by the imaging unit 62. At this time, the estimating unit 66 acquires at least one of the orientation and the distance of the automated guided vehicle 10 with respect to the terminal 60 based on the display form of the identification information (here, the mark 55) in the image acquired by the imaging unit 62.

Specifically, as shown in fig. 7, when the worker P stands on the side surface 11a of the automated guided vehicle 10, if the automated guided vehicle 10 moves close to the worker P, the automated guided vehicle 10 and the worker P may come too close to each other. Further, when the worker P stands on the end surface 11b side of the automated guided vehicle 10, if the automated guided vehicle 10 moves close to the worker P side, the automated guided vehicle 10 and the worker P may come too close to each other. In order to suppress the approach, the operation state control unit 65 restricts the operation of the automated guided vehicle 10 in the above-described direction. The operation restriction is to reduce the speed of the automated guided vehicle 10 or to prevent the automated guided vehicle 10 from moving in the direction regardless of the operation by the operator.

As shown in fig. 8(a) and 8(b), the estimating unit 66 acquires at least one of the orientation and the distance of the automated guided vehicle 10 with respect to the terminal 60 based on the display form of the mark 55 displayed in the image. For example, as shown in fig. 8(a), when only one mark 55 is displayed in the image, the estimating unit 66 estimates that the orientation of the automated guided vehicle 10 with respect to the terminal 60 is the orientation in which the end surface 11b side and the terminal 60 face each other. The operation state control unit 65 limits the operation of the automated guided vehicle 10 based on the estimation result of the estimation unit 66. In this way, by referring to the estimation result of the estimation unit 66, the operation state control unit 65 can execute control for limiting the operation of the automated guided vehicle 10 based on the display form of the mark 55 in the image acquired by the imaging unit 62. As shown in fig. 8(b), when two marks 55 are displayed in the image, the estimating unit 66 estimates that the orientation of the automated guided vehicle 10 with respect to the terminal 60 is the orientation in which the side surface 11a and the terminal 60 face each other. The operation state control unit 65 limits the operation of the automated guided vehicle 10 based on the estimation result of the estimation unit 66.

At this time, the estimating unit 66 may estimate the orientation of the automated guided vehicle 10 by determining the external shape of the vehicle body 11 in the image. That is, the estimation unit 66 refers to the length of the outer shape of the vehicle body 11 in the lateral direction in the image. In the state shown in fig. 8(a), since the outer shape of the vehicle body 11 in the image is short, it is estimated that the end face 11b is highly likely to be displayed in the image. In the state shown in fig. 8(b), it is estimated that the side surface 11a is highly likely to be displayed in the image because the outer shape of the vehicle body 11 in the image is long. For example, when the mark 55 in the image is one, it is difficult to distinguish the following two cases: since it is the mark 55 of the end face 11b, only one is displayed; since the automated guided vehicle 10 is too close to the terminal 60, only one mark 55 is displayed. Therefore, the estimation unit 66 can easily estimate the number of marks 55 and the outer shape of the vehicle body 11.

As shown in fig. 8(a), when the end face 11b is displayed in the image, the reference regions SE1 and SE2 set for the end face 11b may not be suitable for use. Therefore, when the end face 11b is displayed, the image display section 81 can shorten the lateral length of the reference regions SE1 and SE 2. The operation state control unit 65 performs the same processing as that described in fig. 6(a) to 6(d) using the shortened reference regions SE1 and SE 2. Since the end surface 11b and the side surface 11a have the same height in the vertical direction, the vertical size of the reference regions SE1 and SE2 is fixed.

The estimation unit 66 may estimate the distance of the automated guided vehicle 10 from the terminal 60 based on the display form of the mark 55 in the image. For example, the estimation unit 66 may estimate the distance of the automated guided vehicle 10 from the terminal 60 based on the size of the mark 55 in the image. Alternatively, when two marks 55 are displayed in the image, the distance of the automated guided vehicle 10 with respect to the terminal 60 may be estimated from the distance between the marks 55 in the image. The operation state control unit 65 may control the operation state of the automated guided vehicle 10 in consideration of the distance estimated by the estimation unit 66. For example, when the automated guided vehicle 10 is speed-limited, the operation state control unit 65 may change the level of the speed limitation according to the distance of the automated guided vehicle 10 from the terminal 60.

Next, the operation and effect of the handling system 100 of the automated guided vehicle 10 according to the present embodiment will be described.

In the handling system 100 of the automated guided vehicle 10, the plurality of automated guided vehicles 10 are provided with the marks 55 for identifying the identification information of each of the automated guided vehicles 10. That is, the terminal 60 can operate the automated guided vehicle 10 by identifying the automated guided vehicle 10 as the operation target by imaging the mark 55 of the identification information by the imaging unit 62. In this way, the operator can operate the automated guided vehicle 10 through the terminal 60 with a simple configuration of providing the automated guided vehicle 10 with the identification information, without using a connector or the like requiring waterproofness. While the image capturing unit 62 continues to acquire the image of the mark 55 of the identification information, the operation state control unit 65 can cause the terminal 60 to operate the automated guided vehicle 10. That is, when the identification information mark 55 cannot be confirmed from the position of the operator or when the operator is too far away from the position to confirm the identification information, the operator cannot operate the automated guided vehicle. Therefore, the operation state control unit 65 can allow the terminal 60 to operate the automated guided vehicle 10 after the positional relationship between the operator and the automated guided vehicle 10 is set to an appropriate state. As described above, the automated guided vehicle 10 can be operated by the operator from an appropriate position with a simple configuration.

The operation system 100 of the automated guided vehicle 10 further includes an estimation unit 66 that estimates a positional relationship of the automated guided vehicle 10 with respect to the position of the terminal 60 based on a display form of information in the image acquired by the imaging unit 62. In this way, the estimating unit 66 can easily estimate the positional relationship between the terminal and the automated guided vehicle 10 from the image acquired by the imaging unit 62 without using complicated processing, special sensors, and the like. In addition, the operation state control unit 65 can control the operation state of the terminal 60 by effectively using the estimation result of the estimation unit 66.

The operation state control section 65 adjusts the operation mode of the terminal 60 based on the comparison result between the reference regions SE1, SE2 set for the image acquired by the imaging section 62 and the contents displayed in the image. The operation state control unit 65 can perform control in consideration of the positional relationship between the operator and the automated guided vehicle 10 only by comparing the reference regions SE1 and SE2 with the contents displayed in the image, without performing complicated processing.

The estimation unit 66 acquires at least one of the orientation and the distance of the automated guided vehicle 10 with respect to the terminal 60 based on the display form of the identification information in the image acquired by the imaging unit 62. Thus, the estimating unit 66 can easily acquire at least one of the orientation and the distance of the automated guided vehicle 10 with respect to the terminal 60 based on the display form of the mark 55 of the identification information, without performing complicated image processing or the like.

Even when the image obtained from the imaging unit 62 by the mark 55 of the identification information is out of the frame, the operation state control unit 65 can cause the terminal 60 to operate the automated guided vehicle 10 for a predetermined time. Thus, even when the mark 55 of the identification information accidentally goes out of the frame, the operation can be continued by immediately capturing the mark 55 of the identification information.

The operation state control unit 65 executes control for restricting the operation of the automated guided vehicle 10 based on the display mode of the mark 55 of the identification information in the image acquired by the imaging unit 62. For example, as described with reference to fig. 7, when the automated guided vehicle moves in a direction approaching the operator, the operation state control unit 65 disables the operation in that direction. This can improve the safety when the automated guided vehicle 10 is operated.

The present invention is not limited to the above embodiments.

For example, the operation state control unit 65 and the estimation unit 66 may be provided on the automated guided vehicle 10 side. Alternatively, the functions of the operation state control unit 65 and the estimation unit 66 may be allocated to the automated guided vehicle 10 and the terminal 60.

The method of providing the identification information to the vehicle body 11 in the above-described embodiment is merely an example, and the identification information may be provided in any manner. The operation state control unit may perform characteristic control according to the manner of giving the identification information.

For example, the configuration shown in fig. 9 may be adopted such that the worker slowly moves the automated guided vehicle 10 at a close position. In fig. 9, a mark 55A smaller than the above-described mark 55 is provided on the side surface 11a of the vehicle body 11. The size of the mark 55A is formed so as to be small enough to be difficult to recognize in an image when a normal operation is performed (for example, fig. 6(a)), and so as to be easily recognized in an image when a worker approaches the vehicle body 11. When the worker approaches the side surface 11a and captures the image of the mark 55A, the estimation unit 66 estimates that the automated guided vehicle 10 is oriented such that the side surface 11a faces the terminal 60 and the terminal 60 is very close to the side surface 11 a. At this time, the operation state control unit 65 restricts the operation so that the automated guided vehicle 10 moves in the slow travel mode. By using the slow travel mode, the operator can slowly move the automated guided vehicle 10 and accurately stop the automated guided vehicle at a desired position while checking the surrounding situation of the automated guided vehicle 10 in detail, for example, when the automated guided vehicle 10 is stored in a warehouse. Further, a mark 55B having the same meaning as the mark 55A may be provided on the upper surface 11c of the vehicle body 11. At this time, the operator takes an image of the mark 55B while climbing on the upper surface 11c of the vehicle body 11.

The method for determining whether the mark displayed in the image is the mark 55A or 55B by the estimation unit 66 is not particularly limited. For example, as shown in fig. 6(a) to 6(d), when the mark 55 is displayed in a predetermined size in the image, the edge of the vehicle body 11 is also present in the image. In contrast, when the small marks 55A and 55B are displayed in a predetermined size in the image, only the wall surface of the vehicle body 11 is present around the marks, and the edges are not present in the image. Therefore, when no edge is detected around the mark in the image, the estimation section 66 may determine that the mark in the image is the mark 55A, 55B. Alternatively, the pattern of the marks 55A, 55B may include information "itself is the small mark 55A, 55B", and the estimation section 66 may acquire the information via an image.

Although the edge of the vehicle body 11 is also detected when the processes shown in fig. 6(a) to 6(d) are performed in the above-described embodiment, the edge detection may not be performed but the control may be performed based on only the image of the mark 55 indicating the identification information. For example, the operation state control section 65 may allow the terminal 60 to operate when the mark 55 in the image is displayed in a size of a prescribed range. In this case, if the mark 55 on the side surface 11a side and the mark 55 on the end surface 11b side are different patterns, colors, or include different information, the estimation unit 66 can estimate the orientation of the automated guided vehicle 10 from only one mark 55.

For example, as shown in fig. 10(a) and 10(b), a plurality of marks 55 may be provided on the vehicle body 11 so as to be visible from each direction of the vehicle body 11. In this case, when a predetermined number or more of the marks 55 are acquired in the image, the operation state control unit 65 may cause the terminal 60 to operate. For example, when only one or two marks 55 can be acquired in the image because the terminal 60 is too close to the vehicle body 11, the operation state control section 65 may restrict the operation. The estimation unit 66 may estimate the distance between the terminal 60 and the automated guided vehicle 10 based on the size of the mark 55 in the image, the distance between the marks 55, and the like. The estimating unit 66 may estimate the orientation of the automated guided vehicle 10 based on the number of marks 55 in the image. For example, as shown in fig. 10(a), three marks 55 are provided on the end surface 11 b. Therefore, when four or more marks 55 are displayed in the image, the estimating unit 66 determines that the vehicle body 11 is oriented such that the side surface 11a faces the terminal 60.

In addition, as shown in fig. 11(a) and 11(b), by improving the arrangement of the plurality of marks 55, a pattern can be formed by the plurality of marks 55. For example, by providing the side surface 11a of the vehicle body 11 with a pattern as shown in fig. 11(a) and providing the end surface 11b with a pattern as shown in fig. 11(b), the estimating unit 66 can estimate the orientation of the automated guided vehicle 10 based on the pattern of the arrangement of the marks 55 in the image.

The estimation unit 66 may estimate the position of the edge of the vehicle body 11 based on the specific pattern (mark 55) indicating the identification information, or based on the display form of various information in the image. For example, when a characteristic portion such as a drawing, a logo, or a character displayed on the vehicle body 11, or a component that is easily recognized is displayed on an image, the estimation unit 66 may estimate using the image.

In the above-described embodiment, a general-purpose device such as a smartphone or a tablet terminal is used as the terminal 60. Instead, as the terminal, a terminal dedicated to the operating system 100 may be used. In this case, the dedicated terminal may have a configuration and a function that facilitate operations in the operating system 100. For example, the dedicated terminal may locate the camera on an edge surface of the tablet computer rather than the back. In this case, the worker can capture the image of the automated guided vehicle 10 with the camera on the edge surface in a state where the screen of the tablet pc is held substantially horizontally. In addition, the dedicated terminal may have physical buttons, levers instead of the operation buttons, levers on the image. Alternatively, an accessory device having the above-described camera, physical buttons and lever with an edge surface may be manufactured, and a general-purpose smartphone or tablet terminal may be attached to the accessory device, without manufacturing the terminal as a dedicated product as a whole.

When the operation state control part 65 of the terminal 60 determines the automated guided vehicle 10 and establishes the connection state of the communication with the automated guided vehicle 10, if a terminal different from the terminal 60 has established the connection state with the automated guided vehicle 10, the establishment of the connection state may be invalidated. That is, only the terminal that initially establishes the connection state can operate the automated guided vehicle.

The region in which the connection state of the communication between the operation state control portion 65 of the terminal 60 and the automated guided vehicle 10 can be established can be appropriately set. For example, the connection state can be established only in a region separated from a maintenance region, not shown, by a predetermined distance, and the connection state cannot be established with the automated guided vehicle traveling on the travel path R.

Description of the reference numerals

10: unmanned carrying vehicle

55: mark (identification information)

60: terminal device

62: image pickup unit

65: operation state control section

66: estimation part

100: and (4) operating the system.

21页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于在沿路线的任务中控制车辆的方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类