Vehicle for measuring inclination angle and distance profile

文档序号:1145911 发布日期:2020-09-11 浏览:33次 中文

阅读说明:本技术 倾角和距离轮廓测量车辆 (Vehicle for measuring inclination angle and distance profile ) 是由 S.帕特尔 F.阿卜杜拉蒂夫 B.帕罗特 于 2019-02-06 设计创作,主要内容包括:本文公开了用于对表面进行轮廓测量的系统和方法。在一些实施例中,所述系统和方法使用机器人车辆执行轮廓测量。所述车辆可以包含驱动系统、一个或多个车轮编码器和一个或多个距离传感器和/或惯性测量单元,所述一个或多个距离传感器和/或惯性测量单元用于捕获测量数据,如所述表面的斜率或所述机器人车辆相对于所述表面或重力矢量的角度。包含控制计算系统,所述控制计算系统具有一个或多个处理器,所述一个或多个处理器执行存储在软件模块中的指令以处理移动数据。在一些实施例中,经过处理的移动数据确定在所述机器人车辆横越所述表面时的不同时间和位置处所述表面的多个快照。组合这些快照以生成所述表面的轮廓。(Systems and methods for profiling a surface are disclosed herein. In some embodiments, the systems and methods perform profile measurements using a robotic vehicle. The vehicle may include a drive system, one or more wheel encoders and one or more distance sensors and/or inertial measurement units for capturing measurement data, such as the slope of the surface or the angle of the robotic vehicle relative to the surface or gravity vector. Including a control computing system having one or more processors executing instructions stored in software modules to process movement data. In some embodiments, the processed movement data determines a plurality of snapshots of the surface at different times and locations as the robotic vehicle traverses the surface. These snapshots are combined to generate a profile of the surface.)

1. A system for profiling a surface, the system comprising:

a robotic vehicle having a drive system, at least one wheel, and one or more wheel encoders housed within each wheel for measuring movement of the robotic vehicle as the robotic vehicle is driven by the drive system;

one or more distance sensors coupled to the robotic vehicle to measure a slope of the surface;

a control computing system, the control computing system comprising:

a non-transitory computer-readable storage medium,

one or more processors in electronic communication with the one or more sensors, each wheel encoder, the robotic vehicle, and the computer-readable storage medium,

one or more software modules comprising executable instructions stored in the storage medium, wherein the one or more software modules are executable by the processor and comprise:

a movement module that configures the processor to receive movement data from the one or more wheel encoders to determine a position of the robotic vehicle,

a sensor control module that configures the processor to instruct the one or more distance sensors to transmit one or more signals toward the surface at respective pulse times and detect times at which the one or more signals are reflected from the surface, and receive movement data from the movement module, wherein the sensor control module determines a snapshot of the surface by calculating a slope of the surface using the times at which the one or more signals are reflected and the movement data,

wherein the robotic vehicle is driven by the drive system to traverse the surface while the control computing system continuously determines a plurality of snapshots of the surface as the position of the robotic vehicle changes to generate a profile of the surface.

2. The system of claim 1, wherein a first sensor of the one or more sensors is mounted on a front portion of the robotic vehicle and a second sensor of the one or more sensors is mounted on a rear portion of the robotic vehicle.

3. The system of claim 1, wherein the one or more distance sensors are arranged linearly longitudinally along a longitudinal axis of the robotic vehicle.

4. The system of claim 1, wherein the one or more distance sensors are arranged linearly along a transverse axis of a vehicle axis of the robotic vehicle.

5. The system of claim 1, wherein the one or more distance sensors are arranged substantially equidistant from each other.

6. The system of claim 1, wherein the distance sensor is disposed on an exterior surface of the robotic vehicle.

7. The system of claim 1, wherein the distance sensor is disposed within the robotic vehicle.

8. The system of claim 1, further comprising an inertial measurement unit housed in the robotic vehicle for capturing orientation data of the robotic vehicle.

9. The system of claim 8, wherein the inertial measurement unit is an accelerometer.

10. The system of claim 1, wherein the movement data includes a speed of the robotic vehicle or a travel distance of the robotic vehicle.

11. The system of claim 1, wherein the control computing system further comprises:

a profile analysis module that configures the processor to calculate respective times of flight (TOFs) of the one or more signals traveling between the sensor and the surface, determine respective slopes of the surface at a given point based on the respective TOFs and the movement data, and store the respective slopes of the surface at the given point in the non-transitory computer-readable storage medium.

12. A system for profiling a surface, the system comprising:

a robotic vehicle having a drive system, at least one wheel, and one or more wheel encoders housed within each wheel for measuring movement of the robotic vehicle as the robotic vehicle is driven by the drive system;

an inertial measurement unit coupled to the robotic vehicle, the inertial measurement unit to measure an angle of the robotic vehicle relative to a direction of gravity;

a control computing system, the control computing system comprising:

a non-transitory computer-readable storage medium,

one or more processors in electronic communication with the inertial measurement unit, each wheel encoder, the robotic vehicle, and the computer-readable storage medium,

one or more software modules comprising executable instructions stored in the storage medium, wherein the one or more software modules are executable by the processor and comprise:

a movement module that configures the processor to receive movement data from the one or more wheel encoders to determine a changing position of the robotic vehicle,

a sensor control module that configures the processor to instruct the inertial measurement unit to determine a first angle relative to gravity at a first location on the surface and to determine a second angle relative to gravity at a second location on the surface,

a contour analysis module that configures the processor to determine an angular offset between the first angle and the second angle and calculate a change in slope between the first position and the second position, receive the movement data from the movement module, and determine a snapshot of the surface between the first position and the second position using the calculated change in slope and the movement data and store the snapshot in the non-transitory computer-readable storage medium,

wherein the robotic vehicle is driven by the drive system to traverse the surface while the control computing system continuously determines a plurality of snapshots of the surface as the position of the robotic vehicle changes to generate a profile of the surface.

13. The system of claim 12, wherein the profile analysis module determines an offset by calculating a sine of an average of the first and second angles multiplied by a distance between the first and second positions measured by the one or more wheel encoders.

14. The system of claim 12, further comprising one or more distance sensors mounted to or within the robotic vehicle to measure a slope of the surface.

15. The system of claim 14, wherein the sensor control module configures the processor to instruct the one or more distance sensors to emit one or more signals toward the surface at respective pulse times, wherein the sensor control module further configures the processor to detect reflections of the one or more signals from the surface using the one or more distance sensors.

Technical Field

The present invention generally relates to systems and methods for profiling surfaces. In particular, the invention relates to vehicle profiling and measuring surfaces in a non-destructive manner.

Background

In the oil and gas industry, storage tanks for crude oil and refinery products play a key role in the hydrocarbon supply chain. Knowing the exact volume of these storage units plays a crucial role when transferring product to and/or from the tanks. Due to variations and aging in external and internal conditions (i.e., temperature) and also due to the weight of the liquid product (i.e., hydrostatic pressure), the tank volume can vary by as much as +/-0.2%. For a 250,000 barrel tank, this variation would result in a volume variation of +/-500 barrel volumes.

Due to the high value of petroleum hydrocarbons, there is a mandatory requirement for calibration of storage tanks. The tank used for the store transfer (custody transfer) must be calibrated so that the volume transferred is known very accurately (e.g., with an error of less than 0.1%). The most common techniques to perform this operation are: manual strapping (API MPMS 2.2A); optical techniques (Optical Reference Line Method ORLM-API Chapter 2.2B), Optical Triangulation (OTM) -API Chapter2.2C (Optical Triangulation Method (OTM) -API Chapter2.2C), Electro-Optical Ranging (EODR) -API Chapter 2.2D (Electro-Optical Ranging Method (EODR) -API Chapter 2.2D)) and liquid calibration (API standard 2555). However, these measurements have been found to be error prone and have other drawbacks. In some cases, the aforementioned testing techniques require tank downtime (e.g., emptying the tank or otherwise temporarily stopping tank operation), which incurs additional costs for the losses incurred. In addition, many of the foregoing testing techniques are invasive in that they require access to the internal volume of the tank and may also be destructive.

Existing methods for tank calibration suffer from significant drawbacks. For example, using current standards, performing a correction may take 1-2 days of work. Therefore, the tank is rarely calibrated, resulting in inaccurate measurements of the actual volume stored in or transferred to and from the tank, which can result in high costs. For example, a conventional time period between corrections may be between five and fifteen years.

What is needed is a system and method for profiling and inspecting surfaces for inspection, calibration, and construction tasks. Additionally, what is needed are systems and methods for correcting tank volume that address limitations associated with the efficiency of performing corrections using existing systems. More specifically, what is needed is a system and method for accurately performing tank calibration that can be deployed and operated in a relatively fast, low cost, and non-invasive manner. What is also needed is a system that can be deployed quickly and on demand and thus facilitates more frequent (e.g., daily or at each filling) detection of changes in tank volume.

It is with respect to these and other considerations that the disclosure herein is presented.

Disclosure of Invention

According to one broad aspect of the present invention, a system and method for generating a profile of a surface is provided. In one or more embodiments, this profile measurement is accomplished using a robotic vehicle.

According to one aspect of the invention, a system for profiling a surface according to one or more embodiments is provided. In one or more embodiments, the system includes a robotic vehicle having a drive system, at least one wheel, and one or more wheel encoders housed within each wheel for measuring movement of the robotic vehicle as the robotic vehicle is driven by the drive system. Further, one or more distance sensors are coupled to the robotic vehicle to measure a slope of the surface. For example, a first sensor may be mounted at a front portion of the robotic vehicle and a second sensor may be mounted at a rear portion of the robotic vehicle. In one or more embodiments, the one or more distance sensors are arranged linearly longitudinally along a longitudinal axis of the robotic vehicle or linearly along a lateral axis of a vehicle axis of the robotic vehicle. The one or more sensors may also be arranged substantially equidistant from each other. In one or more embodiments, the sensor is disposed on an exterior surface of the robotic vehicle. In one or more embodiments, the sensor is disposed within the robotic vehicle. The system for profiling a surface may also comprise one or more inertial measurement units. For example, the inertial measurement unit may be an accelerometer or a gyroscope. The one or more inertial measurement units may be housed within the vehicle or may be attached to an exterior surface of the vehicle.

Continuing with this aspect of the invention, the system additionally comprises a control computing system. The control computing system includes a non-transitory computer-readable storage medium, one or more processors in electronic communication with the one or more sensors, each wheel encoder, the robotic vehicle, and the computer-readable storage medium, and one or more software modules comprising executable instructions stored in the storage medium.

The one or more software modules are executable by a processor and include: a movement module that configures the processor to receive movement data from the one or more wheel encoders to determine a position of the robotic vehicle; and a sensor control module that configures the processor to instruct the one or more distance sensors to transmit one or more signals towards the surface at respective pulse times and detect times at which the one or more signals are reflected from the surface, and to receive movement data from the movement module, wherein the sensor control module determines a snapshot of the surface by calculating a slope of the surface using the times at which the one or more signals are reflected and the movement data. In one or more embodiments, the movement data includes a speed of the robotic vehicle or a travel distance of the robotic vehicle. In one or more embodiments, the control computing system further includes a profile analysis module that configures the processor to calculate respective times of flight (TOF) of the one or more signals traveling between the sensor and the surface, determine a respective slope of the surface at a given point based on the respective TOF and the movement data, and store the respective slope of the surface at the given point in the non-transitory computer-readable storage medium. The robotic vehicle is then driven by the drive system to traverse the surface while the control computing system continuously determines a plurality of snapshots of the surface as the position of the robotic vehicle changes to generate a profile of the surface.

In another aspect of the invention, a system for profiling a surface in accordance with one or more embodiments is provided. In one or more embodiments, the system includes a robotic vehicle having a drive system, at least one wheel, and one or more wheel encoders housed within each wheel for measuring movement of the robotic vehicle as the robotic vehicle is driven by the drive system. The system further includes an inertial measurement unit coupled to the robotic vehicle for measuring an angle of the robotic vehicle relative to a direction of gravity. In one or more embodiments, the system includes one or more distance sensors mounted to or within the robotic vehicle to measure the slope of the surface.

Continuing with this aspect of the invention, the system additionally comprises a control computing system. The control computing system includes a non-transitory computer-readable storage medium, one or more processors in electronic communication with the one or more sensors, each wheel encoder, the robotic vehicle, and the computer-readable storage medium, and one or more software modules comprising executable instructions stored in the storage medium, wherein the one or more software modules are executable by the processor.

Continuing with this aspect of the invention, the one or more software modules include a movement module, a sensor control module, and a contour analysis module. The movement module configures the processor to receive movement data from the one or more wheel encoders to determine a changing position of the robotic vehicle. The sensor control module configures the processor to instruct the inertial measurement unit to determine a first angle with respect to gravity at a first location on the surface and to determine a second angle with respect to gravity at a second location on the surface. In one or more embodiments, the sensor control module configures the processor to instruct the one or more distance sensors to emit one or more signals toward the surface at respective pulse times, wherein the sensor control module further configures the processor to detect reflections of the one or more signals from the surface using the one or more distance sensors. The contour analysis module configures the processor to determine an angular offset between the first angle and the second angle and calculate a change in slope between the first position and the second position, receives the movement data from the movement module, and uses the calculated change in slope and the movement data to determine a snapshot of the surface between the first position and the second position and stores the snapshot in the non-transitory computer-readable storage medium. In one or more embodiments, the profile analysis module determines the offset by calculating a sine of an average of the first angle and the second angle multiplied by a distance between the first position and the second position measured by the one or more wheel encoders. The robotic vehicle is then driven by the drive system to traverse the surface while the control computing system continuously determines a plurality of snapshots of the surface as the position of the robotic vehicle changes to generate a profile of the surface.

Drawings

The present invention is illustrated in the figures of the accompanying drawings which are meant to be exemplary and not limiting, in which like references are intended to refer to like or corresponding parts, and in which:

FIG. 1 presents a high-level diagram illustrating an exemplary configuration of a system for profiling a surface of an exemplary storage container in accordance with one or more embodiments;

FIG. 2 presents a block diagram illustrating an exemplary configuration of a control computing system in accordance with one or more embodiments;

FIG. 3 presents a side view schematically illustrating an exemplary robotic vehicle for profiling a system according to one embodiment;

FIG. 4 is a flow diagram illustrating a routine illustrating a system and method for profiling a surface in accordance with one or more embodiments;

FIG. 5 presents a side view schematically illustrating an exemplary robotic vehicle for profiling a system according to another embodiment;

FIG. 6 presents a side view schematically illustrating an exemplary robotic vehicle for profiling a system according to another embodiment; and is

Fig. 7 presents a front view schematically illustrating an exemplary robotic vehicle for profiling a system according to another embodiment.

Detailed Description

Throughout this specification, terms may have meanings beyond the meanings explicitly set forth herein to indicate or imply subtle differences in context. Likewise, the phrase "in one embodiment" as used herein does not necessarily refer to the same embodiment, and the phrase "in another embodiment" as used herein does not necessarily refer to a different embodiment. Similarly, the phrase "one or more embodiments" as used herein does not necessarily refer to the same embodiment, and the phrase "at least one embodiment" as used herein does not necessarily refer to a different embodiment. It is intended that claimed subject matter encompass combinations of all or portions of the example embodiments.

The present disclosure details systems and methods for generating a profile of a surface. Because current methods in the field of surface profilometry are not highly efficient or effective in surface profilometry using robotic inspection devices without the use of a remote base station to process the collected data, the present systems and methods employ hardware, software, and/or a combination of both to provide a surface profilometry system that does not require remote processing or a base station. In particular, the present disclosure details an improved robotic system in which one or more sensors are arranged on the robot and configured to collect robot orientation data relative to a surface as the robot moves along the surface, so as to generate a "snapshot" of the surface, in other words, capture and optionally further generate data regarding the orientation of the robot at a given location on the surface. Advantageously, a device having a sensor with a broader response spectrum than an optical capture device (e.g., a camera) may be used to capture and/or generate the "snapshots" disclosed herein. The robot orientation data includes a measurement of the inclination of the vehicle relative to the direction of gravity as the vehicle traverses along the surface. The system then merges these snapshots under the control of a programmed processor to provide the contour of the surface. Although the systems and methods described herein may be used for profiling any type of surface, they have particular application in the fields of surface inspection, tank calibration, construction and shipbuilding, for example.

In one aspect, the system disclosed herein includes an accelerometer or other inertial measurement unit integrated into a robotic vehicle that also includes one or more distance sensors for rapidly profiling a surface without an external reference. The system herein uses a processor executing code that configures the processor for the purpose to measure surface curvature and calculate the absolute offset of the vehicle position relative to the surface. The measured offset can be used to calculate the magnitude of the surface deformation and thereby perform a profile measurement of the surface. While external references may be added to the system to improve accuracy, the vehicle may generate the surface profile using only on-board sensors and gravity traction. In one or more embodiments, the robotic vehicle includes wheel encoders that again use a processor executing code that configures the processor for the purpose to determine movement of the vehicle over the surface.

Thus, in some configurations, the system 100 may include one or more robotic vehicles or "robots" configured to automatically and semi-automatically traverse a surface being contoured. For example, as shown in fig. 1, a robot 110 is deployed on a cylindrical container 105. As will be understood by those skilled in the art of robotics, the robot 110 is a mobile robotic device that includes a body and a drive system for moving the robot during operation. The drive system includes at least one wheel and at least one motor for powering the at least one wheel. The wheels may be driven wheels, omni wheels, or other types of robotic wheels known in the art. The robot may be powered by, for example, a solar cell, a battery, or any other suitable power source. The robot may contain functional hardware components specifically designed to facilitate performing operational tasks, such as sensors for detecting the height, position, orientation, etc. of the robot. In addition to profiling the surface, the operational task may also include, for example, performing surface inspection (e.g., wall thickness measurement, surface geometry) or coating porosity measurement. The robotic hardware may also include on-board sensors and accelerometer/inertial measurement units used in the surface profilometry process, and additionally or alternatively, components suitable for transporting and deploying other devices configured to operate in a standalone fashion. In one or more embodiments, the robot 110 includes one or more distance sensors 120. For example, the distance sensor may be an optical sensor, an ultrasonic sensor, a LIDAR or other sensor capable of determining distance. The robot 110 may include electronic circuitry within the body including memory and/or a computer readable storage medium configured to store information related to the operation of the robot, such as configuration settings and one or more control programs and a processor that facilitate performance of a container volume correction operation, as previously described. System 100 also contains one or more software modules comprising executable instructions stored in the storage medium and executable by the processor.

Referring now to FIG. 2, a control computing system 200 is depicted in accordance with one or more embodiments. As shown, controlling computer system 200 may be arranged with various hardware and software components for implementing the operations of system 100, including a processor 210, a memory 220, a communication interface 250, and a computer-readable storage medium 290. In one or more embodiments, processor 210, memory 220, and communication interface 250 are integrated onto a single circuit board.

Processor 210 is operative to execute software instructions that may be stored in storage area 290 and loaded into memory 220. Processor 210 may be a plurality of processors, a plurality of processor cores, or some other type of processor, depending on the particular implementation. In one or more embodiments, the processor 210 is in electronic communication with one or more distance sensors (e.g., distance sensor 120), wheel encoders, and other components of the robotic vehicle 110.

Preferably, memory 220 and/or storage 290 are accessible to processor 210, thereby enabling processor 210 to receive and execute instructions stored on memory 220 and/or storage 290. The memory 220 may be, for example, a Random Access Memory (RAM) or any other suitable volatile or non-volatile computer-readable storage medium. Additionally, the memory 220 may be fixed or removable. Storage area 290 may take various forms depending on the embodiment. For example, storage area 290 may contain one or more components or devices, such as a hard drive, a flash memory, a rewritable optical disk, a rewritable magnetic tape, or some combination of the above. Storage 290 may also be a fixed or removable, local storage area, or a remote storage area, such as a cloud-based data storage system.

In one or more embodiments, the control computing system 200 also includes a display 235 and a user interface 225. The display may be displayed on a touch screen or other display (not shown) operatively coupled to the input device. For example, a display may be positioned at a robot (e.g., robot 110) and used to output surface contour measurements with which a user may interact via user interface 225.

One or more software modules 230 are encoded in storage 290 and/or memory 220. Software module 230 may include one or more software programs or applications having computer program code, scripts, or interpretable instruction sets that execute in processor 210. Such computer program code or instructions for carrying out operations and implementing aspects of the systems and methods disclosed herein may be written in any combination of one or more programming languages or scripts. The program code may execute entirely on the control computing system 200, partly on the control computer and partly on a remote computer/device (e.g. sensors, transducers and/or robots) or entirely on such a remote computer/device as a stand-alone software package. In the latter scenario, the remote computer system may be connected to control computer system 200 through any type of electronic data connection or network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made through an external computer (for example, through the Internet using an Internet Service Provider). In those scenarios, the controlling computing system 200 may include a network card or other means for wirelessly transmitting data, as is known in the art.

Included among software modules 230, in one or more embodiments, are a sensor control module 270, a movement module 272, and a contour analysis module 274 that are executed by processor 210. During execution of the software module 230, the processor 210 is configured to perform various operations related to surface profile measurement, as will be described in greater detail below.

In one or more embodiments, the sensor control module 270 configures the processor 210 to instruct one or more distance sensors (e.g., the distance sensor 120) to transmit one or more signals toward the surface being traversed at respective pulse times. The sensor control module 270 further configures the processor to use one or more distance sensors to detect the arrival of the emitted signal as it reflects back from the traversing surface. In this way, the time taken for the signal to reflect is measured, and in turn the distance from the vehicle to the surface. As the robotic vehicle travels along the surface, the distance measurements change and the topology of the surface can be determined accordingly. For example, using the known distance formula: distance is speed time and assuming the signal is traveling at the speed of light, the sensor control module 270 can easily calculate the distance by measuring the time it takes for the signal to reflect.

In one or more embodiments, each of the distance sensors is instructed to transmit a signal simultaneously. In other embodiments, each of the distance sensors is instructed to transmit signals at alternating intervals. For example, the sensor control module 270 may cause the distance sensors to transmit signals based on the current vehicle speed and direction, such that when each sensor is estimated to be passing a particular location, the respective sensor data may be compared to refine the accuracy of the surface measurements.

In one or more embodiments, the sensor control module 270 configures the processor to instruct the inertial measurement unit to determine a first angle relative to the gravity vector at a first location and to determine a second angle relative to the gravity vector at a second location once the vehicle has moved from the first location to the second location. In this way, the relative inclination of the vehicle with respect to rest caused by the surface can be obtained, and further information about the topology of the surface can be obtained in turn.

The movement module 272 configures the processor to receive movement data from one or more wheel encoders to determine the position of the robotic vehicle. The wheel encoders provide the ability to measure the amount of each wheel turn of the vehicle and thereby provide an indicator of the distance traveled by the vehicle. The mobile module receives information about the wheel sizes from the memory and, in conjunction with a determination of the amount each wheel of the vehicle has turned, may calculate the distance traveled by the vehicle.

In one or more embodiments, the profile analysis module 274 configures the processor to calculate respective times of flight (TOF) of one or more signals traveling between the respective distance sensor and the surface. This may be combined with the operation of the sensor control module 270. The profile analysis module 274 then determines the respective slopes of the surfaces at the given points based on the respective TOF and the movement data provided by the movement module 272. Thereby, the topology of the surfaces can be stitched together. The contour analysis module can also store the respective slopes of the surfaces at a given point in a non-transitory computer-readable storage medium (e.g., in memory).

In one or more embodiments, the profile analysis module 274, in conjunction with the sensor control module 270 and/or the inertial measurement unit, configures the processor to determine an angular offset between a first angle and a second angle, where the first angle and the second angle are measured as respective positions of the vehicle relative to a gravity vector at two different vehicle positions. The profile analysis module 274 may store the respective angles measured with respect to the surface at each given point in a non-transitory computer-readable storage medium (e.g., in a non-volatile memory device). In one or more embodiments, the offset is determined relative to a longitudinal axis of the robotic vehicle.

An angular offset between the first angle and the second angle may be used to determine the profile of the surface. Since the respective angles are indicative of measurements of the inclination of the robotic vehicle relative to the surface, the profile analysis module 274 can use this information to determine the profile of the surface. For example, if the inertial measurement unit is placed in the center of the robotic vehicle and the wheels positioned at the front of the vehicle begin to move up the incline, the inertial measurement unit will measure the tilt angle relative to the rest of the vehicle. Thereafter, the profile analysis module 274 may take the angle data in conjunction with the wheel encoder movement data and determine the slope of the measured angle at which the surface has the distance provided by the movement data.

It can also be said that the program code of software module 230 and one or more of the non-transitory computer-readable storage devices (e.g., memory 220 and/or storage area 290) form a computer program product that can be manufactured and/or distributed in accordance with the present disclosure, as is known to those of ordinary skill in the art.

It should be appreciated that in some demonstrative embodiments, one or more of software modules 230 may be downloaded over a network from another device or system to storage area 290 via communication interface 250 for use within the system in configuring field robot 110.

Additionally, it should be noted that other information and/or data related to the operation of the systems and methods of the present invention, such as various control programs for operating system 100 (e.g., sensors, encoders, transducers) and/or the robot during use, may also be stored on memory area 290.

Database 285 may also be stored on storage 290. The database 285 may contain and/or maintain various data items and elements used in various operations throughout the system 100. The information stored in database 285 may include, but is not limited to, software and information for coordinating the operation of the sensors, software and information for coordinating the movement of the robot while deploying the sensors into their respective locations during surface profiling, known characteristics for performing profiling and calculating surface dimensions (e.g., surface slope, surface geometry, and surface dimensions). It should be noted that although the database 285 is depicted as being configured locally with a memory area of the control computing system 200, in certain embodiments, the database 285 and/or various data elements stored therein may be remotely located and connected to the control computing system 200 over a network in a manner known to those of ordinary skill in the art.

While an advantage of one or more embodiments herein is that the control computing system 200 is able to perform surface contour measurements without the need for remote input or processing, in other embodiments, the communication interface 250 is also operatively connected to the processor 210. Communication interface 250 may be any interface capable of enabling communication between control computing system 200 and external devices, machines, and/or elements (such as transducers, sensors, and any robots used in connection with profilometry operations). Preferably, communication interface 250 includes, but is not limited to: a modem; a Network Interface Card (NIC); an integrated network interface; radio frequency transmitter/receiver (e.g., bluetooth, cellular, NFC); a satellite communication transmitter/receiver; an infrared port; a USB connection, and/or any other such interface for connecting control computing system 200 to other computing devices and/or communication networks, such as a private network and the internet. Although such a connection may comprise a wired connection or a wireless connection (e.g., using the IEEE 802.11 standard), it should be appreciated that communication interface 250 may be virtually any interface that enables communication to and from a controlling computer.

Referring now to FIG. 3, a system 300 for profiling a surface 305 in accordance with one or more embodiments is provided. The system 300 includes a robotic vehicle 310 having a set of wheels 315, a distance sensor 320, and a control computing system 340. The robotic vehicle 310 contains a drive system (not shown), such as an on-board motor for powering the vehicle and for movement. The set of wheels 315 may include one or more wheel encoders (not shown) that gather movement information about the vehicle 310 (e.g., vehicle speed and direction) to inform the processor 210 of the vehicle movement to control the computing system 200 to define the movement of the vehicle along the surface 305. In one or more embodiments, the set of wheels 315 includes one or more omni-wheels, drive wheels, treads, and the like. In one or more embodiments, the set of wheels 315 includes magnets to improve adhesion to ferromagnetic surfaces. The robot may include other attachment mechanisms such as clamps, hooks, springs, cords, suction cups, or other attachment mechanisms known in the art.

In one or more embodiments, the distance sensor 320 is a single optical sensor, ultrasonic sensor, or LIDAR sensor. In other embodiments, the distance sensor 320 encompasses multiple optical or ultrasonic sensors. The distance sensor 320 may be mounted at the front, rear, and/or other defined locations of the vehicle 310, depending on the desired configuration, so long as the distance sensor can transmit a signal toward the surface 305 and receive a subsequent return signal. In a particular embodiment, the first distance sensor is mounted at a front portion of the robotic vehicle and the second distance sensor is mounted at a rear portion of the robotic vehicle.

As shown in exemplary fig. 3, a distance sensor 320 is mounted to the front of the robotic vehicle 310 and is arranged to transmit a signal along a distance 330 to the surface 305. The distance 330 measured by the sensor determines how the slope of the surface changes relative to the current position and inclination of the vehicle on the surface and calculates therefrom the change in inclination of the surface. For example, in fig. 3, robotic vehicle 310 is first located at position a along surface 305 and distance 330A is measured. The drive system then activates the wheels 315 and moves the robotic vehicle 310 along the surface 305 into position B at a different location, where the vehicle makes a second measurement of the distance 330B by the distance sensor 320. The corresponding "location" of the vehicle 310 should be understood to refer to a location (e.g., a point or area) on the surface from which the distance sensor 320 transmits and/or receives signals. In one or more embodiments, the wheel encoder may also make positioning measurements at each position (e.g., by measuring wheel rotation).

It should be appreciated that the distance traveled by the robotic vehicle 310 between measurements (e.g., from position a to position B) should be very small, and that many surface measurements should be taken in order to maximize the accuracy of the surface profile measurements. As more measurements are made, and as the robotic vehicle changes less in position between each measurement, the result is a higher resolution of the data output (i.e., finer integration of the data to produce the surface profile).

The system 300 further includes a control computing system 340 (e.g., the control computing system 200). The control computing system 340 includes a processor, a memory, a non-transitory computer-readable storage medium, and one or more software modules comprising executable instructions stored in the storage medium, wherein the one or more software modules are executable by the processor.

Preferably, included among the software modules are a sensor control module, a movement module, and a contour analysis module. The sensor module configures the processor to instruct the one or more distance sensors to transmit one or more signals at respective pulse times, wherein the sensor control module further configures the processor to detect an arrival of the one or more signals using the one or more distance sensors. The movement module configures the processor to receive movement data from the one or more wheel encoders to determine a position of the robot. The profile analysis module configures the processor to calculate respective times of flight (TOFs) of the one or more signals traveling between the sensor and the surface, determine respective slopes of the surface at a given point based on the respective TOFs and the movement data, and store the respective slopes of the surface at the given point in the non-transitory computer-readable storage medium. As the robotic vehicle moves from one location to another on the surface, the control computing system 340 executes the modules described above to continuously determine the respective slopes at different locations along the surface. The change in the measured slope is then integrated and the position of the robot is calculated using the previous readings to generate a profile of the surface. The profile may be generated during operation of the robotic vehicle or after completion of the operation. Generating the profile may include "solving" the profile by finding the profile that will generate the observed reading. During execution of the software modules, the processor is configured to perform various operations related to surface profiling, as will be described in greater detail below.

The operation of the exemplary surface profile measurement systems 100 and 300 illustrated in fig. 1 and 3 will be understood with further reference to fig. 4. FIG. 4 is a high-level flow diagram of a routine 400 for generating a surface contour in accordance with one or more embodiments of the present invention.

The routine 400 begins at step 405, where a robotic vehicle is deployed on a surface to be contoured. Thereafter, one or more signals are generated using one or more distance sensors (e.g., distance sensor 320) and transmitted toward a surface at the current location of the vehicle. At step 415, one or more signals returned from the surface to the one or more sensors are received. This information is passed to a computational control system, such as computational control system 200, where one or more software modules containing code are executed by a processor to perform contour calculations. In one or more embodiments, steps 415 and 420 are performed by a sensor control module described herein.

At step 420, the computational control system calculates respective times of flight (TOF) of the one or more signals received at the distance sensor. After or concurrently with this step, the computing control system receives movement data from the wheel encoders (step 425). The movement data may contain the current speed and direction of the robotic vehicle and is used to determine where the robotic vehicle is positioned on the surface. In one or more embodiments, the movement module configures the processor to receive movement data from the one or more wheel encoders to determine the position of the robot.

After receiving this information, the computational control system calculates the respective slopes of the surfaces at the current location based on the respective TOF (step 430). For example, a longer measured TOF at the second location compared to the measured TOF at the first location indicates that the surface is at a lower elevation at the second location than at the first location. At step 430, the calculated slope data and movement data for the current robotic vehicle position are stored in a storage medium. This information includes a "snapshot" of the surface. If the surface has not been fully captured, i.e., there is a snapshot that is insufficient to determine the profile of the entire surface, the robotic vehicle is repositioned to a new position (step 440) and the routine 400 branches back to step 405. This process may be repeated as many times as necessary to capture the desired surface within a specified threshold of accuracy.

Once the robotic vehicle has made measurements along the entire surface to be contoured, the computational control system uses the data collected from each location to contour the surface (step 445). The profile measurement may involve solving equations based on the received TOF and movement data to determine how the surface looks. In one or more embodiments, steps 430, 435, and 445 may be implemented by a contour analysis module that includes code executable in processor 210 to perform those steps.

Referring now to fig. 5, a system 500 for profiling a surface 505 in accordance with one or more embodiments is provided. The system 500 includes a robotic vehicle 510 having a plurality of distance sensors 520 aligned longitudinally linearly along a longitudinal axis of the vehicle, each sensor configured to measure a distance 530A, B, C to a surface 505, and the like, as described elsewhere herein. The longitudinal axis of the vehicle may be coaxial with the center frame of the vehicle 510. In this manner, each of the plurality of distance sensors 520 generates a separate data stream. Each distance sensor 520 may be calibrated to measure distance at a different rate or with a different accuracy (e.g., at regular time intervals). For example, one sensor may be configured to collect distance measurements every 0.001 seconds, while another sensor may be configured to collect distance measurements every 0.01 or 0.1 seconds. Similarly, each distance sensor may be individually configured to accurately measure the distance to centimeters, millimeters, or the like. The number of distance sensors 520 may be increased to provide a finer "snapshot" measurement. In one or more embodiments, the distance sensors 520 are arranged substantially equidistant from each other. In one or more embodiments, the distance sensor 520 is disposed on an exterior surface of the robotic vehicle 510. In one or more embodiments, a distance sensor 520 is disposed within the robotic vehicle 510.

To accommodate the plurality of distance sensors 520, the robotic vehicle 510 is elongated relative to the robotic vehicle 310. The robotic vehicle 510 also includes a control computing system 540 (e.g., control computing systems 200, 340) as described elsewhere herein. In certain embodiments, the rate or accuracy of data collection may be optimized to manage the amount of data generated by the plurality of distance sensors 520 for communication to the control computing system 540.

The robotic vehicle 510 also houses an accelerometer 550 for capturing vehicle orientation data. Although accelerometers are contemplated herein, other inertial measurement units (e.g., gyroscopes) may be implemented. The accelerometer 550 is calibrated with respect to the direction of gravity, which means that the accelerometer measures the horizontal and/or vertical offset of the robotic vehicle 510 as it traverses the surface. This provides additional data beyond the distance measurement 530 itself that may be used by the profile analysis module 274 to determine the profile of the surface 505. In this way, the control computing system 540 may adjust the captured "snapshot" of the surface to improve profile measurement accuracy. In one or more embodiments, the accelerometer 550 is positioned substantially centrally on the robotic vehicle 510 in the x and y planes.

In one or more embodiments, a plurality of accelerometers 550 are disposed on or within the robotic vehicle 510. In one or more embodiments, the accelerometers 550 are arranged substantially equidistant from each other.

Referring now to fig. 6, a system 600 for profiling a surface 605 in accordance with one or more embodiments is provided. The system 600 includes a robotic vehicle 610 having a set of wheels housing wheel encoders and a control computing system 640 (e.g., control computing systems 200, 340) and an accelerometer 650 as described elsewhere herein. Advantageously, the present embodiment enables profiling of a surface purely by measuring vehicle inclination via an accelerometer and wheel encoder, and therefore does not have to incorporate any distance sensor to profile the surface (although as described below, the distance sensor may further improve accuracy).

More specifically, the control computing system 640 calibrates the current vehicle angle to a zero point at a first position (position 1) using the accelerometer. Preferably, the zero point matches the zero point of the gravity vector, although this is not required. Fig. 6 illustrates the null angle by first angle 660. Then, as shown in fig. 6, the robotic vehicle 610 is driven a first distance 665 to a second position (position 2). Since the shape of the surface has changed from position 1 to position 2 (e.g., elevation, orientation), the angle measured by the accelerometer 650 of the robotic vehicle 610 relative to the zero point has shifted by a second angle 670. The robotic vehicle 610 is then driven a second distance 675 to a third position (position 3) where the angle measured by the accelerometer 650 of the robotic vehicle 610 with respect to the zero point has been offset by a third angle 680. The angular tilt data captured at each of positions 1, 2, and 3, as measured by accelerometer 650, is communicated to control computing system 640 and processed by a processor executing one or more software modules (e.g., a profile analysis module). For example, using this information, the system 600 may be configured to calculate that the robotic vehicle 610 has moved along a line defined by an average of the tilt readings between two locations, determining an increment in offset (e.g., between location 1 and location 2) by calculating the average angle (the average angle at location 1 and location 2) multiplied by the sine of the travel distance estimated from the encoders on the wheels as the vehicle moves from location 1 to location 2. These incremental offsets are then summed to estimate the absolute offset relative to the starting position of the vehicle. The foregoing includes algorithms that assume that the robotic vehicle 610 has moved along a line defined by an average of tilt readings between two locations, other algorithms may be employed to account for more complex movements, and the foregoing is by way of example only.

To improve measurement accuracy, the system 600 preferably implements a short robotic vehicle 610 to locally measure the tilt angle at any given location to accurately determine how far the vehicle has moved in both the x and y directions. Similarly, it is advantageous for the system 600 to keep the distance variations small (e.g., the distances 665, 675 may be 0.1mm or 0.1cm apart) when performing new measurements. Keeping the distance variation small limits the opportunity for the system 600 to miss information, such as if the surface is similar in two locations, but varies between those locations, and also provides more information to the processor 210 in order to create a more refined detailed surface profile. In another way to improve accuracy, the system 600 may be configured with code to generate a contour as the robotic vehicle 610 moves in a forward direction of travel, and then retrace the route in a reverse direction of travel to generate two contours. The information generated in each contour is then fused to improve the accuracy of the overall generated contour. If the two generated contours differ by a certain amount, the vehicle can rerun the measurements and add the generated information to the data fusion to continue to improve accuracy.

Although system 600 is contemplated without a distance sensor as the primary focus, in one or more embodiments, a distance sensor may be added to the system to improve accuracy. For example, a distance sensor may be mounted on robotic vehicle 610 to face the ground (e.g., passively mounted to a point parallel to gravity traction) to measure the height of the vehicle above the surface. This information is then used to determine what angle the vehicle is moving along by comparing the wheel encoder readings to this distance measurement relative to the ground.

Turning now to fig. 7, a system 700 for profiling a surface 705 in accordance with one or more embodiments is provided. Fig. 7 illustrates a front view of a robotic vehicle 710 having a plurality of distance sensors 720 spaced linearly along a shorter lateral axis of the vehicle (e.g., the x-axis in fig. 7). For example, the plurality of distance sensors 720 may be substantially equidistant from each other across the lateral axis. The transverse axis may be coaxial with the central frame or axle of the robotic vehicle 710. By placing the distance sensor across the narrow axis of the robotic vehicle rather than the long axis, the sensor can collect distance information in both the x and y directions. The system 700 further includes a wheel encoder, a control computing system 740, and an accelerometer 750 as described elsewhere herein and providing information to the processor for calculating inclination and distance information associated with generating the surface profile. By collecting data in two directions, control computing system 740 can generate two surface contours, one for each direction, thereby generating a 2D contour of the surface.

In one or more embodiments, accuracy may be further improved by including a first set of distance sensors 720 arranged linearly along the longitudinal axis of the robotic vehicle 710 and a second set of distance sensors 720 arranged linearly along the lateral axis of the robotic vehicle and perpendicular to the first set of distance sensors. In this manner, each of the plurality of distance sensors 720 generates a separate data stream. Each distance sensor 720 may be calibrated to measure distance at a different rate or with a different accuracy. For example, one sensor may be configured to collect distance measurements every 0.001 seconds, while another sensor may be configured to collect distance measurements every 0.01 or 0.1 seconds. Similarly, each distance sensor may be individually configured to accurately measure the distance to centimeters, millimeters, or the like. The number of distance sensors 720 may be increased to provide a finer "snapshot" measurement. In one or more embodiments, the distance sensors 720 are arranged equidistant from each other. In one or more embodiments, the distance sensor 720 is disposed on an exterior surface of the robotic vehicle 710. In one or more embodiments, a distance sensor 720 is disposed within the robotic vehicle 710.

In one or more embodiments, the surface profile measurement implements a simultaneous localization and mapping algorithm (SLAM) to map the entire surface of an object (e.g., a tank wall) automatically and with tolerances to a certain degree of accuracy. The SLAM algorithm is a real-time positioning system that acquires environmental sensor data received at the vehicle in real-time and compares the data to previously known data in order to approximate the geometry of the current environment while tracking the trajectory of the vehicle within the environment. According to embodiments disclosed herein, robotic vehicles are deployed that can access SLAM technology to automatically generate 2D ("spread out") or 3D contours of a surface with high accuracy. SLAM techniques may provide greater accuracy than LIDAR techniques, depending on the sensor and system sensitivity. In this manner, the systems and methods described herein are not limited to the linear profile or the 2D profile associated with fig. 7.

Additionally, in one or more embodiments, the systems and methods herein include a laser reference device external to the robotic vehicle for estimating vehicle position. The laser reference device is preferably positioned at a known distance from a known location of the surface (e.g., the center of the surface). In operation, the laser reference device would emit a vertical reference laser line (the beam has a linear width in the horizontal direction) rather than being oriented parallel to the surface. The light from the laser reference device is continuously detected by a sensor (e.g., an optical sensor) coupled to the robotic device as the robotic device moves along the surface. Any protrusions, depressions and unevenness on the surface will shift the position at which the sensor captures and detects the laser radiation emitted by the laser reference device. The measured offset can be used to calculate the magnitude of the deformation of the surface from a reference calculation made at the surface. Since the calculation depends on the optical sensor detecting the exact reading of the position of the laser light, this method depends to a significant extent on the design of the optical sensor.

Fig. 1 to 7 are conceptual diagrams allowing the explanation of the present invention. Those skilled in the art will appreciate that aspects of embodiments of the present invention may be implemented in hardware, firmware, software, or a combination thereof. In such embodiments, the various components and/or steps would be implemented in hardware, firmware, and/or software to perform the functions of the present invention. That is, the same piece of hardware, firmware, or software module may perform one or more of the illustrated blocks (e.g., components or steps).

In software embodiments, computer software (e.g., programs or other instructions) and/or data is stored on a machine-readable medium as part of a computer program product and loaded into a computer system or other device or machine via a removable storage drive, hard drive, or communications interface. Computer programs (also called computer control logic or computer readable program code) are stored in main and/or secondary memory and executed by one or more processors (controllers, etc.) to cause the one or more processors to perform the functions of the present invention as described herein. In this document, the terms "machine-readable medium," "computer program medium," and "computer-usable medium" are used generally to refer to media such as Random Access Memory (RAM), Read Only Memory (ROM), removable storage units (e.g., magnetic or optical disks, flash memory devices, etc.); a hard disk; and the like.

Notably, the above figures and examples are not intended to limit the scope of the present invention to a single embodiment, as other embodiments are possible by interchanging some or all of the elements described or illustrated. Furthermore, where certain elements of the present invention may be partially or fully implemented using known components, only those portions of such known components that are necessary for an understanding of the present invention are described, and detailed descriptions of other portions of such known components are omitted so as not to obscure the invention. In this specification, unless explicitly stated otherwise herein, an embodiment showing a single component should not necessarily be limited to other embodiments containing multiple of the same component, and vice versa. Moreover, applicants do not intend for any term in the specification or claims to be ascribed an uncommon or special meaning unless explicitly set forth as such. Further, the present invention encompasses present and future known equivalents to the known components referred to herein by way of illustration.

22页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种测绘采样点的规划方法、装置、控制终端及存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!