System and method for calibrating attitude of sensor associated with materials handling vehicle

文档序号:261284 发布日期:2021-11-16 浏览:2次 中文

阅读说明:本技术 用于校准与物料搬运车辆相关的传感器的姿态的系统和方法 (System and method for calibrating attitude of sensor associated with materials handling vehicle ) 是由 R·M·埃斯泰普 T·W·范西罗 J·J·汤姆森 C·J·罗宾逊 于 2020-02-13 设计创作,主要内容包括:用于包括处理器和传感器以记录仓库特征的物料搬运车辆的方法和系统。处理器被配置为从传感器数据生成和提取特征、创建包括传感器外参节点(eO)的因子图(FG),并生成初始车辆节点(vO)、初始传感器帧节点(cO)和初始传感器特征节点(fO),初始传感器特征节点包括与初始数据关联中的cO和vO相关联的选定提取特征。基于累积里程数生成后续车辆节点(v1),并且后续传感器帧节点(c1)被生成并在后续数据关联中与fO或后续传感器特征节点(f1)之一以及eO、v1相关联。FG基于数据关联被优化以提供传感器校准输出,并且基于传感器校准输出来导航车辆。(Methods and systems for a materials handling vehicle including a processor and sensors to record warehouse characteristics. The processor is configured to generate and extract features from the sensor data, create a Factor Graph (FG) that includes sensor external reference nodes (eO), and generate an initial vehicle node (vO), an initial sensor frame node (cO), and an initial sensor feature node (fO), the initial sensor feature node including selected extracted features associated with the cO and the vO in the initial data association. A subsequent vehicle node (v1) is generated based on the accumulated mileage, and a subsequent sensor frame node (c1) is generated and associated in a subsequent data association with one of the fO or subsequent sensor feature nodes (f1) and the eO, v 1. The FG is optimized to provide a sensor calibration output based on the data association, and to navigate the vehicle based on the sensor calibration output.)

1. A materials handling vehicle comprising a sensor, a vehicle position processor, one or more vehicle dead reckoning components, and a drive mechanism configured to move the materials handling vehicle along an inventory transit surface, wherein:

the sensor is configured to record one or more characteristics of the warehouse; and

the vehicle position processor is configured to

Generating sensor data from the recording of the one or more characteristics recorded by the sensor,

extracting the one or more features from the sensor data,

creating a factor graph including sensor external reference nodes (eO) including initial seed external references associated with a sensor,

generating an initial vehicle node (vO), an initial sensor frame node (cO) and an initial sensor feature node (fO) in a factor graph, the initial sensor feature node (fO) comprising selected ones of the one or more extracted features that are associated in an initial data association with the initial sensor frame node (cO) and the initial vehicle node (vO),

a drive mechanism is used to navigate the materials handling vehicle along the inventory transit surface,

generating, in the factor graph, a subsequent vehicle node (v1) based on the accumulated miles from the one or more vehicle dead reckoning components,

generating, in the factor graph, a subsequent sensor frame node (c1) associated in a subsequent data association with the initial sensor feature node (fO) or one of the subsequent sensor feature nodes (f1) and the sensor external reference node (eO), the subsequent vehicle node (v1),

optimizing a factor graph based on the initial data correlation and the subsequent data correlation to provide a calibration output associated with the sensor, an

The materials handling vehicle is navigated along the inventory transit surface based on the calibration output associated with the sensor.

2. The materials handling vehicle as set out in claim 1, wherein said vehicle position processor is further configured to:

when providing a vehicle seed location and a feature map comprising one or more map features, an initial vehicle node (vO) is generated in the factor map as (i) a map-based initial vehicle node at the vehicle seed location, having (ii) an associated vehicle prior factor, and generating (iii) the one or more map features from the site map as corresponding sensor feature nodes, and (iv) a corresponding map feature prior factor.

3. The materials handling vehicle as set out in claim 2, wherein the associated vehicle a priori factor of the map based initial vehicle node at the vehicle seed location comprises an error function regarding the accuracy of the materials handling vehicle's vehicle seed location relative to the site map.

4. The materials handling vehicle as set out in claim 1, wherein said vehicle position processor is further configured to:

when no vehicle seed location and feature map is provided, an initial vehicle node (vO) is generated in the factor map as a map-less initial vehicle node at the origin.

5. The materials handling vehicle as set out in claim 1, wherein said vehicle position processor is further configured to:

iteratively generating one or more subsequent sensor feature nodes and one or more additional subsequent vehicle nodes in a factor graph based on accumulated miles from the one or more vehicle dead reckoning components between a previous vehicle node and an immediately subsequent vehicle node; and

one or more further subsequent sensor frame nodes associated in subsequent data associations with the sensor external reference node (eO), the associated immediate subsequent vehicle node, and the associated one of the one or more subsequent feature nodes are iteratively generated in the factor graph.

6. The materials handling vehicle as set out in claim 5, wherein said vehicle position processor is further configured to:

an inter-vehicle factor between an immediately subsequent vehicle node and a previous vehicle node is generated in a factor graph based on the accumulated mileage.

7. The materials handling vehicle as set out in claim 1, wherein said vehicle position processor is further configured to:

selecting one of the one or more extracted features;

when the one feature matches an existing feature, associating the one feature with the existing feature as a matching sensor feature node; and

factors are generated in the factor graph that link the generated sensor frame nodes with the matching feature nodes.

8. The materials handling vehicle as set out in claim 7, wherein the generated sensor framing node comprises one of an initial sensor framing node (cO) or a subsequent sensor framing node (c 1).

9. The materials handling vehicle as set out in claim 7, wherein the matched sensor signature node comprises one of an initial sensor signature node (fO) or a subsequent sensor signature node (f 1).

10. The materials handling vehicle as set out in claim 1, wherein said vehicle position processor is further configured to:

selecting one of the one or more extracted features;

generating a new sensor feature node associated with the one feature in the factor graph when the one feature does not match an existing feature;

generating factors in a factor graph that link the generated sensor frame nodes with the new sensor feature nodes.

11. The materials handling vehicle as set out in claim 10, wherein said new sensor signature node comprises one of an initial sensor signature node (fO) or a subsequent sensor signature node (f 1).

12. The materials handling vehicle as set out in claim 1, wherein said vehicle position processor is further configured to:

optimizing the factor graph when no other of the one or more extracted features remain associated with an existing feature as a corresponding matching sensor feature node or are added to the factor graph as a corresponding new sensor feature node.

13. The materials handling vehicle as set out in claim 1, wherein said vehicle position processor is further configured to:

the factor graph is optimized by generating a constraint optimization problem based on one or more variables and one or more factors defined in the factor graph.

14. The materials handling vehicle of claim 13, wherein the one or more variables represent unknown random variables in a constraint optimization problem and comprise one or more nodes comprising one or more of a sensor external reference node (eO), an initial vehicle node (vO), a subsequent vehicle node (v1), an initial sensor frame node (cO), a subsequent sensor frame node (c1), an initial sensor feature node (fO), and a subsequent sensor feature node (f 1).

15. The materials handling vehicle as set out in claim 13, wherein said one or more factors represent probabilistic information about a selected factor of said one or more factors and comprise a priori factor, an inter factor, a reference frame factor, a projection factor, an azimuth range factor, or a combination thereof.

16. The materials handling vehicle as set out in claim 13, wherein the constraint optimization problem is constructed using one or more smoothing and mapping (SAM) libraries and optimizers.

17. The materials handling vehicle as set out in claim 13, wherein said one or more features comprise one or more overhead lights of a warehouse and a constraint optimization problem is constructed using image recognition algorithms, data correlation algorithms, modeling techniques, or a combination thereof.

18. The materials handling vehicle as set out in claim 1, wherein said vehicle position processor is further configured to:

the optimization is terminated when the internal parameter graph state over a period of time is determined to be acceptable based on a predetermined threshold.

19. The materials handling vehicle as set out in claim 1, wherein:

the sensor is a camera, a laser-based sensor, or a combination thereof;

the camera is configured to capture the one or more features of the warehouse;

the laser-based sensor is configured to detect the one or more characteristics of the warehouse; and

the vehicle position processor is further configured to generate the sensor data from a recording of the one or more features captured by the camera, detected by the laser-based sensor, or a combination thereof.

20. A method of operating a materials handling vehicle comprising a sensor, a vehicle position processor, one or more vehicle dead reckoning components, and a drive mechanism configured to move the materials handling vehicle along an inventory transport surface, wherein the sensor is configured to record one or more characteristics of a warehouse, the method comprising, via the vehicle position processor:

generating sensor data from the recording of the one or more characteristics recorded by the sensor;

extracting the one or more features from the sensor data;

creating a factor graph including sensor external reference nodes (eO) including initial seed external references associated with a sensor;

generating an initial vehicle node (vO), an initial sensor frame node (cO) and an initial sensor feature node (fO) in a factor graph, the initial sensor feature node (fO) comprising selected ones of the one or more extracted features that are associated in an initial data association with the initial sensor frame node (cO) and the initial vehicle node (vO);

navigating a materials handling vehicle along an inventory transport surface using a drive mechanism;

generating a subsequent vehicle node (v1) based on the accumulated miles from the one or more vehicle dead reckoning components in a factor graph;

generating, in the factor graph, a subsequent sensor frame node (c1) associated in a subsequent data association with the initial sensor feature node (fO) or one of the subsequent sensor feature nodes (f1) and the sensor external reference node (eO), the subsequent vehicle node (v 1);

optimizing a factor graph based on the initial data correlation and the subsequent data correlation to provide a calibration output associated with the sensor; and

the materials handling vehicle is navigated along the inventory transit surface based on the calibration output associated with the sensor.

Technical Field

The embodiments described herein relate generally to materials handling vehicle sensor calibration and, more particularly, to systems and methods for calibrating sensors of materials handling vehicles relative to materials handling vehicles in a warehouse environment.

Background

Materials handling vehicles may utilize one or more sensors to capture features in the warehouse environment to facilitate positioning, attitude determination, and navigation. However, if the sensor is not properly calibrated, operational challenges can be introduced. Manufacturing or installation tolerances of the sensors with respect to the roll, pitch, yaw, x, y, and/or z positions of the materials handling vehicle may include too much variation to reliably position the materials handling vehicle in a warehouse environment. Thus, there is a need in the industry for accurate calibration of sensors on a materials handling vehicle regardless of location, such as, but not limited to, the factory building the materials handling vehicle.

Disclosure of Invention

In accordance with the presently disclosed subject matter, in a first aspect, a materials handling vehicle includes a sensor, a vehicle position processor, one or more vehicle dead reckoning components, and a drive mechanism configured to move the materials handling vehicle along an inventory transit surface. The sensor is configured to record one or more characteristics of the warehouse. The vehicle position processor is configured to generate sensor data from a record of one or more features recorded by the sensor, extract the one or more features from the sensor data, create a factor graph including sensor extrinsic parameters (externalics) nodes (eO) including an initial seed extrinsic parameter associated with the sensor, and generate an initial vehicle node (vO), an initial sensor frame node (cO), and an initial sensor feature node (fO) in the factor graph. The initial sensor feature node (fO) comprises selected ones of the extracted one or more features that are associated in the initial data association with an initial sensor frame node (cO) and an initial vehicle node (vO). The vehicle position processor is further configured to navigate the materials handling vehicle along the inventory transport surface using the drive mechanism, generate a subsequent vehicle node (v1) in a factor graph based on the accumulated miles from the one or more vehicle dead reckoning components, and generate a subsequent sensor frame node (c1) in the factor graph associated in a subsequent data association with the initial sensor feature node (fO) or one of the subsequent sensor feature nodes (f1) and the sensor outer reference node (eO), the subsequent vehicle node (v 1). The vehicle position processor is further configured to correlate the optimization factor graph based on the initial data correlation and the subsequent data correlation to provide a calibration output associated with the sensor, and to navigate the materials handling vehicle along the inventory transit surface based on the calibration output associated with the sensor.

In a second aspect, including the materials handling vehicle of the first aspect, the vehicle position processor is further configured to: when providing a vehicle seed location and a feature map (feature map) comprising one or more map features (mapped feed, or map features), an initial vehicle node (vO) is generated in the factor map as (i) a map-based initial vehicle node at the vehicle seed location, having (ii) an associated vehicle prior factor, and (iii) the one or more map features from a site map (site map) as corresponding sensor feature nodes, and (iv) a corresponding map feature prior factor.

In a third aspect, including the materials handling vehicle of the second aspect, the associated vehicle apriori factor for the map-based initial vehicle node at the vehicle seed position comprises an error function with respect to the accuracy of the materials handling vehicle phase's vehicle seed position to the site map.

In a fourth aspect, including the materials handling vehicle of the first aspect, the vehicle position processor is further configured to: when no vehicle seed location and feature map is provided, an initial vehicle node (vO) is generated in the factor map as a map-less initial vehicle node at the origin.

In a fifth aspect, including the materials handling vehicle of any of the first to fourth aspects, the vehicle position processor is further configured to: one or more subsequent sensor feature nodes and one or more additional subsequent vehicle nodes are iteratively generated in the factor graph based on accumulated miles from the one or more vehicle dead reckoning components between the previous vehicle node and the immediately subsequent vehicle node, and one or more additional subsequent sensor frame nodes associated in a subsequent data association with the sensor external reference node (eO), the associated immediately subsequent vehicle node, and the associated one of the one or more subsequent feature nodes are iteratively generated in the factor graph.

In a sixth aspect, including the materials handling vehicle of the fifth aspect, the vehicle position processor is further configured to generate an inter-vehicle factor in the factor graph between an immediately subsequent vehicle node and a previous vehicle node based on the accumulated mileage.

In a seventh aspect, including the materials handling vehicle of any of the first to sixth aspects, the vehicle position processor is further configured to select one of the one or more extracted features, associate the one feature with an existing feature that is a matching sensor feature node when the one feature matches the existing feature, and generate a factor in the factor graph that links the generated sensor frame node with the matching feature node.

In an eighth aspect, including the materials handling vehicle of the seventh aspect, the generated sensor framing node comprises one of an initial sensor framing node (cO) or a subsequent sensor framing node (c 1).

A ninth aspect includes the materials handling vehicle of the seventh or eighth aspect, wherein the matched sensor signature node comprises one of an initial sensor signature node (fO) or a subsequent sensor signature node (f 1).

In a tenth aspect, including the materials handling vehicle of any of the first to ninth aspects, the vehicle position processor is further configured to select one of the extracted one or more features, generate a new sensor feature node associated with the one feature in the factor graph when the one feature does not match an existing feature, and generate a factor in the factor graph that links the generated sensor frame node with the new sensor feature node.

In an eleventh aspect, including the materials handling vehicle of the tenth aspect, the new sensor signature node comprises one of an initial sensor signature node (fO) or a subsequent sensor signature node (f 1).

In a twelfth aspect, including the materials handling vehicle of any of the first to eleventh aspects, the vehicle position processor is further configured to optimize the factor graph when none of the extracted one or more features remain associated with an existing feature as a corresponding matched sensor feature node or are added to the factor graph as a corresponding new sensor feature node.

In a thirteenth aspect, including the materials handling vehicle of any of the first to twelfth aspects, the vehicle position processor is further configured to optimize the factor map by generating a constraint optimization problem based on the one or more variables and the one or more factors defined in the factor map.

In a fourteenth aspect, the materials handling vehicle of the thirteenth aspect is included, the one or more variables representing unknown random variables in the constraint optimization problem and including one or more nodes including one or more of a sensor outlier node (eO), an initial vehicle node (vO), a subsequent vehicle node (v1), an initial sensor frame node (cO), a subsequent sensor frame node (c1), an initial sensor feature node (fO), and a subsequent sensor feature node (f 1).

In a fifteenth aspect, including the materials handling vehicle of the thirteenth or fourteenth aspect, the one or more factors represent probabilistic information about a selection factor of the one or more factors and include an a priori factor, an inter factor (between factor), a reference frame factor, a projection factor, an orientation-range factor, or a combination thereof.

In a sixteenth aspect, including the materials handling vehicle of any of the thirteenth through fifteenth aspects, the constraint optimization problem is constructed using one or more smoothing and mapping (SAM) libraries and optimizers.

In a seventeenth aspect, including the materials handling vehicle of any of the thirteenth to sixteenth aspects, the one or more features include one or more overhead lights of the warehouse and the constrained optimization problem is constructed using an image recognition algorithm, a data correlation algorithm, a modeling technique, or a combination thereof.

In an eighteenth aspect, including the materials handling vehicle of any of the first to seventeenth aspects, the vehicle position processor is further configured to terminate the optimization upon determining that the internal parameter map state over a period of time is acceptable based on a predetermined threshold.

In a nineteenth aspect, the materials handling vehicle including any of the first through eighteenth aspects, the sensor is a camera, a laser-based sensor, or a combination thereof, the camera is configured to capture one or more features of a warehouse, the laser-based sensor is configured to detect the one or more features of the warehouse, and the vehicle position processor is further configured to generate sensor data from a record of the one or more features captured by the camera, detected by the laser-based sensor, or a combination thereof.

According to another embodiment of the present disclosure, and in a twentieth aspect, a method of operating the materials handling vehicle of any of the first through nineteenth aspects, comprising: generating, via a vehicle position processor, sensor data from a record of one or more characteristics recorded by a sensor; extracting one or more features from the sensor data; creating a factor graph including a sensor external reference node (eO) including an initial seed external reference associated with a sensor; and generating an initial vehicle node (vO), an initial sensor frame node (cO) and an initial sensor feature node (fO) in the factor graph. The method further includes navigating the materials handling vehicle along the inventory transit surface using the drive mechanism; generating a subsequent vehicle node (v1) based on the accumulated miles from the one or more vehicle dead reckoning components in the factor graph; and generating a subsequent sensor frame node (c1) in a subsequent data association associated with the initial sensor feature node (fO) or one of the subsequent sensor feature nodes (f1) and the sensor external reference node (eO), the subsequent vehicle node (v1) in the factor graph. The method also includes optimizing the factor graph based on the initial data correlation and the subsequent data correlation to provide a calibration output associated with the sensor, and navigating the materials handling vehicle along the inventory transit surface based on the calibration output associated with the sensor.

Although the concepts of the present disclosure are described herein primarily with reference to materials handling vehicles in a warehouse environment, it is contemplated that the concepts will apply to any automated, partially automated, or manual vehicle in an environment.

Drawings

The embodiments set forth in the drawings are illustrative and exemplary in nature and not intended to limit the disclosure. The following detailed description of illustrative embodiments can be understood when read in conjunction with the following drawings, where like structure is indicated with like reference numerals and in which:

FIG. 1 depicts a materials handling vehicle in a warehouse environment including overhead features in accordance with one or more embodiments shown and described herein;

FIG. 2 depicts a computing infrastructure that may be used for a materials handling vehicle in accordance with one or more embodiments shown and described herein;

FIG. 3 depicts an illustrative time lapse map of a materials handling vehicle at multiple locations during mapping and calibration of the warehouse environment of FIG. 1;

FIG. 4 depicts a graphical representation of a calibration path performed by a materials handling vehicle within a calibration coordinate system (calibration coordinate frame) in accordance with one or more embodiments shown and described herein;

FIG. 5A schematically depicts a factor graph of map sensor calibration (mapped sensor calibration) of an image-based sensor of a materials handling vehicle according to one or more embodiments shown and described herein;

FIG. 5B schematically depicts a factor graph for map sensor calibration of laser-based sensors of a materials handling vehicle according to one or more embodiments shown and described herein;

FIG. 6 schematically depicts a factor graph for image-based sensor map-less sensor calibration (calibration) of a materials handling vehicle in an environment according to one or more embodiments shown and described herein;

FIG. 7 schematically depicts a factor graph for map-less calibration of laser-based sensors of a materials handling vehicle in an environment according to one or more embodiments shown and described herein;

FIG. 8 schematically depicts a factor graph for a map-less combined sensor calibration including image-based sensors and laser-based sensors of a materials handling vehicle in an environment, according to one or more embodiments shown and described herein;

FIG. 9 depicts a flow diagram illustrating another method of calibrating sensors of a materials handling vehicle in accordance with one or more embodiments shown and described herein;

FIG. 10 depicts another flow diagram illustrating a method of calibrating sensors of a materials handling vehicle by applying a calibration algorithm process in accordance with one or more embodiments shown and described herein;

FIG. 11A depicts a first subset of a flow chart illustrating a method of constructing a factor graph and calibrating sensors of a materials handling vehicle using the factor graph in accordance with one or more embodiments shown and described herein;

FIG. 11B depicts a second subset of the flow chart of FIG. 11A; and

FIG. 11C depicts a third subset of the flow chart of FIG. 11A.

Detailed Description

Embodiments disclosed herein include systems and methods for calibrating sensors relative to a materials handling vehicle. Some embodiments relate to calibration of image-based sensors, such as image capture devices (e.g., cameras) disposed on a materials handling vehicle. The materials handling vehicle may be positioned within the warehouse environment by tracking a plurality of static features in the environment, where the static features may include static features tracked through the use of one or more sensors, such as cameras that capture image data. Such features may be overhead features (overhead features) and/or other static infrastructure. The overhead features may include, for example, skylights, ceiling lights, reflectors, and/or other suitable overhead components for capture or detection. The static infrastructure may include such features, whether overhead or not, and may include, for example, rack legs and/or readable tags. Other position and location sensor systems, e.g., laser-based sensors (such as LIDAR), and/or other sensor systems (such as RADAR), ultrasonic sensor systems disposed on materials handling vehicles, etc., may also be calibrated using the systems and methods described herein. For purposes of defining and describing the concepts and scope of the present disclosure, it should be noted that "warehouse" includes any indoor or otherwise covered facility in which materials handling vehicles transport goods, including, but not limited to, warehouses primarily used for storage of goods, such as those in which single or multi-tiered warehouse racks or storage units are disposed in aisles or elsewhere, and manufacturing facilities in which goods are transported around the facility by materials handling vehicles for use in one or more manufacturing processes.

More specifically, embodiments described herein may be configured to reduce manufacturing tolerances of sensors coupled on a materials handling vehicle such that the calibrated sensors may position the materials handling vehicle with increased accuracy in an environment as compared to sensors that are installed but not calibrated with respect to the materials handling vehicle. By the system and method of calibrating sensors, the sensors may be calibrated with respect to sensor locations on and with respect to a particular materials handling vehicle. For example, the roll, pitch, and yaw of the sensor relative to the materials handling vehicle may be more accurately determined as compared to the initial manufacturing tolerances of the sensor to more accurately determine the vehicle attitude based on the use of the sensor in a warehouse environment. Further, the systems and methods for calibrating sensors described herein can be readily employed in near real-time processing when the materials handling vehicle is delivered and set up in the environment, for maintenance, when performing part replacements, when retrofitting existing trucks, when placing sensors (e.g., to account for changes in sensor external parameters over a long period of time), after a period of time (e.g., to adjust for sensor drift), and the like. In some embodiments, deployment and calibration of positioning sensors of a materials handling vehicle may or may not implement a site map of the environment.

The term "positioning" is used herein to refer to any of a variety of system configurations that enable position tracking of a materials handling vehicle in a warehouse environment. The concepts of the present disclosure are not limited to any particular positioning system configuration. A system implementing one or more positioning sensors may be applicable to any of a variety of conventional and yet to be developed positioning systems. Such POSITIONING systems may include those described in U.S. patent No.9,349,181 entitled "LOST VEHICLE RECOVERY using assisted supplied VEHICLEs pair" issued 24/5/2016, U.S. patent No.9,984,467 entitled "VEHICLE position OR NAVIGATION identifying assisted supplied VEHICLEs pair" issued 29/5/2018, AND/OR U.S. published patent application No.2012/0303255 entitled "METHOD AND APPARATUS FOR PROVIDING current position of VEHICLE location using a VEHICLE contact VEHICLE" issued 29/11/2012. Further, such a positioning system may utilize sensors such as image capture devices to determine the pose of the materials handling vehicle in the warehouse environment through image capture using overhead features, as described in the above-referenced positioning system patents. Such sensors may be calibrated to provide more accurate location determinations by the positioning system. An example of such a calibration technique is described in U.S. patent publication No.2016/0353099, assigned to Crown Equipment, inc, which involves comparing image data to a stored site map of a warehouse environment to determine calibration values for an image capture device. The concepts described herein provide calibration techniques for positioning sensors on a materials handling vehicle that can be calibrated regardless of whether reference is made to a storage site map of a warehouse environment. Thus, regardless of whether a site map has been created for the warehouse environment, calibration can be simplified and universally implemented in the environment.

Positioning sensors operating with such positioning systems may be used to position and/or navigate a materials handling vehicle through a warehouse environment, such as a warehouse, yard, or the like. In some embodiments, image-based sensors (e.g., cameras) and/or laser-based sensor systems may be mounted to a materials handling vehicle (e.g., an automated guided vehicle or a manually guided vehicle as a materials handling vehicle) that navigates through a warehouse and may assist in vehicle positioning. The laser-based sensor system may include a laser scanner, a laser rangefinder, a 2D/3D mapping laser, a LIDAR, or a combination thereof.

In some embodiments, calibration of image-based sensors (e.g., cameras) may be used for a materials handling vehicle that utilizes overhead feature detection for position determination and/or routing of the materials handling vehicle. The camera calibration parameters may include calibration of roll, pitch, yaw, x-position, y-position, z-position, focal length, other camera external parameters, internal parameters, and the like. The camera calibration may include an internal reference calibration and an external reference calibration. The internal reference calibration includes determining parameters of the camera model itself within a suitable error range. The external reference camera calibration may include determining the position of the camera on the materials handling vehicle within a suitable error range. Embodiments described herein focus at least on external reference calibration of a camera. The system and method for vehicle calibration incorporating the same will be described in more detail below.

Referring now to FIG. 1, a materials handling vehicle 100 is illustrated that includes materials handling hardware configured to utilize overhead features such as lighting for positioning and navigation services in accordance with embodiments described herein. Overhead features may include various styles, types, shapes and sizes of light fixtures, skylights, etc. As shown, the materials handling vehicle 100 may be configured to navigate through a warehouse environment 10, such as a warehouse. The materials handling vehicle 100 may be configured as a materials handling vehicle for lifting and moving cargo. Examples of materials handling vehicles include, but are not limited to, forklifts, lead trucks, turret trucks, portable stackers, tractors, pallet trucks, high/low stacker trucks, trailer loaders, side loaders, fork lifts, and the like. The materials handling vehicle 100 may be configured to automatically navigate around the floor of the warehouse environment 10 along a desired route and/or manually/remotely operate. Thus, the materials handling vehicle 100 may be directed forward and rearward by rotation of one or more wheels. Additionally, the materials handling vehicle 100 may change direction by maneuvering one or more wheels. Materials handling vehicle 100 may be autonomously controlled and/or include operator controls for controlling functions of materials handling vehicle 100, such as, but not limited to, speed of the wheels, orientation of the wheels, etc.

Operator controls may include inputs and outputs assigned to functions of the materials handling vehicle 100 such as, for example, switches, buttons, joysticks, handles, pedals, calibration indicators, and the like. The operator controls may additionally include a position system, a positioning system, an accelerator, a brake, an autonomous mode option, and/or other controls, outputs, hardware, and software for manually, semi-autonomously, and/or fully autonomously operating the materials handling vehicle 100. The operator controls may additionally include odometers for determining the distance traveled by the materials handling vehicle 100. The odometer may be configured to determine a determined number of rotations of one or more wheels (e.g., using one or more motion sensors 114 (fig. 2), such as encoders coupled with a drive axle or one or more wheels) and calculate a distance traveled based on a predetermined circumference of the wheel. In some embodiments, the odometer may be determined based on one or more Inertial Measurement Units (IMUs), laser scanning matching, and/or visual odometers to calculate the distance traveled.

The materials handling vehicle 100 may also include a sensor 110 disposed on the materials handling vehicle 100 and disposed relative to the materials handling vehicle 100, whereby the sensor 110 may be an image based sensor, such as a camera 110A. Fig. 2 depicts a computing infrastructure that may be implemented and used for the materials handling vehicle 100 that is configured to calibrate the sensors 110, as described in more detail further below. The sensors 110 may additionally or alternatively be other sensor types configured to assist in vehicle positioning, such as laser-based sensors 110B as described in more detail below.

As a non-limiting example described with respect to fig. 1-4, sensor 110 may be an image-based sensor, such as a camera. With respect to such image-based sensor embodiments, the sensor 110 may be a digital still camera, a digital video camera, an analog still camera, an analog video camera, a red-green-blue (RGB) camera, an RGB-depth (RBGD) camera, and/or other device for capturing overhead images. The captured image may be formatted as JPEG, JPEG 2000, Exif, TIFF, raw image format, GIF, BMP, PNG, Netpbm format, WEBP, raster format, vector format, and/or other types of formats. Thus, the sensor 110 may comprise an image sensor as an image capture device, such as a Charge Coupled Device (CCD), complementary metal oxide semiconductor sensor, or a functional equivalent thereof. In some embodiments, the materials handling vehicle 100 may be located within the warehouse environment 10 and configured to capture overhead images of the ceiling of the warehouse environment 10. To capture an overhead image, the sensor 110 may be mounted to the materials handling vehicle 100 and focused toward the ceiling. The sensors 110 may capture sensor data 144 (e.g., image data) of the ceiling and/or the warehouse environment 10 within an image area 112 (e.g., a field of view of a camera such as the image areas 112A-112L of fig. 3), which may be based on camera specifications and mounting locations on the materials handling vehicle 100. For example, image area 112 is depicted with reference to sensor 110 focused on the ceiling of warehouse environment 10, which includes features 150, such as ceiling lights 150A-150S of fig. 3.

The ceiling of the warehouse environment 10 may include overhead lights such as, but not limited to, ceiling lights 150A-S for providing illumination from the ceiling or generally above the materials handling vehicle 100 operating in the warehouse. The ceiling lights 150A-S may include skylights, fluorescent lights, industrial lighting fixtures, and/or other types of lights of various shapes and sizes; and may be mounted or suspended in a ceiling or wall structure to provide illumination. FIG. 4 depicts an embodiment of a plurality of locations of the materials handling vehicle 100 in a coordinate reference frame and the locations of the ceiling lights 150A-S in the coordinate reference frame as determined by at least one or more mapless calibration processes described herein and described in further detail below.

The systems and methods described herein may also employ one of a variety of image recognition, data correlation, and modeling techniques to construct optimization problems. In some data correlation embodiments, tracking of static overhead features (such as ceiling lights between frames) may be accomplished using a pixel velocity nearest neighbor image space tracking algorithm or any algorithm capable of tracking features from frame to frame without a priori knowledge of the location of the features in the camera calibration frames. Further, image recognition embodiments may include edge detection, feature detection, and/or object recognition routines for tracking static features or portions of static overhead features on a frame-by-frame basis. Further, modeling techniques may include techniques such as Motion recovery Structure (SFM), simultaneous localization and mapping (SLAM), george Technology smoothing and mapping (GTSAM) libraries, or other smoothing and mapping (SAM) libraries that may be implemented using factor graphs and/or bayesian networks as an underlying paradigm. It should be understood that other modeling techniques are contemplated within the scope of the present disclosure.

In particular, libraries may be utilized to construct optimization problems, such as C + + like GTSAM libraries or other smooth and map (SAM) libraries. GTSAM libraries use bayesian network-derived Factor Graphs (FGs) as the underlying computational paradigm to achieve smoothing and mapping in robotics and vision.

The SAM library and other similar libraries provide a common framework for building optimization problems related to satisfying multiple spatial constraints with various associated uncertainties. It is contemplated that portions of the library for SLAM may also be utilized. The SLAM portion of the SAM library provides the functionality specifically for optimizing sensor external reference calibration using a reference frame factor.

Referring now to fig. 5A-8, illustrative examples of different factor maps for calibrating various sensors (e.g., cameras and/or laser systems) coupled to a materials handling vehicle are depicted. Fig. 5A relates to a factor graph FG, 500 constructed from a map configuration (mapped configuration) using an image-based sensor, such as camera 110A, wherein reference is made to a site map 142 of warehouse environment 10, which includes one or more map features, such as stored overhead feature locations and known initial vehicle seed locations. Fig. 5B relates to a factor graph FG, 500' constructed from a map configuration using a laser-based sensor 110B (such as a laser range finder, LIDAR, etc.) and in which a site map 142 of the warehouse environment 10 is referenced. The site map of the warehouse environment 10 includes one or more map features, such as a stored overhead feature positioning and calibration coordinate system, so that the initial vehicle seed position may be known relative to the site map. Fig. 6 relates to a factor graph FG constructed from a map-less configuration (600) using an image-based sensor, such as a camera, where the initial vehicle position is unknown and set to the origin of the calibration coordinate system and features are extracted by the sensor, as described in more detail below, and reference is made to the calibration coordinate system mapping. Fig. 7 relates to a factor graph FG, 700 constructed from an unmapped configuration similar to that of fig. 6, but using a laser-based sensor 110B, the sensor 110B being configured to determine an azimuthal range (bearing range) from a feature that acts as a laser-reflective surface disposed within the environment. Fig. 8 relates to a factor graph FG, 800 constructed from an unmapped configuration (e.g., if dashed position prior factor 804 and dashed prior factors 805, 806, 807 are not included) or a mapped configuration (e.g., if dashed position prior factor 804 and dashed prior factors 805, 806, 807 are included), which is a combination of fig. 6-7 (e.g., unmapped) or fig. 5A-5B (e.g., mapped), where both image-based sensor 110A and laser-based sensor 110B are used to construct the factor graph. 11A-11C, described in more detail below, describe how each of the factor graphs of FIGS. 5A-8 is constructed by one or more calibration processes as described herein.

Although reference is initially made with respect to fig. 5A, the following generally applies to each of the factor graphs depicted and described herein. Factor graph 500 may include nodes 510, 512, 514, 516, 518, 520, 522, 524, 526 (e.g., observations) and factors 502, 504, 506, 508, 519, 521, 523, 525, 527, 529, 531, 533. The factors set between the nodes represent a formula indicating the uncertainty of the probability function or the transformation or relationship between the nodes. An initial factor (e.g., a priori factor) before a node represents the initial uncertainty of the corresponding node. For example, in an embodiment, the a priori factors model the initial uncertainties of the various states utilized by the optimizer as described herein. Node 510 represents a sensor external parameter. By way of non-limiting example, camera external parameters may include manufacturing tolerances and specifications of the camera model and the pose (e.g., roll, pitch and yaw, x, y, and/or z positions) of the camera (or generally the sensor) with tolerance measurements and mounting on the materials handling vehicle 100. Another node may be vehicle nodes 512, 514, and 516. The vehicle nodes 512, 514, and 516 may indicate vehicle coordinates at respective locations in the calibration coordinate system CCF. These locations may be derived from sensors that determine the odometer of the materials handling vehicle 100. Another node may be a sensor frame node 518, 520, 522 (also referred to as a camera frame node, CFN), which may be defined by frames captured by a camera at a vehicle node. In addition, image-based sensor feature nodes 524 and 526 of an image-based sensor, such as a camera, may be determined and populated within factor graph 500.

As described herein, a "frame" may be one or more frames captured with respect to the unique image region 112A depicted in fig. 3, which may include at least one static overhead feature captured by an image sensor (e.g., an image-based sensor) or other type of sensor being calibrated (e.g., a laser-based sensor). For example, a frame is some example of data generated from sensors to be calibrated provided on the materials handling vehicle 100 during travel of the materials handling vehicle 100. Certain instances may be selected when the materials handling vehicle 100 exceeds a linear or angular absolute distance traveled threshold while capturing images and/or when a set of observations (e.g., not necessarily from a single sensor) passes through a set of heuristics that indicate that the observations should provide information that will help reduce optimization errors. In an embodiment, a combination of such frames may be used to determine the position of one or more materials handling vehicles 100 in a calibrated coordinate system (CCF) and the pose of one or more sensors capturing images relative to one or more vehicles on which the one or more sensors are positioned.

In embodiments utilizing image-based sensors, such as cameras, in a map-less configuration (e.g., in fig. 6), frame data may be utilized to determine the location of materials handling vehicle 100 in the CCF as a vehicle node, and a Camera Frame Node (CFN) may be attached to the vehicle node and a camera external reference node (CEN) to model the pose of the camera in the CCF relative to materials handling vehicle 100. Features from the current image may be associated with the previous image to append to the CFN and construct a Factor Graph (FG). This association may occur, for example, by using a pixel velocity nearest neighbor image space tracking algorithm or any algorithm that tracks features from frame to frame without a priori knowledge of the location of the features in the CCF. In some embodiments, features may be grouped over multiple (e.g., consecutive or non-consecutive) frames. The associated features may then be appended to the CFN, which may result in multiple CFNs referencing a single sensor feature node, and this may be accomplished by using a projection factor in, for example, an un-mapped configuration.

In general, the factor graph FG may be populated because each frame is selected from the data collected by the sensors 110. The factor graph models observations with uncertainty, where the observations are directly related to each other by a factor. Furthermore, the factor graph does not include time as a variable and does not have to rely on a forward time factor. The factor graph is a graphical model for modeling complex estimation problems, such as simultaneous localization and mapping (SLAM) or motion recovery Structure (SFM). The factor graph behavior is a bipartite graph comprising factors connected to nodes (also called variables), as described below. The variables represent unknown random variables in the estimation problem, and the factors represent probability information for those variables derived from measurements or a priori known information. For example, referring to fig. 5A as a reference, the nodes may include a sensor external parameter node (eO)510, a first vehicle node (vO)512, a second vehicle node (v1)514, a third vehicle node (v2)516, a first sensor frame node (cO)518, a second sensor frame node (c1)520, a third sensor frame node (c2)522, a first sensor feature node (fO)524, and a second sensor feature node (f1)526, and factors 513, 515, 519, 521, 523, 525, 527, 529, 531, and 533 that link the nodes to each other.

A node (e.g., 510, 512, 514, 516, 518, 520, 522, 524, 526) may be linked to one or more other nodes (e.g., 510, 512, 514, 516, 518, 520, 522, 524, 526) by factors (e.g., 519, 521, 523, 525, 527, 529, 531, 533). The factor graph FG can be represented and used to construct the constrained optimization problem to be solved. Optimization problems can be constructed using the GTSAM library and an optimizer. The optimization problem can then be solved and an error metric can be derived from the factor graph FG. The error metric can then be used to determine when to stop calibration by displaying calibration factors to the user to select such a minimum error metric, or the algorithm can automatically stop calibration when the error is within or below a predetermined value. Non-limiting examples of such calibration algorithms are described below with respect to fig. 9-10.

In embodiments, a toolkit such as a GTSAM library may provide solutions to SLAM and SFM problems, and may also be used to model and solve optimization problems, utilizing a factor graph-based C + + library. In general, optimization of the factor graph aims to reduce the overall error of all factors in the factor graph. The factor is a value element, such as a value of a direct or indirect observation or optimization problem. The factors and variables/nodes within the factor graph FG may exist in multiple coordinate systems. The factor graph may include multiple types of nodes and multiple types of factors.

As non-limiting examples, the types of nodes may include one or more vehicle nodes, one or more sensor frame nodes (image-based and/or laser-based as described herein), sensor external reference nodes for each respective sensor, and one or more sensor feature nodes (image-based and/or laser-based). These nodes may be applied to the map or map-less configuration described herein.

Each vehicle node may be associated with a two-dimensional (2D) pose, a 3D pose (e.g., x, y, z, roll, pitch, yaw, or any subset thereof), etc. in a global reference frame, including, for example, an x-position value, a y-position value, and yaw. Yaw indicates rotation about the z-axis, also known as heading. In an embodiment, the x and y position values may be in meters and the yaw may be in radians.

Each sensor frame node, whether image-based or laser-based, is associated with a three-dimensional (3D) pose in a global reference frame, including, for example, x-position values, y-position values, z-position values, roll, pitch, and yaw. Roll indicates rotation about the x-axis and pitch indicates rotation about the y-axis. In an embodiment, the x, y, and z position values may be in meters, and the roll, pitch, and yaw values may be in radians.

Each sensor appearance node, whether image-based or laser-based, of each respective sensor is associated with a three-dimensional position or attitude (e.g., x, y, z, roll, pitch, yaw, or any subset thereof), such as in a local reference frame relative to the vehicle, to describe an attitude increment between the center of the kinematic vehicle and the stored origin of the sensor. The three-dimensional attitude includes, for example, an x-position value, a y-position value, a z-position value, a roll, a pitch, and a yaw.

Each sensor feature node, whether image-based or laser-based, is associated with a three-dimensional pose of a feature in a global reference frame. In embodiments, the features may be overhead features including, but not limited to, ceiling lights, skylights, etc. as described herein for capture by image-based sensors and/or laser-based sensors.

By way of example and not limitation, the types of factors may include a priori factors, inter factors (BF), Reference Frame Factors (RFF), Projection Factors (PF), azimuth range factors, and the like. The a priori factor provides an initial estimate of the value (node/variable). The a priori factor may be a 2D attitude estimate (e.g., including x, y, and yaw) of the initial vehicle position in a map calibration process, which will be further described with respect to the position a priori factor 504 in fig. 5A. In embodiments, the location prior factor 504 may be provided by a user or automatically based on the location of the vehicle relative to the site map. Another example of a priori factors may be 3D point estimates (e.g., including x, y, and z) of map features in a map calibration process, which will be further described with respect to the a priori factors 506 and 508 of fig. 5A and 506 'and 508' of fig. 5B.

Further, the a priori factors in the map configuration or the map-less configuration may be 3D pose estimates of seed sensor external parameters (e.g., including x, y, z, roll, pitch, and yaw) as may be provided by manufacturing tolerance based, which will be further described with respect to a priori factors 502 of the image-based sensor 110A of fig. 5A, a priori factors 502 of the laser-based sensor 110B of fig. 5B, a priori factors 602 of the image-based sensor 110A of fig. 6, a priori factors 702 of the laser-based sensor 110B of fig. 7, a priori factors 802 of the laser-based sensor 110B of fig. 8, and a priori factors 803 of the image-based sensor 110A of fig. 8.

The a priori factors (e.g., factors 502, 504, 506, and 508 of fig. 5A) include a priori and/or initial values that encode known values and/or uncertainties when creating the factor graph, such as a location a priori factor 504 that sets the location of the materials handling vehicle 100. With respect to the factor graphs shown in fig. 5A-8 and described in detail below, each factor graph illustrates only various calibrations of various sensors of the environment deployed for the respective map and map-less configuration with and without the materials handling vehicle. Unless otherwise noted, the description with respect to the nodes and/or factors of fig. 5A applies to the nodes and factors of fig. 5B-8.

The intermediate factor (BF) describes the relationship Between two nodes, such as two vehicle nodes. For example, the inter-factor represents a 2D attitude increment (e.g., provided by a cumulative odometer) indicative of x, y, and yaw changes between two vehicle nodes. An embodiment of such an intermediate factor is depicted and described with respect to factors 513 and 515 in fig. 5A.

The Reference Frame Factor (RFF) may be used to calculate a relationship between two reference frames, which may be, for example, two different types of nodes. For example, the reference frame factor may indicate a 3D pose that describes a relationship (e.g., as an incremental change) between a kinematic center of the vehicle and a reference frame of the sensor. In an embodiment, the reference frame factors 519, 521, and 523 of FIG. 5A are used to calculate and/or optimize the sensor peripheral node 510 based on the relationships between the vehicle nodes 512, 514, and 516 and the sensor frame nodes 518, 520, and 522.

A Projection Factor (PF) may be used to describe the relationship between the observed features and the reference frame sensors in which they are observed. The projection factor may be used for sensors without range information. That is, the projection factor uses a projection model to calculate the observation error (e.g., using a pinhole camera model). In some embodiments, the feature locations are set in local sensor coordinates (e.g., in pixel (u, v) in a u-v coordinate space associated with a 3D modeled UY map involving 2D images to a 3D model surface) and may be projected as infinite lines (e.g., x, y, and z location values in meters) into global 3D coordinates. Referring to fig. 5A, projection factors 525, 527, 529, 531 and 533 describe how image-based sensor feature nodes 524 and 526 are observed from sensor frame nodes 518, 520 and 522 (e.g., camera frame node CFN), taking into account the inherent characteristics of the image-based sensor. In an embodiment, the projection factor may be used for a map image based calibration process (e.g., fig. 5A). Furthermore, the projection factor may be used in a map-less image-based calibration process (e.g., fig. 6) that does not require an initial a priori factor estimation of the matching features.

As another type of factor, a bearing-range factor (BRF) describes the relationship between an observed feature and a reference frame of a sensor observing it, such as, for example, from a sensor providing range and bearing information. Referring to fig. 7, the azimuth range factors may include factors 725, 727, 729, 731, and 733 and similar factors of fig. 5B and 8 associated with the laser-based sensor and the features detected by the sensor. The azimuth range factor may provide range measurements (e.g., in meters) and azimuth measurements (e.g., in radians) in the local coordinates of the various sensors, and may calculate errors based on the transformation to the global 3D space to include x, y, and z positions (e.g., in meters).

Still referring to FIG. 5A, a factor graph 500 of map camera calibration for a materials handling vehicle is depicted and described in greater detail. In the present embodiment, calibration of the sensor as a camera is completed by using a site map. The camera external reference nodes 510 include the roll, pitch, yaw, x, y, and/or z positions of the camera and the expected variance (i.e., standard deviation) is set with reference to the a priori factors 502 of the camera seed positions with uncertain external reference values. When using the site map, the materials handling vehicles are seeded within the site map by associating the vehicles with locations within the site map, as indicated by the location prior factor 504. Since the materials handling vehicle is seeded (i.e., the actual position of the vehicle in the environment is associated with the position in the site map), there is a level of uncertainty modeled by the position prior factor 504. The seed position indicates the origin of the initial vehicle attitude (vO). As referred to herein, the "origin" refers to the seed position (or initial position vO) of the vehicle in the calibrated coordinate system. The origin vO may be set at the origin O of the calibrated coordinate system CCF or may be an arbitrary origin vO, which is not the origin O of the calibrated coordinate system CCF. The seed external reference value (e.g., sensor external reference point eO) is based on manufacturing tolerances of sensors mounted on the materials handling vehicle. An initial sensor frame (e.g., sensor frame node, cO) may be calculated by combining the sensor external reference nodes eO on the initial vehicle pose vO. This becomes the origin vO of the calibration coordinate system CCF (at the origin O or any starting position set in another way) and the initial vehicle pose vO models the position of the vehicle at the seed position. The observed V can be usedx(e.g., V1514) and Vx-1(vO 512) calculating new V by dead reckoningx(e.g., v 151)4) Values and add between them an inter-factor (e.g., 513 and 515) indicating uncertainty about the observed dead reckoning. Sensor frame nodes 518, 520, and 522 representing camera frame nodes are for each VxThe factors model the pose of the camera in the same coordinate system and the sensor frame nodes 518, 520, and 522 are linked to the reference frame factors 519, 521, and 523 of each camera frame at the respective vehicle nodes 512, 514, 516. Observations, such as overhead features observed by the camera, serve as a common projection factor for multiple observations and correlation between frames. In an embodiment, logic is provided to derive a location of an observed feature from a plurality of observations. Projection factors 525, 527, 529, 531, 533 are created to link to each sensor feature node 524 and 526 observed in the current frame. There may be a many-to-many, one-to-one, one-to-many, or many-to-one relationship between the sensor frame nodes 518, 520, and 522 and the sensor feature nodes 524 and 526.

In some embodiments, once there is a fully formed factor graph, an optimizer is run to minimize the error in all factors by moving all nodes. However, in some embodiments, the optimizer may be implemented at any time in the construction of the factor graph. Thus, when sufficient information is built into the factor graph, the construction of the factor graph can be stopped so that the optimizer can formulate an optimization problem for determining the calibration factor. Further, the optimized camera external parameters for camera calibration may be determined with the minimized error reading eO, as described in more detail below.

Referring now to FIG. 5B, a map laser calibrated factor graph 500' of a materials handling vehicle is depicted and described in greater detail. In some embodiments, the image-based sensor/camera 110A described with respect to fig. 5A may instead be a laser-based sensor 110B calibrated using a site map. For example, the sensor external reference node 510 'may include the roll, pitch, yaw, x, y, and/or z positions and expected variances (i.e., standard deviations) of the lasers, which are a priori factors 502' with reference to the laser seed position with uncertain external reference values. Further, when using the site map, the material handling vehicles are seeded within the site map by associating the vehicles with locations within the site map, as indicated by the location prior factor 504'.

The laser records the sensor nodes (e.g., 518', 520', 522') as laser nodes associated with the vehicle nodes 512', 514', and 516', such that the laser node 518 'is associated with the vehicle node vO, 512', the laser node 518 'is associated with the vehicle node v1, 514', and the laser node 518 'is associated with the vehicle node v2, 516'. The laser based sensor feature nodes 524 'and 526' are determined by lasers, plotted against each laser node 518', 520' and 522', and can be associated by azimuth range factors 525', 527', 529', 531', 533', and 533', defining the relationship between the laser nodes 518', 520', and 522' and the laser based sensor feature nodes 524 'and 526'.

Referring now to FIG. 6, a factor graph 600 for map-less camera calibration for materials handling vehicles in an environment is depicted. The calibration depicted in factor graph FG 600 does not include the use of a site map (e.g., is map-less), and therefore, the initial position information of materials handling vehicle 100 is represented as vehicle node vO, 612. For example, the initial position vO is shown on the calibrated coordinate system CCF of fig. 4, in which case it is not the origin O. By way of non-limiting example, the initial location information may be any location within the environment and need not be located relative to the site map 142 or any other reference location. As the materials handling vehicle 100 travels, the estimated location of the materials handling vehicle 100 as vehicle locations 0, 1, 2, etc. (i.e., vO, v1, v2, etc.) is recorded as, for example, odometer-based vehicle nodes 612, 614, 616. The inter-factors 613, 615 are set between such vehicle nodes 612, 614, 616. Further, at the location of the calibrated coordinate system CCF of the vehicle nodes, the sensor as a camera records the sensor frame nodes 618, 620, 622 associated with each vehicle node 612, 614, 616, such as the sensor frame nodes cO, 618 associated with the vehicle nodes vO, 612; sensor frame node c1, 614 associated with vehicle node v1, 620; and sensor frame nodes c2, 616 associated with vehicle nodes v2, 622. The image-based sensor feature nodes 624, 626 depicted relative to each sensor frame node 618, 620, 622 may be associated by projection factors 625, 627, 629, 631, 633 defining the relationship between adjacent sensor frame nodes 618, 620, 622 and image-based sensor feature nodes 624, 626.

Referring now to FIG. 7, a factor graph 700 for mapless laser calibration of materials handling vehicles in an environment is depicted. Factor graph 700 is similar to factor graph 600 (FIG. 6), except that the sensor being calibrated here is part of a laser-based system modeled in the factor graph. The sensor external reference points 710 that are directed to the laser external reference points (LEN) may include the roll, pitch and yaw, x, y and/or z positions and expected variances (i.e., standard deviations) of the lasers, which are set with reference to the a priori factors 702 of the laser seed positions with uncertain external reference values. The calibration depicted in the factor graph 700, FG, does not include the use of a site map (e.g., is map-less). Thus, the initial position information as the vehicle node 712 is set as the origin of the materials handling vehicle 100 and is represented as a vehicle node vO (e.g., having a coordinate position vO, FIG. 4). By way of non-limiting example, the initial location information may be any location within the environment and need not be located relative to the site map 142 or any other reference location. As the materials handling vehicle 100 travels, the estimated location of the materials handling vehicle 100 as vehicle locations 0, 1, 2, etc. (i.e., vehicle nodes vO, v1, v2, etc.) are recorded as, for example, odometer-based vehicle nodes 712, 714, 716. Intermediate factors 713 and 715 are provided between such vehicle nodes 712, 714, 716. Further, the laser records the respective laser node 718, 720, 722 associated with each vehicle node 712, 714, 716, such as the laser node L0, 718 associated with the vehicle node vO, 712; laser nodes L1, 720 associated with vehicle nodes v1, 714; and laser nodes L2, 722 associated with the vehicle nodes v2, 716. Laser-based sensor feature nodes 724, 726 that represent feature information determined by the laser are depicted relative to each sensor frame node 718, 720, 722 and may be associated by azimuth-range factors 725, 727, 729, 731, 733 that define the relationship(s) between the sensor frame nodes 718, 720, 722 and the laser-based sensor feature nodes 724, 726.

The laser positioning system may identify features similar to features identified by the image data of the camera, such as edges of objects or features within the environment, by employing a laser system, such as a LIDAR system. In some embodiments, the laser-based positioning system may identify a range of positions of reflector objects positioned throughout the environment. The reflector object may be similar to the camera's light in that the reflector object is a feature used by the laser system to map the environment and calibrate the laser-based sensor 110B as if the camera 110A were used to illuminate.

Referring now to FIG. 8, a factor graph 800 for calibrating an un-mapped positioning system that implements a combination of cameras and lasers for materials handling vehicles within an environment is depicted. In an embodiment, the inclusion of a dashed position prior factor 804 (indicated by the dashed line and described herein with respect to using a position prior factor related to the initial vehicle position and using a site map) will allow calibration of a map positioning system (mapped localization system) that implements the combination of cameras and lasers of the materials handling vehicles in the environment.

The sensor external reference nodes 808 may include the roll, pitch, yaw, x, y, and/or z positions and expected variances (i.e., standard deviations) of the lasers, which are set with reference to the a priori factors 802 of the laser seed positions with uncertain external reference values. The sensor external parameters 810 may include the roll, pitch, yaw, x, y, and/or z positions of the camera and expected variances (i.e., standard deviations) that are set with reference to the a priori factors 803 for the camera seed position with uncertain external parameters. Similar to the factor graphs in fig. 6 and 7, the calibration modeled in the FG described with reference to fig. 8 and in the factor graph 800 does not include the use of a site map (e.g., is map-less). Thus, the initial position information for the vehicle node 812 is set as the origin of the materials handling vehicle 100 and is represented as the vehicle node vO (e.g., the coordinate position having vO on the calibrated coordinate system CCF of FIG. 4). When the materials handling vehicle 100 is traveling, the estimated location of the materials handling vehicle 100 as vehicle locations 0, 1, 2, 3, etc. (i.e., vehicle nodes vO, v1, v2, v3, etc.) is recorded as, for example, odometer-based vehicle nodes 812, 814, 816, 818. The inter-factors 813, 815, 817 are arranged between such vehicle nodes 812, 814, 816 and 818.

In some positions of the vehicle nodes 812, 814, 816, 818 in the calibration coordinate system CCF, the cameras record camera frame nodes 820, 824 associated with the vehicle nodes 812, 816, such as camera frame nodes cO, 820 associated with the vehicle nodes vO, 812, and camera frame nodes c1, 824 associated with the vehicle nodes v2, 816. The feature nodes 828 depicted with respect to the camera frame nodes 820, 824 may be associated by projection factors 827, 829 that define the relationship between adjacent camera frame nodes 820, 824 and the feature node 828.

Further, with respect to the vehicle nodes 812, 814, 816, 818, the laser records the laser nodes 822, 826 associated with the vehicle nodes 814, 818, such that the laser nodes L0, 822 are associated with the vehicle nodes vO, 814 and the laser nodes L1, 826 are associated with the vehicle nodes v3, 818. Laser-based sensor feature nodes 830, 832 determined by the laser are depicted relative to each laser node 822, 826 and may be associated by azimuth range factors 831, 833, 835 that define the relationship between the laser nodes 822, 826 and the laser-based sensor feature nodes 830, 832.

It should be appreciated that the factor graph provides a method of correlating observed information collected by one or more sensors being calibrated to vehicle locations and characteristics within the environment. The laser and camera may be calibrated by a relationship between the vehicle position (i.e., the vehicle node) and the observed features (e.g., as laser-based sensor feature nodes and/or image-based sensor feature nodes). As described above, the relationships between variables/nodes are defined by factors. These factors may represent probabilistic information about these variables. In some embodiments, the factor may define a factor function related to the transformation between nodes. During optimization, the error for each variable may be minimized such that the probability information associated with the variables is maximized.

Turning now to fig. 9-11C, several flow charts illustrating a method of calibrating sensors of a materials handling vehicle are depicted. 9-10 depict a process of applying a calibration algorithm by using an optimizer in a map or map-less configuration. 11A-11C depict a calibration method that constructs a factor graph, such as the factor graphs of FIGS. 5A-8 described above, and uses the factor graph to calibrate the position of the sensor relative to the sensor on the materials handling vehicle, as described herein.

Referring to fig. 9, a flow chart 900 of a sensor calibration method that can be applied to a map or mapless calibration system is depicted. Sensor calibration begins at block 902. Calibration of the sensor may be initiated automatically by the system, or may be initiated manually through a GUI interface or command prompt interface of the computing device. In some embodiments, calibration may be initiated automatically upon the first activation of the materials handling vehicle. In some embodiments, as shown in block 910, an initial step of calibration includes determining and/or confirming a vehicle model of the materials handling vehicle. The vehicle model may be determined manually by inputting the vehicle model or selecting the vehicle model through a computing device. Alternatively, the vehicle model may be determined automatically by a preprogrammed identifier or by the computing device completing a scan and registration of the component connected thereto. That is, the computing device may be able to automatically determine the vehicle model of the materials handling vehicle through a scan of the installed hardware. For example, a model matching method may be implemented to determine the model of the materials handling vehicle. That is, when sensor data is collected, for example, from encoders, IMUs, imaging devices, laser sensors, etc., the data and sources may be compared to a set of known mathematical models to select the type of materials handling vehicle.

Once the vehicle model is determined, two tasks may be initiated. One task may include initiating a kinematic model so that sensors and systems used to determine odometry may be calibrated, as shown at block 920. For example, an odometer may be calculated by moving the materials handling vehicle back and forth between two known points several times and comparing the difference between the wheel encoders and the actual distance. Such odometer CALIBRATION may be used as described in U.S. patent No.9,921,067 entitled "SYSTEMS AND METHODS FOR MATERIALS HANDLING VEHICLE odometer CALIBRATION," issued 3, 20.2018 and assigned to Crown Equipment corporation.

At block 930, the odometer calibration and sensor seed extrinsic parameters determined at block 920 may be received as inputs to a calibration algorithm. In a map embodiment, the calibration algorithm at block 930 may also receive a site map and truck seed position as inputs from block 940. However, as described above, calibration of sensors for materials handling vehicle positioning may not include a site map (e.g., may be map-less). In this case, block 940 may provide only an input of the truck seed position as having a coordinate reference (which may be any coordinate reference or set to (0,0,0) with respect to x, y, and yaw) on the calibration coordinate system as described herein. In some embodiments, the seed position may be any arbitrary coordinate within the calibration coordinate system. At block 930, a calibration algorithm is executed and calibration external parameters of the sensor are determined. The example calibration methods as described herein may be used for calibration algorithms.

If the calibration external reference is determined to be acceptable at block 950, the calibration external reference is saved as a calibrated external reference at block 970. However, if it is determined at block 950 that the calibration external parameters determined at block 930 are not acceptable, then calibration may be restarted at block 960, returning to block 910 or block 930. The analysis of the calibration external parameters determined at block 930 and/or the analysis of the previously completed calibration process may be completed at block 960 before restarting the calibration at block 960. The analysis at block 960 may be configured to determine whether a particular problem of the calibration process may be identified that results in an unacceptable calibration external. For example, the analysis may determine that an incorrect truck vehicle model was determined at block 910 or that the calibration path performed when executing the calibration algorithm was insufficient (e.g., not including movement in a particular direction, i.e., the vehicle was only traveling in a straight line and no turn was performed during calibration). Thus, block 960 may determine that calibration may be restarted at an intermediate step at block 930, rather than from scratch. For example, if the vehicle does not complete a sufficiently diverse path to generate acceptable calibration parameters, then the calibration may be restarted by continuing the calibration algorithm at block 930. However, if the vehicle model is determined incorrectly at block 910, then calibration may be restarted at block 930 to determine the correct vehicle model.

Referring now to FIG. 10, another example flowchart 1000 illustrating a method of calibrating sensors of a materials handling vehicle is depicted as a non-limiting example of a calibration algorithm. For example, the flow chart depicted in fig. 10 may be an embodiment of the calibration algorithm implemented at block 930 in fig. 9. Calibration is initiated at block 1002. As discussed with reference to fig. 9, calibration may be initiated automatically or manually. Once initiated, sensor data is captured by the sensor(s) of the materials handling vehicle at block 1010. In some embodiments, the sensor data may be captured by the calibrated sensor as well as other sensors. For example, sensor data from sensors used to determine the odometer (e.g., as determined at block 1070) and sensor data from the camera may be collected at block 1010. Features are then extracted from the raw sensor data collected at block 1010. For example, in block 1020, features are extracted by one or more algorithms configured to identify, isolate, and/or extract features from the sensor data. Feature extraction may employ algorithms, such as using object and image detection and identification techniques, such as scale-invariant feature transform ("SIFT"), speeded up robust features ("SURF"), Blob detection methods, and edge-detection algorithms. Such algorithms may identify features and/or detect objects within the sensor data that may be used as features that a calibration method may associate with future or previously observed features from future or previously collected sensor data. Examples of such feature extraction techniques that may be used herein are described in U.S. patent nos. 9,170,581 and 9,606,540, assigned to Crown Equipment, inc.

At block 1030, the calibration method correlates the extracted sensor data with previously observed features. Some data association methods that may be implemented include, but are not limited to, image-to-image, pixel velocity tracking nearest neighbors, joint compatible branches and boundaries "JCBB", and the like. For example, but not limiting of, the extracted sensor data may include features defining the first luminaire and the previously observed features may also include the first luminaire observed from a previous iteration of sensor capture and feature extraction. Features from the current image may be associated with the previous image to attach to the camera frame node and construct the factor graph FG. At block 1030, the extracted features and previously observed features may be matched. This association may occur, for example, by using a pixel velocity nearest neighbor image space tracking algorithm or any algorithm that tracks features from frame to frame without prior knowledge of the location of the features in a calibration coordinate system (CCF). The associated features may then be appended to a Camera Frame Node (CFN), which may result in multiple camera frame nodes referencing a single sensor feature node, and this may be accomplished by using a projection factor. In some embodiments, such as those utilizing a map or map-less system, the data correlation technique may match observations from the current frame to features already present in the model (e.g., utilizing an algorithm such as nearest neighbor or JCBB). That is, data association of sensor feature nodes may be accomplished by image-to-image mapping or image-to-model mapping.

That is, at block 1040, with reference to the factor graphs discussed herein, the projection factor and/or the azimuth range factor may associate sensor data (e.g., camera external parameters at the current location of the materials handling vehicle) with the first light fixture previously observed from a different location and at the current location of the materials handling vehicle. If no previous observations are made, the extracted sensor data can be added as a new node to the factor graph in the mapless calibration. One or more frames may be created accordingly and added to the factor graph FG to create an optimization problem.

The position of the materials handling vehicle may be provided to the model based on odometry information generated by movement of the materials handling vehicle at block 1070 and from raw sensor data captured at block 1010 in an initial iteration of the method. The data association at block 1030 may be repeated for one or more extracted features and previously observed features, with each matching feature added to the factor graph or calibration model and/or each unmatched feature added as a new feature of the factor graph or calibration model. Further, at block 1040, model error values may be determined and optimization problems may be defined from the model, e.g., a factor graph.

An optimization problem is solved at block 1050 to produce a calibration output. Sensor external parameters and/or confidence values are determined based on calibration outputs determined by solving an optimization problem. If the sensor external parameters and/or confidence values indicate local minima at block 1060, such that adding additional information to the factor graph does not result in a significant change in the overall graph error, then the calibration is complete and the sensor external parameters based on the calibration output are stored as calibration values at block 1080 and the calibration ends. The calibration results may continue to block 950 of fig. 9. However, in some embodiments, if adding additional information to the factor graph at block 1060 does result in a significant change in the overall graph error, the materials handling vehicle continues to move through the environment at block 1070, generating further odometer and raw sensor data from at least the calibrated sensor and returning to block 1010. This may continue until adding additional information to the factor graph does not result in a significant change in the overall error of the graph. In this case, the calibration may be ended. It should be understood that the foregoing method is merely a non-limiting example of a method of calibrating sensors of a materials handling vehicle.

Referring now to fig. 11A-11C, a flow diagram is depicted that combines a subset of an example method of constructing a factor graph FG and calibrating sensors 110 of a materials handling vehicle 100 using the factor graph. The materials handling vehicle 100 includes a sensor 110, a vehicle position processor 130, one or more vehicle dead reckoning components, and a drive mechanism configured to move the materials handling vehicle 100 along an inventory transit surface. The one or more dead reckoning components may include an odometer that includes odometer logic 126. The sensors 110 are configured to record one or more characteristics of a warehouse (e.g., the warehouse environment 10), and the vehicle position processor 130 is configured to generate sensor data 144 from the recording of the one or more characteristics 150 recorded by the sensors 110. In an embodiment, the sensor 110 is a camera configured to capture one or more features 150 of the warehouse, and the vehicle position processor is further configured to generate the sensor data 144 from a recording of the one or more features captured by the camera. The corresponding image-based factor graphs FG generated in the embodiments described herein and in more detail below are depicted in fig. 5A and 6 as respective factor graphs 500, 600. Additionally or alternatively, the sensor 110 is a laser configured to detect one or more characteristics of the warehouse, and the vehicle position processor is further configured to generate the sensor data 144 from a record of the one or more characteristics detected by the laser. The corresponding laser-based factor graphs FG generated in the embodiments described herein and in more detail below are depicted in fig. 5B and 7 as respective factor graphs 500', 700. The corresponding image-based and laser-based factor graphs FG generated in the embodiments described herein and in more detail below are depicted in fig. 8 as factor graph 800.

With respect to fig. 11A, calibration is initiated at block 1102. As discussed with reference to fig. 9, calibration may be initiated automatically or manually. Once initiated, sensor data 144 from the calibrated sensor 110 is collected and received at block 1104. The sensor data may include image data collected by a camera and/or range distance data collected by a laser sensor, for example, by laser scanning, or a combination of both. The sensor data is processed and features are extracted at block 1106. For example, the vehicle position processor 130 is configured to extract one or more features 150 from the sensor data 144.

Features can be extracted in several ways. For example, as described above, feature extraction may employ algorithms, such as utilizing object and image detection and recognition techniques, such as scale-invariant feature transform ("SIFT"), speeded up robust features ("SURF"), Blob detection methods, and edge detection algorithms. Such algorithms may identify features and/or detect objects within the sensor data that may be used as features that a calibration method may associate with future or previously observed features from future or previously collected sensor data.

At block 1108, the system determines whether one or more features have been extracted from the sensor data. If no features are extracted, or at least successfully extracted, the system collects additional sensor data by returning the process to block 1104. If no additional features are extracted from the sensor data, as determined at block 1108, the system continues to determine whether a factor graph is initialized at block 1110. If the factor graph has not been initialized, the system continues with the step of initializing the factor graph at block 1112. At block 1112, the system creates sensor external reference nodes (e.g., the a priori factors 502 and sensor external reference nodes eO, 510 shown in fig. 5A) with a priori factors that include known seed external references. For example, the vehicle position processor 130 is configured to create a factor graph FG that includes a sensor external reference node (eO), which includes an initial seed external reference associated with the sensor 110 (eO).

Next, at block 1114, the system determines whether a site map 142 (which may incorporate a feature map of one or more features within the warehouse environment) and materials handling vehicle seed locations for the warehouse environment 10 are provided. In other words, this step determines whether the calibration method is a map-less calibration or a map-based calibration. The vehicle position processor 130 may be configured to generate an initial vehicle node (vO), an initial sensor frame node (cO), and an initial sensor feature node (fO) in a factor graph FG of a map or map-less configuration as described below, the initial sensor feature node (fO) comprising selected ones of the extracted one or more features associated in the initial data association with the initial sensor frame node (cO) and the initial vehicle node (vO).

The vehicle position processor 130 is further configured to generate an initial vehicle node (vO) in the factor graph FG as a map-free initial vehicle node at the origin when the vehicle seed position and the feature map are not provided. Thus, if the site map 142 and materials handling vehicle seed locations are not provided, the system operates in an un-mapped configuration and creates a first vehicle node (e.g., node vO, 612 shown in fig. 6) in the factor graph. In this case, at block 1116, the first vehicle node may be configured at any origin point defined by the origin coordinate value (0,0,0), such that the origin point is defined by the origin coordinate value (0,0,0) in the sensor calibration frame associated with the sensor 110.

The vehicle position processor 130 is further configured, when providing the vehicle seed position and the feature map comprising one or more map features, to generate in the factor map FG an initial vehicle node (vO) as (i) a map-based initial vehicle node at the vehicle seed position, having (ii) an associated vehicle a priori factor, and (iii) one or more map features from the site map 142 as corresponding sensor feature nodes, and (iv) a corresponding map feature a priori factor. The associated vehicle apriori factor for the map-based initial vehicle node at the vehicle seed location may comprise an error function with respect to the accuracy of the materials handling vehicle's vehicle seed location relative to the site map 142. Thus, if the site map 142 and materials handling vehicle seed location are instead provided, the system operates in a map configuration at block 1118 and creates a first vehicle node (e.g., node vO, 512 shown in FIG. 5A) at the seed location. The vehicle node will also include a prior location factor in the map configuration (e.g., location prior factor 504 shown in fig. 5A). The a priori position factor may define a probability or error function with respect to the accuracy of the seeding position of the materials handling vehicle with reference to the site map 142. At block 1120, each map feature based on the site map 142 is added as a node to the factor graph along with a prior factor for each map feature. In embodiments that include a site map 142 for calibrating sensors, the map features defined in the site map 142 are the features to which the system will correlate sensor data from the sensors during calibration.

The vehicle position processor 130 is further configured to navigate the materials handling vehicle 100 along the inventory transport surface using the drive mechanism and generate a subsequent vehicle node based on the accumulated miles from the one or more vehicle dead reckoning components in the factor graph FG (v 1). Referring back to block 1110, if a factor graph is initialized at item B, a second vehicle node (e.g., node v1, 514, or 614 shown in fig. 5A and 6, respectively) that is transformed relative to the first vehicle node based on the cumulative odometer from vehicle movement (i.e., described in more detail with respect to blocks 1146, 1148, and 1150 between items C and D) is created and added to the factor graph in block 1122. Further, an inter-factor is generated at block 1124 that links the first vehicle node to the second vehicle node that defines an error function of the probability relationship or vehicle position transformation. The vehicle position processor 130 may be configured to generate an inter-vehicle factor between the subsequent vehicle node (v1) and the initial vehicle node (vO) in a factor graph FG based on the accumulated mileage. The inter-factor may define an error in determining the second vehicle node from the first vehicle node. In other words, the inter-factor describes the transformation between the two vehicle poses and the associated uncertainty of the transformation.

The vehicle position processor 130 is further configured to generate, in the factor graph FG, a subsequent sensor frame node (c1) associated in a subsequent data association with the initial sensor feature node (f0) or one of the subsequent sensor feature nodes (f1) and the sensor external reference node (eO), the subsequent vehicle node (vl). Once the factor map is initialized in block 1110, such as during the first iteration of the calibration process or determining and creating a second vehicle node location within the factor map via blocks 1112 and 1124, a factor corresponding to item A is added to the factor map at block 1126. By way of example and not limitation, adding a reference frame factor 619 linking the first vehicle node vO, 612 to the sensor node cO, 618 is shown in fig. 6, or adding a reference frame factor 719 linking the first vehicle node vO, 712 to the sensor node L0, 718 is shown in fig. 7. The link reference frame factor defines the relationship between the position of the materials handling vehicle 100 and the sensor nodes.

At block 1128, one of the extracted features determined at block 1106 is selected. The selected extracted features are intended to be associated with existing features from, for example, one or more previously extracted features and/or features already defined from site map 142. If the selected extracted features match existing features, at block 1132, factors (e.g., projection factors 525 shown in FIG. 5A) are generated and added to the factor graph at block 1136, linking the sensor nodes to the matching features defined in the factor graph. As non-limiting examples, the added factor may be a projection factor of a camera feature or an azimuth range factor of a laser feature. Thus, vehicle position processor 130 may be configured to select one of the extracted one or more features, associate the one feature with an existing feature that is a matching sensor feature node when the one feature matches the existing feature, and generate, in factor graph FG, a projection factor that links the generated sensor frame node with the matching feature node. The generated sensor framing nodes may include one of an initial sensor framing node (cO) or a subsequent sensor framing node (c1), the matched sensor feature nodes may include one of an initial sensor feature node (fO) or a subsequent sensor feature node (f1), and existing sensor features may be defined in the site map 142.

Alternatively, in embodiments, if selected ones of the extracted features do not match existing features, a new feature node may be created and added to the factor graph at block 1134. However, in other embodiments, it is not necessary to add extracted features that do not match existing features. Subsequently, at block 1136, a factor (e.g., the projection factor 625 shown in fig. 6) is generated and added to the factor graph, linking the sensor nodes to new feature nodes defined in the factor graph. Thus, vehicle position processor 130 may be configured to select one of the extracted one or more features, generate a new sensor feature node associated with the one feature in the factor graph when the one feature does not match an existing feature, and generate a projection factor in factor graph FG linking the generated sensor frame node with the new sensor feature node. The new sensor signature node may comprise one of the initial sensor signature node (fO) or the subsequent sensor signature node (f 1). Although the above steps and processes are described with respect to a no map calibration method, it should be understood that similar steps and process flows may be employed in processes implementing a map calibration system and method.

The system then determines whether there are more extracted features at block 1138. For example, the system determines whether there are any additional extracted features identified during the current iteration of the calibration method that are associated with existing features or added to the factor graph. If additional extracted features are associated with the existing features or added to the factor graph, block 1138 returns the system to block 1128 for further processing of the next extracted feature. However, if no additional extracted features are associated with or added to the factor graph, block 1138 moves the system to block 1140, at which block 1140 the vehicle position processor 130 is configured to optimize the factor graph FG to provide a calibration output associated with the sensor 110 based on the initial data association and the subsequent data association. The vehicle position processor 130 may also be configured to navigate the materials handling vehicle 100 along the inventory transit surface based on the calibration output associated with the sensor 110.

Referring to block 1140, the system optimizes the factor graph to provide a solution as a calibration output. Further, the vehicle position processor 130 may be further configured to optimize the factor graph when no other of the extracted one or more features remain associated with an existing feature as a corresponding matching sensor feature node or added to the factor graph as a corresponding new sensor feature node. In some embodiments, the system may be optimized at any time, rather than at each step, such as optimization may be triggered at a particular time. For example, the system may optimize the factor graph by generating a constraint optimization problem based on the factors and variables defined in the factor graph. At block 1142, the system determines whether a local minimum is reached during optimization of the factor graph. If a local minimum is detected and, for example, a threshold minimum is reached, an acceptable calibration error is determined, etc., then the optimized external reference values in the factor graph are returned at block 1144 and may then be passed to a second filter at block 950 (FIG. 9) to determine acceptability. In some local minimum embodiments, determining at block 1142 that the optimizer has reached a local minimum may include identifying a condition where providing any more data to the optimizer will not result in a significant change in the output. In some embodiments, block 1142 determines that the local minimum is reached when the threshold is reached and maintained for a predefined period of time. In an acceptability embodiment, block 950 may determine that the calibration value determined from the optimization of the factor graph is acceptable based on the rate of change of the error being below a predetermined value. In some acceptability embodiments, tolerances matching expected calibration values, optimized path characteristic analysis (total travel distance, total angular travel, time used, speed, etc.), sensor convergence, feature diversity, characteristics of the factor graph (distribution weights, structures, discontinuities), spot checks using existing positioning algorithms, or other mechanisms may be implemented to determine when calibration values are acceptable calibration values. In embodiments, the acceptability determination determines that the calibration values are acceptable based on the provided context through one or more heuristic checks including, but not limited to, path characteristics, sensor coverage, geometric constraint checks, comparisons with predetermined acceptable calibrations, comparisons with calibrated averages for fleet populations, and/or spot checks using other positioning systems. Given the broader context of a particular truck or sensor, such acceptability determinations may utilize domain-specific knowledge to detect one or more failures of one or more optimizers.

The vehicle position processor 130 may also be configured to optimize the factor graph by generating a constrained optimization problem based on one or more factors and one or more variables defined in the factor graph. The one or more variables represent unknown random variables in the constraint optimization problem and include one or more nodes as described herein, such as one or more of a sensor outlier node (eO), an initial vehicle node (vO), a subsequent vehicle node (v1), an initial sensor frame node (cO), a subsequent sensor frame node (c1), an initial sensor feature node (fO), and a subsequent sensor feature node (f 1). The one or more factors as described herein may represent probabilistic information about a selection factor of the one or more factors and may include a priori factor, an inter factor, a reference frame factor, a projection factor, an azimuth range factor, or a combination thereof.

As described further herein, a constraint optimization problem may be constructed using one or more smoothing and mapping (SAM) libraries and optimizers. In an embodiment, the one or more features 150 include one or more overhead lights of the warehouse, and the constraint optimization problem is constructed using an image recognition algorithm, a data correlation algorithm, a modeling technique, or a combination thereof. As described herein, the image recognition algorithm is configured to track at least a portion of the static overhead light as one of the one or more features on a frame-by-frame basis and includes an edge detection routine, a feature detection routine, an object recognition routine, or a combination thereof. Further, the data association algorithm includes tracking the feature frame-by-frame using a pixel velocity nearest neighbor image space tracking algorithm without prior knowledge of the location of the feature in a sensor calibration frame associated with the sensor. Further, the modeling techniques include motion restoration Structure (SFM) techniques, simultaneous localization and mapping (SLAM) techniques, smoothing and mapping (SAM) library techniques, or a combination thereof.

Returning to block 1142 (FIG. 11B), if no local minimum is detected at block 1142, the materials handling vehicle moves to a new location, for example, via the command associated with item C executed at block 1146. Odometers are accumulated from the last vehicle location to the new vehicle location at block 1148. In some embodiments, the system determines at block 1150 whether the materials handling vehicle has moved a predetermined distance from the last position. If the materials handling vehicle has not moved at least the predetermined threshold distance, then the materials handling vehicle continues to move in response to block 1146. When the vehicle moves at least a predetermined threshold distance, as determined at block 1150, then the calibration method returns to block 1104 associated with item D and a new set of sensor data is collected by the calibrated sensor. It should also be understood and within the scope of this disclosure that in embodiments, the data may be provided to the optimizer using observations from sensors other than odometer sensors as described herein.

It should be understood that the calibration method illustrated and described with respect to the flow chart 1100 in fig. 11A-11C is only in an example embodiment of calibration of sensors of a materials handling vehicle utilizing a factor graph. Those skilled in the art will appreciate that modifications to the method are contemplated within the scope of the present disclosure.

In an embodiment, a Factor Graph (FG) is used to construct and solve a constraint optimization problem to reduce error metrics, e.g., of sensor external reference nodes, such as by reducing a minimized desired threshold, to determine the attitude of the sensor relative to the materials handling vehicle 100. FIG. 4 illustrates an example calibration coordinate system CCF created by the travel of the materials handling vehicle 100 from the origin O to map overhead features 150 such as ceiling lights 150A-S to generate the factor graph FG of FIG. 5A and a constrained optimization problem through the travel path as shown in FIG. 3, and solve for error metrics indicative of the calibration and acceptable pose of the camera with respect to the materials handling vehicle 100.

2-4, the materials handling vehicle 100 may include and/or be coupled with a vehicle computing device 102, the vehicle computing device 102 being shown and described in greater detail with reference to FIG. 2. The vehicle computing device 102 may include a processor 130, which may be implemented as one or more processors communicatively coupled to the sensors 110. The processor 130 may be configured to execute logic that implements any of the methods or functions described herein. Memory 120 may also be included and memory 120 may be utilized to store logic, including machine readable instructions, that may be communicatively coupled to processor 130, sensor 110, or any combination thereof.

In particular, FIG. 2 depicts a computing infrastructure that may be implemented and used for the materials handling vehicle 100. The computing infrastructure may include a vehicle computing device 102 that includes and/or is coupled to sensors 110, motion sensors 114, memory 120, processor 130, accelerometer 132, gyroscope 134, input/output hardware 136, network interface hardware 138, and data storage component 140, in communication with each other through a local communication interface 105. The components of the vehicle computing device 102 may be physically coupled together and/or may be communicatively coupled via a local communication interface 105 (e.g., which may be implemented as a bus) or other interface to facilitate communication between the components of the vehicle computing device 102.

The sensors 110 may be any device capable of collecting sensor data that may be used for the positioning of the materials handling vehicle 100. The sensors 110 may be communicatively coupled to the processor 130 and other components of the materials handling vehicle 100. The sensor 110 may include a digital still camera, a digital video camera, an analog still camera, an analog video camera, and/or other devices for capturing overhead images. Additionally or alternatively, the sensors 110 may include other position and location sensor systems coupled to the materials handling vehicle, such as LIDAR systems, RADAR systems, ultrasonic sensor systems, and the like.

Processor 130 may include an integrated circuit, microchip, and/or other device capable of executing or having been configured to perform functions in a manner similar to machine-readable instructions. The memory 120 may be configured as volatile and/or non-volatile memory and thus may include random access memory (including SRAM, DRAM, and/or other types of RAM), ROM flash, a hard drive, Secure Digital (SD) memory, registers, Compact Discs (CDs), Digital Versatile Discs (DVDs), and/or other types of non-transitory computer-readable media capable of storing logic, such as machine-readable instructions. Depending on the particular embodiment, these non-transitory computer-readable media may reside within the vehicle computing device 102 and/or external to the vehicle computing device 102, as described in more detail below. Thus, memory 120 may store operating logic 122, sensor logic 124 and logic 126, and calibration logic 128 for providing instructions and facilitating the functions described herein.

For example, the operating logic 122 may include an operating system and/or other software for managing components of the vehicle computing device 102. The sensor logic 124 may cause the materials handling vehicle 100 to determine the positional location of the materials handling vehicle 100 relative to the warehouse environment 10 (e.g., the ceiling lights 150A-S) via the sensor data 144 captured by the sensors 110.

As described herein, the odometer logic 126 may be used to generate and calibrate odometer data. The odometer logic 126 may also cause the materials handling vehicle 100 to navigate along the floor of the warehouse environment 10. In some embodiments, the odometer logic 126 may receive one or more signals from one or more sensors, such as one or more motion sensors 114, one or more accelerometers 132, one or more gyroscopes 134, one or more wheel angle sensors, etc., to determine the path traveled by the materials handling vehicle 100 over a period of time. Some other odometry methods may include laser scan matching and/or visual odometry. More particularly, the odometer logic 126 may be configured to determine a change in position of the materials handling vehicle 100 from a first position to a second position. The sensor logic 124 and the odometer logic 126 may each include a number of different logic blocks, each of which may be implemented as a computer program, firmware, and/or hardware, as examples.

In operation, for example, the materials handling vehicle 100 may determine its current location via user input, via the vehicle computing device 102 (such as the materials handling vehicle 100 passing through a radio frequency identifier, via a location system, etc.), and/or via the remote computing device 170. In some embodiments, the sensor 110 may be a camera configured to capture images of a ceiling, which may include one or more ceiling lights 150A-S. Some embodiments are configured such that sensor data 144 (e.g., image data) captured by the sensors 110 may additionally be compared to the site map 142 to determine the current vehicle location. However, to accurately determine vehicle position based on the sensor data 144 captured by the sensors 110, the sensors 110 may need to be calibrated with respect to the materials handling vehicle 100 with which they operate before relying or not on such a site map 142. Systems and methods for calibrating sensors 110 and, in some embodiments, assisting in generating site map 142, as described herein.

Still referring to fig. 2, the calibration logic 128 as described herein may be configured to cause tracking of multiple static features (e.g., ceiling lights 150A-S) in the warehouse environment 10 by using the sensors 110 to be calibrated, such as image-based systems (e.g., cameras) and/or laser-based systems (e.g., LIDAR systems), to build constraint optimization problems on multiple observations of the multiple static features from different locations in the warehouse environment 10. This data may be used to create an optimization problem that may be used to optimize the attitude of the sensor 110 as it relates to the materials handling vehicle 100. In the embodiments described herein, the calibration logic 128 may not use or rely on a priori knowledge of the warehouse environment 10, such as the site map 142. In some embodiments, the process of calibrating the sensors 110 may also result in the formation of a site map 142, which site map 142 may be used later to locate or supplement the location of the materials handling vehicle 100 in the warehouse environment 10.

Referring again to FIG. 2, the motion sensor 114 may be any sensor capable of detecting or determining motion of the materials handling vehicle 100. For example, motion sensor 114 may include an optical, microwave, acoustic, or lighting sensor configured to detect motion or changes in warehouse environment 10 as a function of detected motion. Some such sensors may include infrared sensors, ultrasonic ranging sensors, photo resistors, and the like. In some embodiments, the motion sensor 114 may include one or more encoders (e.g., optical, mechanical, or electromechanical) positioned to detect movement of one or more wheels of the materials handling vehicle 100. The motion sensors 114 may also include one or more sensors capable of and configured to detect angular or directional heading of the materials handling vehicle 100. Further, materials handling vehicle 100 may include at least accelerometer 132 and/or gyroscope 134 in communication with vehicle computing device 102.

The input/output hardware 136 may include and/or be configured to interface with the components of fig. 2, including the sensors 110, odometers that provide odometer data used by the odometer logic 126, and the like. The network interface hardware 138 may include and/or be configured to communicate with any wired or wireless networking hardware, including an antenna, modem, LAN port, Wi-Fi card, WiMax card, BluetoothTMModules, mobile communication hardware, and/or other hardware for communicating with other networks and/or devices. From this connection, communication between the vehicle computing device 102 and other computing devices (such as remote computing device 170) may be facilitated over, for example, the network 160. Network 160 may include a wide area network, such as the internet, a Local Area Network (LAN), a mobile communications network, a Public Service Telephone Network (PSTN), and/or other networks and may be configured to electronically and/or communicatively connect vehicle computing device 102, remote computing device 170, and/or any other network-enabled device.

Still referring to fig. 2, data storage component 140 may include any storage device capable of storing logic, such as machine-readable instructions and/or data. The data storage component 140 can be located locally to the vehicle computing device 102 and/or remotely from the vehicle computing device 102, and can be configured to store one or more pieces of data for access by the vehicle computing device 102 and/or other components. As shown, for example, the data storage component 140 may include data defining a site map 142, which site map 142 may include information generated or otherwise generated and stored by the sensors 110 of the materials handling vehicle 100 during calibration, sensor data 144 captured by the sensors 110 of the materials handling vehicle 100 during calibration, and/or calibration data 146 for the sensors 110. The calibration data 146 may include manufacturer tolerances for sensors to be calibrated and positioned on the materials handling vehicle 100, installation calibration data 146, and/or calibration data 146 determined from calibrating the sensors 110 relative to the materials handling vehicle 100 in the warehouse environment 10.

It should be understood that while the components in fig. 2 are illustrated as residing as part of the vehicle computing device 102, this is merely an example. In some embodiments, as described above, one or more components may reside external to the vehicle computing device 102. It should also be understood that while the vehicle computing device 102 is illustrated as a single device, this is also merely an example. In some embodiments, the operating logic 122, sensor logic 124, odometer logic 126, and calibration logic 128 may reside on different computing devices. As an example, one or more of the functions and/or components described herein may be provided by the remote computing device 170 and/or other devices, which may be communicatively coupled to the vehicle computing device 102. These computing devices may also include hardware and/or software for performing the functions described herein (e.g., implementing the methods illustratively shown in fig. 9-12).

Further, while the vehicle computing device 102 is illustrated as having the sensor logic 124, the odometer logic 126, and the calibration logic 128 as separate logic components, this is also an example. In some embodiments, a single logic may cause the vehicle computing device 102 to provide the described functionality.

Referring now to fig. 3, an illustrative time lapse diagram of the materials handling vehicle 100 providing calibration of sensors derived from a mapping of the warehouse environment 10 at multiple locations is depicted. In particular, FIG. 3 depicts the materials handling vehicle 100 and associated image area 112 at discrete locations in the warehouse environment 10, the image area 112 defining an area of ceiling and overhead features such as ceiling lights 150A-S in sensor data 144 captured by the sensors 110 (in this example, cameras 110A) coupled to the materials handling vehicle 100 at each location. That is, each of the materials handling vehicles 100A depicted in FIG. 3 is the same materials handling vehicle 100A simply positioned at a different location within the warehouse environment 10. Further, as shown, a number of image areas 112A-112L overlap, capturing the same static overhead features (e.g., ceiling lights 150A-150S) in one or more images from different locations along the path of the materials handling vehicle 100A through the warehouse environment 10.

As described in more detail below, the calibration method generally relies on calibrating the cameras 110A on the materials handling vehicle 100A by collecting image data from the cameras 110A at multiple locations in the warehouse environment 10. The image data captured from camera 110A includes static overhead features that overlap between one or more images captured from different locations.

Still referring to fig. 3, the calibration method in the warehouse environment 10 is generally described with respect to the illustrated time lapse maps of the materials handling vehicle 100 at a plurality of locations. For purposes of explanation and not limitation, each unique location of the materials handling vehicle 100A, camera 110A, and image area 112 defining the image data captured by camera 110A depicted in fig. 3 (i.e., identified by the letters "a" - "L" in fig. 3) is associated with an additional letter (e.g., "a" - "L") indicating the unique location (e.g., the materials handling vehicle 100A at location a captures the image data in image area 112A). More specifically, the camera 110A at location a has an image area 112A that includes two static features captured by image data, ceiling lights 150A and 150B. Position a may be referred to as the initial position of the remainder of the coordinate system for calibration.

Unlike calibration with the site map 142, the materials handling vehicle 100A may be seeded anywhere within the warehouse environment 10 and need not be associated with a corresponding location in the site map 142. Thus, errors associated with the materials handling vehicle 100A with the site map 142 may be eliminated, thereby removing the possibility of seeding errors resulting from improper calibration and/or calibration failures due to improper or inaccurate initial seeding of the materials handling vehicle 100A. However, it should be understood that the optimizer embodiments described herein may be used in a map configuration and/or an un-map configuration.

As the materials handling vehicle 100A begins to move from position A to position B, a path to the reference coordinate system is determined and recorded. The path may be determined by using any positioning method that does not include a calibrated sensor. For example, the path may be determined based at least on the use of an odometer, motion sensor 114, wheel angle sensor, gyroscope 134, and/or accelerometer 132. Thus, vehicle nodes representing vehicle poses within the coordinate system can be determined. For example, if location a is represented as an (x, y, z) coordinate with an initial location seeded at the origin (O) of a coordinate reference system (CRF) (e.g., as shown in fig. 4) as (0,0,0) and the materials handling vehicle 100A is determined to have traveled, for example, 4 meters in the x-direction, location B may be (4, 0, 0).

In some embodiments, the path and/or distance traveled from a previous location to a next location may be determined by an algorithm configured to cause the materials handling vehicle 100A to travel a linear or angular distance such that at least one static overhead feature within the image area 112A captured at location a is visible within the image area 112B captured at location B. As referred to herein, these unique image regions 112A, which may include at least one static overhead feature, are referred to as "frames". Although discrete locations are depicted in fig. 3, the camera 110A may continuously record image data from which frames including overlapping static overhead features may be selected for calibration purposes as the materials handling vehicle 100A traverses the warehouse environment 10.

As described above with respect to fig. 3, static overhead features (i.e., ceiling lights 150A-S) captured in one image area 112F are also captured and visible in another image area 112G. For example, image area 112F and image area 112G share the visibility of ceiling light 150J, which overlaps with image area 112F and image area 112B. Tracking of static overhead features such as ceiling lights between frames can be accomplished using a pixel velocity nearest neighbor image space tracking algorithm or any algorithm capable of tracking features from frame to frame without prior knowledge of the feature location in the camera calibration frame. Some image tracking algorithms may include edge detection, feature detection, and/or object recognition routines for tracking static features or portions of static overhead features on a frame-by-frame basis.

Further, as the materials handling vehicle 100A moves throughout the warehouse environment 10, a coordinate position relative to an initial position (e.g., seed location) may be associated with each image data (i.e., with each frame) of each image area 112. The tiles similar to those depicted in fig. 4 visually represent the path of the materials handling vehicle 100A and the collection of data (e.g., image data of the camera 110A) as the materials handling vehicle 100A travels around the warehouse environment 10. As discussed with reference to fig. 1, the sensors 110 (e.g., cameras 110A) of the materials handling vehicle 100 may capture views of the same static overhead features (e.g., ceiling lights 150A-S) from various locations within the warehouse environment 10. The graph in fig. 4 depicts the position of the materials handling vehicle 100 in a coordinate reference frame and the position of the ceiling lights 150A-S in the same coordinate reference frame (i.e., depicted by the graph in fig. 4). Note, however, that the graphs in fig. 4 are merely example graphs and do not directly depict the warehouse environment 10 in fig. 3.

It will be appreciated that in such a calibration method, i.e. one that does not make use of a priori knowledge of the environment, calibration of the camera may be achieved by observing the same static object from more than one location within the environment. Thus, the transformation between locations and the uncertainty between the transformations can be estimated within the factor graph FG. The factor graph FG may be used to build an optimization problem that may be generated using a GTSAM library and an optimizer, where each factor of the factor graph FG is optimized while satisfying constraints defined by uncertainty of inter-factors and external references (such as camera external references) until a calibration value within acceptable predefined values is determined.

The sensor calibration techniques of the present disclosure are well suited for use in a customer site, specialized or general warehouse configuration. Using optimization and statistical techniques, sensor calibration is estimated as the materials handling vehicle 100 travels through the environment. Calibration confidence in this estimate is also calculated and provided to the commissioning engineer as real-time feedback of the calibration progress to help determine when to end the calibration process. In some embodiments, the calibration process may include both online and offline portions. For example, sensor data may be collected by sensors to be calibrated online as the materials handling vehicle traverses the environment. The sensor data may be stored in a computer readable medium for offline analysis. The stored sensor data is input to an offline system, such as a server or computing device. The offline system may then generate a factor graph or similar model, an optimization model, based on the sensor data, and generate calibration values for the sensors, which may then be loaded into the computing device of the materials handling vehicle.

Moreover, the systems and methods described herein improve the ability to determine the position of a sensor (e.g., a camera coupled to a materials handling vehicle) relative to the materials handling vehicle by removing the need for a priori information about the position of static features in the warehouse environment. Thus, calibration may be performed prior to generating the site map 142, which eliminates the reliance on generating a priori information and time associated with bringing the materials handling vehicle to a functional state. Furthermore, potential errors associated with using map-based calibration, such as using a SLAM map in one example and manually seeding a materials handling vehicle within the map, may not be associated with the calibration pose of the camera. Further, the information of the site map 142 may be generated in parallel with calibration of the camera 110A or similar sensor 110, as each frame and associated location coordinates may be compiled into the site map 142 of the warehouse environment 10. This information may also be used, for example, to update an existing site map 142 or to help build a new site map 142 for a new warehouse environment 10. SLAM is just one example of how to generate maps for use with the systems and methods described herein, and other map generation techniques are contemplated and are within the scope of the present disclosure. Another example of a map source includes a survey map.

For the purposes of describing and defining this disclosure, it should be noted that reference herein to a variable being a "function" of a parameter or another variable is not intended to mean that the variable is merely a function of the listed parameter or variable. Conversely, references herein to a variable as a "function" of a listed parameter are intended to be open ended such that the variable may be a single parameter or a function of multiple parameters.

It should also be noted that recitation of "at least one" element, such as "a," "an," etc., herein should not be taken to imply that alternative uses of the article "a" or "an" should be limited to the inference of a single element, etc.

It should be noted that recitations herein of a component of the present disclosure being "configured" in a particular way to embody a particular property or function in a particular manner are structural recitations as opposed to recitations of intended use. More specifically, references herein to the manner in which a component is "configured to" denotes an existing physical condition of the component and is thus to be taken as a definite recitation of the structural characteristics of the component.

For the purposes of describing and defining the present invention it is noted that the term "substantially" is utilized herein to represent the inherent degree of uncertainty that may be attributed to any quantitative comparison, value, measurement, or other representation. The term "substantially" is also utilized herein to represent the degree by which a quantitative representation may vary from a stated reference without resulting in a change in the basic function of the subject matter at issue.

Having described the subject matter of the present disclosure in detail and by reference to specific embodiments thereof, it should be noted that the various details disclosed herein should not be construed as implying that such details relate to elements that are essential components of the various embodiments described herein, even though specific elements are illustrated in each of the figures accompanying this specification. Furthermore, it will be apparent that modifications and variations are possible without departing from the scope of the disclosure, including but not limited to the embodiments defined in the appended claims. More specifically, although some aspects of the present disclosure are identified herein as preferred or particularly advantageous, it is contemplated that the present disclosure is not necessarily limited to these aspects.

While particular embodiments and aspects of the present disclosure have been illustrated and described herein, various other changes and modifications can be made without departing from the spirit and scope of the present disclosure. Further, while various aspects have been described herein, these aspects need not be used in combination. Accordingly, it is therefore intended that the appended claims cover all such changes and modifications that are within the scope of the embodiments shown and described herein.

It should now be appreciated that the embodiments disclosed herein include systems, methods, and non-transitory computer readable media that describe methods for calibrating a materials handling vehicle. It should also be understood that these examples are illustrative only, and are not intended to limit the scope of the present disclosure.

42页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:机器人拥塞管理

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!