Information processing apparatus, control method of information processing apparatus, and storage medium

文档序号:447690 发布日期:2021-12-28 浏览:3次 中文

阅读说明:本技术 信息处理装置、信息处理装置的控制方法和存储介质 (Information processing apparatus, control method of information processing apparatus, and storage medium ) 是由 今井彰人 于 2021-06-21 设计创作,主要内容包括:本发明提供信息处理装置、信息处理装置的控制方法和存储介质。该信息处理装置进行第一校正,以在检测到传感器的移动路径的环路的情况下,将与用于估计位置和朝向的第一测量点相关联的位置和朝向,校正为基于当检测到环路时存在于传感器附近的第二测量点的位置和朝向。(The invention provides an information processing apparatus, a control method of the information processing apparatus, and a storage medium. The information processing apparatus performs a first correction to correct a position and an orientation associated with a first measurement point used for estimating the position and the orientation to be based on a position and an orientation of a second measurement point existing in the vicinity of the sensor when the loop is detected, in a case where the loop of the movement path of the sensor is detected.)

1. An information processing apparatus, comprising:

an acquisition unit configured to acquire sensor information that is output from a sensor configured to move and that is obtained by measuring a surrounding environment;

a generation unit configured to generate map data indicating a map based on a movement path of the sensor, the map data including measurement points that associate the sensor information with a position and an orientation of the sensor;

an estimation unit configured to estimate a position and an orientation of the sensor based on the measurement point and the sensor information acquired by the acquisition unit;

a detection unit configured to detect a first measurement point included in the map data based on an output of the sensor;

a first correction unit configured to, in a case where the first measurement point is detected by the detection unit, correct a position and an orientation, which are associated with a second measurement point used by the estimation unit to estimate a position and an orientation, and which are included in the map data, to be based on the position and the orientation of the first measurement point; and

a second correction unit configured to, in a case where the first measurement point is detected by the detection unit, correct a position and an orientation included in the map data and associated with a plurality of measurement points including a measurement point different from the second measurement point, the plurality of measurement points being greater in number than the measurement points corrected by the first correction unit.

2. The information processing apparatus according to claim 1, wherein in a case where a loop is detected by the detection unit, the generation unit does not generate a new measurement point until the correction by the first correction unit is completed.

3. The information processing apparatus according to claim 1, wherein the first correction unit corrects the position and orientation using rigid transformation.

4. The information processing apparatus according to claim 1, wherein the first correction unit calculates the position and orientation at the second measurement point after correction based on a relative position and orientation between the position and orientation at the second measurement point and the position and orientation at the first measurement point.

5. The information processing apparatus according to claim 4, wherein the first correction unit adds a difference of the position and orientation calculated based on the position and orientation at the second measurement point before correction and the position and orientation corresponding to the second measurement point after correction to the position and orientation at a third measurement point to correct the position and orientation at the third measurement point.

6. The information processing apparatus according to claim 1,

wherein the map data generated by the generation unit includes position information on feature points in the surrounding environment,

wherein the measurement point includes information on the feature point observed at the measurement point based on the sensor information, and

wherein the estimation unit estimates the position and orientation of the sensor based on the feature points.

7. The information processing apparatus according to claim 6,

wherein the second measurement point is one or more measurement points at which the feature point for estimation by the estimation unit is observed, and

wherein the first correction unit further corrects the position of the feature point.

8. The information processing apparatus according to claim 7, wherein the first correction unit integrates a first feature point and a second feature point, whose positions are corrected, in a case where the first feature point and the second feature point included in the map data are within a predetermined distance and a feature of the first feature point observed based on the sensor information is similar to a feature of the second feature point observed based on the sensor information.

9. The information processing apparatus according to claim 8, wherein the first correction unit corrects the position and orientation by beam-adjusting an integrated feature point and a measurement point at which the integrated feature point is observed.

10. The information processing apparatus according to claim 1, wherein the first correction unit performs correction according to a distance between the second measurement point and the first measurement point.

11. The information processing apparatus according to claim 1,

wherein the map data comprises a pose graph comprising relative positions and orientations between measurement points, and

wherein the second correction unit corrects the positions and orientations at the plurality of measurement points to minimize an error of the relative positions and orientations calculated based on the relative positions and orientations between the measurement points shown in the attitude map and the positions and orientations at the plurality of measurement points.

12. The information processing apparatus according to claim 1,

wherein the sensor is a camera, and

wherein the sensor information is an image.

13. The information processing apparatus according to claim 1, wherein the detection unit detects a loop in a case where an object observed based on the sensor information associated with the first measurement point is observed based on the sensor information acquired by the acquisition unit and output from the sensor that has moved.

14. The information processing apparatus according to claim 1, the information processing apparatus further comprising:

the sensor; and

a mobile unit.

15. The information processing apparatus according to claim 14, wherein the mobile unit moves based on an operation by a user.

16. The information processing apparatus according to claim 14, wherein the moving unit moves along a preset route based on the position and orientation estimated by the estimating unit.

17. The information processing apparatus according to claim 14, wherein the mobile unit is a wheel or a propeller.

18. The information processing apparatus according to claim 1, wherein the correction by the first correction unit is performed before the correction by the second correction unit is performed.

19. An information processing method, comprising:

acquiring sensor information that is output from a sensor configured to move and that is obtained by measuring a surrounding environment;

generating map data indicative of a map based on a path of movement of the sensor, the map data including measurement points that associate the sensor information with a position and orientation of the sensor;

estimating a position and orientation of the sensor based on the measurement points and the acquired sensor information;

in a case where a first measurement point included in the map data is detected based on an output of the sensor, correcting a position and an orientation, which are associated with a second measurement point used for estimating a position and an orientation and are included in the map data, to be based on the position and the orientation of the first measurement point; and

in a case where the first measurement point included in the map data is detected based on the output of the sensor, the position and orientation included in the map data and associated with a plurality of measurement points including a measurement point different from the second measurement point are corrected to correct the position and orientation at the plurality of measurement points.

20. A non-transitory storage medium storing a program for causing a computer to execute an information processing method, the information processing method comprising:

acquiring sensor information that is output from a moving sensor and is obtained by measuring a surrounding environment;

generating map data indicative of a map based on a path of movement of the sensor, the map data including measurement points that associate the sensor information with a position and orientation of the sensor;

estimating a position and orientation of the sensor based on the measurement points and the acquired sensor information;

in a case where a first measurement point included in the map data is detected based on an output of the sensor, correcting a position and an orientation, which are associated with a second measurement point used for estimating a position and an orientation and are included in the map data, to be based on the position and the orientation of the first measurement point; and

in a case where the first measurement point included in the map data is detected based on the output of the sensor, the position and orientation included in the map data and associated with a plurality of measurement points including a measurement point different from the second measurement point are corrected to correct the position and orientation at the plurality of measurement points.

Technical Field

The present disclosure relates to techniques for estimating the position and orientation of a sensor and generating an electronic map for use in the estimation.

Background

Autonomous moving bodies such as Automated Guided Vehicles (AGVs) are used in factories or distribution warehouses. As a method for estimating the position and orientation of such an AGV and creating electronic environment map data used in the estimation, a Simultaneous Localization and Mapping (SLAM) technique using a camera or a laser range scanner as a sensor is known.

"Mur-ar, r., & Tardos, j.d. orb-SLAM2: an Open-Source SLAM System for simple, Stereo and RGB-D cameras. ieee Transactions on Robotics 33(5) 1255-" discusses a technique in which a computer identifies a correspondence between a position in a real space and a measurement point on environment map data based on information of each measurement point acquired by a sensor in a process of generating the environment map data by moving a mobile body mounted with the sensor. The above-mentioned prior art document also discusses a closed loop (loop closing) technique for correcting any one of a position and an orientation in environment map data based on a correspondence between a position in a real space recognized by a computer and a measurement point.

In the method discussed by "Mur-Artal, R., & Tardos, J.D.ORB-SLAM2: an Open-Source SLAM System for cellular, Stereo and RGB-D cameras. IEEE Transactions on Robotics 33(5) 1255-1262", it takes a long time to correct the environment map data when the loop is closed, which results in a decrease in the accuracy of the position/orientation estimation. Specifically, if the sensor is continuously moved after the correction process is started, a new measurement point may be generated which is affected by the error accumulated before the correction process is completed. In this case, the position/orientation estimation result may vary depending on whether the position and orientation are estimated based on a new measurement point or based on an existing measurement point.

Disclosure of Invention

The information processing apparatus includes: an acquisition unit configured to acquire sensor information that is output from a sensor configured to move and that is obtained by measuring a surrounding environment; a generation unit configured to generate map data indicating a map based on a movement path of the sensor, the map data including measurement points that associate the sensor information with a position and an orientation of the sensor; an estimation unit configured to estimate a position and an orientation of the sensor based on the measurement point and the sensor information acquired by the acquisition unit; a detection unit configured to detect a first measurement point included in the map data based on an output of the sensor; a first correction unit configured to, in a case where the first measurement point is detected by the detection unit, correct a position and an orientation, which are associated with a second measurement point used by the estimation unit to estimate a position and an orientation, and which are included in the map data, to be based on the position and the orientation of the first measurement point; and a second correction unit configured to, in a case where the first measurement point is detected by the detection unit, correct a position and an orientation included in the map data and associated with a plurality of measurement points including a measurement point different from the second measurement point, the plurality of measurement points being greater in number than the number of measurement points corrected by the first correction unit.

Other features of the present disclosure will become apparent from the following description of exemplary embodiments with reference to the attached drawings.

Drawings

Fig. 1 is a diagram illustrating an example of a moving path of a moving body and a state of an environment map before correction processing is performed according to one or more aspects of the present disclosure.

Fig. 2 is a diagram illustrating an example of a state of an environment map in a case where the first correction process is not performed according to one or more aspects of the present disclosure.

Fig. 3 is a diagram illustrating an example of a state of an environment map after first correction processing is performed according to one or more aspects of the present disclosure.

Fig. 4 is a diagram illustrating an example of a state of an environment map before performing second correction processing according to one or more aspects of the present disclosure.

Fig. 5 is a diagram illustrating an example of a state of the environment map after the second correction processing is performed according to one or more aspects of the present disclosure.

Fig. 6 is a block diagram illustrating a configuration of a mobile body system according to one or more aspects of the present disclosure.

Fig. 7 is a block diagram showing a logical configuration of an information processing apparatus according to one or more aspects of the present disclosure.

Fig. 8 is a block diagram illustrating details of a logical configuration of a closed loop processing unit in accordance with one or more aspects of the present disclosure.

Fig. 9 is a flow diagram illustrating an information processing flow according to one or more aspects of the present disclosure.

Fig. 10 is a flow diagram illustrating details of a closed loop process flow in accordance with one or more aspects of the present disclosure.

Fig. 11 is a flow diagram illustrating details of a first correction process flow according to one or more aspects of the present disclosure.

Fig. 12 is a flow diagram illustrating an information processing flow according to one or more aspects of the present disclosure.

Fig. 13 is a diagram illustrating an example of a state of an environment map before performing the first correction process according to one or more aspects of the present disclosure.

Fig. 14 is a flow diagram illustrating details of a first correction process flow according to one or more aspects of the present disclosure.

Fig. 15 is a diagram illustrating an example of a state of an environment map after correction processing using rigid transformation is performed according to one or more aspects of the present disclosure.

Fig. 16 is a diagram illustrating an example of a state of an environment map after a feature point integration process is performed according to one or more aspects of the present disclosure.

Detailed Description

An exemplary embodiment for preventing generation of redundant measurement points caused by a long processing time for correcting map data while performing closed-loop will be described below.

Exemplary embodiments will be described in detail below with reference to the accompanying drawings. The following exemplary embodiments are not intended to limit the scope of the claims of the present disclosure, and not all combinations of features described in the following exemplary embodiments are essential to the present disclosure.

A mobile body system, an environment map creation system, an information processing apparatus, an information processing method, and a computer program according to a first exemplary embodiment will be described in detail below with reference to the accompanying drawings.

The first exemplary embodiment shows an example in which a mobile body mounted with a sensor is operated from the outside by a user to move in an environment, and environment map data usable for position/orientation estimation and autonomous traveling of the mobile body is generated. Although the present exemplary embodiment shows an example in which a grayscale camera is used as the sensor, the present exemplary embodiment is not limited to this example. For example, a compound eye camera may also be used as the sensor, and depth may be used as the sensor information. Laser range scanners, Laser range finders, Laser Imaging Detection and Ranging (LIDAR), and the like may also be used as sensors.

The term "environment" used in the present exemplary embodiment refers to a three-dimensional space including an area in which the sensor moves and a surrounding area. The "position and orientation" used in the present exemplary embodiment is a value having six degrees of freedom as a combination of three-dimensional position information and orientation information having three degrees of freedom. In the present exemplary embodiment, the position and orientation in the three-dimensional space can be represented by an affine matrix of 4 × 4, and only rotation and parallel movement in the nature of affine transformation are used. If there are two affine matrices a and B respectively representing the position and orientation, an affine matrix d representing the relative position and orientation (position/orientation difference) between the affine matrices a and B can be obtained by adding the affine matrix B to the inverse matrix of the affine matrix a. Similarly, the position/orientation B may be obtained by adding the relative position and orientation d to the position/orientation a.

(environmental map data)

The environment map data used in calculating the position and orientation according to the present exemplary embodiment includes one or more measurement points and a posture diagram. Each measurement point includes sensor information (image data taken by the sensor) and position/orientation information as information for position/orientation measurement and information for loop detection.

The pose graph is a simple graph in which the measurement points are shown as nodes and the relative position and orientation between two measurement points are shown as edges. In the pose graph, each measurement point is connected to one or more other measurement points by an edge. All measurement points on the environment map data are connected to the attitude map.

(Drift & closed loop)

The relationship between the error (drift) of the position and orientation occurring when the environment map data is generated and the closed-loop processing will now be described. The environment map data is generated by repeatedly generating measurement points and estimating a current position and orientation using the measurement points. Therefore, errors in position and orientation that occur when a measurement point is generated result in errors in subsequent position/orientation estimates using that measurement point, such that the errors accumulate each time a measurement point is generated. The accumulation of errors is called "drift".

Fig. 1 is a diagram schematically showing a relationship between a measurement point generated in environment map data and a real movement path of a moving body on a plane. In fig. 1, solid lines represent the estimated position and orientation of the mobile body obtained by the estimation result, and broken lines represent the true position and orientation path of the mobile body. In fig. 1, each white circle represents a measurement point. The moving body generates the measurement points B, C, …, E, F, G, and H while moving clockwise in the environment with the measurement point a as a starting point, and then returns to a position near the measurement point a. The measurement point H' represents the true position and orientation when the measurement point H is generated. A phenomenon in which errors of the position and orientation occurring at each measurement point are accumulated and the position and orientation gradually deviate from the true position and orientation according to the amount of movement is called drift.

To correct for drift, a process called "closed loop" is performed as follows. That is, when the moving body repeatedly reaches (circulates) a certain point on the environment map data, the positions and orientations of the moving body and the elements on the environment map data are corrected. In the closed-loop processing, when the sensor detects that the mobile body has returned to the already mapped area (loop detection), the error accumulated by the search is corrected (loop correction). In the example shown in fig. 1, the position and orientation at each measurement point on the path from the measurement point a to the measurement point H are corrected (attitude map optimization), so that the position and orientation at the measurement point H are corrected to substantially correspond to the measurement point H' and to satisfy the relative position/orientation relationship between the corrected measurement point and the other measurement points.

However, in the attitude map optimization, the positions and orientations at a large number of measurement points on the environment map data are used as parameters. Therefore, it takes a long time depending on the number of measurement points. Therefore, if the moving body continuously moves while the attitude map optimization is performed after the loop is detected, a new measurement point may be generated before the attitude map optimization is completed.

Fig. 2 is a diagram showing the state of the environment map data in the above-described case. After the measurement point H is generated, a loop is detected between the measurement points a and H. The mobile body continuously travels through positions on the environment map data distant from the measurement points a and B, so that new measurement points I and J are generated before the attitude map optimization process is completed. After the attitude map optimization processing is completed, the positions and orientations at the measurement points I and J are corrected to the measurement points I 'and J', respectively. However, on the attitude map, the measurement points I ' and J ' are not connected to the measurement points a ', B ', and C ' existing in the vicinity of the corrected measurement points I ' and J '. Therefore, if the mobile body moves further forward from the measurement point J ', another new measurement point can be generated in the vicinity of the measurement point C'. If the mobile body travels near these measurement points again and calculates the position and orientation, the position/orientation estimation result may change depending on whether the position and orientation are calculated with reference to the measurement points B 'and C' or the position and orientation are calculated with reference to the measurement points I 'and J'.

To avoid this problem, in the present exemplary embodiment, the local correction processing that can be completed in a short period of time is performed on the measurement point H for measuring the position and orientation of the mobile body at the timing at which the loop is detected between the measurement points a and H. Fig. 3 is a diagram showing a state of the environment map data when the local correction processing ends. The position and orientation at the measurement point H are corrected to the measurement point H'. In the present exemplary embodiment, the moving body is still located near the measurement point H' at this time, and a new measurement point is not generated.

Fig. 4 is a diagram showing a state in which the mobile body continuously moves forward on the route along the measurement points a and B after the local correction processing ends and before the attitude map optimization processing is completed. At this time, in the present exemplary embodiment, the position and orientation of the mobile body may be calculated with reference to the measurement points a and B, so that new measurement points corresponding to the measurement points I and J are therefore not generated.

At this time, the path of the moving body obtained by the position/orientation estimation result after the loop is detected exhibits an error in the left direction of fig. 4 due to the measurement point B because the drift of the position and orientation at the measurement point B is not corrected. However, the amount of this error is smaller than the amount of error caused by drift at the measurement point H. Further, the relative position/orientation relationship between the path when the measurement point B is generated and the path after the loop is detected is maintained at this time.

Fig. 5 is a diagram showing a state when the posture-chart optimization processing is completed. The attitude map optimization processing enables the positions and orientations at the measurement points B to G to be corrected to the measurement points B 'to G' located near the true movement path, respectively. At this time, the position and orientation of the mobile body are also corrected based on the measurement point B.

Therefore, in the present exemplary embodiment, the local correction processing completed in a short period of time is performed before the optimization of the attitude map. Therefore, even if it takes a long time to perform attitude map optimization, it is possible to prevent redundant measurement points from being generated and to prevent the accuracy of position/orientation estimation from being degraded when environment map data is used. According to the present exemplary embodiment, in simultaneous localization and mapping (SLAM), when a loop is detected, only the loop source or the like is subjected to the local position/orientation correction processing, which is completed with a smaller processing amount in a short period of time than the attitude map optimization processing, before the attitude map optimization is performed and before a new measurement point is generated. Therefore, it is possible to prevent generation of measurement points that should not be generated based on an error in the environment map data, and to prevent a decrease in accuracy of the position/orientation estimation result due to the measurement points generated based on the error.

(construction of Mobile body System, Environment map creation System, and information processing apparatus)

Next, a configuration example of the moving body system 601 according to the present exemplary embodiment will be described with reference to fig. 6. The mobile body system 601 includes an environment map creation system 602, a communication unit 605, a control device 606, and a mobile unit 607, and the environment map creation system 602 includes a sensor 603 and an information processing device 604.

The sensor 603 outputs sensor information obtained by measuring the surrounding environment. The sensor 603 is a camera fixed in front of the moving body system 601 and configured to continuously acquire a grayscale luminance image. For simplification of explanation, it is assumed that internal parameters of the camera (such as a focal length and an angle of view) are known, and an image is output in a state where the image is not distorted or distortion in the image is corrected. The sensor 603 takes images 30 times per second, but may take images at other frame rates.

Information processing apparatus 604 estimates the position and orientation of mobile body system 601 based on the information input from sensor 603, and generates environment map data. During autonomous traveling of the mobile body system 601, the information processing apparatus 604 issues a movement instruction to the control apparatus 606. The configuration of the information processing apparatus 604 will be described in detail below.

The communication unit 605 receives instructions from the user, such as an instruction of movement or rotation of the mobile body system 601 and an instruction to start or end the environment map creation process to be performed by the information processing apparatus 604. The communication unit 605 is, for example, a chip or an antenna for establishing communication based on the Institute of Electrical and Electronics Engineers (IEEE)802.11 series. The communication method of the communication unit 605 is not limited to the IEEE 802.11 series. Other communication methods may also be used, such asOr wired communication.

The control device 606 controls the driving of the moving unit 607. Control device 606 drives moving unit 607 based on instructions from information processing device 604 and communication unit 605 to move or rotate moving body system 601. The moving unit 607 is a plurality of tires partially or entirely operated by power.

(Structure of information processing apparatus)

The information processing apparatus 604 includes the functions of a general built-in Personal Computer (PC) apparatus. The information processing apparatus 604 includes a Central Processing Unit (CPU)611, a Read Only Memory (ROM)612, a Random Access Memory (RAM)613, a storage unit 614 such as a Hard Disk Drive (HDD) or a Solid State Drive (SSD), a universal interface (I/F)615 such as a Universal Serial Bus (USB), and a system bus 616.

The CPU 611 executes an Operating System (OS) and various computer programs stored in the ROM 612 and the storage unit 614 using the RAM 613 as a work memory, thereby calculating or processing information and controlling the units via the system bus 616. For example, the program to be executed by the CPU 611 includes a program for executing the processing described below.

The sensor 603, the communication unit 605, and the control device 606 are each connected to the information processing device 604 via a general-purpose I/F615.

Fig. 6 shows an example in which the information processing apparatus 604 is incorporated in a mobile body system 601. However, the configuration of the information processing apparatus 604 is not limited to this example. Information processing apparatus 604 may be configured as an apparatus separate from moving body system 601. In this case, the general-purpose I/F615 may serve as a wireless or wired communication interface, and may be configured to communicate with the communication unit 605 and with the sensor 603 for data exchange. Information processing apparatus 604 may constitute the entire moving body system 601.

(logical Structure of information processing apparatus)

The logical configuration of the information processing apparatus 604 according to the present exemplary embodiment will be described below. The processing of each unit described below is executed as software in such a manner that a computer program is loaded from the ROM 612 or the like into the RAM 613 and the loaded program is executed by the CPU 611. Fig. 7 is a block diagram showing the logical configuration of the environment map creation system 602 and the information processing apparatus 604. The various logical constructions shown in fig. 7 may be constructed as an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), or the like.

The environment map data 701 is environment map data that is stored in the RAM 613 and is currently created.

The sensor information acquisition unit 702 acquires an output from the sensor 603. Specifically, the sensor information acquisition unit 702 acquires an image captured by the sensor 603.

The position/orientation calculation unit 703 selects a measurement point located near the sensor 603 from the measurement points included in the environment map data 701, and estimates the position and orientation of the sensor 603 based on the image captured by the sensor 603 and the selected measurement point.

The measurement point generation unit 704 adds a measurement point to the environment map data 701 as necessary based on the image captured by the sensor 603 and the position/orientation information estimated by the position/orientation calculation unit 703. For example, when the distance between the position of the sensor 603 in the environment map and the existing measurement point in the environment map is greater than or equal to a predetermined value, and when it is determined that it is difficult to estimate the position and orientation based on the existing measurement point, it is preferable to add the measurement point. Further, the measurement point generation unit 704 adds the relative position and orientation between the generated measurement point and the measurement point used for position/orientation estimation when the measurement point is generated, to the attitude map on the environment map data 701.

The closed-loop processing unit 705 performs closed-loop processing. When the sensor detects that the moving body has returned to the already mapped area (loop detection), the closed-loop processing unit 705 corrects the error accumulated due to the search (loop correction). Specifically, when the closed-loop processing unit 705 determines that a newly generated measurement point is located near a previously generated measurement point, the closed-loop processing unit 705 calculates the relative positions of the two measurement points and associates the measurement points on the attitude map. Then, the closed-loop processing unit 705 performs processing for correcting the shape of at least the area constituting the loop on the environment map. The configuration of the closed-loop processing unit 705 will be described in detail below.

(logical Structure of closed Loop processing Unit)

Fig. 8 is a block diagram showing a detailed logical configuration of the closed-loop processing unit 705.

The loop detection unit 801 checks the newly generated measurement point on the environment map data 701, and determines whether a loop of the measurement point occurs. The term "loop" as used herein refers to a state in which a newly generated measurement point and a previously created measurement point that is not associated on the attitude map are close to each other in real space. Specifically, first, the similarity between the image taken at the newly generated measurement point and the images taken at the respective measurement points that have been created is calculated by using a known Bag-of-Words (BoW) model, and the measurement points having a similarity greater than or equal to a certain level are extracted.

After that, the loop detection unit 801 calculates the relative position and orientation between the newly generated measurement point and each created measurement point in descending order of the degree of similarity. Specifically, image feature points are extracted from sensor information on two measurement points, and the relative position and orientation between images are estimated based on the distribution of the feature points. If the calculation of the relative position and orientation at any one measurement point is successful, the loop detection unit 801 determines that a loop has occurred between the two measurement points. In the following, the newly generated measurement point is referred to as loop source, while the previously created measurement point is referred to as loop destination. Before a loop is detected, the path between the loop source and the loop destination on the attitude map is called the loop range.

The first correction unit 802 corrects the position/orientation information on the elements in the environment map data 701, which is referred to by the position/orientation calculation unit 703, to a position and an orientation based on the loop destination measurement point. Further, the first correction unit 802 adds the relative position and orientation between the loop source measurement point and the loop destination measurement point to the attitude map on the environment map data 701.

The second correction unit 803 performs optimization processing (attitude map optimization) on the position/orientation information of each measurement point shown in the attitude map on the environment map data 701. The second correction unit 803 acquires the relative position and orientation between the measurement points, and performs attitude map optimization to minimize an error between the relative position/orientation information between the measurement points shown in the attitude map and the relative position/orientation information calculated based on the position and orientation at the measurement points.

(Environment map data creation processing)

Next, a method of creating environment map data relating to an information processing method using the information processing apparatus 604 according to the present exemplary embodiment will be described with reference to fig. 9 to 11. Fig. 9 is a flowchart showing the flow of the environment map data correction processing according to the present exemplary embodiment. The processes shown in the flowcharts of fig. 9 to 11 are realized by the CPU 611 loading a program stored in the ROM 612 or the storage unit 614 and executing the program to calculate or process information and control each hardware module. Some or all of the steps shown in the flowcharts may be implemented by hardware modules such as ASICs or FPGAs.

As shown in fig. 9, in step S901, information processing apparatus 604 performs initialization. The information processing apparatus 604 constructs empty environment map data 701, and generates a first measurement point (measurement point a shown in fig. 1) on the environment map data 701. The position of the first measurement point is set as the origin in the environment, and the orientation is set in a predetermined direction (for example, the Y-axis positive direction). The information processing apparatus 604 initializes the current position/orientation information estimated by the position/orientation calculation unit 703 with the position and orientation at the first measurement point.

The subsequent processing is performed in parallel in three threads, i.e., a position/orientation calculation thread, a map creation thread, and a map correction thread. Steps S902 to S904 correspond to processing in the map creation thread.

In step S902, the position/orientation calculation unit 703 acquires an image from the sensor 603, and updates the current position/orientation information based on the image and the currently-referenced measurement point. Specifically, the information processing device 604 uses, as the current position/orientation information, position/orientation information obtained by applying a position/orientation difference between an image associated with a measurement point and the latest image captured at the current position to the position/orientation information associated with the measurement point currently referred to. The position/orientation difference between the images is estimated from the distribution of the feature points on each image. In order to maintain the correspondence of the feature points between the images, the feature points are tracked between the latest image and the previous image.

Instead of calculating the position/orientation difference of the image associated with the measurement point in each image, the position/orientation difference between the latest image and the previous image may be calculated, and the calculated position/orientation difference may be repeatedly applied to the current position/orientation information.

For the above-described extraction of feature points on an image, a FAST From Accessed Segment Test (FAST) algorithm is used in the present exemplary embodiment. In order to estimate the position/orientation difference between the images, with the position of each feature point fixed, a bundle adjustment (bundle adjustment) process is used to optimize only the position and orientation of the sensor 603. The beam adjustment process is a method for solving a problem of estimating parameters of a geometric model based on the correspondence of image feature points extracted between a plurality of images, and is also used for numerically solving a nonlinear optimization problem. In the bundle adjustment processing, three-dimensional coordinates of each feature point are calculated based on the correspondence relationship of the feature points between the input images. The calculated three-dimensional coordinates of the respective feature points are re-projected onto the image plane, and re-projection errors calculated as distances between the re-projected points and the respective feature points are repeatedly re-estimated, thereby estimating more accurate values of the three-dimensional coordinates of the respective feature points.

The known Kanade-Lucas-Tomasi (KLT) algorithm is used to track feature points between images. The above-described algorithms for feature point extraction and feature point tracking may be replaced by other algorithms.

In step S903, the position/orientation calculation unit 703 selects a measurement point to be referred to in subsequent position/orientation calculation. Specifically, information processing apparatus 604 selects a measurement point having the highest feature point correspondence amount with respect to the latest image from among measurement points located in the vicinity of the current position and orientation on environment map data 701.

The position/orientation calculation unit 703 determines whether a new measurement point is to be added. If a new measurement point is to be added, the position/orientation calculation unit 703 issues an instruction to generate a measurement point to the measurement point generation unit 704. Specifically, if the corresponding amount of the feature point between the measurement point selected in step S903 and the latest image is less than or equal to the threshold, or if the difference between the position and orientation at the measurement point and the current position and orientation is greater than or equal to the threshold, the information processing apparatus 604 determines that a new measurement point is to be added. If it is determined that a new measurement point is to be added and the measurement point generation unit 704 does not have any unprocessed measurement point generation instruction, the information processing apparatus 604 issues an instruction to generate a new measurement point in the map creation thread. In the instruction to generate the measurement point, the position/orientation calculation unit 703 transfers information on the selected measurement point and the latest image to the measurement point generation unit 704.

In step S904, it is determined whether to end the processing in the position/orientation calculation thread. If the environment map data creation processing is ended in response to an instruction from the user (yes in step S904), the thread ends. If the environment map data creation processing has not ended (no in step S904), the processing returns to step S902 to repeat the processing.

Steps S905 and S906 are processing in the map creation thread.

In step S905, the measurement point generating unit 704 generates a new measurement point based on the instruction issued from the position/orientation calculating unit 703 in step S903. If the measurement point generating unit 704 has no unprocessed instructions, the measurement point generating unit 704 waits for receiving a new instruction. The new measurement point includes information indicating the image acquired in step S902 and the position and orientation. The measurement point generation unit 704 acquires a captured image from the sensor 603, and calculates a position and an orientation based on the captured image. Then, the measurement point generating unit 704 generates a measurement point including the acquired captured image and the calculated position and orientation.

Further, the measurement point generation unit 704 adds the relative position/orientation information between the new measurement point and the measurement point selected in step S903 to the attitude map.

In step S906, it is determined whether to end the processing in the map creation thread. If the position/orientation calculation thread has ended (yes in step S906), the map creation thread terminates. If the position/orientation calculation thread is not ended (no in step S906), the processing returns to step S905 to repeat the processing.

Steps S907 and S908 are processing in the map correction thread.

In step S907, upon detecting a loop on the environment map data in which the map creation thread is updated, the information processing apparatus 604 performs closed-loop processing. The closed loop process will be described in detail below.

In step S908, it is determined whether to end the map correction thread. If the position/orientation calculation thread and the map creation thread have ended (yes in step S908), the map correction thread terminates. If the position/orientation calculation thread and the map creation thread are not ended (no in step S908), the processing returns to step S907 to repeat the processing.

(details of closed Loop processing)

Next, the closed loop processing shown in step S907 will be described in detail. Fig. 10 is a flowchart showing a closed loop processing flow according to the present exemplary embodiment.

As shown in fig. 10, in step S1001, the loop detection unit 801 detects a loop, and calculates the relative position and orientation between the loop source measurement point and the loop destination measurement point. In step S1002, it is determined whether a loop is detected. If a loop is detected (yes in step S1002), the process proceeds to S1003 after adding information on the relative position and orientation between the loop source and the loop destination to the attitude map. If no loop is detected (no in step S1002), the closed-loop processing is terminated.

In step S1003, the first correction unit 802 performs first correction processing to correct the position/orientation information of the measurement point currently referred to by the position/orientation calculation unit 703 to the coordinates based on the loop destination measurement point. The first correction processing is completed in a shorter period of time than the interval for generating the measurement points in the map creation thread. In other words, the information processing apparatus 604 does not generate any new measurement point during the period from the start of the first correction process to the completion of the first correction process. The first correction processing will be described in detail below.

In step S1004, the second correction unit 803 performs an attitude map optimization process (second correction process) on the loop range on the environment map data 701 and the measurement points generated after the loop range. Information on the relative position/orientation between the loop source and the loop destination calculated in step S1001 is added to the attitude map in step S1003, and the drift occurring at each measurement point is corrected by the attitude map optimization process. Thus, the position and orientation at each measurement point can be corrected to the position of the true position/orientation path of the proximity sensor 603. If the position and orientation at the measurement point currently referred to by the position/orientation calculation unit 703 are corrected by the attitude map optimization process, the current position/orientation information on the sensor 603 is also corrected.

(details of the first correction processing)

Next, the first correction process shown in step S1003 will be described in detail with reference to a flowchart shown in fig. 11. In this case, the position and orientation at each measurement point are corrected by rigid transformation using the position/orientation difference.

First, consider the following: as shown in fig. 1, the sensor 603 generates the measurement points B, C, …, E, F, G, and H while moving clockwise in the environment with the measurement point a as a starting point, and then the sensor 603 returns to a position near the measurement point a. In step S1001, a loop in which the measurement point H is set as a loop source and the measurement point a is set as a loop destination is detected, and the relative position and orientation between the measurement points are calculated.

As shown in fig. 11, in step S1101, it is determined whether or not the correction processing using the rigid transformation is performed. Rigid transformations indicate transformations that move only in parallel and in rotation.

As described above, the first correction processing is processing for correcting the position/orientation information of the measurement point currently referred to by the position/orientation calculation unit 703 to the coordinates based on the loop destination measurement point. Therefore, if the position/orientation calculation unit 703 has referred to the loop destination measurement point on the attitude map or the measurement point in the vicinity of the loop destination measurement point, the correction of the position and orientation at the measurement point using the rigid transformation cannot be operated efficiently. In the present exemplary embodiment, the measurement point currently referred to by the position/orientation calculation unit 703 is compared with the distance between the loop source measurement point and the loop destination measurement point on the attitude map. The following processing of steps S1102 to S1104 is performed only when the loop source measurement point is closer to the measurement point.

In step S1102, the first correction unit 802 first calculates a relative position and orientation. Thereafter, the position and orientation at the measurement point H' after the first correction are calculated based on the position and orientation at the loop destination measurement point and the calculated relative position and orientation. Further, the position/orientation difference dH used in the first correction is calculated based on the position and orientation at the measurement point H' after correction and the position and orientation at the measurement point H before correction.

In step S1103, an element on the environment map data on which the correction processing using the rigid transformation is performed is selected. It is assumed here that the period from the issuance of the instruction to generate the measurement point H (step S903) to the detection of the loop (step S1001) is sufficiently short, and the sensor 603 is located in the vicinity of the measurement point H. Therefore, the position/orientation calculation unit 703 still refers to the measurement point H. In this case, the measurement point H is selected as the measurement point to be corrected.

In step S1104, the first correction unit 802 adds the position/orientation difference dH calculated in step S1102 to the correction target position and orientation selected in step S1103, and updates the position and orientation. The current position/orientation information about the sensor 603 is also updated in the same manner.

This process can be realized by integrating a simple matrix of 4 × 4, and can be completed in a much shorter period of time than the measurement point generation process in step S905 or the process of step S1004 including the attitude map optimization process. Therefore, after the loop is detected by the closed-loop processing unit 705, the correction processing is completed before a new measurement point needs to be generated in the position/orientation calculation thread, and then the position/orientation calculation processing with reference to the measurement point a can be performed. Therefore, generation of redundant measurement points I and J can be prevented.

Therefore, when the loop is detected, the information processing device 604 performs local correction processing for correcting the position and orientation associated with the plurality of measurement points included in the environment map data, before the attitude map optimization processing. The first correction processing performed as the local correction processing corrects the positions and orientations at the measurement points of the number smaller than the number of measurement points corrected in the attitude map optimization processing. In other words, when a loop is detected, the information processing device 604 performs the first correction processing to correct the position and orientation at some measurement points. After the first correction processing is completed, the information processing apparatus 604 completes second correction processing of a larger number of measurement points than the number of measurement points corrected in the first correction processing. Therefore, redundant measurement points can be prevented from being generated.

The present exemplary embodiment shows an example in which the first correction unit 802 performs correction processing using rigid transformation. However, the correction processing is not limited to this example. The correction processing by the first correction unit 802 may be other correction processing as long as the processing amount thereof is smaller than that in the correction processing (attitude map optimization processing) by the second correction unit 803. Further, the correction processing by the first correction unit 802 may be other correction processing as long as the correction processing can be completed before generating a new measurement point and the position and orientation at the loop source measurement point or at a measurement point near the loop source measurement point can be corrected even when the sensor 603 is continuously moved after the loop is detected.

(advantageous effects of the present exemplary embodiment)

As described above, according to the present exemplary embodiment, even when the environment map correction processing is performed during the creation of the environment map data, the environment map data in which the position and orientation can be estimated with high accuracy can be generated. Further, according to the present exemplary embodiment, the first correction processing is completed before the second correction processing, thereby preventing generation of redundant measurement points due to an increase in time for correcting the map data during the closed-loop processing.

The moving body system 601 moves to circulate through a plurality of coordinates while performing position/orientation estimation processing based on the environment map data generated by the information processing device 604 and the image captured by the sensor 603, thereby autonomously traveling with high accuracy.

The second exemplary embodiment will be described below. The above-described first exemplary embodiment shows an example in which the position/orientation calculation unit 703 performs the position/orientation measurement processing with reference to a single measurement point in the environment map data 701. The present exemplary embodiment shows a method for correcting environment map data in a case where three-dimensional position information on each feature point in the environment is held as the environment map data and the moving body system 601 estimates position/orientation information based on the three-dimensional feature point distribution and sensor information. In the case of creating such environment map data, the relative position and orientation between the measurement points are corrected by a method called beam adjustment as needed, thereby enabling the environment map data to be generated with higher accuracy.

(environmental map data)

The environment map data used for calculation of the position and orientation in the present exemplary embodiment includes a plurality of feature points, one or more measurement points, and a posture diagram in the environment.

Each feature point in the environment includes three-dimensional position information. Each measurement point holds not only sensor information (image) and position/orientation information as information for feature point detection in the environment and information for loop detection, but also observation information about each feature point in the environment. The observation information is a list of combinations of feature points in the environment of the image included in the sensor information and two-dimensional coordinates on the image. The two-dimensional coordinates on the image are information corresponding to the orientation of each feature point in the environment when viewed from the sensor.

(construction of Mobile body System, Environment map creation System, and information processing apparatus)

The physical and logical configuration of the mobile body system 601 according to the present exemplary embodiment is similar to that of the first exemplary embodiment, and thus the description thereof is omitted.

(Environment map data creation processing)

A method for creating environment map data according to the present exemplary embodiment will be described with reference to fig. 12 to 16. Fig. 12 is a flowchart showing the flow of the environment map data creation processing according to the present exemplary embodiment. Steps similar to those in the flowchart shown in fig. 9 are denoted by the same numerals in the flowchart shown in fig. 12, and detailed description thereof is omitted. Specifically, the processing of steps S903, S904, S906, and S908 is similar to that of the first exemplary embodiment, and thus the description thereof is omitted.

As shown in fig. 12, in step S1201, information processing apparatus 604 performs initialization. The information processing apparatus 604 constructs empty environment map data 701, and generates a first measurement point (measurement point a shown in fig. 1) and a feature point group in the environment observed from the first measurement point on the environment map data 701. The information processing apparatus 604 initializes the current position/orientation information estimated by the position/orientation calculation unit 703 with the position and orientation at the first measurement point. As in the first exemplary embodiment, the position of the first measurement point is set as the origin in the environment, and the orientation is set in a predetermined direction (for example, the Y-axis positive direction).

To generate the feature point group in the environment, first, images are acquired from the sensor 603 at two positions (i.e., at the position and orientation for generating the first measurement point a and at the position/orientation a' slightly distant from the first measurement point a). Then, based on the image feature points extracted from the images, three-dimensional coordinates of the image feature points matched between the images are estimated using the beam adjustment process. Finally, the image feature points for which three-dimensional coordinate estimation has been successfully performed are added to the environment map data as feature points in the environment, and the two-dimensional coordinates of the image feature points on the image of the measurement point a corresponding to the feature points in the environment are added as observation information when observed from the measurement point a.

A position/orientation calculation thread according to the second exemplary embodiment will be described.

In step S1202, the position/orientation calculation unit 703 acquires an image from the sensor 603, and updates the current position/orientation information based on the image and the feature points in the environment. The position/orientation calculation unit 703 re-projects the feature points in the environment observed from the measurement points referred to from this time onto the image, and performs optimization processing to calculate the position and orientation so that the re-projection error between the re-projected point and the corresponding image feature points is minimized. In this case, the feature points are tracked between the latest image and the previous image to maintain the correspondence between the image and the feature points in the environment.

Further, as in the first exemplary embodiment, steps S903 and S904 are performed.

Next, a map creation thread according to a second exemplary embodiment will be described.

In step S1203, the measurement point generation unit 704 generates feature points and measurement points in the environment based on the instruction issued from the position/orientation calculation unit 703 in step S903.

The measurement points to be generated include the image acquired in step S1202 and the position and orientation, the feature point group in the environment tracked by the position/orientation calculation unit 703, and observation information about a newly generated feature point group in the environment to be described below. Feature points in the environment are generated using the image of the measurement point selected in step S1202 and the captured image at the measurement point to be generated.

First, the information processing apparatus 604 extracts image feature points that are not associated with existing feature points in the environment from the two images. After that, the information processing device 604 estimates the three-dimensional coordinates of the corresponding image feature points between the images based on the relative position and orientation between the images. Finally, information processing apparatus 604 adds the image feature points, for which three-dimensional coordinate estimation has been successfully performed, to the environment map data as feature points in the environment, and adds two-dimensional coordinates of the image feature points on each image corresponding to the feature points in the environment as observation information from each measurement point. Further, the measurement point generation unit 704 adds the relative position/orientation information between the new measurement point and the measurement point selected in step S903 to the attitude map.

In step S1204, the measurement point generation unit 704 performs beam adjustment processing to correct the positions and orientations of the measurement points generated in step S1202 and the measurement point group that shares the feature points in the environment. Further, the measurement point generation unit 704 adds or writes relative position/orientation information between the measurement points obtained as a result of the beam adjustment to or on the attitude map.

Next, a map correction thread according to a second exemplary embodiment will be described.

In step S1205, a loop on the environment map data updated in the map creation thread is detected, and closed-loop processing is performed. The details of the closed-loop process are similar to those shown in fig. 10 according to the first exemplary embodiment, except for the details of the first correction process (step S1003). Therefore, only the details of the first correction process will now be described.

(details of the first correction processing)

The first correction process illustrated in step S1003 according to the present exemplary embodiment will be described in detail with reference to fig. 13 to 16. In the present exemplary embodiment, in addition to correcting the positions and orientations at the respective measurement points and the characteristic points in the environment by using the rigid transformation of the position/orientation difference, processing for integrating the characteristic points in the environment and beam adjustment processing using the measurement points near the loop source and the loop destination are performed.

As in the first exemplary embodiment, the second exemplary embodiment also shows an example in which, as shown in fig. 1, the sensor 603 generates the measurement points B, C, …, E, F, G, and H while moving clockwise in the environment starting from the measurement point a, and then returns to a position near the measurement point a.

When a loop in which measurement point H is set as a loop source and measurement point a is set as a loop destination is detected, information processing apparatus 604 calculates the relative position and orientation between the measurement points (step S1001).

Fig. 13 is a diagram showing a state of an environment map in the vicinity of the sensor 603 when a loop is detected. In fig. 13, black points p, p', q, r, and s are characteristic points in the environment. Points p and q are observed from measurement point a. From the measurement points H and G and the latest image observation point p' of the sensor 603. From the measurement point H and the latest image viewpoint r of the sensor 603. Point s is observed from measurement point G. The points p and p' are registered in the environment map data 701 as different feature points in the environment at this time, but are derived from the same feature point of the object in the real space. Although there are actually a large number of feature points in the environment for position/orientation estimation, only some of the feature points are shown here for ease of illustration.

Fig. 14 is a flowchart showing a flow of the first correction processing according to the present exemplary embodiment. As shown in fig. 14, in step S1401, it is determined whether or not correction processing using rigid transformation is performed. In the present exemplary embodiment, the information processing device 604 executes the processing of steps S1102, S1402, and S1403 as long as the feature point in the environment observed from the loop destination measurement point is not included in the feature points in the environment referred to by the position/orientation calculation unit 703.

In step S1401, it is determined whether or not correction processing using rigid transformation is performed. If it is determined that the correction processing using the rigid transformation is performed (yes in step S1401), the processing proceeds to step S1102. Then, in step S1402, the information processing apparatus 604 selects an element on the environment map data to be corrected by the rigid transformation. First, feature points in the environment tracked by the position/orientation calculation unit 703 are extracted. In the example shown in fig. 13, points p' and r are extracted. Next, the measurement point at which any one of the feature points is observed is extracted as a measurement point to be corrected. In the example shown in fig. 13, measurement points H and G are extracted. Finally, the feature points in the environment observed by any one of the measurement points to be corrected are extracted as the feature points in the environment to be corrected. In the example shown in fig. 13, the points p', r, and s are characteristic points in the environment to be corrected. The current position/orientation information about the sensor 603 is also corrected. Therefore, in this case, the position and orientation at each of the measurement points H and G, the position of each of the feature points p', r, and s in the environment, and the position and orientation of the sensor 603 are corrected.

In step S1403, the first correction unit 802 adds the position/orientation difference dH calculated in step S1102 to each correction target selected in step S1102, and updates the position and orientation or position. This process is a rigid transformation and maintains the relative position/orientation relationship between the elements to be corrected.

Fig. 15 is a diagram showing a positional relationship between measurement points obtained after the rigidity conversion processing. The position and orientation at each of the measurement points H and G are corrected to H 'and G', respectively. At this time, the relative position/orientation relationship between the measurement point F and the element to be corrected is destroyed. However, the measurement point and the position/orientation calculation unit 703 do not share the feature point in the environment at this time, and hence the destruction of the relative position/orientation relationship does not adversely affect the position/orientation calculation.

In step S1404, the first correction unit 802 integrates the feature points in the environment whose position was corrected in step S1403 and the feature points in the environment existing in the vicinity of the position after correction. First, information processing apparatus 604 extracts, from the feature points in the environment whose position has been corrected, other points that are not more than a certain distance from each corrected position in the environment map. Then, the information processing device 604 compares the image features in the region of the feature point in the observation environment on the image based on each observation information between the extracted feature point in the environment and the corrected feature point in the environment. Image features are compared using patch matching (patch matching). If it is determined that the image features in the observation area are sufficiently similar to each other, it is determined that the image feature indications refer to the same object in real space and the two feature points in the environment are integrated. The method of comparing the image features is not limited to this example. For example, directional FAST and rotational brief (orb) feature quantities may be used.

Fig. 16 is a diagram showing a state of the environment map in the vicinity of the sensor 603 after the feature points are integrated. The position of point p 'is corrected by a rigid transformation and then points p' and p are integrated. The point p obtained after integration includes observation information about the points p' and p before integration. In other words, point p is observed from measurement points A, H 'and G'.

In step S1405, the first correction unit 802 performs beam adjustment processing to correct the position of the feature point group in the environment integrated in step S1404, and the position and orientation of the measurement point group from which any position of the feature point group is observed. In the example shown in fig. 13, the characteristic point p in the environment and the measurement points A, H 'and G' are points at which beam adjustment is performed. In reality, in fact, more feature points in the environment not shown are processed and beam adjusted integratedly, and there is enough information available to optimize the position and orientation by beam adjustment. Further, the first correction unit 802 adds or writes relative position/orientation information between the measurement points obtained as a result of the beam adjustment to or on the attitude map.

Using beam adjustment to correct the position and orientation at each measurement point near the loop source and the loop destination may take longer than correction using rigid transformation, but may be completed in a shorter period of time than the process of step S1004 including the attitude map optimization. Therefore, after the loop is detected, the closed-loop processing unit 705 can obtain the relative position and orientation at each measurement point near the loop source and the loop destination with high accuracy in a short period of time, which results in an improvement in the calculation accuracy of the position and orientation by the position/orientation calculation unit 703. Even if a new measurement point is generated before the second correction processing is completed, an increase in the accuracy of the position/orientation calculation results in an increase in the accuracy of the position/orientation information.

(advantageous effects of the present exemplary embodiment)

According to the present exemplary embodiment described above, even when the environment map correction process is executed during the creation of the environment map data by the position/orientation calculation method including the feature point information in the environment map data, the environment map data that can realize the position/orientation estimation with high accuracy can be generated.

[ other exemplary embodiments ]

The first exemplary embodiment and the second exemplary embodiment show an example in which a camera fixed in front of a moving body and configured to acquire a grayscale luminance image is used as the sensor 603. However, the type, number, and fixing method of the sensors 603 are not limited to this example. Any sensor may be used as the sensor 603 as long as the sensor can continuously acquire a luminance image or a depth image of a surrounding area from the moving body as digital data. Not only grayscale cameras, but also cameras capable of acquiring color images, depth cameras, two-dimensional (2D) light detection and ranging (LiDAR) cameras, or three-dimensional (3D) LiDAR cameras, for example, may be used. A stereo camera may also be used, or a plurality of cameras may be arranged in each direction of the moving body. In addition, the number of times of acquiring the information id per second is not limited to 30.

For example, in the case where a stereo camera is used as the sensor 603, in step S1201, in the process for generating feature points in the environment, the information processing apparatus 604 acquires the distance of each image feature point using a stereo image pair, instead of moving the sensor 603 to the position and orientation a'. The information processing apparatus 604 may convert the distance into three-dimensional coordinates of the image feature point.

Alternatively, in the case of using a sensor (such as a depth camera or 3D-LiDAR) capable of obtaining distance information about each pixel of an image as the sensor 603, the information processing device 604 calculates three-dimensional coordinates based on the position and orientation and angle of view of the sensor, and the distance from each pixel. Further, the information processing apparatus 604 may generate a feature point in the environment corresponding to each pixel. In this configuration, a more dense set of feature points in the environment than in the second exemplary embodiment can be obtained. In this case, the relative position and orientation between the measurement points is calculated using a known Interactive Closest Point (ICP) algorithm. The calculated relative position and orientation may be used for the loop detection process and the first correction process.

The first and second exemplary embodiments described above show examples in which the position and orientation in the three-dimensional space are measured and environment map data for position/orientation measurement is created. However, the position/orientation measurement and the environment map data creation may be performed on a two-dimensional plane along the surface on which the moving body moves. For example, in a moving body system traveling on the ground, 2D-LiDAR for scanning data in the horizontal direction may be used as the sensor 603, and environment map data having a feature point group in a denser environment as described above may be created on a two-dimensional plane.

In the first exemplary embodiment, the image is held as information for loop detection at each measurement point. However, for example, when the measurement points are generated in step S905, the feature value vector and the image feature points may be calculated based on the BoW model, and the feature value vector and the image feature points may be held instead of the image. In this case, the image similarity may be calculated based on the feature value vector calculated in advance by the loop detection unit 801, and the relative position and orientation may be calculated based on the feature points calculated in advance.

In the first exemplary embodiment, in step S901, the position of the first measurement point is set as the origin in the environment, and the orientation is set in a predetermined direction (for example, the positive Y-axis direction). However, if the position and orientation of the first measurement point can be specified by other methods, the value obtained by the method can be used. For example, the relative position and orientation with respect to the sensor 603 may be calculated using a marker that can be detected by the sensor 603 or other unit, and the origin and coordinate axis may be set based on the calculated relative position and orientation. In addition, the origin and coordinate axes can be set using the mounting position and orientation of the sensor 603 on the moving body system 601 as compensation.

A similar label as described above may be used for loop detection by the loop detection unit 801. For example, by detecting that the same mark is observed from two measurement points, instead of calculating the image similarity, the relative position and orientation between the images can be calculated based on the relative position and orientation between the sensor and the mark in each image.

The second exemplary embodiment described above shows an example in which the determination as to whether or not to execute the rigid transformation process in step S1401 is made by determining whether or not the position/orientation calculation unit 703 shares a feature point in the environment with the loop destination measurement point. However, the method for determining whether to perform the rigid transformation process is not limited to this example. For example, the number of feature points in the environment shared between the position/orientation calculation unit 703 and the loop source measurement point may be compared with the number of feature points in the environment shared between the position/orientation calculation unit 703 and the loop destination measurement point, and then, if the former number is larger than the latter number, it is determined to perform the rigid transformation process. Alternatively, as in the first exemplary embodiment, whether or not to perform the rigid transformation process is determined based on the distance from each measurement point on the posture map.

Although the first exemplary embodiment shows an example in which the user operates the mobile body from the outside, the configuration of the mobile body system 601 is not limited to this example. For example, a manned mobile body on which a user can ride and which can be directly operated may be used. Alternatively, a moving body including a function for autonomously traveling along a preset route may be used. In this case, autonomous traveling can be achieved by generating control information for the moving body system 601 based on the environment map data 701 and the position/orientation information calculated by the position/orientation calculation unit 703, and driving the moving unit 607 by the control device 606. Further, the mobile body system 601 may update the environment map data 701 according to the method described in the first or second exemplary embodiment based on the sensor information acquired from the sensor 603 during autonomous travel.

Although the exemplary embodiment shows a configuration in which the moving unit 607 is a wheel, a configuration may also be adopted in which a plurality of propellers or the like are mounted on the moving body system 601, the moving body system 601 flies in the air, and the sensor 603 is viewed in the direction of the ground.

The present disclosure can also be achieved by a process in which a program for realizing one or more functions according to the above-described exemplary embodiments is provided to a system or an apparatus via a network or a storage medium, and the program is read out and executed by one or more processors in a computer of the system or the apparatus. The present disclosure may also be implemented by a circuit (e.g., ASIC) for implementing one or more functions according to the above-described exemplary embodiments.

The processing may be performed using a training model obtained by machine learning instead of the position/orientation calculation unit 703 and the measurement point generation unit 704 included in the processing unit described above. In this case, for example, a plurality of combinations of input data and output data of the processing unit are prepared as learning data, and a training model is generated by acquiring knowledge through machine learning so that the training model outputs output data corresponding to the input data based on the acquired knowledge as a result. The training model may be constructed using, for example, a neural network model. The training model operates as a program for performing processing equivalent to a processing unit in cooperation with a CPU or a Graphics Processing Unit (GPU), thereby performing processing corresponding to the processing unit. The training model may be updated after a predetermined process as needed.

According to the exemplary embodiments of the present disclosure, it is possible to prevent the generation of redundant measurement points due to an increase in time to correct map data during closed-loop processing.

Other embodiments

Embodiments of the present disclosure may also be implemented by a computer of a system or apparatus that reads and executes computer-executable instructions (e.g., one or more programs) recorded on a storage medium (also may be more fully referred to as a "non-transitory computer-readable storage medium") to perform the functions of one or more of the above-described embodiments, and/or includes one or more circuits (e.g., an application-specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiments, and methods may be utilized by which the computer of the system or apparatus, for example, reads and executes the computer-executable instructions from the storage medium to perform the functions of one or more of the above-described embodiments, and/or controls the one or more circuits to perform the functions of one or more of the above-described embodiments, to implement embodiments of the present invention. The computer may include one or more processors (e.g., a Central Processing Unit (CPU), a Micro Processing Unit (MPU)) and may include a separate computer or a network of separate processors to read out and execute the computer-executable instructions. The computer-executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, a hard disk, Random Access Memory (RAM), Read Only Memory (ROM), memory of a distributed computing system, an optical disk such as a Compact Disk (CD), Digital Versatile Disk (DVD), or blu-ray disk (BD)TM) One or more of a flash memory device, and a memory card, etc.

The embodiments of the present invention can also be realized by a method in which software (programs) that perform the functions of the above-described embodiments are supplied to a system or an apparatus through a network or various storage media, and a computer or a Central Processing Unit (CPU), a Micro Processing Unit (MPU) of the system or the apparatus reads out and executes the methods of the programs.

While the present disclosure has been described with reference to exemplary embodiments, the scope of the appended claims should be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.

35页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:地下有水管涵内窥定位系统及其工作方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!