Calibration method of laser sensor

文档序号:1713303 发布日期:2019-12-13 浏览:19次 中文

阅读说明:本技术 激光传感器的校准方法 (Calibration method of laser sensor ) 是由 吴侃之 马陆 于 2017-04-28 设计创作,主要内容包括:一种用于校准由移动平台承载的激光传感器的方法,该方法包括:确定由激光传感器生成的点云数据的重叠区域;比较该重叠区域内的点云的表面特征;以及基于其生成校准规则。一种自动检测对由移动平台携带的发射器/检测器单元的干扰的方法,包括:将第一点云信息和第二点云信息转换为与移动平台相关联的参考系中的第一点云和第二点云,第一点云和第二点云信息在第一和第二时间点从发射器/检测器单元获得;确定第一点云与第二点云之间的重叠区域;比较重叠区域中第一点云和第二点云的表面属性;至少部分地基于比较表面属性来检测对发射器/检测器单元的干扰。(A method for calibrating a laser sensor carried by a mobile platform, the method comprising: determining an overlap region of point cloud data generated by a laser sensor; comparing surface features of the point clouds within the overlapping region; and generating a calibration rule based thereon. A method of automatically detecting interference to an emitter/detector unit carried by a mobile platform, comprising: converting the first point cloud information and the second point cloud information into a first point cloud and a second point cloud in a reference frame associated with the mobile platform, the first point cloud and the second point cloud information obtained from the emitter/detector unit at the first and second points in time; determining an overlap region between the first point cloud and the second point cloud; comparing surface attributes of the first point cloud and the second point cloud in the overlapping region; detecting interference with the emitter/detector unit based at least in part on the comparison surface property.)

1. A computer-implemented method for calibrating a first laser unit and a second laser unit both attached to a bus, comprising:

transforming point cloud information obtained from the first laser unit into a first point cloud in a coordinate system associated with the autonomous vehicle based at least in part on a first transformation matrix defined at least in part by a position and orientation of the first laser unit relative to the autonomous vehicle;

Transforming point cloud information obtained from the second laser unit into a second point cloud in a coordinate system associated with the autonomous vehicle based at least in part on a second transformation matrix defined at least in part by a position and orientation of the second laser unit relative to the autonomous vehicle;

Determining an overlap region between the first point cloud and the second point cloud based at least in part on a pair of points between the first point cloud and the second point cloud;

deriving a plurality of first features characterizing, at least in part, a surface of the first point cloud in the overlap region;

Deriving a plurality of second features characterizing, at least in part, a surface of the second point cloud in the overlap region;

generating at least one calibration rule for calibration between the first laser unit and the second laser unit based at least in part on evaluating a function comprising at least the first feature and the second feature; and

Performing calibration between the first laser unit and the second laser unit based on the at least one calibration rule.

2. A computer-implemented method for calibrating at least a first emitter/detector unit and a second emitter/detector unit, both carried by a common mobile platform, comprising:

Transforming point cloud information obtained from the first emitter/detector unit into a first point cloud in a reference frame associated with the mobile platform;

Transforming point cloud information obtained from the second emitter/detector unit into a second point cloud in a reference frame associated with the mobile platform;

Determining an overlap region between the first point cloud and the second point cloud;

Comparing surface attributes of the first point cloud and the second point cloud in the overlap region; and

Generating at least one calibration rule for calibration between the first emitter/detector unit and the second emitter/detector unit based at least in part on comparing the surface properties.

3. the method of claim 2, wherein point cloud information obtained from the first emitter/detector unit is transformed based at least in part on a first set of transformation rules.

4. the method of claim 3, wherein the first set of transformation rules is defined at least in part according to a position and orientation of the first emitter/detector unit relative to the mobile platform.

5. The method of any of claims 3 or 4, wherein the point cloud information obtained from the second emitter/detector unit is transformed based at least in part on a second set of transformation rules, and wherein the second set of transformation rules is different from the first set of transformation rules.

6. the method of any of claims 3-5, wherein the reference frame associated with the mobile platform comprises a coordinate system.

7. The method of any of claims 3-6, wherein the first set of transformation rules comprises a transformation matrix.

8. the method of claim 2, wherein the first emitter/detector unit comprises at least one laser sensor.

9. The method of claim 8, wherein the FOV of the field of view of the at least one laser sensor is less than at least one of 360 degrees, 180 degrees, 90 degrees, or 60 degrees.

10. The method of any one of claims 8 or 9, wherein the first emitter/detector unit comprises a plurality of laser sensors rigidly fixed relative to each other.

11. The method of claim 2, wherein determining an overlap region comprises: at least a nearest neighbor point is determined in the second point cloud for at least one point in the first point cloud.

12. The method of any of claims 2 or 11, wherein determining an overlap region comprises: creating a tree structured data structure for at least one of the first point cloud or the second point cloud.

13. The method of any of claims 2, 11 or 12, wherein the tree structured data structure comprises a K-dimensional (KD) tree data structure.

14. The method of claim 2, wherein comparing surface attributes comprises: matching a surface associated with the first point cloud with a surface associated with the second point.

15. The method of claim 14, wherein matching the surface associated with the first point cloud with the surface associated with the second point comprises: normal vector information is determined for at least a portion of the first point cloud.

16. The method of any of claims 2, 14 or 15, wherein comparing surface properties further comprises: evaluating an objective function defined at least in part by a plurality of points of the first point cloud and the second point cloud within the overlap region.

17. The method of claim 16, wherein the objective function comprises a rotation component and a translation component.

18. The method of claim 17, wherein comparing surface properties further comprises: the translation component is held fixed.

19. The method of any one of claims 16-18, wherein generating at least one calibration rule comprises: optimizing the objective function.

20. The method of claim 19, wherein optimizing the objective function is based at least in part on a least squares method.

21. the method of any of claims 2 or 14-20, wherein the at least one calibration rule comprises: rules for transforming between the coordinate systems of the first emitter/detector unit and the second emitter/detector unit.

22. The method of any of claims 2 or 14-21, further comprising detecting a difference between the generated at least one calibration rule and one or more previously generated calibration rules.

23. The method of claim 2, wherein the mobile platform comprises at least one of: unmanned Aerial Vehicle (UAV), manned aircraft, autonomous automobile, self-balancing vehicle, robot, intelligent wearable device, Virtual Reality (VR) head mounted display, or Augmented Reality (AR) head mounted display.

24. The method of any of claims 2-23, further comprising calibrating the first emitter/detector unit and the second emitter/detector unit according to the at least one calibration rule.

25. A non-transitory computer-readable medium storing computer-executable instructions that, when executed, cause one or more processors associated with a mobile platform to perform acts comprising:

Transforming point cloud information obtained from a first emitter/detector unit into a first point cloud in a reference frame associated with the mobile platform;

Transforming point cloud information obtained from a second emitter/detector unit into a second point cloud in a reference frame associated with the mobile platform;

determining an overlap region between the first point cloud and the second point cloud;

Comparing surface attributes of the first point cloud and the second point cloud in the overlap region; and

generating at least one calibration rule for calibration between the first emitter/detector unit and the second emitter/detector unit based at least in part on comparing the surface properties.

26. The computer-readable medium of claim 25, wherein point cloud information obtained from the first emitter/detector unit is transformed based at least in part on a first set of transformation rules.

27. The computer-readable medium of claim 26, wherein the first set of transformation rules is defined at least in part according to a position and orientation of the first transmitter/detector unit relative to the mobile platform.

28. The computer-readable medium of any of claims 26 or 27, wherein point cloud information obtained from the second emitter/detector unit is transformed based at least in part on a second set of transformation rules, and wherein the second set of transformation rules is different from the first set of transformation rules.

29. The computer-readable medium of any of claims 26-28, wherein the reference frame associated with the mobile platform comprises a coordinate system.

30. The computer-readable medium of any of claims 26-29, wherein the first set of transformation rules comprises a transformation matrix.

31. the computer readable medium of claim 25, wherein the first emitter/detector unit comprises at least one laser sensor.

32. The computer readable medium of claim 31, wherein the FOV of the field of view of the at least one laser sensor is less than at least one of 360 degrees, 180 degrees, 90 degrees, or 60 degrees.

33. The computer readable medium of any of claims 31 or 32, wherein the first emitter/detector unit comprises a plurality of laser sensors rigidly fixed relative to each other.

34. the computer-readable medium of claim 25, wherein determining an overlap region comprises: at least a nearest neighbor point is determined in the second point cloud for at least one point in the first point cloud.

35. The computer-readable medium of any of claims 25 or 34, wherein determining an overlap region comprises: creating a tree structured data structure for at least one of the first point cloud or the second point cloud.

36. the computer-readable medium of any of claims 25, 34 or 35, wherein the tree-structured data structure comprises a K-dimensional (KD) tree data structure.

37. The computer-readable medium of claim 25, wherein comparing surface properties comprises: matching a surface associated with the first point cloud with a surface associated with the second point.

38. The computer-readable medium of claim 37, wherein matching the surface associated with the first point cloud and the surface associated with the second point comprises: normal vector information is determined for at least a portion of the first point cloud.

39. The computer-readable medium of any of claims 25, 37, or 38, wherein comparing surface properties further comprises: evaluating an objective function defined at least in part by a plurality of points of the first point cloud and the second point cloud within the overlap region.

40. The computer-readable medium of claim 39, wherein the objective function includes a rotation component and a translation component.

41. the computer-readable medium of claim 40, wherein comparing surface properties further comprises: the translation component is held fixed.

42. The computer-readable medium of any one of claims 39-41, wherein generating at least one calibration rule comprises: optimizing the objective function.

43. the computer-readable medium of claim 42, wherein optimizing the objective function is based at least in part on a least squares method.

44. The computer-readable medium of any one of claims 25 or 37-43, wherein the at least one calibration rule comprises: rules for transforming between the coordinate systems of the first emitter/detector unit and the second emitter/detector unit.

45. The computer-readable medium of any one of claims 25 or 37-44, wherein the actions further comprise detecting a difference between the generated at least one calibration rule and one or more previously generated calibration rules.

46. The computer-readable medium of claim 25, wherein the mobile platform comprises at least one of: unmanned Aerial Vehicle (UAV), manned aircraft, autonomous automobile, self-balancing vehicle, robot, intelligent wearable device, Virtual Reality (VR) head mounted display, or Augmented Reality (AR) head mounted display.

47. The computer readable medium of any of claims 25-46, wherein the actions further comprise calibrating the first emitter/detector unit and the second emitter/detector unit according to the at least one calibration rule.

48. A vehicle comprising a programmed controller that controls, at least in part, one or more motions of the vehicle, wherein the programmed controller comprises one or more processors configured to:

transforming point cloud information obtained from a first emitter/detector unit into a first point cloud in a reference frame associated with the vehicle;

Transforming point cloud information obtained from a second emitter/detector unit into a second point cloud in a reference frame associated with the vehicle;

determining an overlap region between the first point cloud and the second point cloud;

Comparing surface attributes of the first point cloud and the second point cloud in the overlap region; and

Generating at least one calibration rule for calibrating between the first emitter/detector unit and the second emitter/detector unit based at least in part on the comparison of surface properties.

49. the vehicle of claim 48, wherein the point cloud information obtained from the first emitter/detector unit is transformed based at least in part on a first set of transformation rules.

50. the vehicle of claim 49, wherein the first set of transformation rules is defined at least in part according to a position and orientation of the first emitter/detector unit relative to the vehicle.

51. The vehicle of any of claims 26 or 27, wherein the point cloud information obtained from the second emitter/detector unit is transformed based at least in part on a second set of transformation rules, and wherein the second set of transformation rules is different from the first set of transformation rules.

52. The vehicle of any one of claims 49-51, wherein the reference system associated with the vehicle comprises a coordinate system.

53. The vehicle of any of claims 49-52, wherein the first set of transformation rules comprises a transformation matrix.

54. the vehicle of claim 48, wherein the first emitter/detector unit comprises at least one laser sensor.

55. The vehicle of claim 54, wherein the FOV of the field of view of the at least one laser sensor is less than at least one of 360 degrees, 180 degrees, 90 degrees, or 60 degrees.

56. The vehicle of any one of claims 54 or 55, wherein the first emitter/detector unit comprises a plurality of laser sensors rigidly fixed relative to one another.

57. the vehicle of claim 48, wherein determining an overlap region comprises: at least a nearest neighbor point is determined in the second point cloud for at least one point in the first point cloud.

58. the vehicle of any one of claims 48 or 57, wherein determining an overlap region comprises: creating a tree structured data structure for at least one of the first point cloud or the second point cloud.

59. The vehicle of any of claims 48, 57, or 58, wherein the tree-structured data structure comprises a K-dimensional (KD) tree data structure.

60. The vehicle of claim 48, wherein comparing the surface properties comprises: matching a surface associated with the first point cloud with a surface associated with the second point.

61. The vehicle of claim 60, wherein matching the surface associated with the first point cloud to the surface associated with the second point comprises: normal vector information is determined for at least a portion of the first point cloud.

62. The vehicle of any one of claims 48, 60, or 61, wherein comparing surface properties further comprises: evaluating an objective function defined at least in part by a plurality of points of the first point cloud and the second point cloud within the overlap region.

63. The vehicle of claim 62, wherein the objective function includes a rotation component and a translation component.

64. The vehicle of claim 63, wherein comparing surface properties further comprises: the translation component is held fixed.

65. The vehicle of any one of claims 62-64, wherein generating at least one calibration rule comprises: optimizing the objective function.

66. The vehicle of claim 65, wherein optimizing the objective function is based at least in part on a least squares method.

67. The vehicle of any one of claims 48 or 60-66, wherein the at least one calibration rule comprises: rules for transforming between the coordinate systems of the first emitter/detector unit and the second emitter/detector unit.

68. The vehicle of any one of claims 48 or 60-67, wherein the one or more processors are further configured to detect a difference between the generated at least one calibration rule and one or more previously generated calibration rules.

69. The vehicle of claim 48, wherein the vehicle comprises at least one of: unmanned aerial vehicle UAV, manned aircraft, autonomous vehicle, self-balancing vehicle, or robot.

70. The vehicle of any one of claims 48-69, wherein the one or more processors are further configured to calibrate the first emitter/detector unit and the second emitter/detector unit according to the at least one calibration rule.

71. a computer-implemented method for automatically detecting interference to an emitter/detector unit carried by a mobile platform, comprising:

Transforming first point cloud information into a first point cloud in a reference frame associated with the mobile platform, the first point cloud information obtained from the emitter/detector unit at a first point in time;

Transforming second point cloud information into a second point cloud in a reference frame associated with the mobile platform, the second point cloud information obtained from the emitter/detector unit at a second point in time;

Determining an overlap region between the first point cloud and the second point cloud;

Comparing surface attributes of the first point cloud and the second point cloud in the overlap region; and

Detecting interference with the emitter/detector unit based at least in part on comparing the surface property.

72. The method of claim 71, wherein the first point cloud information obtained from the emitter/detector unit is transformed based at least in part on a set of transformation rules.

73. The method of claim 72, wherein the set of transformation rules is defined at least in part according to a position and orientation of the emitter/detector unit relative to the mobile platform.

74. The method of any one of claims 71-73, wherein the reference frame associated with the mobile platform comprises a coordinate system.

75. The method of any of claims 71-74, wherein a reference frame associated with the mobile platform corresponds to a reference frame for the first point in time.

76. The method of any of claims 71-75, wherein the set of transformation rules comprises a transformation matrix.

77. The method of claim 1, wherein the emitter/detector unit comprises at least one laser sensor.

78. The method of claim 77, wherein the FOV of the field of view of the at least one laser sensor is less than at least one of 360 degrees, 180 degrees, 90 degrees, or 60 degrees.

79. The method of any one of claims 77 or 78, wherein the emitter/detector unit comprises a plurality of laser sensors rigidly fixed relative to each other.

80. The method of claim 71, wherein determining an overlap region comprises: at least a nearest neighbor point is determined in the second point cloud for at least one point in the first point cloud.

81. The method of any of claims 71 or 80, wherein determining an overlap region comprises: creating a tree structured data structure for at least one of the first point cloud or the second point cloud.

82. The method of any one of claims 71, 80, or 81, wherein the tree structured data structure comprises a K-dimensional (KD) tree data structure.

83. The method of claim 71, wherein comparing surface attributes comprises: matching a surface associated with the first point cloud with a surface associated with the second point.

84. The method of claim 83, wherein matching the surface associated with the first point cloud with the surface associated with the second point comprises: normal vector information is determined for at least a portion of the first point cloud.

85. The method of any one of claims 71, 83, or 84, wherein comparing surface properties further comprises: evaluating an objective function defined at least in part by a plurality of points of the first point cloud and the second point cloud within the overlap region.

86. The method of claim 85, wherein the objective function includes a rotation component and a translation component.

87. The method of any one of claims 85 or 86, wherein detecting interference to the emitter/detector unit comprises optimizing the objective function.

88. The method of claim 87, wherein optimizing the objective function is based at least in part on a least squares method.

89. the method of any one of claims 71 or 83-88, wherein detecting interference to the emitter/detector unit further comprises: generating at least one rule for transforming between the first point cloud and the second point cloud.

90. The method of claim 89, wherein interference to the emitter/detector unit is detected further based on a relationship between the generated at least one rule and a threshold.

91. the method of claim 71, wherein the mobile platform comprises at least one of: unmanned Aerial Vehicle (UAV), manned aircraft, autonomous automobile, self-balancing vehicle, robot, intelligent wearable device, Virtual Reality (VR) head mounted display, or Augmented Reality (AR) head mounted display.

92. the method of any of claims 71-91, further comprising issuing a warning in response to detecting interference with the emitter/detector unit.

Technical Field

the technology of the present disclosure relates generally to calibration of emitter/detector sensors (e.g., laser sensors) carried by a mobile platform.

Background

Laser sensors, such as LiDAR sensors, typically emit pulsed laser signals outward, detect pulsed signal reflections, and measure three-dimensional information (e.g., laser scan points) in the environment to facilitate environment mapping. To achieve accurate mapping of the environment around a mobile platform, an omnidirectional laser sensor with a 360 degree horizontal field of view (FOV) is typically mounted on the mobile platform to constantly scan its surroundings. Omnidirectional laser sensors are typically expensive, non-customizable, and have a limited vertical FOV. Accordingly, there remains a need for improved sensing techniques and apparatus for mobile platforms.

Disclosure of Invention

the following summary is provided for the convenience of the reader and indicates some representative embodiments of the presently disclosed technology.

In some embodiments, a computer-implemented method for automatically calibrating at least a first emitter/detector unit and a second emitter/detector unit, both carried by a common mobile platform, comprises: transforming point cloud information obtained from the first emitter/detector unit into a first point cloud in a reference frame associated with the mobile platform; transforming point cloud information obtained from the second emitter/detector unit into a second point cloud in a reference frame associated with the mobile platform; determining an overlap region between the first point cloud and the second point cloud; comparing surface attributes of the first point cloud and the second point cloud in the overlap region; and generating at least one calibration rule for calibration between the first emitter/detector unit and the second emitter/detector unit based at least in part on comparing the surface properties. In some embodiments, the point cloud information obtained from the first emitter/detector unit is transformed based at least in part on a first set of transformation rules, the first set of transformation rules defined at least in part according to a position and orientation of the first emitter/detector unit relative to the mobile platform. In some embodiments, the reference frame associated with the mobile platform comprises a coordinate system. In some embodiments, the first set of transformation rules comprises a transformation matrix. In some embodiments, the first emitter/detector unit comprises at least one laser sensor having a field of view (FOV) that is less than at least one of 360 degrees, 180 degrees, 90 degrees, or 60 degrees. In some embodiments, the first emitter/detector unit comprises a plurality of laser sensors rigidly fixed relative to each other. In some embodiments, determining the overlap region comprises: at least a nearest neighbor point is determined in the second point cloud for at least one point in the first point cloud. In some embodiments, determining the overlap region comprises: creating a tree structured data structure for at least one of the first point cloud or the second point cloud. In some embodiments, comparing the surface properties comprises: matching a surface associated with the first point cloud with a surface associated with the second point. In some embodiments, comparing the surface properties further comprises: evaluating an objective function defined at least in part by a plurality of points of the first point cloud and the second point cloud within the overlap region. In some embodiments, generating at least one calibration rule comprises optimizing the objective function. In some embodiments, the at least one calibration rule comprises: rules for transforming between the coordinate systems of the first emitter/detector unit and the second emitter/detector unit. In some embodiments, the computer-implemented method further comprises detecting a difference between the generated at least one calibration rule and one or more previously generated calibration rules. In some embodiments, the computer-implemented method further comprises calibrating the first emitter/detector unit and the second emitter/detector unit according to the at least one calibration rule.

in other embodiments, a non-transitory computer-readable medium stores computer-executable instructions. The instructions, when executed, cause one or more processors associated with a mobile platform to perform acts comprising: transforming point cloud information obtained from a first emitter/detector unit into a first point cloud in a reference frame associated with the mobile platform; transforming point cloud information obtained from a second emitter/detector unit into a second point cloud in a reference frame associated with the mobile platform; determining an overlap region between the first point cloud and the second point cloud; comparing surface attributes of the first point cloud and the second point cloud in the overlap region; and generating at least one calibration rule for calibration between the first emitter/detector unit and the second emitter/detector unit based at least in part on comparing the surface properties. In some embodiments, the point cloud information obtained from the first emitter/detector unit is transformed based at least in part on a first set of transformation rules, the first set of transformation rules defined at least in part according to a position and orientation of the first emitter/detector unit relative to the mobile platform. In some embodiments, the reference frame associated with the mobile platform comprises a coordinate system. In some embodiments, the first set of transformation rules comprises a transformation matrix. In some embodiments, the first emitter/detector unit comprises at least one laser sensor having a field of view (FOV) that is less than at least one of 360 degrees, 180 degrees, 90 degrees, or 60 degrees. In some embodiments, the first emitter/detector unit comprises a plurality of laser sensors rigidly fixed relative to each other. In some embodiments, determining the overlap region comprises: at least a nearest neighbor point is determined in the second point cloud for at least one point in the first point cloud. In some embodiments, determining the overlap region comprises: creating a tree structured data structure for at least one of the first point cloud or the second point cloud. In some embodiments, comparing the surface properties comprises: matching a surface associated with the first point cloud with a surface associated with the second point. In some embodiments, comparing the surface properties further comprises: evaluating an objective function defined at least in part by a plurality of points of the first point cloud and the second point cloud within the overlap region. In some embodiments, generating the at least one calibration rule comprises optimizing the objective function. In some embodiments, the at least one calibration rule comprises: rules for transforming between the coordinate systems of the first emitter/detector unit and the second emitter/detector unit. In some embodiments, the actions further include detecting a difference between the generated at least one calibration rule and one or more previously generated calibration rules. In some embodiments, the actions further comprise calibrating the first emitter/detector unit and the second emitter/detector unit according to the at least one calibration rule.

in still other embodiments, a vehicle includes a programmed controller that at least partially controls one or more motions of the vehicle. The programmed controller comprises one or more processors configured to: transforming point cloud information obtained from a first emitter/detector unit into a first point cloud in a reference frame associated with the vehicle; transforming point cloud information obtained from a second emitter/detector unit into a second point cloud in a reference frame associated with the vehicle; determining an overlap region between the first point cloud and the second point cloud; comparing surface attributes of the first point cloud and the second point cloud in the overlap region; and generating at least one calibration rule for calibrating between the first emitter/detector unit and the second emitter/detector unit based at least in part on the comparison of the surface properties. In some embodiments, the point cloud information obtained from the first emitter/detector unit is transformed based at least in part on a first set of transformation rules, the first set of transformation rules defined at least in part according to a position and orientation of the first emitter/detector unit relative to the vehicle. Transforming the point cloud information obtained from the second emitter/detector unit based at least in part on a second set of transformation rules, and wherein the second set of transformation rules is different from the first set of transformation rules. In some embodiments, the first set of transformation rules comprises a transformation matrix. In some embodiments, the first emitter/detector unit comprises a plurality of laser sensors rigidly fixed relative to each other. In some embodiments, determining the overlap region comprises: at least a nearest neighbor point is determined in the second point cloud for at least one point in the first point cloud. In some embodiments, comparing the surface properties comprises: matching a surface associated with the first point cloud with a surface associated with the second point, and matching a surface associated with the first point cloud with a surface associated with the second point comprises: normal vector information is determined for at least a portion of the first point cloud. In some embodiments, comparing the surface properties further comprises: evaluating an objective function defined at least in part by a plurality of points of the first point cloud and the second point cloud within the overlap region, the objective function comprising a rotation component and a translation component. In some embodiments, the at least one calibration rule comprises: rules for transforming between the coordinate systems of the first emitter/detector unit and the second emitter/detector unit. In some embodiments, the one or more processors are further configured to detect a difference between the generated at least one calibration rule and one or more previously generated calibration rules. In some embodiments, the vehicle comprises at least one of: unmanned aerial vehicle UAV, manned aircraft, autonomous vehicle, self-balancing vehicle, or robot. In some embodiments, the one or more processors are further configured to calibrate the first emitter/detector unit and the second emitter/detector unit according to the at least one calibration rule.

In still other embodiments, a computer-implemented method for automatically detecting interference to an emitter/detector unit carried by a mobile platform, comprising: transforming first point cloud information into a first point cloud in a reference frame associated with the mobile platform, the first point cloud information obtained from the emitter/detector unit at a first point in time; transforming second point cloud information into a second point cloud in a reference frame associated with the mobile platform, the second point cloud information obtained from the emitter/detector unit at a second point in time; determining an overlap region between the first point cloud and the second point cloud; comparing surface attributes of the first point cloud and the second point cloud in the overlap region; and detecting interference with the emitter/detector unit based at least in part on comparing the surface property. In some embodiments, the first point cloud information obtained from the emitter/detector unit is transformed based at least in part on a set of transformation rules, the first set of transformation rules defined at least in part in accordance with a position and orientation of the emitter/detector unit relative to the mobile platform. In some embodiments, the reference frame associated with the mobile platform comprises a coordinate system. In some embodiments, a reference frame associated with the mobile platform corresponds to a reference frame of the first point in time. In some embodiments, the set of transformation rules comprises a transformation matrix. In some embodiments, the emitter/detector unit includes at least one laser sensor having a field of view (FOV) that is less than at least one of 360 degrees, 180 degrees, 90 degrees, or 60 degrees. In some embodiments, the emitter/detector unit comprises a plurality of laser sensors rigidly fixed relative to each other. In some embodiments, determining the overlap region comprises: at least a nearest neighbor point is determined in the second point cloud for at least one point in the first point cloud. In some embodiments, determining the overlap region comprises: creating a tree-structured data structure for at least one of the first point cloud or the second point cloud, wherein the tree-structured data structure comprises a K-dimensional (KD) tree data structure. In some embodiments, comparing the surface properties comprises: matching a surface associated with the first point cloud with a surface associated with the second point, wherein matching a surface associated with the first point cloud with a surface associated with the second point comprises: normal vector information is determined for at least a portion of the first point cloud. In some embodiments, comparing the surface properties further comprises: evaluating an objective function defined at least in part by a plurality of points of the first point cloud and the second point cloud within the overlap region, the objective function comprising a rotation component and a translation component. In some embodiments, detecting interference to the emitter/detector unit comprises: the objective function is optimized based at least in part on a least squares method. In some embodiments, detecting interference to the emitter/detector unit further comprises: generating at least one rule for transforming between the first point cloud and the second point cloud. In some embodiments, the mobile platform comprises at least one of: unmanned Aerial Vehicle (UAV), manned aircraft, autonomous vehicle, self-balancing vehicle, ground robot, intelligent wearable device, Virtual Reality (VR) head mounted display, or Augmented Reality (AR) head mounted display. In some embodiments, the computer-implemented method further comprises issuing an alert in response to detecting interference with the emitter/detector unit.

Drawings

Fig. 1A illustrates a scan pattern of a laser sensor that may be used in accordance with some embodiments of the disclosed technology.

Fig. 1B illustrates a three-dimensional point cloud generated by a laser sensor, in accordance with some embodiments of the presently disclosed technology.

Fig. 2 illustrates a layout of multiple laser sensors carried by a mobile platform to achieve wide-angle (e.g., omnidirectional) horizontal field of view (FOV) coverage, in accordance with some embodiments of the presently disclosed technology.

Fig. 3 illustrates a laser unit including multiple laser sensors that may be used in accordance with some embodiments of the disclosed technology.

Fig. 4 illustrates a layout of multiple laser units carried by a mobile platform to achieve wide-angle (e.g., omnidirectional) horizontal FOV coverage, in accordance with some embodiments of the presently disclosed technology.

Fig. 5 illustrates a mobile platform with two laser units (or laser sensors) having overlapping FOVs in accordance with some embodiments of the presently disclosed technology.

fig. 6 illustrates a calibration process for two laser units (or laser sensors) in accordance with some embodiments of the disclosed technology.

Fig. 7A illustrates movement of a mobile platform from a first point in time to a second point in time, in accordance with some embodiments of the presently disclosed technology.

fig. 7B illustrates a process for detecting interference to a laser unit in accordance with some embodiments of the disclosed technology.

Fig. 8 illustrates one frame of laser spots of a laser sensor in accordance with some embodiments of the disclosed technology.

Fig. 9 illustrates a sequence of point data frames generated by a laser unit (or laser sensor) carried by a moving platform moving over a period of time in accordance with some embodiments of the presently disclosed technology.

Fig. 10 illustrates a combined point cloud generated in accordance with some embodiments of the presently disclosed technology.

Fig. 11 illustrates a mobile platform carrying multiple sensors in addition to a laser unit, in accordance with some embodiments of the disclosed technology.

Fig. 12 illustrates information that may be provided by the plurality of sensors of fig. 11 in accordance with some embodiments of the presently disclosed technology.

fig. 13 illustrates data collection frequency differences for the multiple sensors and laser units of fig. 11 in accordance with some embodiments of the presently disclosed technology.

Fig. 14 illustrates a process for combining temporally sequential point information generated by laser units to form a point cloud in accordance with some embodiments of the presently disclosed technology.

Fig. 15 illustrates an example of a mobile platform configured in accordance with some embodiments of the presently disclosed technology.

FIG. 16 is a block diagram illustrating a representative architecture of a computer system or other control device that may be used to implement portions of the techniques of this disclosure.

Detailed Description

1. Overview

To achieve accurate and comprehensive environmental mapping while overcoming the drawbacks associated with omni-directional laser sensors, multiple laser sensors (e.g., strategically attached to a mobile platform) may be used to achieve wide horizontal field of view (FOV) coverage, omni-directional horizontal coverage, partial or full spherical coverage, or any other suitable customized coverage of the surrounding environment. Laser sensors with a limited FOV can be much cheaper than omnidirectional laser sensors and, as used herein, generally refer to laser sensors with a horizontal FOV less than 360 degrees, 180 degrees, 90 degrees, or 60 degrees. Due to the multiple sensors typically used to achieve the desired angular coverage, proper calibration between the sensors is used to accurately align the point cloud data generated by the different sensors, providing meaningful and reliable mapping of the surrounding environment. Incorrect calibration between sensors may distort the alignment of the point cloud, causing errors in the environment mapping process, and thus cause undesirable changes to the navigation, movement, and/or other functions of the mobile platform. Furthermore, when a mobile platform is actively deployed, external vibrations or other disturbances may cause the position or orientation of the originally fixed sensor to change, causing calibration errors. Thus, accurately detecting such changes in real time may further contribute to the reliability and security of the mobile platform.

The technology disclosed herein relates generally to calibrating and/or detecting errors in laser sensors carried by a mobile platform. As will be discussed in further detail below, some embodiments of the presently disclosed technology include a multi-laser calibration method that takes into account at least: 1) limit of FOV of the laser sensor; 2) desire or requirement for omnidirectional coverage, spherical coverage, and/or other customizable coverage achieved by multiple laser systems. In some embodiments, the techniques of the present disclosure reduce calibration times within a system using individual laser units, each laser unit including a plurality of fixed laser sensors. In some embodiments, the techniques of the present disclosure utilize methods for detecting a common overlap region between point clouds generated by at least two laser sensors or cells, and surface matching or comparing the point clouds within the overlap region, to generate rules for high precision calibration rules of the laser sensors or cells.

For clarity, several details of the following structures or processes are not set forth in the following description: these structures or processes are well known and commonly associated with mobile platforms (e.g., UAVs or other types of movable objects) and corresponding systems and subsystems, but may unnecessarily obscure some important aspects of the techniques of this disclosure. Furthermore, while the following disclosure sets forth several embodiments of different aspects of the techniques of the disclosure, some other embodiments may have different configurations or different components than those described herein. Accordingly, the techniques of this disclosure may have other embodiments with additional elements and/or without several elements described below with reference to fig. 1-16.

Fig. 1-16 are provided to illustrate representative embodiments of the presently disclosed technology. The drawings are not intended to limit the scope of the claims in this application unless otherwise specified.

many embodiments of the present technology described below may take the form of computer-or controller-executable instructions, including routines executed by a programmable computer or controller. The programmable computer or controller may or may not reside on the respective mobile platform. For example, the programmable computer or controller may be an on-board computer of the mobile platform, or a separate but dedicated computer associated with the mobile platform, or part of a network-based or cloud-based computing service. One skilled in the relevant art will appreciate that the techniques can be implemented on a computer or controller system other than those shown and described below. The techniques may be embodied in a special purpose computer or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions described below. Thus, the terms "computer" and "controller" are used generically herein to refer to any data processor, and may include internet devices and handheld devices (including palm-top computers, wearable computers, cellular or mobile phones, multi-processor systems, processor-based or programmable consumer electronics, network computers, mini-computers, and the like). The information processed by these computers and controllers may be presented on any suitable display medium, including an LCD (liquid crystal display). Instructions for performing computer or controller-executable tasks may be stored on or in any suitable computer-readable medium including hardware, firmware, or a combination of hardware and firmware. The instructions may be contained in any suitable storage device, including, for example, a flash drive, a Universal Serial Bus (USB) device, and/or other suitable medium.

2. Representative examples

Fig. 1A illustrates a scan pattern 102a of a laser sensor that may be used in accordance with some embodiments of the disclosed technology. As shown in fig. 1A, the FOV of the laser sensor is limited (e.g., no greater than 60 degrees in both the horizontal and vertical directions). FIG. 1B illustrates a three-dimensional point cloud generated by a laser sensor (e.g., the laser sensor shown in FIG. 1A). Laser sensors with a limited FOV, as compared to conventional omnidirectional laser sensors, cannot provide a uniformly distributed 360-degree three-dimensional point cloud, but may provide denser or sparser scan points in certain portions of their FOV (e.g., higher dot density in the forward region of the sensor FOV, as shown in fig. 1B).

Fig. 2 illustrates a layout of multiple laser sensors 202 carried by a mobile platform 202 to achieve wide-angle (e.g., omnidirectional) horizontal FOV coverage, in accordance with some embodiments of the presently disclosed technology. The laser sensors 204 may be distributed and/or oriented differently to achieve wide angle FOV coverage, partial or full spherical coverage, and other customizable FOV coverage. In the illustrative layout of fig. 2, depending on the FOV of each laser sensor 204, the overlap region 206 between the FOVs of two adjacent sensors 204 may be limited, which may provide insufficient or inadequate geometric information for high resolution calibration. In addition, depending on the number of laser sensors 204 needed to achieve wide-angle coverage, the number of calibrations between sensors may be large. For example, even if only pair-wise calibration between adjacent sensors 204 is performed, at least twelve sets of calibration rules need to be determined for the layout shown in fig. 2.

The disclosed technology may: 1) achieving a sufficiently large sized overlap region between the sensors (e.g., exceeding a threshold for overlapping FOVs); 2) strategically distributing and orienting multiple sensors relative to a mobile platform to achieve wide-angle (e.g., omnidirectional) coverage, spherical coverage, and/or other customizable coverage; and/or 3) reducing the number of laser sensors used. In this regard, fig. 3 illustrates a laser unit 300 including a plurality of laser sensors 302, 304, and 306 that may be used in accordance with some embodiments of the disclosed technology.

referring to fig. 3, the laser unit 300 may include a mechanical structure 308 (e.g., a metal frame), to which mechanical structure 308 two or more laser sensors having a limited FOV are rigidly or fixedly connected (e.g., welded to the metal frame) in a certain position and orientation relative to each other. In some embodiments, the mechanical structure 308 may correspond to a portion of a mobile platform. In the embodiment shown in fig. 3, the three laser sensors 302, 304, and 306 are positioned and oriented in a manner that extends the 60 degree horizontal FOV coverage of each sensor to about the 160 degree horizontal FOV of the laser unit 300. Illustratively, the angle between adjacent laser units may be set to 50 degrees to allow for proper FOV overlap.

The calibration rules for calibration between or among the laser sensors within the laser unit 300 may be known and fixed. For example, the laser sensors within one laser unit 300 may be pre-calibrated relative to each other manually using the same calibration techniques as disclosed herein, or based on calibration methods known to those skilled in the relevant art. As described above, the relative positions and orientations of the plurality of laser sensors are unlikely to change within the laser unit 300. Thus, based on pre-calibration, the laser unit 300 may consistently generate or otherwise output point cloud data covering a wider FOV than each constituent laser sensor.

fig. 4 illustrates a layout of multiple laser units 404 carried by a mobile platform 402 to achieve wide-angle (e.g., omnidirectional) horizontal FOV coverage, in accordance with some embodiments of the presently disclosed technology. Each laser unit 404 may have a similar configuration as the laser unit 300 shown in fig. 3. The laser units 404 (possibly in conjunction with the laser sensors 204 described above with reference to fig. 2) may be distributed and/or oriented differently to achieve wide angle FOV coverage, partial or full spherical coverage, and other customizable FOV coverage. Referring to fig. 4, four laser units 404 are distributed and oriented in four respective directions, for example, at 45 degrees, 135 degrees, 225 degrees, and 315 degrees, according to a coordinate system centered on the moving platform 402. As previously described, the laser sensors within each laser unit 404 are fixed relative to each other. Thus, the number of calibrations (e.g., between laser units 404) is reduced compared to the configuration of fig. 2. For example, if only pair-wise calibration between adjacent laser units 404 is performed, only four sets of calibration rules need to be determined. At the same time, the overlap area between the FOVs of adjacent laser units 404 may be large enough to provide sufficient geometric information to enhance the accuracy of the calibration between the laser units 404.

fig. 5 illustrates a mobile platform 512 having a first laser unit (or laser sensor) 508 and a second laser unit 510, the first and second laser units 508 and 510 having respective FOVs 504 and 506 that overlap, in accordance with some embodiments of the presently disclosed technology. As shown in fig. 5, two laser units 508 and 510 are arranged in a front view and the relative distance between them is limited to provide an overlap region 502 of sufficient size. In other embodiments, the laser units or sensors may be arranged differently to provide an overlap area of sufficient size. Referring to fig. 5, illustratively, a set of indicators "x" represents the point cloud 514 generated by the first laser unit 508 and a set of indicators "o" represents the point cloud 516 generated by the second laser unit 510.

With continued reference to FIG. 5, the mobile platform 512 may be associated with a coordinate system FrIn association, the first laser unit 508 may be associated with a coordinate systemand the second laser unit 510 may be associated with a coordinate systemAnd (4) associating. For on a moving platform coordinate system FrAnd laser unit coordinate systemAndThe initial transformation matrices for transforming the coordinates therebetween may be expressed asAndIn some embodiments, the initial transformation matrix may be determined prior to deployment of the mobile platform, for example, as a function of the position and orientation of the respective laser unit relative to the mobile platform. The initial transformation matrix may be determined using suitable manual or semi-automatic calibration methods known to those skilled in the relevant art. In some embodiments, the transform matrices are homogeneous matrices of size 4 × 4. In some embodiments, it may also be based onAndBefore deployment of the mobile platform, determines an initial transformation matrix for transforming directly between the coordinate systems of the first and second laser units 508, 510

Fig. 6 illustrates a calibration process for two laser units (or laser sensors) in accordance with some embodiments of the disclosed technology. The calibration process of FIG. 6 may be performed by a controller (e.g., moving a plate)An on-board computer of the station, an associated computing device, and/or an associated computing service). In step 605, the process includes transforming the point cloud information obtained by the two laser units into a point cloud in a reference frame based on an initial transformation rule. For example, the laser point clouds 514 and 516 are each based on their corresponding coordinate systemsAndCaptured by two laser units 508 and 510. The laser point clouds 514 and 516 may be based on initial transformation matrices, respectivelyAndCoordinate system F projected onto mobile platform 512rTo form a corresponding first transformed point cloudAnd second transformed point cloud

In step 610, the process includes determining an overlap region between the transformed point clouds in the reference frame. Illustratively, assume an initial transformation matrixAndis correctly determined (e.g., is close to its true value), the transformed point cloudandthe surface shapes in the overlap region should not deviate significantly from each other. Illustratively, the controller is based on two point clouds projected onto the coordinate system of the mobile platform 512AndSeparately building K-dimensional (KD) tree structuresAndBy using this type of tree structure, the cloud is tuned to the first pointCan quickly retrieve the second point cloudAnd vice versa. If the first point cloud (for example,) The query point in (a) and the second point cloud (e.g.,) Is less than a specified threshold, the process may include marking or classifying the query point as being located in an overlapping region. The process may identify point cloudsAndAll query points in one or both that meet a threshold requirement, thereby determining an overlap region.

in step 615, the process includesThe matching between the surfaces of the transformed point clouds in the overlap region is evaluated. Illustratively, for a particular transformed point cloud (e.g.,) Each point in the overlap region may also be modified by using the corresponding KD-tree structure established in step 610 (e.g.,) Its nearest neighbors in a particular point cloud are quickly retrieved. Thus, for each point of a particular transformed point cloud, a specified number of (nearest) neighboring points may be selected to form a plane from which the normal vector corresponding to that point may be readily determined.

In some embodiments, to match the surface represented or indicated by the point cloud in the overlap region, the controller may implement a point-to-plane iterative closest point (point-to-plane ICP) method. Illustratively, after finding the overlap region and determining the normal vector of the at least one point cloud within the overlap region, the controller may use a point-to-plane ICP method to minimize the following objective function:

Wherein p isiRepresenting a first point cloudPoints within the overlap region, qiRepresenting a second point cloudAt the closest point within the overlap region, R represents the rotation matrix, t represents the translation vector, niDenotes relative to piIs measured. When a sufficient number of points (e.g., exceeding a threshold number) are available in the overlap region, minimization may be achieved based on, for example, a least squares method.

In step 620, the process includes generating a calibration rule for calibrating between two laser units or sensors. Illustratively, when the objective function is minimized, an optimal transformation solution (e.g., optimal values of the rotation matrix R and translation vector t) is achieved. In some embodiments, translational changes between laser units or sensors are less likely to occur because different laser units or sensors are fixedly connected (e.g., by a bracket). However, the laser units or sensors are more likely to rotate, for example in case they are connected to the bracket by screws. In these embodiments, the translation vector t may be fixed to a constant (e.g., a value determined based on a prior minimization of the objective function) so that the controller may more efficiently estimate the rotation matrix R. In some embodiments, after generating the calibration rules, the controller calibrates the two laser units or sensors based thereon. Illustratively, the controller may use the rotation matrix R and the translation vector t to align the point clouds generated from the two laser units or sensors until they are updated in the next round of calibration.

In step 625, the process includes comparing the newly generated calibration rule to the previously generated calibration rule. Illustratively, the on-board computer may compare the newly determined optimal values of the rotation matrix R and/or the translation vector t with the optimal values determined in the initial round of calibration, the most recent round of calibration, several recent rounds of calibration averaged or weighted averaged, etc. In step 630, the process includes determining whether the difference resulting from the comparison in step 625 exceeds a threshold. If not, the process proceeds to step 605 for a new round of calibration. If the difference exceeds the threshold, the process proceeds to step 635.

in step 635, the process includes taking one or more further actions. A difference exceeding a threshold value may indicate that the two laser units or sensors are not reliably calibrated to each other. For example, the physical position or orientation of at least one of the laser unit or the sensor may have substantially deviated from the preset configuration. In this case, the controller may issue a warning to the operator of the mobile platform. Alternatively, the controller may suspend navigation or other functions of the mobile platform in a secure manner.

fig. 7A illustrates a mobile platform 702 from a first point in time t, in accordance with some embodiments of the presently disclosed technology1To a second point in time t2Is moved. Illustratively, for at time t1coordinate system F of a time-shifted platform1 rAnd time t2Coordinate system of time-shift platformtransformation matrix for transformation betweenare known or may be determined, for example, based on measurements taken by a global positioning system and/or inertial measurement unit carried by mobile platform 702. Based on a known transformation relationship (e.g., a transformation matrix) between two locations of a mobile platform at two different points in time) The representative system may detect and/or calibrate interference with the transformation between the coordinate systems of the individual laser units (or laser sensors) 704 and may implement the mobile platform body.

fig. 7B illustrates a process for detecting interference to a laser unit (e.g., laser unit 704 carried by mobile platform 702 as shown in fig. 7A) in accordance with some embodiments of the presently disclosed technology. The process may be implemented by a controller (e.g., an on-board computer of a mobile platform, an associated computing device, and/or an associated computing service). Laser unit 704 at time t1and t2The FOV in time may have a sufficiently large overlap region to facilitate an interference detection process similar to the calibration process described above with reference to fig. 6. Similar to generating calibration rules for calibration between two different laser units, the process of fig. 6 may be modified to generate calibration rules for calibrating a single laser unit at two points in time (or at two positions/orientations) and to detect disturbances to the laser unit based thereon.

Interference detection procedureStep 705 comprises transforming the point cloud information obtained by the laser unit at two points in time into corresponding point clouds in a reference system. Illustratively, given (1) two times t1And t2Transformation matrix for mobile platform 702(2) At time t1Time-of-flight platform coordinate system F1 rAnd laser unit coordinate system F1 lTransformation matrix betweenand (3) at time t2Time-of-flight platform coordinate systemAnd laser unit coordinate systemTransformation matrix betweenMay be initiated, the controller may turn the laser unit 704 on at time t1And t2Projection of the acquired point cloud to time t1Coordinate system F of a time-shifted platform1 r. As previously mentioned, the initial values of the transformation matrix between the mobile platform and the laser unit may be known or may be predetermined depending on the position and orientation of the laser unit relative to the mobile platform. The initial transformation matrix may be determined using suitable manual or semi-automatic calibration methods known to those skilled in the relevant art, for example, prior to deployment of the mobile platform. Accordingly, in some embodiments, the system may,AndMay be the same.

Step 71 of the interference detection procedure0 includes determining an overlap region between the transformed point clouds. Illustratively, the projected point cloud may be represented as(for time t)1) And(for time t)2) Wherein, in theorySimilar to the calibration process of fig. 6, the controller may build KD-tree structures for the two projected point clouds and determine their respective subsets of points in the overlap region, respectively:And

Step 715 of the disturbance detection process includes evaluating a match between surfaces in the overlap region indicated or represented by the transformed point cloud. Illustratively, similar to the calibration process of fig. 6, the controller may compare the surfaces of the two projected point clouds within the overlap region. At the controller aiming atEach point within estimates of normal vector n1,iThe controller may then evaluate the following objective function:

wherein the content of the first and second substances,

Step 720 of the disturbance detection process includes detecting whether there is a disturbance to the laser unit. Illustratively, the controller may be minimized, for example, based on a least squares approachAnd (5) normalizing the objective function H. If the minimum of the function H exceeds a threshold, or if the rotation matrix R and/or the translation vector t exceed or deviate from a certain threshold (e.g., relative to) Then, in step 725 of the process, the controller may determine that the laser unit 704 is at time t1And t2Are disturbed (e.g., due to a loosened screw or impact from an external object). In this case, the process proceeds to step 730, where one or more further actions may be taken in step 730. For example, the controller may issue alerts and/or take other actions to the operator of the mobile platform 702. If interference is not detected in step 725, the process proceeds to step 705. Similar to the calibration process of fig. 6, this interference detection process using the coordinate system of the single laser unit and the mobile platform body may be performed repeatedly and periodically while the mobile platform is deployed.

When using certain laser units or sensors, the number and/or distribution of laser scan points in a single frame may not provide a sufficiently dense point cloud to facilitate calibration, mapping, object detection, and/or localization. This problem can be particularly pronounced when using low-cost, small-angle LiDAR sensors. For example, for a typical low-cost small angle LiDAR, the number of laser points in a single frame may be less than 4000 or even less than 2000, while a more expensive omnidirectional LiDAR may produce 288000 laser-scan points in a single frame.

Fig. 8 shows a frame 800 of laser scan points produced by a laser sensor (e.g., a small angle laser sensor implementing the scan pattern 102a of fig. 1). As shown, a sparse set of laser scanning points 810 is distributed in a non-uniform manner in a three-dimensional coordinate system 820. In some embodiments, the sparse and non-uniform distribution of points 810 may not provide enough data in the overlapping region for reliably performing the calibration process disclosed herein.

FIG. 9 illustrates scanned point data generated by a laser unit (or laser sensor) carried by a moving platform moving over a period of timeA sequence of frames. As shown in fig. 9, a laser unit (or laser sensor) 915 carried by the mobile platform 910 is at time tiTo ti+kGenerates multi-frame 920 scan point data during the time period. For example, frame 920a is at time t with mobile platform 910 (and laser unit or sensor) in a first position/orientationi+1Generated, frame 920b is at a subsequent time t with the mobile platform 910 (and laser unit or sensor) at a second position/orientationi+1Generated and frame 920c is at a subsequent time t with the mobile platform 910 (and laser unit or sensor) in a third position/orientationi+2And (4) generating. Some portions of the techniques of this disclosure may generate a combined point cloud based on a point data set (e.g., frame sequence 920) having a temporal order.

Fig. 10 illustrates a combined point cloud 1000 generated in accordance with some embodiments of the presently disclosed technology. As shown in fig. 10, a dense set of laser scan points 1010, which combine multiple sets of laser scan points (e.g., similar to the laser scan points of frame 800 in fig. 8), are distributed in a relatively uniform manner in a three-dimensional coordinate system 1020 to provide comprehensive three-dimensional environmental information. In some embodiments, the calibration process disclosed herein uses such a combined point cloud rather than single frame point data.

To combine multiple frames of point data in a manner that reduces noise and error, techniques of the present disclosure include estimating a relative transformation matrix between successive frames by using multiple types of sensors carried by a mobile platform.

fig. 11 illustrates a mobile platform 1120 that carries multiple sensors in addition to a laser unit (or sensor) in accordance with some embodiments of the disclosed technology. As shown, the mobile platform may also carry a stereo camera 1104, an inertial measurement unit 1106, a wheel encoder 1110, and/or a Global Positioning System (GPS)1102 in addition to the laser unit 1108. One skilled in the relevant art will appreciate that fewer, more, or alternative sensors may be used with the techniques of this disclosure. For example, instead of using a stereo camera 804, a group, array or system of multiple cameras may be used.

FIG. 12 illustrates information that may be provided by the plurality of sensors of FIG. 11. The stereo camera 1104 may provide three-dimensional coordinates of the environmental features 1202 (e.g., at one or more different points in the three-dimensional space of the surrounding environment), which may establish a constrained relationship between successive frames (e.g., corresponding to observations from two different locations 1220a and 1220 b). Illustratively, the sampling frequency or data acquisition rate of the stereo camera 1104 is between 20Hz and 40 Hz. The inertial measurement unit 1106 may provide high frequency acceleration information and angular velocity information. Illustratively, the sampling frequency or data acquisition rate of the inertial measurement unit is 200Hz or higher. By integration, a transformation matrix between two consecutive frames of mobile platform 1120 may be computed. The wheel encoder 1110 may provide rotational speed of powered wheels (e.g., rear wheels) and steering information for the front wheels, and may provide constraints on forward speed and yaw angle between successive frames based on known wheel dimensions. Illustratively, the sampling frequency or data acquisition rate of the wheel encoder is about 20 Hz. Depending on outdoor signal conditions, the GPS 1102 may provide the location of the mobile platform 1120 and its attitude information in a global system. Illustratively, the sampling frequency or data acquisition rate of the GPS is below 5 Hz. Illustratively, the laser unit 1108 (e.g., including one or more LiDAR sensors) has a sampling frequency or data acquisition rate of 10 Hz.

The following table summarizes typical data acquisition frequency information for the representative sensors shown in fig. 11 and 12:

fig. 13 illustrates data collection frequency differences for the multiple sensors and laser units of fig. 11 in accordance with some embodiments of the presently disclosed technology.

fig. 14 illustrates a process for combining temporally sequential point information generated by laser units to form a point cloud in accordance with some embodiments of the presently disclosed technology. The process may be implemented by a controller (e.g., an on-board computer of a mobile platform, an associated computing device, and/or an associated computing service). As part of the techniques of this disclosure, generating a combined point cloud may include estimating the relative state associated with the laser unit over a period of time, rather than estimating all subsequent states relative to a global coordinate system. Illustratively, embodiments of the presently disclosed technology estimate the relative position information of the laser unit with respect to two or more different frames it generates over the time period, thereby enabling accurate accumulation of laser point data from the different frames of the time period. The method may facilitate or enhance subsequent calibration, object detection, mapping, and/or positioning operations.

Step 1405 of the process includes obtaining observation data corresponding to a time period from a plurality of observation sensors (e.g., a plurality of sensors as shown in fig. 8). In some embodiments, a method in accordance with the techniques of this disclosure includes approximating that data from different sensors is synchronized. For example, in a representative case, the data acquisition frequency of the target laser unit is 10Hz, the frequency of the stereo camera is 40Hz, the frequency of the wheel encoder is 20Hz, the frequency of the inertial measurement unit is 200Hz, and the frequency of the GPS is 5 Hz. As an approximation, the observed data from different sensors can be considered to be accurately aligned according to different frequency multipliers. Thus, taking a 1 second time window as an example, the controller may obtain 200 accelerometer and gyroscope readings (from the inertial measurement unit), 40 frames of stereo camera observations, 20 sets of velocity and yaw angle observations (from the wheel encoders), and 5 pieces of GPS position information. Based on these, embodiments of the presently disclosed technology can estimate the relative position between 10 laser unit data acquisition events or its position relative to a particular local coordinate system (e.g., the local coordinate system corresponding to the first of the 10 data acquisition events).

In some embodiments, the techniques of the present disclosure include a further approximation that the position of the laser unit coincides with the position of the stereo camera, thereby further simplifying the problem to be solved. As discussed with reference to fig. 12, the observed data from the different sensors may be mathematically described as follows:

1) From the observation data from the stereo camera, illustratively, three-dimensional coordinates and/or descriptors of one or more environmental features (e.g., feature 1202) may be extracted from the frames produced by the camera at locations 1220a and 1220b, respectively. These coordinates and/or descriptors may be matched against the feature 1202. In the objective function for optimization, such observations can be embodied by error terms related to re-projecting the features at different locations on the camera coordinate system. For example, the cost term based on environmental features and two consecutive stereo camera observation frames includes 3 parts: (a) a reprojection error of the left and right cameras at the frame corresponding to position 1220 a; (b) a reprojection error of the left and right cameras at the frame corresponding to position 1220 b; (c) the reprojection error of the left (or right) camera between the two positions 1220a and 1220 b.

2) From the observation data from the inertial measurement unit with known time stamps and initial values, the constraint relationship of the rotation matrix, the translation vector and the speed between two consecutive camera frames can be calculated, for example, by using suitable integration techniques known to those skilled in the relevant art. This type of observation can be represented by an error term between the integrated state and the actual state in the objective function. Illustratively, the variables to be estimated at each frame (e.g., the camera frame corresponding to positions 1220a and 1220 b) include the orientation of the camera (e.g., the elements in a particular orthogonal group) and the position and velocity (e.g., R)3Elements in a spatial group). Integration using observations captured from the inertial measurement unit provides constraints between the variables described above. In some embodiments, appropriate pre-integration techniques are employed to improve computational efficiency while iteratively optimizing the state.

3) A motion model including the velocity and yaw angle of the moving platform may be derived based on the observation data from the wheel encoders. Similarly, by integration, state constraints between successive camera frames may be obtained, and the representation of this type of observation may be similar to that of an inertial measurement unit. In some embodiments, only the subspace of states (e.g., the position and yaw angle of the mobile platform) is constrained based on wheel mileage appearance measurements, as opposed to the case of inertial measurement units. Due to possible noise of the wheel encoder, the covariance of the error term may be set relatively large in some embodiments.

4) The observations from the GPS may directly provide constraints on the state of the mobile platform at a particular time. In an objective function, this type of observation can be expressed as an error between the estimated state provided by the GPS and the actual state value. Because the data collection frequency of GPS is low in some embodiments, GPS observations may only be used when their noise level is below a certain threshold and/or its accuracy is guaranteed to be within a certain range.

in embodiments where the position of the laser unit is approximately coincident with the position of the stereo camera, the controller (e.g., an onboard computer of the mobile platform, an associated computing device, and/or an associated computing service) obtains observation data that may be provided by the sensor for a time period from time 1 to time k. The observation data may be expressed as follows:

Zk={C1:k,I1:k-1,W1:p,G1:q}

Wherein

1) First element C1:Krepresents observation information obtained by a stereo camera, and can be defined as follows:

Ci={zi,1,zi,2,...,zi,l}

Wherein z isi,jAn observation representing a jth feature in an ith frame obtained by a stereo camera;

2) Second elementrepresenting the set of data acquired by the inertial measurement unit up to the kth point in time, wherein,Set representing all observations of the inertial measurement unit between the i-th frame produced by the camera and the i + 1-th frame produced by the cameraClose (e.g., a total of 20 readings from the inertial measurement unit between two consecutive camera observations);

3) Third elementRepresents an observation of the wheel encoder, which may be represented as follows:

Wherein the content of the first and second substances,Represents speed information obtained by a wheel encoder at an ith time point and a jth time point, andRepresenting a rotational transformation (e.g., a quaternion expression) that can be derived or otherwise obtained by a yaw angle calculation between an ith time point and a jth time point; and is

4) Last elementRepresents GPS-acquired observations:

Wherein the content of the first and second substances,Represents the global position of the ith point in time,Indicating a rotation relative to a global coordinate system.

Step 1410 of the process includes evaluating a state associated with the laser unit at different points in time within the time period based on the observation data. For example, using a factor graph, the controller mayEstablishing a state x with the laser unitk={xk}k=1,...,nRelationship between associated prior probability and posterior probability (consistent with stereo camera):

Where k ═ 1, 2.., k ] denotes an observation index of the camera, m denotes a set of observation indexes of the GPS, and the state of the laser unit can be expressed as:

xk=[pk,vk,qk]

Wherein p isk,vkAnd q iskRespectively, the position, velocity and quaternion (rotation) of the laser unit with respect to a particular coordinate system at the kth time point. In the above formula, each pOa factor referred to as a factor graph.

In some embodiments, using a mathematical derivation based on the zero-mean white gaussian noise assumption, the controller can calculate the maximum a posteriori of the formula based on the factor graph by solving the minimum of the following formula:

Wherein r is*representing different residual types, Σ·Covariance matrices corresponding to different types of residuals are represented and used to describe uncertainty of the observation. In this regard, one skilled in the relevant art can determine residual models for different sensors and determine a jacobian matrix between optimization iterations. The controller may calculate the optimal value of the laser unit state based on the minimization (e.g., based on a gradient-based optimization method).

Step 1415 of the process includes determining transformation rules for transforming between multiple reference frames (e.g., multiple reference frames at different points in time) and a target reference frame. Illustratively, according to the following approximation: (1) the positions of the stereo camera and the laser unit coincide with each other; and (2) the timestamps of the data collected by the laser unit and the data collected by the camera are identical, the controller may use the determined corresponding states to calculate a relative transformation matrix for the laser unit at different points in time relative to the target point in time (i.e., when the targeted period of time begins, in the middle of the targeted period of time, or when the targeted period of time ends).

in some embodiments, the following approximation is not used: (1) the positions of the stereo camera and the laser unit are consistent with each other; (2) the time stamps of the data acquired by the laser unit and the data acquired by the camera are identical. In these embodiments, the techniques of this disclosure may consider two factors: (1) relative change (e.g. transformation matrix between stereo camera and laser unit) (ii) a And (2) timestamp differences between different sensors. With respect to the first factor (1), since the laser unit and the stereo camera are unlikely to move relative to each other during the period of time aimed at, the controller can calculate the relative position of the laser unit at any qth point in time relative to any pth point in time during the period of time aimed at by simply calculating the relative positions of the cameras at time q and time p. For the second factor (2) that the timestamps between different sensors are not perfectly synchronized, the controller may use interpolation (e.g., based on polynomial fitting) to calculate the relative position information in the coordinate system (e.g., the coordinate system of the mobile platform) at the time of any given timestamp.

Step 1420 of the process includes transforming data acquired by the laser unit at different points in time based on the transformation rules. Illustratively, using the relative transformation matrix determined in step 1415, the controller may re-project data (e.g., laser scan points) acquired at different points in time (e.g., different frames) in the targeted time period to the target point in time. In some embodiments, the controller may exclude certain points in time from the reprojection process due to excessive noise, data errors, or other factors. Step 1425 of the process includes generating a combined point cloud using the transformed data. Illustratively, the controller may add reprojection data from multiple (selected) frames to the point data frames initially associated with the target point in time, accumulating the time-ordered data frames to form a combined point cloud as if the data were all acquired by the laser unit at the target point in time.

Fig. 15 illustrates an example of a mobile platform configured in accordance with various embodiments of the presently disclosed technology. As shown, representative mobile platforms disclosed herein may include at least one of: unmanned Aerial Vehicle (UAV)1502, manned aerial vehicle 1504, autonomous car 1506, self-balancing vehicle 1508, ground robot 1510, smart wearable device 1512, Virtual Reality (VR) head mounted display 1514, or Augmented Reality (AR) head mounted display 1516.

Fig. 16 is a block diagram illustrating an example of an architecture of a computer system or other control device 1500 that may be used to implement portions of the techniques of this disclosure. In fig. 16, computer system 1600 includes one or more processors 1605 and memory 1610 connected via an interconnect 1625. Interconnect 1625 may represent any one or more separate physical buses, point-to-point connections, or both, connected by appropriate bridges, adapters, or controllers. Thus, interconnect 1625 may include, for example, a system bus, a Peripheral Component Interconnect (PCI) bus, a HyperTransport or Industry Standard Architecture (ISA) bus, a Small Computer System Interface (SCSI) bus, a Universal Serial Bus (USB), an IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 674 bus (sometimes referred to as a "firewire").

The processor 1605 may include a Central Processing Unit (CPU) to control overall operation of, for example, a host computer. In certain embodiments, the processor 1605 accomplishes this by executing software or firmware stored in the memory 1610. The processor 1605 may be or include one or more programmable general or special purpose microprocessors, Digital Signal Processors (DSPs), programmable controllers, Application Specific Integrated Circuits (ASICs), Programmable Logic Devices (PLDs), and the like, or a combination of such devices.

memory 1610 is or includes the main memory of the computer system. Memory 1610 represents any form of Random Access Memory (RAM), Read Only Memory (ROM), flash memory, etc., or a combination of these devices. In use, memory 1610 may contain a set of machine instructions or the like that, when executed by processor 1605, cause processor 1605 to perform operations to implement embodiments of the present invention.

Also connected to the processor 1605 via the interconnect 1625 is an (optional) network adapter 1615. Network adapter 1615 provides computer system 1600 with the ability to communicate with remote devices, such as storage clients and/or other storage servers, and may be, for example, an ethernet adapter or a fibre channel adapter.

The techniques described herein may be implemented by, for example, programmable circuitry (e.g., one or more microprocessors) programmed with software and/or firmware, or entirely in dedicated hardwired circuitry, or in a combination of these forms. The dedicated hardwired circuitry may be in the form of, for example, one or more Application Specific Integrated Circuits (ASICs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), or the like.

Software or firmware for implementing the techniques described herein may be stored on a machine-readable storage medium and executed by one or more general-purpose or special-purpose programmable microprocessors. The term "machine-readable storage medium" as used herein includes any mechanism that can store information in a form accessible by a machine (which can be, for example, a computer, network device, cellular telephone, Personal Digital Assistant (PDA), manufacturing tool, any device with a set of one or more processors, etc.). For example, a machine-accessible storage medium includes recordable/non-recordable media and the like (e.g., Read Only Memory (ROM), Random Access Memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, etc.).

The term "logic" as used herein may include, for example, programmable circuitry, dedicated hardwired circuitry, or a combination thereof, programmed with specific software and/or firmware.

Some embodiments of the disclosure have other aspects, elements, features and steps in addition to or in place of those described above. These potential additions and substitutions are described in the remainder of the specification. Reference in the specification to "various embodiments," "certain embodiments," "some embodiments," or "some embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least one embodiment of the disclosure. These embodiments, and even alternative embodiments (e.g., referred to as "other embodiments") are not mutually exclusive of other embodiments. In addition, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not other embodiments.

as described above, the techniques of this disclosure may utilize low cost laser sensors to achieve wide angle FOV coverage, provide high precision calibration between laser sensors, detect interference to laser sensors, and generate a combined point cloud based on point data obtained at different times. While advantages associated with certain embodiments of the technology have been described in the context of those embodiments, other embodiments may also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the technology. Accordingly, the present disclosure and related techniques may encompass other embodiments not explicitly shown or described herein.

To the extent that any material incorporated herein conflicts with the present disclosure, the present disclosure controls.

36页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:光截面传感器和用于操作光截面传感器的方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!