Autonomous navigation system

文档序号:1844035 发布日期:2021-11-16 浏览:20次 中文

阅读说明:本技术 自主导航系统 (Autonomous navigation system ) 是由 A·阿-达勒尔 M·E·拉斯特 P·J·西赫 B·莱昂 于 2015-12-04 设计创作,主要内容包括:本发明题为“自主导航系统”。本发明的一些实施方案提供了一种自主导航系统,所述自主导航系统基于在沿着驾驶路线手动导航车辆时在所述车辆处监测所述路线的各种特征,启用所述车辆沿着所述路线的一个或多个部分的自主导航。通过沿着所述路线的重复手动导航逐步更新所述表征,并且当所述表征的置信度指标满足阈值指示时,启用所述路线的自主导航。表征可以响应于所述车辆遇到所述路线的变化而更新,并且可以包括与所述路线相关联的一组驾驶规则,其中所述驾驶规则基于监测所述路线的一个或多个车辆的所述导航而开发。可以将表征上载到远程系统,所述远程系统处理数据以形成并细化路线表征,并向一个或多个车辆提供表征。(The invention provides an autonomous navigation system. Some embodiments of the present invention provide an autonomous navigation system that enables autonomous navigation of a vehicle along one or more portions of a driving route based on monitoring various features of the route at the vehicle as the vehicle is manually navigated along the route. The representations are gradually updated by repeated manual navigation along the route, and autonomous navigation of the route is enabled when a confidence indicator of the representations meets a threshold indication. The characterization may be updated in response to the vehicle encountering a change in the route, and may include a set of driving rules associated with the route, wherein the driving rules are developed based on monitoring the navigation of one or more vehicles of the route. The characterization may be uploaded to a remote system that processes the data to form and refine route characterizations and provide the characterizations to one or more vehicles.)

1. An autonomous navigation system configured to be installed in a vehicle and to selectively enable autonomous navigation of the vehicle, wherein the autonomous navigation system comprises:

at least one processor; and

a memory storing program instructions that, when executed by the at least one processor, cause the autonomous navigation system to:

during manual navigation of a route to a destination:

evaluating features of an alternative route to at least a portion of the route to determine that a confidence indicator for the features of the alternative route exceeds a threshold confidence indication;

displaying an offer to enable autonomous navigation of the vehicle to the destination along the alternative route but not the portion of the route; and

in response to receiving a request to accept the offer, autonomously navigating the vehicle to the destination along the alternative route is enabled.

2. The autonomous navigation system of claim 1, wherein the memory further stores program instructions that, when executed by the at least one processor, further cause the autonomous navigation system to:

generating the features of the alternative route from sensor data captured during one or more previous manual navigations of the vehicle along the alternative route.

3. The autonomous navigation system of claim 2, wherein the memory further stores program instructions that, when executed by the at least one processor, further cause the autonomous navigation system to send the feature of the alternate route to a navigation monitoring system via a network interface of the navigation monitoring system.

4. The autonomous navigation system of claim 2, wherein the memory further stores program instructions that, when executed by the at least one processor, further cause the autonomous navigation system to:

displaying a proposal to manually navigate the alternative route prior to the one or more previous manual navigations of the vehicle along the alternative route to enable autonomous navigation of the alternative route to be subsequently enabled.

5. The autonomous navigation system of claim 1, wherein the memory further stores program instructions that, when executed by the at least one processor, further cause the autonomous navigation system to:

receiving the characteristics of the alternative route from a navigation monitoring system via a network interface of the navigation monitoring system.

6. The autonomous navigation system of claim 5, wherein the feature of the alternate route is generated from sensor data captured by another autonomous navigation system installed at another vehicle that navigates the alternate route.

7. The autonomous navigation system of claim 1, wherein the memory further stores program instructions that, when executed by the at least one processor, further cause the autonomous navigation system to identify that the alternate route includes a common destination location with the route.

8. A method, comprising:

performing by the autonomous navigation system:

during manual navigation of a route to a destination:

evaluating features of an alternative route to at least a portion of the route to determine that a confidence indicator for the features of the alternative route exceeds a threshold confidence indication;

displaying an offer to enable autonomous navigation of the vehicle to the destination along the alternative route but not the portion of the route; and

in response to receiving a request to accept the offer, autonomously navigating the vehicle to the destination along the alternative route is enabled.

9. The method of claim 8, further comprising:

generating the features of the alternative route from sensor data captured during one or more previous manual navigations of the vehicle along the alternative route.

10. The method of claim 9, further comprising transmitting the characteristics of the generated alternative route to a navigation monitoring system via a network interface of the navigation monitoring system.

11. The method of claim 9, further comprising:

displaying a proposal to manually navigate the alternative route prior to the one or more previous manual navigations of the vehicle along the alternative route to enable autonomous navigation of the alternative route to be subsequently enabled.

12. The method of claim 8, further comprising receiving the characteristics of the alternative route from a navigation monitoring system via a network interface of the navigation monitoring system.

13. The method of claim 12, wherein the feature of the alternate route is generated from sensor data captured by another autonomous navigation system installed at another vehicle that navigates the alternate route.

14. The method of claim 8, further comprising identifying that the alternative route includes a common destination location with the route.

15. One or more non-transitory computer-readable storage media storing program instructions that, when executed on or across one or more computing devices, cause the one or more computing devices to implement an autonomous navigation system that implements:

during manual navigation of a route to a destination:

evaluating features of an alternative route to at least a portion of the route to determine that a confidence indicator for the features of the alternative route exceeds a threshold confidence indication;

displaying an offer to enable autonomous navigation of the vehicle to the destination along the alternative route but not the portion of the route; and

in response to receiving a request to accept the offer, autonomously navigating the vehicle to the destination along the alternative route is enabled.

16. The one or more non-transitory computer-readable storage media of claim 15, wherein the one or more non-transitory computer-readable storage media further stores instructions that, when executed by the one or more computing devices, cause the autonomous navigation system to further implement:

generating the features of the alternative route from sensor data captured during one or more previous manual navigations of the vehicle along the alternative route.

17. The one or more non-transitory computer-readable storage media of claim 16, wherein the one or more non-transitory computer-readable storage media further stores instructions that, when executed by the one or more computing devices, cause the autonomous navigation system to further implement:

displaying a proposal to manually navigate the alternative route prior to the one or more previous manual navigations of the vehicle along the alternative route to enable autonomous navigation of the alternative route to be subsequently enabled.

18. The one or more non-transitory computer-readable storage media of claim 15, wherein the one or more non-transitory computer-readable storage media further store instructions that, when executed by the one or more computing devices, cause the autonomous navigation system to further enable receiving the feature of the alternate route from a navigation monitoring system via a network interface of the navigation monitoring system.

19. The one or more non-transitory computer-readable storage media of claim 18, wherein the feature of the alternative route is generated from sensor data captured by another autonomous navigation system installed at another vehicle that navigates the alternative route.

20. The one or more non-transitory computer-readable storage media of claim 15, wherein the one or more non-transitory computer-readable storage media further store instructions that, when executed by the one or more computing devices, cause the autonomous navigation system to further enable identifying that the alternate route includes a common destination location with the route.

Technical Field

The present disclosure relates generally to autonomous navigation of vehicles, and in particular, to the formation and evaluation of autonomous navigation route representations with which at least some portions of a vehicle may autonomously navigate a route.

Background

The increasing interest in autonomous navigation of vehicles, including automobiles, has spurred the desire to develop autonomous navigation systems that can navigate a vehicle through various routes, including one or more roads in a network of roads, such as current roads, streets, highways, and the like, by autonomous navigation (i.e., autonomous "driving"). However, systems that enable autonomous navigation of a vehicle (also referred to as autonomous driving) may be less than ideal.

In some cases, autonomous navigation may be accomplished by an autonomous navigation system that may process and respond in real-time to static features (e.g., road lanes, road signs, etc.) and dynamic features (current location of other vehicles in the road, current environmental conditions, road obstacles, etc.) encountered along the route, thereby mimicking the real-time processing and driving capabilities of a person. However, even if technically feasible, the processing and control capabilities required to simulate such processing and response capabilities may be impractical, and the complexity and size of the computer systems that need to be included in the vehicle to achieve such real-time processing and response may put capital cost investments for each vehicle significantly beyond a reasonable range, making the system unsuitable for large-scale applications.

In some cases, autonomous navigation is achieved by: forming detailed maps of various routes; including data indicative of various features of a road (e.g., road signs, intersections, etc.); specifying various driving rules with respect to various routes (e.g., appropriate speed limits for a given portion of a given route, lane change speeds, lane locations, changes in driving rules based on various climate conditions and various time periods of the day); and providing the map to autonomous navigation systems of the various vehicles to enable the vehicles to autonomously navigate various routes using the map.

However, the formation of such maps may require a significant expenditure of time and effort, as forming sufficient data for a single route may require scheduling a set of sensors installable in a dedicated sensor vehicle to traverse the route and collect data about various features contained in the route, processing the collected data to form a "map" of the route, determining appropriate driving rules for various portions of the route, and repeating the process for each individual route included in the map. Such processes may take a significant amount of time and effort to form maps characterizing multiple routes, particularly when the multiple routes span some or all of the roads of a major city, region, country, etc.

Furthermore, as roads may change over time (e.g., due to road construction, accidents, weather, seasonal events, etc.), such maps may be accidentally outdated and thus unavailable for safe autonomous navigation of routes. Updating the map may require scheduling a suite of sensors to re-traverse the route, which may take a certain amount of time. When such expenditures are considered in view of the enormous number of potential routes in a road network, especially when multiple routes need to be updated simultaneously, it may be difficult to update the route map in time so that the user of the vehicle does not lose the ability to safely navigate autonomously.

Disclosure of Invention

Some embodiments provide a vehicle configured to autonomously navigate a driving route. The vehicle includes a sensor device that monitors a characteristic of a driving route based on the vehicle navigating along the driving route, and an autonomous navigation system interoperable with the sensor device to: enabling continuous updating of the virtual representation of the driving route based on monitoring continuous manual navigation of the vehicle along the driving route; associating a confidence indicator with the virtual representation based on monitoring the continuous update to the virtual representation; and based at least in part on determining that the confidence indicator satisfies at least the threshold confidence indication, enabling the vehicle to autonomously navigate along the driving route; based on controlling one or more control elements of the vehicle, and based on receiving, at the autonomous navigation system via a user interface of the vehicle, a user-initiated command to engage in autonomous navigation of a portion of the driving route, cause the autonomous navigation system to autonomously navigate the vehicle along at least a portion of the driving route.

Some embodiments provide an apparatus comprising an autonomous navigation system configured to be installed in a vehicle and to selectively enable autonomous navigation of the vehicle along a driving route. The autonomous navigation system may include a route characterization module that implements continuous updates of a virtual characterization of the driving route, wherein each update is based on a separate one of the continuous manually controlled navigation of the vehicle along the driving route, and implementing each of the continuous updates includes associating a confidence indicator with the virtual characterization based on the monitored change of the virtual characterization associated with the respective update. The autonomous navigation system may include a route evaluation module configured to enable the vehicle to enable user-initiated autonomous navigation of the driving route based on determining that a confidence indicator associated with the characterization of the driving route exceeds a threshold confidence indication.

Some embodiments provide a method comprising performing, by one or more computer systems installed in a vehicle: the method includes receiving a set of sensor data associated with a driving route from a set of sensors included in a vehicle based at least in part on manual navigation of the vehicle along the driving route, processing the set of sensor data to update a stored characterization of the driving route, wherein the stored characterization is based on at least one set of previously generated sensor data associated with one or more historical manual navigations of the vehicle along the driving route, associating a confidence indicator with the updated characterization based on a comparison of the updated characterization to the stored characterization, and enabling the vehicle to enable user-initiated autonomous navigation of the driving route based at least in part on determining that the confidence indicator satisfies at least a predetermined threshold confidence indication.

Drawings

Fig. 1 shows a schematic block diagram of a vehicle 100 including an Autonomous Navigation System (ANS) according to some embodiments.

Fig. 2 illustrates a schematic diagram of a vehicle including an ANS and a set of sensor devices, navigating through an area including a plurality of road portions of a plurality of roads, according to some embodiments.

Fig. 3 illustrates a schematic diagram of a vehicle including an ANS and a set of sensor devices, navigating through an area including a plurality of road portions of a road, according to some embodiments.

Fig. 4 illustrates a block diagram of an Autonomous Navigation System (ANS) according to some embodiments.

Fig. 5A-5C illustrate user interfaces associated with an autonomous navigation system, according to some embodiments.

FIG. 6 illustrates a user interface associated with an autonomous navigation system, according to some embodiments.

Fig. 7 illustrates forming a virtual representation of one or more road portions to enable autonomous navigation of the one or more road portions, according to some embodiments.

Fig. 8 illustrates a schematic diagram of an autonomous navigation network, according to some embodiments.

Fig. 9A-9B illustrate schematic diagrams of autonomous navigation networks, according to some embodiments.

Fig. 10 illustrates a "management spectrum" that may be used in the process of generating one or more virtual road portion representations, according to some embodiments.

Fig. 11 illustrates the receipt and processing of a virtual representation of one or more road portions, according to some embodiments.

Fig. 12 illustrates one or more virtual representations of one or more road portions to implement at least a portion of a managed spectrum, according to some embodiments.

Fig. 13 illustrates an exemplary computer system configured to implement aspects of the systems and methods for autonomous navigation, according to some embodiments.

This specification includes references to "one embodiment" or "an embodiment". The appearances of the phrase "in one embodiment" or "in an embodiment" are not necessarily referring to the same embodiment. The particular features, structures, or characteristics may be combined in any suitable manner consistent with the present disclosure.

The term "comprising" is open ended. The term does not exclude additional structures or steps when used in the appended claims. Consider the claims as cited below: "a device comprising one or more processor units," such claims do not exclude the device from comprising additional components (e.g., network interface units, graphics circuits, etc.).

"configured to" various units, circuits, or other components may be described or recited as "configured to" perform a task or tasks. In such context, "configured to" is used to connote structure by indicating that the units/circuits/components include structure (e.g., circuitry) that performs these tasks during operation. As such, the unit/circuit/component may be configured to perform this task even when the specified unit/circuit/component is not currently operational (e.g., not turned on). The units/circuits/components used with the expression "configured to" includes hardware-e.g., circuitry, memory storing program instructions executable to perform operations, etc. References to a unit/circuit/component being "configured to" perform one or more tasks are expressly intended for that unit/circuit/componentIs not limited toReference 35u.s.c. § 112 (f). Further, "configured to" may include a general-purpose structure (e.g., a general-purpose circuit) that is manipulated by software and/or firmware (e.g., an FPGA or a general-purpose processor executing software) to operate in a manner that enables performance of one or more tasks to be solved. "configured to" may also include adapting a manufacturing process (e.g., a semiconductor manufacturing facility) to manufacture a suitable implementationA device (e.g., an integrated circuit) that performs one or more tasks.

"first", "second", etc. As used herein, these terms are used as labels to the nouns preceding them and do not imply any type of ordering (e.g., spatial, temporal, logical, etc.). For example, the buffer circuit may be described herein as performing a write operation of a "first" value and a "second" value. The terms "first" and "second" do not necessarily imply that the first value must be written before the second value.

"based on" as used herein, the term is used to describe one or more factors that affect the determination. The term does not exclude other factors that may influence the determination. That is, the determination may be based solely on these factors or at least partially on these factors. Consider the phrase "determine a based on B. In this case, B is one factor that affects the determination of a, and the phrase does not exclude that the determination of a may also be based on C. In other examples, a may be determined based on B alone.

Detailed Description

Brief introduction to the drawings

Some embodiments include one or more vehicles including an autonomous navigation system ("ANS") therein, wherein the ANS enables autonomous navigation of various driving routes (also referred to herein as "routes") by forming virtual representations of the routes based on various real-world features of the monitored routes during navigation of the vehicle along the routes. The ANS controls various control elements of the vehicle to autonomously drive the vehicle along one or more portions of the route based at least in part on the virtual representations of the one or more portions of the route (referred to herein as "autonomously navigate," "autonomously navigate," etc.). Such autonomous navigation may include controlling a vehicle control element based on the following characterizations: a representation of driving rules included in the virtual route representation (e.g., vehicle speed, separation from other vehicles, location on the road, corresponding adjustments based on environmental conditions, etc.); and a representation of static features of the route (locations of road lanes, road edges, road signs, landmarks, road grades, intersections, crosswalks, etc.) included in the virtual route representation, such that the ANS can safely and autonomously navigate the vehicle along at least a portion of the route.

As described herein, a "route" includes a path along which a vehicle is navigated. The route may extend from the starting location to another, separate destination location, back to the same destination location as the starting location, etc. The route may extend along one or more different portions of one or more different roads. For example, a route between a home location and a work site may extend from a home lane, through one or more residential streets, along one or more portions of one or more thoroughfares, highways, toll roads, etc., to one or more parking spaces of one or more parking lots. Such a route may be a route that a user repeatedly navigates over time, including navigating multiple times in a given day (e.g., a route between home and work may travel at least once in a given day).

In some embodiments, the ANS in the vehicle enables autonomous navigation of one or more portions of the route based at least in part on the virtual representations of the one or more portions. Enabling autonomous navigation may include making autonomous navigation of one or more portions of the route available for selection by a user of the vehicle such that the ANS may participate in autonomous navigation of the one or more portions based on receiving a user-initiated command to participate in autonomous navigation of the one or more portions.

The virtual representation of the route portion, referred to herein as a "virtual route portion representation," may include a virtual representation of a portion of a road included in the route between one or more locations. Such representations may be referred to as "virtual road segment representations". A given virtual road portion representation may be independent of any overall route navigable by the vehicle such that the route navigated by the vehicle includes a set of consecutively navigated road portions, and the ANS may use one or more virtual road portion representations to autonomously navigate the vehicle along one or more different routes. The virtual route representation may include a set of one or more virtual road portion representations associated with a set of one or more road portions with autonomous navigation enabled and one or more road portions without autonomous navigation enabled, and the ANS may participate in autonomous navigation of the autonomous navigation enabled portions, interacting with a user of the vehicle through a user interface of the vehicle, such that control of various control elements of the vehicle is transmitted between the user and the ANS based on the road portions of the route along which the vehicle is navigated.

In some embodiments, the ANS forms the virtual road portion representation, the virtual route representation, and/or the like by monitoring navigation of a vehicle comprising the ANS along one or more routes. Such monitoring may include monitoring one or more external environments, vehicle control elements, etc., as the vehicle is manually navigated by a user along one or more routes. As described herein, a user may include a driver of a vehicle, a passenger on the vehicle, some combination thereof, and the like. As a user manually navigates the vehicle along a route that may extend along one or more road portions, the ANS may monitor various aspects of the manual navigation, including monitoring various static features encountered by the vehicle in various road portions through which the manual navigation passes (e.g., road signs, curbs, lane markings, signal lights, trees, landmarks, the physical location of the vehicle, etc.), dynamic features encountered in various road portions (other vehicles driving along the road, emergency vehicles, accidents, weather conditions, etc.), driving features of the user with respect to manually navigating the vehicle through various road portions (e.g., driving speed at various road portions, lane changes and operation, acceleration events and rates, deceleration events and rates, location on the road relative to static features, spacing from other vehicles, etc.), approaching the manually navigated vehicle (e.g., in different lanes, in front of or behind the vehicle, etc.) the driving characteristics of the moving entity navigating through the various road portions, some combination thereof, etc. As used herein, a "mobile entity" may include a motor vehicle, including cars, trucks, and the like; human powered vehicles, including bicycles, tricycles, and the like; pedestrians, including humans, animals, etc.; some combination thereof, and the like. The driving characteristics of a mobile entity, including a vehicle, pedestrian, etc., may include data characterizing how the mobile entity navigates through at least a portion of one or more road portions. For example, a characteristic of pedestrian progress may indicate that the pedestrian is traveling at a certain speed at a certain distance along a road from a certain edge of the road. The system may process input data generated at various vehicle sensors based on the monitoring to form a virtual representation of the route, which may include representations of static features associated with various portions of the route (referred to herein as "representations of static features"), representations of driving rules associated with various portions of the route (referred to herein as "representations of driving rules"), and so forth.

In some embodiments, the ANS updates one or more virtual road portion representations of one or more road portions included in the route based on monitoring continuous manual navigation of the route. Since the ANS enables continuous updates of virtual representations of one or more road portions based on multiple successive navigations of the route, the ANS may develop and update confidence indicators associated with one or more road portion representations associated with one or more road portions included in the route. For example, where the number of new static features in a road portion of a conventional navigation route identified based on processing input data from various vehicle sensors decreases with continuous manual navigation over the monitored route, the confidence indicator associated with the virtual road portion characterization may increase with continuous monitoring of navigation through the road portion. When the characterization of the one or more road portions has a confidence indicator that at least meets the threshold confidence indication, the ANS may enable autonomous navigation features of the vehicle for the one or more road portions such that autonomous navigation of the vehicle along the one or more portions of the route including the one or more road portions is enabled. The threshold level may be predetermined. In some embodiments, the ANS may adjustably establish a confidence indicator for one or more particular road portions based at least on monitoring navigation along the one or more particular road portions, signals received from one or more remote servers, systems, or the like, some combination thereof, or the like.

As used herein, an index may include one or more of a particular value, a grade, a level, some combination thereof, and the like. For example, the confidence indicator may include one or more of a confidence value, a confidence level, some combination thereof, and the like. Where an index comprises one or more of a particular value, a level, a rank, some combination thereof, etc., an index may comprise one or more of a range of indices. For example, where the confidence indicator includes a confidence level, the confidence indicator may include a particular level of a range of levels, where the particular level indicates a relative confidence associated with the indicator. In another example, where the confidence indicator comprises a confidence value, the confidence indicator may comprise a particular value in a range of values, where the particular value in the range indicates a relative confidence associated with the indicator with respect to one or more confidence extrema represented by the range limits.

In some embodiments, the threshold confidence indication may include one or more indicators, values, levels, etc., and determining that the confidence indicator at least satisfies the threshold confidence indication may include determining that a value, level, etc., included in the confidence indicator at least matches a value, level, etc., included in the confidence indicator. In some embodiments, determining that the confidence indicator satisfies at least the threshold confidence indication may include determining that a value, level, etc. included in the confidence indicator exceeds a value, level, etc. included in the confidence indicator. In some embodiments, the threshold confidence indication may be referred to as one or more of a threshold confidence indicator, a threshold level, some combination thereof, and the like.

The virtual route representation may comprise a set of virtual road portion representations of individual road portions comprised in the route. The virtual route representations may include metadata that references various virtual road portion representations, and may represent driving rules associated with navigating between the various road portions. In some embodiments, autonomous navigation of one or more portions of a route is enabled based at least in part on determining that a sufficiently large portion of the route including a set of one or more road portions has associated virtual representations whose associated confidence indicators at least satisfy one or more thresholds. Such a set of road portions may comprise a limited selection of road portions comprised by the route. For example, where the route includes a plurality of road portions having a length of 100 feet, and the virtual road portion representation associated with a single road portion has a confidence indicator that satisfies the threshold confidence indication, autonomous navigation of the single road portion may remain disabled where the remainder of the road portion representation does not have a confidence indicator that satisfies the threshold confidence indication. In another example, autonomous navigation of a portion of a route including a plurality of contiguous road portions may be enabled where virtual road portion representations associated with the plurality of contiguous road portions each have a confidence indicator that satisfies a threshold, and where a contiguous length of the road portion satisfies at least a threshold confidence indication. The "threshold confidence indication" may be interchangeably referred to herein as a "threshold". The threshold may be based at least in part on one or more of a distance of an adjoining road portion, a driving speed through one or more road portions, an estimated time to elapse to navigate through one or more road portions, some combination thereof, and/or the like. The threshold may vary based on the various road portions included in the route portion that determine whether autonomous navigation is enabled.

In some embodiments, enabling autonomous navigation of one or more portions of a path enables user-initiated participation in autonomous navigation of one or more particular road portions specified through user-based interaction with one or more user interfaces. For example, in response to enabling autonomous navigation of a road portion, the ANS may present to the user, via a user interface included in the vehicle, an option to engage in autonomous navigation of the vehicle at one or more route portions including one or more road portions that enable autonomous navigation. Based on user interaction with the user interface, the ANS may receive a user-initiated command to autonomously navigate the vehicle along one or more portions of the route, and in response, engage in autonomous navigation through one or more control elements controlling the vehicle.

In response to detecting a change in a static feature of the route via monitoring the external environment, the ANS may update the virtual representation of the route. For example, where a portion of a road in a route that a vehicle is conventionally traveling is roadway worked on, resulting in a road change, lane closure, etc., an ANS included in the vehicle may update a representation of the route in response to monitoring the portion as the vehicle travels through the portion of the road. Thus, the ANS may adapt to changes in the route independent of pre-existing route representations, "maps," etc., including independent of data received from remote servers, systems, etc., thereby reducing the amount of time required to enable autonomous navigation of the changed route. Further, since the route representations are formed by the ANS of the vehicle based on routes that the user of the vehicle continuously (i.e., repeatedly) navigates, routes that may enable autonomous navigation include routes that the user of the vehicle tends to navigate, including routes that are conventionally navigated. Thus, the ANS can autonomously navigate routes that are routinely navigated by a vehicle user without the need for pre-existing route characterization. Further, because the ANS may update virtual representations of one or more road portions, routes, etc. based on locally monitoring changes to the road portions via sensors included in the vehicle, the ANS may update these representations as soon as the vehicle encounters these changes, thereby providing updates to the virtual representations of the routes navigated by the user, and in some embodiments, without relying on distributed update information from remote systems, servers, etc. In some embodiments, the ANS may continue to update the virtual representation of the portion of the road based on monitoring autonomous navigation of the vehicle through the portion of the road.

In some embodiments, the ANS uploads the virtual representations of the one or more routes to one or more remote systems, servers, etc. implemented on one or more computer systems external to the vehicle in which the ANS is located. Such uploading may be performed in response to determining that forming autonomous navigation with sufficient confidence to enable the route requires processing resources that are not available locally at the vehicle, in response to determining that the ANS cannot establish a confidence indicator associated with the characterization at a rate greater than a certain value with continuous monitoring of route navigation, and so forth. For example, where the characterization of the road portion requires processing power beyond the capabilities of a computer system included in the vehicle, an ANS included in the vehicle may upload the characterization of the route, one or more sets of input data associated with the route to a remote server, and the remote service may process the data, evaluate the characterization, and so forth to form a virtual characterization of the route. In another example, where the ANS of the vehicle determines that the confidence indicator associated with the virtual road portion representation does not increase at greater than a certain rate and continuously monitors navigation of the road portion, the ANS may upload the representation, input data associated with the route, etc. to a remote server, system, etc., and the remote server, system, etc. may further evaluate the representation to enhance the confidence indicator of the representation.

In the event that a remote system, server, or the like is unable to establish a sufficient confidence indicator for a characterization, the remote system may flag the characterization for manual evaluation of the characterization, and may modify the characterization in response to manual input from one or more operators. In some embodiments, the manual input may include a manually specified characterization confidence indicator. In the event that such modifications do not result in establishing a sufficient characterization confidence indicator, the remote system may schedule a dedicated sensor suite, which may be included in a dedicated sensor-bearing vehicle, to collect additional input data associated with one or more selected portions of the route in question, where the remote system may utilize the additional input data to modify the characterization. In the event that such modifications do not result in establishing a sufficient characterization confidence indicator, the remote system may flag the route and provide a proposed alternative route to the ANS for vehicle navigation.

The alternative route proposal may include a characterization of one or more alternative routes that may have a sufficient confidence indicator such that the ANS of the vehicle may enable autonomous navigation of the alternative route and propose an autonomously navigated alternative route to a user of the vehicle through the interface, rather than travel in the first route. In some embodiments, the ANS invites a user of the vehicle to manually navigate one or more alternative routes through the user interface such that the ANS may form a virtual representation of the one or more alternative routes as part of enabling autonomous navigation of the one or more alternative routes.

In some embodiments, characterizing the route at the ANS of the vehicle includes monitoring driving characteristics of one or more vehicle users manually navigating the vehicle along one or more portions of the route. Such characterization may include monitoring one or more various mobile entities traveling in proximity to the vehicle at one or more portions of the route, including driving characteristics of one or more motor vehicles, human-powered vehicles, pedestrians, some combination thereof, and so forth. The driving characteristics may include positioning one or more mobile entities relative to one or more static route characteristics along the route in one or more portions of the route, etc., acceleration events relative to the static route characteristics, acceleration rates, driving speeds relative to the static characteristics, dynamic characteristics, etc. The ANS may process the monitored driving characteristics to form a set of driving rules associated with one or more portions of the route, wherein the set of driving rules determine driving characteristics according to which the ANS autonomously navigates the vehicle along the one or more portions of the route. For example, based on monitoring the driving characteristics of a user of a local vehicle along a particular route, the driving characteristics of various other vehicles, etc., the ANS of the vehicle may form a set of driving rules associated with the route, which may include the following rules: the driving speed range of each portion of the route, the location of the lanes along the route, the allowed separation distance between the vehicle and other vehicles along the route, the location along the route where a particular range of acceleration is allowed, the location where a certain amount of acceleration is to be applied (e.g., road slope), the likelihood of certain dynamic events occurring (accident, sudden acceleration event, road obstacle, pedestrian, etc.), some combination thereof, etc. are specified. Such sets of driving rules may be referred to as driving rule representations and may be included in a virtual road portion representation, a virtual route representation, some combination thereof, and so forth.

Thus, driving rules for a route may be developed "empirically," i.e., based on monitoring the manner in which one or more users actually navigate one or more vehicles along the route. Such locally formed driving rules may provide an autonomous driving experience that is customized according to the particular conditions of the autonomously navigated route, rather than using general driving rules that are formed independently of the manner in which the vehicle user is directly monitored for the actual navigated route. Further, in some embodiments, the driving characteristics may be processed to form a representation of static characteristics included in one or more portions of the route. For example, as a vehicle navigates through a road that lacks at least some conventional static features (e.g., an unpaved road that lacks one or more of defined road edges, lane boundary markings, etc.), monitoring of the driving features of the local vehicle and one or more various external vehicles may be processed to form one or more static feature representations, including representations of road edges, representations of boundaries of road unmarked lanes, etc.

The driving rule characterization may be subject to predetermined driving limits, including a driving speed limit. For example, based on processing input data generated by monitoring external environmental elements, the ANS may identify road signs along various portions of the route that specify speed limits for roads extending for that portion of the route. The ANS may analyze the input data associated with the monitoring of the road signs to identify the indicated speed limit and incorporate the identified speed limit into the driving rules associated with the portion of the route as the driving speed limit, such that when the ANS uses the driving rule characterization to self-pilot at least along the portion of the route, the ANS will at least not attempt to exceed the speed limit associated with the portion of the route.

In some embodiments, a plurality of virtual road portion representations included in a plurality of navigation routes may be formed on a vehicle, and such plurality of representations may be incorporated into a set of road portion representations for respective portions of a plurality of different roads that are navigated via navigation of the plurality of different routes, wherein the ANS may use the various representations of the plurality of road portions to enable autonomous navigation along the respective portions of the various routes (including portions of the plurality of independent routes).

In some embodiments, virtual representations of one or more road portions may be uploaded from one or more ANS included in one or more vehicles to a remote system, server, or the like. Such systems may include a navigation monitoring system, wherein a plurality of ANS of a plurality of independent vehicles are communicatively coupled to one or more navigation monitoring systems in a navigation network. Various representations of the various road portions may be incorporated into a "map" of road portion representations in a remote system, server, or the like. The characterization map can be distributed to various ANS of various vehicles. Where multiple road portion representations of one or more portions of a common road portion are received at a remote system, server, or the like, incorporating the representations into the map may include forming a composite representation of the road portion based on processing the multiple representations of the one or more portions. Thus, the ANS of the various vehicles can characterize the various routes traveled by these various vehicles, and the various route representations formed locally on the various vehicles can be incorporated into a representation map of route representations that can be distributed to and used by the ANS of other vehicles to enable autonomous navigation of the other vehicles over the various routes.

The ANS included in the vehicle may generate virtual representations locally to the vehicle based at least in part on local monitoring of the environment in the vicinity of the vehicle, thereby eliminating the need for an existing detailed "map" of road portions included in the road network, where the map may include a set of virtual representations of road portions organized and arranged according to their relative physical geographic locations, such that the map includes the virtual representations of the road network and the various routes that may be navigated therein. In the absence of a "map," the ANS may "guide" map generation by forming virtual representations of one or more portions of one or more routes navigated by the vehicle. Further, since the ANS characterizes routes navigated by the vehicle, the locally formed map characterizing the route may include routes that the vehicle user is inclined to navigate, but not routes that the user does not navigate, thereby enabling autonomous navigation of routes that the user conventionally travels.

In some embodiments, the ANS is communicatively coupled to one or more other ANS and may communicate the virtual route representation with one or more other autonomous navigation systems. The ANS may be implemented by one or more computer systems external to one or more vehicles, and may modify virtual route representations received from one or more other ANS. Such modification may include incorporating multiple representations into one or more composite representations, modifying representations received from one or more groups of other remote ANS, some combination thereof, and so forth, based on input data received from the one or more groups of remote ANS.

Reference will now be made in detail to the embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art that some embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, networks, have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.

It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact may be termed a second contact, and, similarly, a second contact may be termed a first contact, without departing from the intended scope. The first contact and the second contact are both contacts, but they are not the same contact.

The terminology used in the description herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

Depending on the context, the term "if" as used herein may be interpreted to mean "when … …" ("when" or "upon") or "in response to a determination" or "in response to a detection". Similarly, depending on the context, the phrase "if determined … …" or "if [ stated condition or event ] is detected" may be interpreted to mean "upon determination … …" or "in response to determination … …" or "upon detection of [ stated condition or event ] or" in response to detection of [ stated condition or event ] ".

Autonomous navigation system

Fig. 1 illustrates a schematic block diagram of a vehicle 100 including an Autonomous Navigation System (ANS) configured to control various control elements of the vehicle to autonomously navigate the vehicle along one or more driving routes based at least in part on one or more virtual representations of one or more portions of the one or more driving routes, according to some embodiments.

Vehicle 100 will be understood to include one or more vehicles having one or more various configurations that may house one or more persons, including but not limited to one or more automobiles, trucks, vans, and the like. The vehicle 100 may include one or more interior compartments configured to house one or more persons (e.g., passengers, drivers, etc.), collectively referred to herein as "users" of the vehicle. The interior compartment may include one or more user interfaces including vehicle control interfaces (e.g., steering wheel, throttle control device, brake control device), display interfaces, multimedia interfaces, climate control interfaces, some combination thereof, and the like. The vehicle 100 includes various control elements 120 that can be controlled to navigate ("drive") the vehicle 100 around the world, including navigating the vehicle 100 along one or more routes. In some embodiments, the one or more control elements 120 are communicatively coupled to one or more user interfaces included in the interior cabin of the vehicle 100, such that the vehicle 100 is configured to enable user interaction with the one or more user interfaces to control at least some of the control elements 120 and manually navigate the vehicle 100. For example, the vehicle 100 may include a steering device, a throttle device, and a braking device in the interior compartment with which a user may interact to control various control elements 120 to manually navigate the vehicle 100.

The vehicle 100 includes an Autonomous Navigation System (ANS)110 configured to autonomously navigate the vehicle 100. ANS 110 may be implemented by any combination of hardware and/or software configured to perform the various features, modules, or other components discussed below. For example, one or more of various general purpose processors, graphics processing units, or special purpose hardware components, such as various Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), or other special purpose circuits, may implement all (or in conjunction with portions of program instructions stored in memory and executed by the processors) of route characterization module 112 and driving control module 114. One or more computing systems, such as computer system 1300 in fig. 13 below, may also implement ANS 110. ANS 110 is communicatively coupled to at least some of vehicle control elements 120 and is configured to control one or more of the elements 120 to autonomously navigate vehicle 100. As used herein, autonomous navigation of the vehicle 100 refers to controlled navigation ("driving") of the vehicle 100 along at least a portion of a route based on active control of control elements 120 of the vehicle 100 (including steering control elements, throttle control elements, brake control elements, transmission control elements, etc.), independent of control element input commands from a vehicle user. Autonomous navigation may include the ANS actively controlling the driving control elements 120 while enabling manual control of the elements 120 through manual input from a user from user interaction with one or more user interfaces included in the vehicle. For example, ANS 110 may autonomously navigate vehicle 110 without a vehicle user entering commands through one or more user interfaces of vehicle 100, and ANS 110 may cease control of one or more elements 120 in response to a user initiated input command to one or more elements 120 from one or more user elements of vehicle 100.

ANS 110 includes a route characterization module 112 that forms and maintains virtual representations of various road sections, driving routes, and the like; and a driving control module 114 configured to control one or more control elements 120 of the vehicle 100 to autonomously navigate the vehicle along one or more portions of one or more driving routes based on the virtual representations associated with the one or more portions of the routes.

The vehicle 100 includes a set of one or more external sensor devices 116, also referred to as external sensors 116, which may monitor one or more aspects of the external environment relative to the vehicle 100. Such sensors may include a camera device, a video recording device, an infrared sensor device, a radar device, a light scanning device including a LIDAR device, a precipitation sensor device, an ambient wind sensor device, an ambient temperature sensor device, a location monitoring device that may include one or more global navigation satellite system devices (e.g., GPS, BeiDou, DORIS, Galileo, GLONASS, etc.), some combination thereof, and so forth. One or more external sensor devices 116 may generate sensor data associated with an environment as the vehicle 100 navigates through the environment. Sensor data generated by one or more sensor devices 116 can be transmitted as input data to ANS 110, where the input data can be used by route characterization module 112 to form, update, maintain a virtual characterization of one or more portions of a route through which vehicle 100 is navigating, and the like. The external sensor device 116 may generate sensor data when the vehicle 100 is manually navigated, autonomously navigated, etc.

The vehicle 100 includes a set of one or more interior sensors 118, also referred to as interior sensor devices 118, which may monitor one or more aspects of the vehicle 100. Such sensors may include: a camera device configured to collect image data of one or more users in an interior cabin of a vehicle; control element sensors that monitor the operating state of various control elements 120 of the vehicle; an accelerometer; a speed sensor; component sensors that monitor the status of various automotive components (e.g., sensors that monitor wheel rotational movement of one or more wheels of a vehicle), and the like. As the vehicle 100 navigates through the environment, the one or more internal sensor devices 118 may generate sensor data associated with the vehicle 100. Sensor data generated by one or more internal sensor devices 118 can be transmitted to ANS 110 as input data, where the input data can be used by route characterization module to form, update, maintain a virtual representation of one or more portions of a route through which vehicle 100 is navigating, and the like. The internal sensor devices 118 may generate sensor data when the vehicle 100 is manually navigated, autonomously navigated, etc.

The vehicle 100 includes one or more sets of interfaces 130. The one or more interfaces 130 can include one or more user interface devices, also referred to as user interfaces, with which a user of the vehicle 100 can interact to interact with one or more portions of the ANS 100, control elements 120, and/or the like. For example, interface 130 may include a display interface with which a user may interact to command ANS 110 to autonomously navigate vehicle 100 along one or more particular routes based at least in part on one or more virtual representations of one or more portions of the route.

In some embodiments, one or more interfaces 130 include one or more communication interfaces that can communicatively couple ANS 110 with one or more remote servers, systems, etc. via one or more communication networks. For example, interface 130 may include a wireless communication transceiver that may communicatively couple ANS 110 with one or more remote servers via one or more wireless communication networks including cloud servers. ANS 110 may transmit virtual route representations, various input data sets, etc. to remote servers, systems, etc. via one or more interfaces 130, may receive virtual representations of one or more road portions, etc. from one or more remote servers, systems, etc.

Formation of route characterization

In some embodiments, as the vehicle is navigating through one or more road portions, the ANS may form one or more virtual representations of the one or more road portions, which the ANS may then utilize to autonomously navigate the vehicle through the one or more road portions based on monitoring various static features, dynamic features, driving features, and the like. Such monitoring may be accomplished as the vehicle is manually navigated through one or more road portions by a vehicle user, such that the ANS may form a representation of the static features of the route by monitoring the static features as the vehicle is manually navigated along the route, and based on monitoring the driving features of the user manually navigating the vehicle through that portion of the road, monitoring the driving features of other vehicles approaching the local vehicle navigating through that portion of the road, etc., a set of driving rules may be formed that specify the manner in which the ANS navigates the vehicle through the one or more road portions. Thus, the ANS of the vehicle, based on monitoring manual navigation along the route and various features observed as the vehicle navigates through the route via manual navigation, may form both a characterization of the route's physical state (e.g., static features) and a characterization of the manner in which the route is navigated (e.g., driving rules). Thus, the ANS may form a representation for autonomous navigation of the route independent of externally received representation data or pre-existing representation data.

Fig. 2 shows a schematic diagram of a vehicle 202, according to some embodiments, including an ANS 201 and a set of sensor devices 203, navigating through an area 200 including a plurality of road portions 210A-210D of roads 208, 218. The vehicle 202 may be manually navigated through the route, and the sensor devices 203 may include one or more external sensor devices, vehicle sensor devices, and the like. The vehicle 202 and the ANS 201 may be included in any embodiment of a vehicle, ANS, etc.

As shown in the illustrated embodiment, an area 200 including one or more different roads 208,218 may be divided into various road "sections" 210. The ANS may distinguish the various road portions based on location data received from one or more location sensors in the vehicle 202, one or more various static features included in the area 200, and so forth. Different road portions 210 may have different dimensions, which may be based at least in part on the driving speed of vehicles navigating the road including the portion of the road, environmental conditions, and the like. For example, the road 208 may be an expressway with an average driving speed higher than the average driving speed of the road, which may be the ramp curve 218; thus, each of the road portions 210A-210C of the road 208 may be larger than the road portion 210D of the road 218.

In some embodiments, the ANS included in the vehicle monitors various static characteristic characteristics of various road portions of various roads, monitors various driving characteristics of a user of the vehicle, other nearby vehicles, and the like, as well as some combination thereof, and the like, as the vehicle navigates (manually, autonomously, and the like) through one or more different roads as the vehicle navigates through the various road portions. Such monitoring, which may be accomplished by the ANS based on input data received from the sensors 203, may include processing various features to form a virtual representation of one or more road portions through which the vehicle is navigated. The ANS may then autonomously navigate the vehicle through one or more road segments using the virtual representations.

In some embodiments, monitoring various static feature characteristics of the road portion includes identifying various static features associated with the road portion. For example, in the illustrated embodiment, where the vehicle 202 is navigating through the road portion 210B of the road 208, the sensor device 203 may monitor various aspects of the external environment of the area 200 to identify various static features associated with the road portion 210B, including edges 212A-212B, lane boundaries 217A-217B, lanes 214A-214C of the road 208. In some embodiments, one or more sensor devices 203 may identify the material composition of one or more portions of roadway 208. For example, the sensor devices 203 of the vehicle 202 may include internal sensor devices that may monitor rotational movement of the wheels of the vehicle 202 to determine whether the vehicle is navigating through an asphalt surface, a gravel surface, a concrete surface, a dirt surface, and the like.

In some embodiments, the sensor device 203 may monitor various aspects of the environment outside of the area 200 to identify various static features associated with the road portion 210B of the road outside of the road 208, including static landmarks 213, natural environment elements 215, road ramps 242, road signs 221,223, and the like.

In some embodiments, identifying the static feature includes identifying information associated with the static feature, including identifying information presented on the road sign. For example, the area 200 includes road signs 221,223, where the road sign 221 indicates the presence of a ramp curve 218 and the road sign 223 is a speed limit sign indicating a speed limit for at least the road portion 210B. Monitoring static features associated with the road portion 210B as the vehicle 202 navigates through the portion 210B includes the ANS determining a physical location of a road marker 221,223 in the road portion 210B based on an external environment of the monitored area 200, identifying information presented on the road marker 221,223, and including such information as part of a virtual representation of the road portion. For example, as vehicle 202 navigates through road portion 210B, ANS 201 may identify the physical location of road sign 223 in portion 210B based on monitoring area 200 by sensors 203, identify that road sign 223 is a speed limit sign, identify that the speed limit indicated by the road sign is 55 miles per hour, and incorporate such information into the driving rule characterization associated with at least road portion 210B as the maximum driving speed when navigating through at least portion 210B.

In some embodiments, the sensor device 203 may monitor driving characteristics of the vehicle 202, other vehicles 232-236 that are approaching the vehicle 202 navigating through the road portion 210, and so on. ANS 201 may utilize such driving features to form one or more portions of a virtual representation of one or more road portions, including one or more static feature representations, driving rules, and the like. For example, based on monitoring the driving speed of one or more of the vehicle 202, the vehicles 232-236, etc. navigating through the road portions 210A-210C, the ANS 201 may determine a driving speed range for autonomous navigation through the one or more road portions 210A-210C. The ANS 201 may determine, based on monitoring driving characteristics of one or more vehicles 202,232-236, an allowable range of accelerations associated with navigating through particular portions 210A-210C, a location in a road portion at which an acceleration event may occur, a location of lanes 214A-214C in the road, an allowable range of separation distances 252,254 between the vehicle 202 and other vehicles navigating through one or more road portions 210A-210C in a common lane 214 as the vehicle 202, an allowable range of separation distances 256A-256B between the vehicle 202 and one or more boundaries of the lane 214B the vehicle 202 is navigating through, and so on.

In some embodiments, the driving characteristics monitored as the vehicle navigates through the road portion are associated with one or more other road portions. For example, where ANS 201 monitors a spacing 252 between vehicle 202 and another trailing vehicle 234 as the vehicle navigates through portion 210B, ANS 201 may develop a driving rule that specifies that spacing 252 is a minimum allowable spacing between vehicle 202 and leading vehicle 236 as vehicle 202 navigates through road portions 210A and 210C. Such correlation may be based at least in part on similarities between road portions. For example, driving characteristics determined based on input data generated as the vehicle 202 navigates through one or more of the road portions 210A-210C may be used to form driving rule representations included in the virtual representations of any similar road portions 210A-210C, while such driving representations are not used to form driving rules included in the virtual representations of the dissimilar road surface portion 210D.

Fig. 3 shows a schematic diagram of a vehicle 302 including an ANS 301 and a set of sensor devices 303, navigating through an area 300 including a plurality of road portions 310A-310C of roads 208,218, according to some embodiments. The vehicle 302 may be manually navigated through a route, and the sensor devices 203 may include one or more external sensor devices, vehicle sensor devices, and the like. The vehicle 302 and ANS 301 may be included in any embodiment of a vehicle, ANS, etc.

In some embodiments, the ANS may utilize driving characteristics monitored as the vehicle navigates through the road portion to determine static characteristic characteristics of the road portion included in the virtual representation of the road portion.

For example, in the embodiment shown in fig. 3, vehicle 302 is navigating along an unpaved road 308 lacking well-defined edges and lane boundaries. ANS 301 may determine edges 312A-312B, lanes 314A-314B, and lane boundaries 317 of road 308 based at least in part on monitoring driving characteristics of vehicle 302 as a user of vehicle 302 navigates vehicle 302 through one or more road portions 310A-310C, and on monitoring driving characteristics of one or more other vehicles 332 as another vehicle 332 navigates vehicle 302 through one or more road portions 310A-310C, based on some combination thereof, and so forth. As shown, ANS 301 may determine edges 312A-312B, lanes 314A-314B, and boundaries 317 based on monitoring driving characteristics of vehicle 302 and vehicle 332. Further, ANS 301 may determine that lane 314B is associated with driving in the opposite direction relative to driving in lane 314A.

Fig. 4 illustrates a block diagram of an Autonomous Navigation System (ANS) according to some embodiments. As shown above with respect to fig. 1, ANS400 may be implemented by one or more computer systems and/or by any combination of hardware and/or software configured to execute various features, modules, or other components discussed below, such as one or more various general purpose processors, graphics processing units, or special purpose hardware components, and may be included in any embodiment of the ANS.

ANS400 includes various modules that may be implemented by one or more instances of hardware, software, etc. ANS400 includes a route characterization module 401 configured to form a virtual characterization of each road portion based on monitoring input data generated based on navigation of a vehicle including ANS400 through each road portion.

ANS400 includes an input data module 410 configured to receive input data from various data sources, and the input data module may include one or more sensor devices. In some embodiments, module 410 is configured to process at least some of the input data based on one or more instances of the input data and determine various static feature characterizations, driving rule characterizations, and the like. Input data may be received from various data sources based on a vehicle navigating over one or more road portions, where such navigation may be manual, autonomous, some combination thereof, and so forth.

The input data module 410 includes an external sensor module 412 configured to receive input data from one or more external sensors of the vehicle, wherein the input data may be generated by the one or more external sensors while the vehicle is navigating through one or more road portions along one or more driving routes.

The module 412 may include a static features module 414 that monitors one or more static features included in one or more road segments as the vehicle navigates through the one or more road segments. Such monitoring may include determining the geographic location of the static features, identifying information presented by the static features, classifying the static features, and the like. For example, the module 414 may identify road edges, lane boundaries, road signs, and the like based on monitoring image data generated by a camera device monitoring the environment external to the vehicle.

In some embodiments, the module 414 monitors the physical location of the vehicle 401 (also referred to herein as a "geographic location," "geographic location," etc.) as the vehicle 401 navigates through one or more road segments. Such monitoring may include determining a geographic location of a vehicle in which ANS400 is located based at least in part on input data received from global navigation satellite system devices. Such physical location data may be used to form a static characterization of the road portion that the vehicle in which ANS400 is located is navigating, including the physical location of the road portion, a driving rule characterization of the road portion, including the speed of driving through the road portion, some combination thereof, and so forth.

The modules 412 may include a dynamic features module 416 that monitors one or more dynamic features encountered by the vehicle as it navigates through one or more road segments, including other vehicles navigating through a road segment, vehicles parked in a road segment, emergency vehicles, vehicle accidents, pedestrians, ambient conditions, visibility, etc. Based on the dynamic features encountered in one or more road portions, the module 416 may form one or more driving rule representations, static feature representations, etc. associated with the road portions.

Module 412 may include a driving characteristics module 418 that may monitor driving characteristics of one or more outside vehicles that are navigating proximate to the vehicle as the vehicle navigates through one or more road segments relative to the vehicle in which ANS400 is included. Such driving characteristics may include driving speed, acceleration rate, spacing between road boundaries, lane boundaries, other vehicles, physical location, and the like. Based on the monitored driving characteristics of the one or more external vehicles in the one or more road portions, the module 418 may form one or more driving rule representations, static feature representations, or the like associated with the road portions.

The input data module 410 includes an interior sensor module 422 configured to receive input data from one or more interior sensors of the vehicle, wherein the input data may be generated by the one or more interior sensors while the vehicle in which the ANS400 is disposed navigates through one or more road portions along one or more driving routes.

The modules 422 may include a control element module 426 that monitors one or more instances of input data associated with one or more control elements of the vehicle in which the ANS400 is disposed as the vehicle navigates through one or more road segments. Such input data may include throttle position data, steering element position, brake device status, wheel speeds, commands to such elements from one or more user interfaces, some combination thereof, and so forth. Based on the control element input data, module 426 may form one or more driving rule representations associated with the road portion through which the vehicle in which ANS400 is located navigates, one or more static feature representations of the road portion through which the vehicle in which ANS400 is located navigates, and the like.

The modules 422 may include a local driving characteristics module 428 that may monitor driving characteristics of the vehicle in which the ANS400 is located while the vehicle is navigating through one or more road segments, including driving characteristics of a user of the vehicle as the user manually navigates the vehicle through one or more road segments. Such driving characteristics may include driving speed, acceleration rate, spacing between road boundaries, lane boundaries, other vehicles, physical location, and so forth. Based on the monitored driving characteristics of the vehicle in which ANS400 is located on one or more road portions, module 428 may form one or more driving rule representations, static feature representations, etc. associated with the one or more road portions.

ANS400 includes a processing module 430 configured to process input data received at module 410 to form one or more virtual representations of one or more road portions. In some embodiments, module 430 is configured to process at least some of the input data and determine a virtual representation of the road portion, the virtual representation including a representation of a static feature included in the road portion, a representation of driving rules for navigating through the road portion, some combination thereof, and/or the like.

The module 430 may include a static feature characterization module 432 configured to form a virtual characterization of the static feature of a particular road portion based on one or more instances of input data associated with the road portion received at the module 410. The module 430 may include a driving rule characterization module 434 configured to form a virtual characterization of the driving rule associated with a particular road portion based on one or more instances of input data associated with the road portion received at the module 410. The modules 432,434 are configured to generate virtual representations associated with one or more road portions based on the sensor data generated and received at the module 410 as the vehicle 401 navigates through the one or more road portions. In some embodiments, module 430 is configured to generate one or more virtual road portion representations of one or more road portions, wherein the representations include various driving rule representations and static feature representations associated with the road portions. In some embodiments, module 430 is configured to generate one or more virtual route representations of one or more driving routes, wherein the generated virtual route representations include at least one set of virtual road portion representations of respective road portions included in the route.

In some embodiments, module 430 is configured to update the previously formed virtual representation of the road portion based at least in part on the additional input data set received at module 410 when the vehicle in which ANS400 is located subsequently navigates through the road portion at least once. In some embodiments, one or more of the modules 432,434 may update one or more portions of the virtual representation of the road portion based at least in part on determining differences between one or more static features, driving features, etc., associated with subsequently navigating through the road portion. For example, where the initially formed virtual representation of the road portion comprises a static feature representation of the road portion, and where module 432 determines that there are additional static features (including road signs) that are not characterized in the initial static feature representation based on processing input data generated when a vehicle in which ANS400 is located subsequently navigates through the same road portion again, module 432 may update the static feature representation of the road portion to incorporate the additional static features.

In some embodiments, module 430 is configured to evaluate a virtual representation of the road portion to determine whether to enable autonomous navigation of at least the road portion based on the virtual representation. Such evaluations may include: determining a confidence indicator associated with the virtual representation of the road portion; tracking changes in confidence indicators for continuously monitoring road segments based on continuous navigation through the road segments by a vehicle in which ANS400 is located; compare the confidence indicator to one or more different thresholds, and so on.

The module 430 may include an evaluation module 436 configured to evaluate a virtual representation of a road portion such that the module 436 associates a confidence indicator with the representation. The confidence indicator may indicate a confidence of the static features, driving features, etc. associated with the road portion that the virtual representation represents within a certain level of accuracy, precision, some combination thereof, etc. For example, a confidence indicator associated with a virtual representation of a road portion may indicate a confidence of all road static features (e.g., road edges, lanes, lane boundaries, road signs, etc.) associated with the road portion that the virtual representation represents within a certain level of accuracy.

In some embodiments, the module 436 updates the confidence indicator of the virtual representation of the road portion over time based on successive processing of successively generated input data sets associated with the road portion. For example, when a vehicle in which ANS400 is located navigates through a given road portion multiple times, and successive processing of successive input data sets associated with the road portion results in little or no additional change in the virtual representation of the formed road portion, module 436 may successively adjust the confidence indicator associated with the virtual representation to reflect the confidence increase in the accuracy and precision of the virtual representation. When the set of input data, after processing, results in a substantial modification of the virtual representation of the road portion, the module 436 may reduce a confidence indicator associated with the virtual representation.

In some embodiments, evaluation module 436 evaluates one or more portions of the driving route and determines whether to enable autonomous navigation of the vehicle in which ANS400 is located through the one or more portions of the driving route based at least in part on determining whether the confidence indicator is associated with a set of road portion virtual representations that satisfy at least some continuous distance threshold, at least some threshold level. For example, the evaluation module 436 may determine that a consecutive group of twelve (12) road portions included in a particular driving route has an associated confidence indicator that exceeds a threshold confidence indication (including a particular level of 90%), based at least in part on determining that the module 436 may enable availability of autonomous navigation of at least a portion of the twelve road portions. Such enablement may include establishing one or more "transition" route portions in which transitions between manual navigation and autonomous navigation occur. Such conversions may include: an autonomous switching portion in which a user is instructed to release manual control of one or more control elements of a vehicle in which ANS400 is located; a manual switching portion in which a user is alerted to take manual control of one or more control elements of a vehicle in which ANS400 is located; some combination thereof, and the like.

Module 430 may include a management module 438 configured to monitor the characterization of the one or more road portions to determine whether additional processing is required to enable autonomous navigation of the one or more road portions. Such additional processing may include implementing one or more processing operations at one or more computer systems implementing ANS 400. Monitoring at module 438 may include monitoring a continuous change in the confidence indicator associated with the virtual representation over time and determining whether additional processing is required based on the time change in the confidence indicator. For example, module 438 may monitor the rate of change of the confidence indicators associated with the virtual representations over time.

In some embodiments, the module 438 determines that additional processing of the virtual representation of the road portion, input data associated with the road portion, some combination thereof, or the like is required based on determining that the rate of change of the associated confidence indicator does not satisfy a threshold rate. For example, if the confidence indicator associated with the virtual representation of a particular road portion fluctuates over time and does not increase at greater than a particular rate, the module 438 may determine that additional processing associated with the road portion is required. Such additional processing may include evaluating multiple sets of input data generated during multiple separate navigations through the road portion, evaluating one or more portions of the virtual representation that are determined to vary iteratively with successive sets of input data, and so forth.

In some embodiments, the module 438 is configured to determine whether to upload one or more of a virtual representation of a road portion, one or more sets of input data associated with the road portion, etc. to one or more remote systems, services, etc. for additional management, processing, etc. For example, if the confidence indicator associated with the virtual representation does not at least meet a threshold level after additional processing by a computer system implementing ANS400, module 438 may determine to upload the virtual representation and the sets of input data associated with the road portion to a remote service, which may include a cloud service.

Module 400 includes an interface module 450 configured to present information associated with autonomous navigation to a user of a vehicle in which ANS400 is located via one or more user interfaces of the vehicle in which ANS400 is located, receive user-initiated commands from the user via one or more user interfaces of the vehicle in which ANS400 is located, and/or the like. For example, based on determining at module 430 that autonomous navigation of a portion of a driving route including a set of road portions is enabled, module 450 may present a representation of the driving route including a representation of the driving route portion that autonomous navigation is enabled, and invite the user to indicate whether to participate in autonomous navigation of the driving route portion. Interface module 450 may receive a user-initiated command to engage in autonomous driving of one or more portions of a driving route. In some embodiments, ANS400 may participate in autonomous navigation of the autonomous navigation-enabled driving route portion via one or more user interfaces, independent of user interaction with the ANS. For example, after enabling autonomous navigation of a road portion, the ANS may automatically engage in autonomous navigation without user intervention once a vehicle in which the ANS is located encounters the road portion. Such automatic engagement autonomous navigation may be selectively enabled based on user interaction with the ANS via one or more user interfaces included in the vehicle.

Module 400 includes a communication module 460 configured to be communicatively coupled to one or more remote services, systems, etc. via one or more communication networks. For example, module 460 may be communicatively coupled with a remote service, system, etc., via a wireless communication network, a cellular communication network, a satellite communication network, etc. Module 460 may communicate data with one or more remote services, systems, etc., including uploading virtual representations, input data sets, etc. to a remote service, system, etc., receiving one or more virtual representations from a remote service, system, etc., some combination thereof, and so forth.

Module 400 includes a database module 440 configured to store one or more virtual representations 442. Such representations may include one or more virtual road portion representations, one or more virtual route portions including one or more sets of virtual road portion representations, some combination thereof, and so forth. As described above, virtual representations 442 may be formed at module 430 based on one or more sets of input data generated based on monitoring one or more of external data, vehicle data, etc., as the vehicle in which ANS400 is located navigates through a particular road portion. The virtual route representation may include representations of various road portions included in the route, including indications of a start location and a destination location of the route. For example, when a vehicle is navigated along a route along a particular set of road portions between two locations, a virtual representation may be formed for each road portion, and the virtual route representation indicates the individual road portions included in the route. In some embodiments, the virtual route characterization includes an indication of which road portions in the route autonomous navigation are enabled.

As shown, the respective virtual representations 442 included in database module 440 may include, for each representation 442, a set of driving rule representations 444 that characterize a set of driving rules that may be used to autonomously navigate vehicle 401 through one or more road portions and a set of static feature representations 446 that characterize respective static features included in the one or more road portions. Further, the virtual representations 442 may include confidence indicators 448 associated with the representations 442.

Fig. 5A-5C illustrate user interfaces associated with an autonomous navigation system, according to some embodiments. The user interface may be generated by any embodiment of the ANS.

The user interface 500 is a display interface that presents a Graphical User Interface (GUI)502 of a display screen. The illustrated GUI 502 shows a map representation that includes a set of roads 510A-510E in a particular geographic area. The set of roads 510A-510E may be referred to as at least a portion of a road network.

In some embodiments, the user interface presented to a user of a vehicle comprising the ANS includes a representation of a route that the vehicle can navigate between one or more locations. A representation of the route may be presented to the interface based on one or more user-initiated commands to display a particular re-characterized route on the presented GUI 502 map. Each route may be associated with a particular heading (e.g., "work route"), and the user may interact with one or more user interfaces to select a particular route based on identifying the particular heading associated with the route that the user desires to navigate the vehicle. In some embodiments, the user interface presents a representation of the particular route based at least on an expectation that a user of the vehicle will desire to navigate the vehicle along the particular route. Such anticipation may be based, at least in part, on an expectation that the vehicle is currently located at a physical location corresponding to a starting location of one or more particular routes, a particular time of day at which the vehicle is currently expected to be located, the particular time corresponding to a range of times during the particular route once navigated from the starting location.

In some embodiments, the GUI presents an interface element (e.g., one or more icons, message prompts, etc.) that includes one or more interactive elements, each of which represents a separate route with which the user can interact to command the interface to present a representation of a particular route. Each route may be a route for which a particular virtual representation is stored at the ANS and may be associated with a particular route title. The route title may be specified by the user, the ANS, some combination thereof, and so forth.

In some embodiments, the interface elements of the GUI presentation indicate a limited selection of routes for the ANS to store the virtual representation based at least in part on one or more of a current location of the interface and the vehicle in which the ANS is located, a current time at the location, or some combination thereof. For example, when the vehicle is currently located near one or more locations (which are the starting locations of one or more routes stored in the ANS with virtual representations), the interface may present interactive elements (including one or more presentations of one or more routes) and prompt the user to interact with one or more representations to select one or more routes. Upon receiving an indication of user interaction with the one or more particular representations, the interface may interact with the ANS to present graphical representations of the one or more routes associated with the one or more particular representations.

As shown in FIG. 5A, a plurality of roads 510A-510E are presented on the GUI 502, which also presents a plurality of location icons 520A-520D associated with the plurality of locations. In the illustrated embodiment, the vehicle in which the interface 500 is located may currently be near location 520A, which may be the starting location of several individual routes separating the destination locations. As shown, three locations 520B-520D are presented relative to the illustrated roadway 510 at locations that correspond to the physical locations of the locations relative to the roadway. Each individual location may be a destination location for one or more driving routes starting at the starting location 520A.

In some implementations, the individual locations 520B-520D can be presented in the GUI in response to: the vehicle on which interface 500 is located is identified as being near location 520A, location 520A is identified as a starting location for a plurality of individual driving routes, and locations 520B-520D are identified as destination locations for one or more of the identified individual driving routes. For example, in response to detecting that the user has occupied the vehicle, the ANS and the interface may interoperate to identify a current location of the vehicle based on input data received at the ANS from one or more sensor devices. The ANS may identify one or more starting locations virtually characterizing one or more driving routes stored at the ANS, the starting locations being near a current location of the vehicle, and further identify one or more destination locations of the one or more driving routes. The interface may present a graphical representation of the identified starting and destination locations to the user, and may also present one or more interactive interface elements with which the user may interact to select one or more driving routes. As shown, the GUI 502 includes an interface element 580 that includes three separate representations 590A-590C of three separate driving routes. Each driving route may have a starting location 520A and a separate one of the illustrated destination locations 520B-520D. As shown, each representation 590A-590C includes a route heading associated with a respective driving route. Each representation may be interactive such that a user may interact with one or more of the representations 590 to select one or more driving routes associated with the representations. In response to a user-initiated interaction with one or more of the particular representations 590A-590C, one or more of the ANS and the interface may identify that the user has selected a particular driving route and present a representation of the driving route on the GUI.

Fig. 5B shows GUI 502 presenting a representation of a particular driving route 530 extending between a starting location 520A and a destination location 520B. The driving route 530 includes a set of road portions 532 that extend between the two locations 520A-520B. In some embodiments, the illustrated route 530 does not indicate boundaries between the various road portions 532 included in the route 530.

In some embodiments, the particular represented driving route includes one or more portions that enable autonomous navigation. In the illustrated embodiment, the representation 530 of the driving route includes a representation of a portion 540 of the autonomous navigation-enabled route. The representation may include a message 570 inviting the user to indicate whether to participate in autonomous navigation of the portion 540 of the autonomous navigation-enabled route by interacting with one or more interactive elements 572 of the GUI 502.

In some implementations, when portion 540 of route 530 enables its autonomous navigation, a portion of the portion 540 is associated with a transition between manual navigation and autonomous navigation. For example, transition region 546 of portion 540 is associated with a transition of autonomous navigation of portion 540 to manual navigation of the remaining portion of route 530 to location 520B. In some embodiments, the GUI is configured to present various messages to the user based on the current location of the vehicle in which the interface device 500 is included. For example, when the vehicle is navigating autonomously through section 540 and crosses boundary 545 into the transition section, GUI 502 may present an alert message alerting the user that manual navigation is about to be transmitted. When the vehicle crosses the boundary 547, autonomous navigation may be disabled and a message may be presented on the GUI 502 alerting the user to the fact. One or more alerts presented on user interface 502 may be accompanied by other alert signals presented via one or more other user interfaces. For example, presentation of the alert message on the GUI 502 may be accompanied by an audio signal presented via one or more speaker interface devices of the vehicle.

In some embodiments, portion 540 is represented differently than the rest of route 530. For example, portion 540 may be represented in a different color than the rest of route 530. In another example, an animation effect may be presented on portion 540.

Fig. 5C illustrates a user interface associated with an autonomous navigation system, according to some embodiments. The user interface may be generated by any embodiment of the ANS. In some embodiments, as the portion of the autonomous navigation enabled route changes over time, the representation of the portion of the autonomous navigation enabled route 540 may change accordingly. For example, as shown, when a road portion in the autonomous navigation enabled route includes an additional portion that extends toward location 520B relative to the portion shown in fig. 5B, a representation of portion 540 to be correspondingly extended may be shown in GUI 502. When the portion 540 extends within some recent period of time, the extension elements of the portion 540 may be represented differently than the rest of the portion 540, including in a different color than the rest of the portion 540.

FIG. 6 illustrates a user interface associated with an autonomous navigation system, according to some embodiments. The user interface may be generated by any embodiment of the ANS.

In some embodiments, when one or more alternative routes between one or more particular locations are available relative to a route recently navigated between the one or more particular locations, an indication of such one or more alternative routes may be presented to the user via the user interface, and the user may be requested to engage in navigation of the one or more alternative routes relative to the recently navigated route.

An alternative route may be proposed to the user based on determining that a confidence indicator associated with a virtual representation of one or more road portions included in the driving route is not high enough to enable autonomous navigation of one or more portions of the route. The alternative route may include a route for which autonomous navigation is enabled for one or more portions thereof, such that proposing to the user to participate in the navigation of the alternative route includes inviting participation in the autonomous navigation of the one or more portions of the alternative route. In some embodiments, the alternative route does not include a portion that enables autonomous navigation, and a virtual representation of one or more portions of the alternative route may not currently exist. Thus, proposing to navigate the alternative route may include inviting the user to manually navigate along the route such that a virtual representation of one or more road portions included in the alternative route may be formed and autonomous navigation of the alternative route may then be enabled.

In the illustrated embodiment, the user interface device 600 includes a display interface 602, which may include a GUI 602. The GUI 602 shows one or more representations of one or more roads 610A-610E and a representation of a particular driving route 620 between a starting location 612A and a destination location 612B. As further illustrated, the GUI 602 shows a representation of the alternative route 630 between the two locations 612A-612B and a message 670 prompting the user to selectively engage or decline engagement with autonomous navigation of the alternative route 630, rather than navigating along the route 620 based at least in part on interacting with one or more of the interactive elements 672 and 674 of the GUI 602.

Fig. 7 illustrates forming a virtual representation of one or more road portions to enable autonomous navigation of the one or more road portions, according to some embodiments. The forming process may be implemented by any embodiment of an ANS included in one or more vehicles, and may be implemented by one or more computer systems.

At 702, based on receiving a vehicle that manually navigates through one or more road portions, a set of input data is received from one or more sensor devices of the vehicle. The set of input data may include: external sensor data indicative of respective static characteristics of the road portion; vehicle sensor data indicative of various instances of data associated with a vehicle; driving characteristic data associated with one or more of the vehicle, one or more other external vehicles navigating a portion of a road near the vehicle; some combination thereof, and the like.

At 704, a virtual representation of one or more road portions is formed based at least in part on processing at least some of the received input data sets. The virtual representations may include representations of a set of driving rules associated with navigating through the road portion, representations of static features of the road portion, some combination thereof, and so forth. In some embodiments, forming a virtual representation of one or more road portions includes forming a virtual representation of a driving route including one or more sets of road portions through which the vehicle navigates between one or more starting locations, destination locations, and/or the like.

At 706, a confidence indicator associated with the formed virtual representation of the one or more road portions is determined. The confidence indicator may indicate a confidence associated with one or more of accuracy, precision, etc. of the virtual representations of the one or more road portions. At 708, it is determined whether a confidence indicator associated with the one or more virtual representations at least meets a confidence threshold level. The threshold level may be associated with a confidence indicator that is sufficiently high such that the virtual representation of the one or more road portions may be used to safely engage in autonomous navigation through the one or more road portions. If so, at 709, autonomous navigation of the one or more road portions is enabled such that autonomous navigation of the one or more road portions may be engaged. If not, at 710, it is determined whether the rate of change of the confidence indicator for one or more virtual representations is greater than a threshold rate level according to a continuous change of the virtual representations based on a continuous input data set generated based on continuous manual navigation of one or more road portions. If so, the process 702 is iteratively repeated 710 by successive manual navigations along the one or more road portions, resulting in successive input data sets for updating the virtual representation and confidence indicator of the one or more road portions being generated until the confidence indicator increases above the confidence threshold, changes at a rate less than the confidence threshold, and so on.

As shown at 712, if it is determined at 710 that the rate of change of the confidence indicator of one or more virtual representations of one or more road portions is less than the confidence threshold, one or more of the virtual representations, the input data sets, etc. may be uploaded to a remote service, system, etc. for additional processing to modify the virtual representations to increase the confidence indicator associated with the virtual representations that exceed the confidence threshold. At 714, it is determined whether an alternative route is available with respect to the driving route in which the one or more road portions are included. In some embodiments, the alternative route includes one or more portions that enable autonomous navigation. At 716, if an alternative route is available, the alternative route may be proposed to the user of the vehicle as an option to navigate between the start location and the destination location via one or more user interfaces of the vehicle, the alternative route replacing the driving route that was most recently navigated between the start location and the destination location.

Autonomous navigation network

In some embodiments, a plurality of ANS are installed in a plurality of individual vehicles, and each individual ANS may form a virtual representation of one or more driving routes navigated by the respective vehicle in which the respective ANS is installed. In some embodiments, multiple separate ANS may be communicatively coupled with and communicate data with remote systems, services, and the like. The remote systems, services, etc. may include a navigation monitoring system implemented on one or more computer systems external to the plurality of vehicles and communicatively coupled to the one or more vehicles via one or more communication networks. One or more monitoring systems may be communicatively coupled via one or more communication networks, and the ANS in a given vehicle may be communicatively coupled with one or more navigation monitoring systems.

Data communications between the plurality of ANS and the one or more monitoring systems may include each ANS "uploading" one or more sets of virtual route representations, virtual road portion representations, input data received from sensors of the vehicle in which the ANS is uploaded, and the like. In some embodiments, the ANS uploads the virtual representations to a remote system, service, or the like, which is incorporated into the database of representations and input data at the navigation monitoring system. In some embodiments, the ANS uploads one or more virtual representations, input data sets, etc. processed by the navigation monitoring system to refine the one or more virtual representations such that autopilot may be enabled for the one or more representations.

Data communication between the plurality of ANS and the one or more monitoring systems may include the navigation monitoring system distributing or "downloading" one or more virtual representations of one or more driving routes, road portions, etc. to one or more ANS installed in one or more vehicles. The virtual representations distributed from the navigation monitoring system to the ANS may include virtual representations formed at least in part at the navigation monitoring system based on data received from one or more ANS, virtual representations formed at a separate ANS and uploaded to the navigation monitoring system, some combination thereof, or the like.

Fig. 8 illustrates a schematic diagram of an autonomous navigation network 800 that includes a plurality of ANS 804A-804F located in individual vehicles 802A-802F that are communicatively coupled to a navigation monitoring system 810 via one or more communication links 820 over one or more communication networks, according to some embodiments. Each ANS 804 shown may include any of the ANS shown in any of the embodiments described above.

In some embodiments, navigation monitoring system 810 implemented on one or more computer systems external to each vehicle 802 in network 800 includes a processing module 812 implemented by one or more instances of processing circuitry included in the navigation monitoring system that can process one or more sets of input data associated with one or more road portions, one or more virtual representations of one or more driving routes, some combination thereof, or the like. In some embodiments, the navigation monitoring system 810 includes a database 814 in which a plurality of different virtual representations 816 of one or more driving routes, one or more road portions, and the like are stored.

In some embodiments, the navigation monitoring system 810 communicates with each ANS 804A-804F via one or more communication links 820. Such communications may include exchanging virtual representations between one or more ANS 804 and navigation monitoring system 810, exchanging input data sets associated with one or more road portions, and the like. For example, each ANS 804A-804F includes at least one database 806A-806F in which one or more sets of input data, one or more virtual representations, and the like can be stored. ANS 804 may upload virtual representations formed by ANS 804 and stored in respective databases 806 to monitoring system 810 for one or more of processing, storage, etc. at database 814. Monitoring system 810 can distribute one or more virtual representations 816 stored at database 814 (including virtual representations received from one or more ANS 804, virtual representations formed at least in part at navigation monitoring system 810 via processing module 812, etc.) to one or more ANS 804, for storage in one or more databases 806 of respective one or more ANS 804. For example, virtual route representations formed at ANS 804E can be uploaded to system 810 and distributed to ANS 804A-804D, 804F. In some embodiments, data including some or all of the route representations may be continuously uploaded from one or more ANS to the system 810. For example, as the vehicle 802A continues to navigate one or more road portions while the vehicle 802A autonomously navigates the route, the ANS 804A may process input data from various sensor devices of the vehicle 802A and continuously upload the input data, virtual representations, and/or the like to the system 810 based at least in part on such input data, some combination thereof, and/or the like.

Fig. 9A-9B illustrate schematic diagrams of an autonomous navigation network 900, which includes a plurality of ANS 904A-904D located in independent vehicles 902A-902D that are communicatively coupled to a navigation monitoring system 910 via one or more communication links 920A-920D over one or more communication networks, according to some embodiments. Each ANS 904 shown can include any ANS shown in any of the above embodiments.

In some embodiments, a plurality of individual ANS located in individual vehicles form virtual representations of various individual groups of road segments, driving routes, and the like. The independent ANS may communicate one or more of these locally formed virtual representations to the navigation monitoring system, where various features from the various ANS may be incorporated into the set of virtual representations at the navigation monitoring system. In some embodiments, one or more ANS located in one or more independent vehicles simultaneously autonomously navigate one or more road portions and continuously upload virtual representations formed based on sensor data generated during autonomous navigation of the one or more road portions.

Fig. 9A shows each of the individual ANS 904A-904D of the individual vehicles 902A-902D transmitting individual groups 909A-909D of virtual road section representations to the monitoring system 910 via individual communication links 920A-920D. Each individual group 909A-909D of virtual representations is illustrated in FIG. 9A as an individual map representation 908A-908D, wherein the individual group 909A-909D is shown to include the geographic location and roads of the virtual representation. As shown, each map 908A-908D is an illustrative representation of a common geographic area, and each individual group 909A-909D of virtual representations includes a plurality of virtual road segment representations of an individual group of road segments. In some embodiments, the independent set of virtual representations includes virtual representations of a common road segment. For example, as shown in FIG. 9A, the set of virtual representations 909A-909B includes virtual representations of the road portion 911.

Various ANS may communicate the virtual representations to the navigation monitoring system based on various triggers. For example, in response to the formation, updating, etc. of the representations, in response to a timestamp trigger, in response to a query from the navigation monitoring system 910, the various ANS 904 can transmit at least some partially formed virtual representations, locally stored virtual representations, etc. to the navigation monitoring system 910 intermittently, continuously, periodically, in some combination thereof, etc.

Upon receiving the virtual representation from the ANS, the navigation monitoring system may implement processing of the virtual representation, which may include automatically modifying various elements of the virtual representation such that a confidence indicator associated with the virtual representation is improved. The processing that may be implemented by the one or more processing modules 912 of the system 910 may include: processing the virtual representation in response to determining that the confidence indicator associated with the received virtual representation is less than the threshold confidence indication, processing the virtual representation in response to identifying a management token associated with the received virtual representation, and so on.

In some embodiments, the navigation monitoring system 910 processes the received virtual representations, input data sets, etc. associated with the one or more road portions relative to the stored virtual representations of the one or more road portions. Such processing may include comparing two independent virtual representations of the road portion and discarding one virtual representation and storing the other virtual representation in response to determining that the confidence indicator associated with the preferred virtual representation is better than the confidence indicator associated with the discarded virtual representation. In some embodiments, such processing may include forming a "composite" virtual representation of one or more road portions based, at least in part, on data combined from two or more virtual representations of the one or more road portions. For example, a composite virtual representation of a road portion may be formed based on at least some of the static feature representations included in one virtual representation of the road portion, at least some of the other static feature representations included in another virtual representation of the road portion, and at least some of the driving rule representations combined from yet another virtual representation of the road portion. Such incorporation of multiple elements from various virtual representations may be based at least in part on determining that a confidence indicator associated with a given element of a given virtual representation is superior to corresponding elements of other virtual representations.

Fig. 9B shows a graphical representation of a group 919 of virtual road section representations stored in database 914 of monitoring system 910, where the various representations in the group 919 may be formed based at least in part on representations 909A-909D received from one or more ANS 904. As shown, a group 919 of virtual representations is shown in map representation 918, which shows that group 919 includes the geographic location and roads of the virtual representations. Group 919 includes a virtual representation for each road portion for which a virtual representation is received in one or more groups 909 received from one or more ANS 904. Where multiple virtual representations of a road portion are received in various groups 909, the respective virtual representations in group 919 can include a composite representation formed from the multiple received representations, a selected one of the received virtual representations, some combination thereof, or the like.

In some embodiments, the navigation monitoring system distributes at least a portion of the virtual representations stored at the navigation monitoring system to one or more ANS via one or more communication links. In response to receiving a request for such distribution from the ANS, in response to updating to a virtual representation stored at the navigation monitoring system, in response to a timestamp trigger, the navigation monitoring system may distribute one or more virtual representations to the ANS intermittently, continuously, at periodic intervals, some combination thereof, or the like. As shown in fig. 9B, monitoring system 910 can distribute one or more virtual representations included in a stored set 919 of virtual representations to one or more ANS 904A-904D via one or more communication links 920A-920D. In some embodiments, the navigation monitoring system distributes a limited selection of virtual representations of one or more road portions to a given ANS, wherein the given ANS currently does not have a stored virtual representation associated with a greater confidence indicator than the confidence indicator associated with the limited selected virtual representation.

In some embodiments, the ANS may communicate with the navigation monitoring system to form a virtual representation of the one or more road portions with sufficiently high confidence indicators to enable autonomous navigation of the one or more road portions via the virtual representation. The communication may include uploading one or more sets of input data associated with the portion of the road used for formation into one or more virtual representations, uploading one or more formed virtual representations for additional processing, and so forth.

In some embodiments, the ANS may upload the virtual representations, one or more sets of input data, some combination thereof, and/or the like to the navigation monitoring system based at least in part on determining that the confidence indicator associated with one or more virtual representations is changing at a rate less than a threshold during a number of updates of the one or more virtual representations. Such uploading may include transmitting a request to the navigation monitoring system to enable additional processing of the virtual representation, input data, etc., to generate a modified virtual representation. Once the virtual representations are received at the navigation monitoring system, a processing module of the navigation monitoring system may implement processing of one or more of the representations included in the virtual representations to generate modified virtual representations having improved associated confidence indicators. Such processing may include implementing processing capabilities that are not available in the ANS. If such a modified virtual representation cannot be generated based on processing by the navigation monitoring system, the navigation monitoring system may mark one or more portions of the virtual representation for "manual management," whereby the one or more portions of the virtual representation may be modified based on user-initiated modification of the portions.

The navigation monitoring system may alert a user, operator, etc. that the virtual representations need to be manually modified to improve their associated confidence indicators. If manual management does not result in a modified virtual representation having an associated confidence indicator that at least meets a threshold level, the navigation monitoring system may generate a scheduling command to schedule a dedicated sensor suite, which may be included in a dedicated sensor vehicle, to one or more road portions characterized by the virtual representation, where the sensor suite is commanded to collect additional input data sets associated with the road portions. Once such input data sets are received at the navigation monitoring system from the dedicated sensor suite, the data may be processed to effect additional modifications of the virtual representation of the road portion.

If such additional processing does not result in a modified virtual representation having an associated confidence indicator that at least meets a threshold level, then the road portion may be associated with a warning sign, and an alternative driving route may be associated with a driving route that includes the marked road portion. Such representations of alternative driving routes may be distributed to one or more ANS, including the ANS from which the virtual representation was originally received.

Fig. 10 illustrates a "management spectrum" 1000 that may be used in the process of generating one or more virtual road portion representations, according to some embodiments. Such a management spectrum 1000 can include one or more ANS, monitoring systems, etc., which can include any embodiment of the above ANS, monitoring systems, etc.

In some embodiments, an ANS included in the vehicle implements "local management" 1001 of a virtual representation of a road portion, wherein the ANS processes one or more sets of input data associated with the road portion, as well as the virtual representation of the road portion, to update the virtual representation. Such processing may occur in response to receiving the set of input data, and may occur multiple times in response to input data sets that are continuously generated based on continuous navigation of the vehicle along the road portion. Such continuous processing may result in a confidence indicator associated with the virtual representation varying over time. For example, with each update to a virtual representation, the confidence indicator may change based at least in part on a change in the likely presence of the virtual representation with each successive update thereof.

As shown in fig. 10, vehicle 1002 may include an ANS 1004 that includes a processing module 1006 that may implement the update. Processing module 1006 may compare the virtual representations and associated confidence indicators to one or more threshold confidence indicators, threshold rates, and the like. ANS 1004 may selectively enable autonomous navigation of the road portion based at least in part on determining that the associated confidence indicator satisfies at least a confidence indication threshold, which may also be interchangeably referred to as a "threshold confidence indication," "threshold," and so forth.

In the event that the confidence indicator does not meet the threshold, ANS 1004 can determine whether the confidence indicator changes at a minimum threshold rate over time and continuously update the associated virtual representation. If not, ANS 1004 can upload some or all of the virtual representations to navigation monitoring system 1010 for next level "management". Such management may include "remote automated management" 1003, in which one or more processing modules 1012 included in the navigation monitoring system 1010 process one or more virtual representations of one or more road portions to form one or more modified virtual representations. Such processing can be implemented automatically without requiring manual input by one or more users using processing power not available locally to ANS 1004. For example, the processing module 1012 of the navigation monitoring system 1010 may include a processing system, processing circuitry, or the like, configured to implement additional processing algorithms, modification processes, or the like that may modify the received virtual representations. Automated management 1003 is implemented to generate a modified virtual representation of the road portion with an associated confidence indicator that is better than the confidence indicator associated with the virtual representation of the road portion received from ANS 1004. In some embodiments, the automated management 1003 may obtain a modified virtual representation of the road portion associated with a confidence indicator that satisfies at least a threshold confidence indication associated with enabling autonomous navigation of the road portion. Where the automated management 1003 results in the modified virtual representation, the modified virtual representation may be stored at the navigation monitoring system 1010, distributed to the ANS 1004, and the like.

Where the automated management 1003 results in a modified virtual representation associated with a confidence indicator that at least does not meet a threshold, the navigation monitoring system 1010 is configured to implement a next level of "management," which may include "manual management" 1005, where the processing module 1012 of the navigation monitoring system 1010 implements additional processing of the virtual management based on user-initiated manual input. Implementations of such manual management may include a processing module 1012 responsive to determining that a confidence indicator of a modified virtual representation of a road portion formed via the automated management 1003 does not at least satisfy a threshold. The threshold may be a confidence indication threshold associated with enabling autonomous navigation, a virtual representation initially received at the navigation monitoring system 1010 from ANS 1004, some combination thereof, or the like.

Responding to such determinations may include generating a warning message that may be transmitted to one or more human operators supported by one or more computer systems, identifying the virtual representation, and requesting "manual management" of one or more portions of the identified virtual representation. In response, the navigation monitoring system may receive one or more operator-initiated manually-input commands to effect a particular modification to one or more elements of the virtual representation. The navigation monitoring system may modify the virtual representation based on the received one or more operator-initiated manually-input commands to generate a modified virtual representation of the road portion having an associated confidence indicator that is better than the confidence indicator associated with the virtual representation of the road portion received from ANS 1004. In some embodiments, the manual management 1005 may result in a modified virtual representation of the road portion associated with a confidence indicator that at least satisfies a threshold confidence indication associated with enabling autonomous navigation of the road portion. Where manual management 1005 results in the modified virtual representation, the modified virtual representation can be stored at the navigation monitoring system 1010, distributed to the ANS 1004, and the like.

Where manual management 1005 results in a modified virtual representation associated with a confidence indicator that at least does not meet a threshold, the navigation monitoring system 1010 is configured to implement a next level of "management," which may include "additional data management" 1007 in which the processing module 1012 of the navigation monitoring system 1010 implements additional processing of the virtual representation of one or more road portions based on input data associated with the road portions received from one or more dedicated sensor suites that are scheduled to generate data associated with the road portions. Such implementations may include a processing module 1012 responsive to determining that a confidence indicator of a modified virtual representation of a road portion formed via manual management 1005 does not at least meet a threshold. The threshold may be a confidence indication threshold associated with enabling autonomous navigation, a virtual representation initially received at the navigation monitoring system 1010 from ANS 1004, some combination thereof, or the like.

Responding to the determination may include generating scheduling commands to one or more sensor suites 1022, which may be included in the one or more dedicated sensor vehicles 1020, to proceed to a road portion characterized by the virtual representation and to generate additional input data associated with the road portion via various sensors included in the sensor suites 1022. The processing module 1012 of the navigation monitoring system 1010 may communicate with the sensor suite 1022 via the communication module 1016 of the navigation monitoring system 1010. The navigation monitoring system 1010 may receive additional sets of input data from the sensor suite 1022 and implement additional processing of the virtual representation of the road portion based at least in part on the additional input data via the processing module 1012. The navigation monitoring system may modify the virtual representation based on the received one or more sets of additional input data to generate a modified virtual representation of the road portion having an associated confidence indicator that is better than the confidence indicator associated with the virtual representation of the road portion received from ANS 1004. In some embodiments, the management 1007 may result in a modified virtual representation of the road portion associated with a confidence indicator that at least satisfies a threshold confidence indication associated with enabling autonomous navigation of the road portion. Where management 1007 results in the modified virtual representation, the modified virtual representation can be stored at navigation monitoring system 1010, distributed to ANS 1004, and the like.

Fig. 11 illustrates the receipt and processing of a virtual representation of one or more road portions, according to some embodiments. The receiving and processing may be implemented on one or more computer systems, including one or more computer systems implementing one or more monitoring systems, ANS, etc.

At 1102, one or more virtual representations are received. One or more virtual representations may be received from one or more ANS, monitoring systems, or the like. Such virtual representations may be received at a navigation monitoring system, ANS, or the like. The virtual representations may include virtual representations of road portions, virtual representations of driving routes, some combination thereof, and the like.

At 1104, it is determined whether the plurality of received virtual representations are virtual representations of a common road segment, a driving route, and the like. For example, two separate virtual representations of a common road portion may be received from two separate ANS. Such a determination of commonality can be made based at least in part on comparing one or more static feature representations, driving rule representations, and the like, included in the various virtual representations. For example, such commonality may be determined based at least in part on determining that two independent virtual road portion features include a common set of geo-location coordinates in a static feature representation of the independent virtual representations.

If it is determined at 1106 that at least two virtual representations exist for the common road portion, the driving route, etc., a composite virtual representation is formed based at least in part on each of the at least two virtual representations. Such forming may include incorporating at least some elements of the various virtual representations into a common composite virtual representation. For example, at least some of the static feature representations of one virtual representation and at least some of the static feature representations of another independent virtual representation may be incorporated into a composite virtual representation.

If the virtual representation is the only virtual representation of a road segment, driving route, etc., a composite virtual representation thereof, some combination thereof, etc., at 1108, the virtual representation is provided to one or more recipients. Such a recipient may include a local database such that the virtual representations are stored in the database. Such recipients may include one or more remotely located ANS, services, systems, etc., such that the virtual representations are communicated into the database via one or more communication links. The database may be included in a navigation monitoring system that is communicatively coupled to a plurality of ANS, other monitoring systems, some combination thereof, and the like. The navigation monitoring system may distribute one or more stored virtual representations to one or more ANS, monitoring systems, or the like.

Fig. 12 illustrates one or more virtual representations of one or more road portions to implement at least a portion of a managed spectrum, according to some embodiments. The implementation may be implemented on one or more computer systems, including one or more computer systems implementing one or more monitoring systems, ANS, etc.

At 1202, one or more virtual representations are received. One or more virtual representations may be received from one or more ANS, monitoring systems, or the like. Such virtual representations may be received at a navigation monitoring system, ANS, or the like. The virtual representations may include virtual representations of road portions, virtual representations of driving routes, some combination thereof, and the like.

At 1204, for one or more received virtual representations, it is determined whether the respective virtual representation has an associated confidence indicator that at least meets a threshold, which may be a threshold associated with enabling autonomous navigation of the road portion and characterized by the virtual representation. If so, at 1206, the virtual representations are stored in one or more databases.

As shown at 1208, if the confidence measure associated with the virtual representation is determined to be less than the threshold, automatic management of the virtual representation is implemented. Such implementations may include processing one or more elements of the virtual representation, including one or more static feature representations, driving rule representations, and the like, to generate a modified virtual representation. In some embodiments, automatic management is achieved without any manual input by any human operator. In some embodiments, generating the modified virtual representation includes establishing a confidence indicator associated with the modified virtual representation.

At 1210, it is determined whether a confidence indicator associated with the modified virtual representation has an improved value. The improved values may include a confidence indicator that is better than an associated confidence indicator of the unmodified virtual representation, a confidence indicator that at least meets a threshold associated with enabling autonomous navigation, some combination thereof, and so forth. If so, the modified virtual representation is stored in a database, as shown at 1222. The modified virtual representation can be distributed to one or more ANS, monitoring systems, and the like.

If not, manual management of the virtual representations is implemented, as shown at 1212. Such implementations may include tagging virtual representations for manual management. Such marking may include generating a warning message to one or more human operators supported by the one or more computer systems, wherein the warning message instructs the one or more human operators to provide one or more manual input commands to modify one or more elements of the virtual representation to generate a modified virtual representation. The message may include an instruction to modify one or more particular elements of the virtual representation. For example, where the confidence measure is below the threshold determined to be due to one or more particular elements of the virtual representation, including one or more particular static feature representations included therein, the message may include an indication to modify at least one or more of the particular static feature representations. Based on manual input received from an operator, manual control may include implementing specific modifications to the various representations included in the virtual representation.

At 1214, it is determined whether the confidence indicator associated with the modified virtual representation has an improved value. The improved values may include a confidence indicator that is better than an associated confidence indicator of the unmodified virtual representation, a confidence indicator that at least meets a threshold associated with enabling autonomous navigation, some combination thereof, and so forth. If so, the modified virtual representation is stored in a database, as shown at 1222. The modified virtual representation can be distributed to one or more ANS, monitoring systems, and the like.

If not, the virtual representation tag is used for additional data management, as shown at 1216. Such tagging may include generating a message to one or more sensor suites, a human operator of one or more sensor suites, one or more vehicles including the one or more sensor suites, or the like, to deploy the one or more sensor suites to one or more road portions characterized in the one or more sensor suites to generate additional input data sets associated with the one or more road portions.

At 1218, one or more sets of additional input data are received and the virtual representation is processed based on the additional input data to modify one or more portions of the virtual representation. Such modification results in the generation of a modified virtual representation, which may include an updated associated confidence indicator.

At 1220, it is determined whether the confidence indicator associated with the modified virtual representation has an improved value. The improved values may include a confidence indicator that is better than an associated confidence indicator of the unmodified virtual representation, a confidence indicator that at least meets a threshold associated with enabling autonomous navigation, some combination thereof, and so forth. If so, the modified virtual representation is stored in a database, as shown at 1222. The modified virtual representation can be distributed to one or more ANS, monitoring systems, and the like.

If not, as shown at 1224, a determination is made as to whether an alternative route is available therein with respect to the driving route including the road portion characterized by the virtual representation. If so, as shown at 1226, an alternative route is identified and associated with the virtual representation such that identifying the driving route including the portion of the road characterized by the virtual representation includes identifying the alternative route.

Exemplary computer System

Fig. 13 illustrates an exemplary computer system 1300 that may be configured to include or perform any or all of the embodiments described above. In different embodiments, computer system 1300 may be any of various types of devices, including but not limited to: personal computer systems, desktop computers, laptop computers, notebook computers, tablet computers, all-in-one computers, tablet or netbook computers, mobile phones, smart phones, personal digital assistants, portable media devices, mainframe computer systems, handheld computers, workstations, network computers, cameras or camcorders, set-top boxes, mobile devices, consumer devices, video game machines, handheld video game devices, application servers, storage devices, televisions, video recording devices, peripheral devices (such as switches, modems, routers), or generally any type of computing or electronic device.

Various embodiments of an Autonomous Navigation System (ANS) as described herein may be executed in one or more computer systems 1300, which may interact with various other devices. It is noted that any of the components, acts, or functionality described above with respect to fig. 1-13 may be implemented on one or more computers configured as the computer system 1300 of fig. 13, according to various embodiments. In the illustrated embodiment, computer system 1300 includes one or more processors 1310 coupled to a system memory 1320 via an input/output (I/O) interface 1330. Computer system 1300 also includes a network interface 1340 coupled to I/O interface 1330, as well as one or more input/output devices, which may include one or more user interface devices. In some cases, it is contemplated that embodiments may be implemented using a single instance of computer system 1300, while in other embodiments multiple such systems or multiple nodes making up computer system 1300 may be configured to host different portions or instances of an embodiment. For example, in one embodiment, some elements may be implemented via one or more nodes of computer system 1300 that are different from those implementing other elements.

In various embodiments, the computer system 1300 may be a single-processor system including one processor 1310, or a multi-processor system including several processors 1310 (e.g., two, four, eight, or another suitable number). Processor 1310 may be any suitable processor capable of executing instructions. For example, in various embodiments, processor 1310 may be a general-purpose or embedded processor implementing any of a variety of Instruction Set Architectures (ISAs), such as the x86, PowerPC, SPARC, or MIPS ISAs, or any other suitable ISA. In a multi-processor system, each processor 1310 may typically, but need not necessarily, implement the same ISA.

The system memory 1320 may be configured to store program instructions 1325, data 1326, etc. that are accessible by the processor 1310. In various embodiments, system memory 1320 may be implemented using any suitable memory technology, such as Static Random Access Memory (SRAM), synchronous dynamic ram (sdram), non-volatile/flash memory, or any other type of memory. In the illustrated embodiment, the program instructions included in the memory 1320 can be configured to implement some or all of an automotive climate control system incorporating any of the functionality described above. Further, the existing vehicle component control data of memory 1320 may include any of the information or data structures described above. In some embodiments, program instructions and/or data may be received, transmitted or stored on a different type of computer-accessible medium or similar medium, independent of system memory 1320 or computer system 1300. Although computer system 1300 is described as implementing the functionality of the functional blocks of the previous figures, any of the functionality described herein may be implemented by such a computer system.

In one embodiment, I/O interface 1330 may be configured to coordinate I/O communications between processor 1310, system memory 1320, and any peripheral devices in the device, including network interface 1340 or other peripheral device interfaces, such as input/output devices 1350. In some embodiments, the I/O interface 1330 may perform any necessary protocol, timing, or other data transformations to convert data signals from one component (e.g., the system memory 1320) into a format that may be suitable for use by another component (e.g., the processor 1310). In some embodiments, I/O interface 1330 may include, for example, support for devices attached through various types of peripheral buses, such as a variant of the Peripheral Component Interconnect (PCI) bus standard or the Universal Serial Bus (USB) standard. In some embodiments, the functionality of I/O interface 1330 may be divided into two or more separate components, such as a north bridge and a south bridge, for example. Further, in some embodiments, some or all of the functionality of I/O interface 1330, such as an interface to system memory 1320, may be incorporated directly into processor 1310.

Network interface 1340 may be configured to allow data to be exchanged between computer system 1300 and other devices 1360 (e.g., carriers or proxy devices) attached to network 1350, or between nodes of computer system 1300. In various embodiments, network 1350 may include one or more networks, including but not limited to: a Local Area Network (LAN) (e.g., ethernet or an enterprise network), a Wide Area Network (WAN) (e.g., the internet), a wireless data network, some other electronic data network, or some combination thereof. In various embodiments, network interface 1340 may support communication via a wired or wireless general purpose data network, such as any suitable type of ethernet network, for example; communication via a telecommunications/telephony network such as an analog voice network or a digital optical fiber communication network; communication via a storage area network such as a fibre channel SAN or via any other suitable type of network and/or protocol.

Input/output devices may include, in some embodiments, one or more display terminals, keyboards, keypads, touch pads, scanning devices, voice or optical recognition devices, or any other devices suitable for entering or accessing data by one or more computer systems 1300. Multiple input/output devices may be present in computer system 1300 or may be distributed across various nodes of computer system 1300. In some embodiments, similar input/output devices may be separate from computer system 1300 and may interact with one or more nodes of computer system 1300 via a wired or wireless connection, such as through network interface 1340.

As shown in fig. 13, memory 1320 may include program instructions 1325 that may be executed by a processor to implement any of the elements or actions described above. In one embodiment, the program instructions may implement the methods described above. In other embodiments, different elements and data may be included. Note that the data may include any of the data or information described above.

Those skilled in the art will appreciate that computer system 1300 is merely illustrative and is not intended to limit the scope of embodiments. In particular, the computer systems and devices may include any combination of hardware or software that can perform the indicated functions, including computers, network devices, internet appliances, personal digital assistants, wireless telephones, pagers, and the like. Computer system 1300 may also be connected to other devices not shown or otherwise operate as a standalone system. Further, the functionality provided by the illustrated components may be combined in fewer components or distributed in additional components in some embodiments. Similarly, in some embodiments, the functionality of some of the illustrated components may not be provided, and/or other additional functionality may be available.

Those skilled in the art will also recognize that while various items are illustrated as being stored in memory or on storage during use, these items, or portions thereof, may be transferred between memory and other storage devices for purposes of memory management and data integrity. Alternatively, in other embodiments, some or all of the software components may execute in memory on another device and communicate with the illustrated computer system via inter-computer communication. Some or all of the system components or data structures may also be stored (e.g., as instructions or structured data) on a computer-accessible medium or a portable article to be read by a suitable drive, various examples of which are described above. In some embodiments, instructions stored on a computer-accessible medium separate from computer system 1300 may be transmitted to computer system 1300 via a transmission medium or signal, such as an electrical, electromagnetic, or digital signal, communicated via a communication medium, such as a network and/or a wireless link. Various embodiments may further include receiving, transmitting or storing instructions and/or data implemented in accordance with the above description upon a computer-accessible medium. Generally speaking, a computer-accessible medium may include a non-transitory computer-readable storage medium or memory medium such as a magnetic or optical medium, e.g., disk or DVD/CD-ROM, a volatile or non-volatile medium such as RAM (e.g., SDRAM, DDR, RDRAM, SRAM, etc.), ROM, or the like. In some embodiments, a computer-accessible medium may include transmission media or signals such as electrical, electromagnetic, or digital signals, communicated via a communication medium such as a network and/or a wireless link.

In various embodiments, the methods described herein may be implemented in software, hardware, or a combination thereof. Additionally, the order of the blocks of a method may be changed, and various elements may be added, reordered, combined, omitted, modified, etc. Various modifications and changes will become apparent to those skilled in the art having the benefit of this disclosure. The various embodiments described herein are intended to be illustrative and not restrictive. Many variations, modifications, additions, and improvements are possible. Thus, multiple instances may be provided for components described herein as a single instance. The boundaries between the various components, operations and data storage devices are somewhat arbitrary, with particular operations being illustrated in the context of specific exemplary configurations. Other allocations of functionality are contemplated and may fall within the scope of claims that follow. Finally, structures and functionality presented as discrete components in the exemplary configurations may be implemented as a combined structure or component. These and other variations, modifications, additions, and improvements may fall within the scope of the embodiments as defined in the claims that follow.

48页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种局部高精度地图的快速生成和制作方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!