Remote operation to extend an existing route to a destination

文档序号:991329 发布日期:2020-10-20 浏览:2次 中文

阅读说明:本技术 将现有路线延伸至目的地的远程操作 (Remote operation to extend an existing route to a destination ) 是由 S·塔库尔 A·盖琳 A·柯巴施 J·S·吉奥菲 M·B·艾伦 于 2019-02-21 设计创作,主要内容包括:一种用于运载工具的自主操作的远程支持的设备,其包括:处理器,其被配置为进行如下的方法,该方法包括:从穿行自起点到目的地处的终点的驾驶路线的运载工具接收用于识别目的地处的运载工具不能到达终点的辅助请求信号;生成第一地图显示,该第一地图显示包括地理区域和地理区域内的运载工具的表示;从运载工具接收来自运载工具的一个或多个感测装置的传感器数据;生成远程支持界面,该远程支持界面包括第一地图显示和传感器数据;以及响应于被提供给远程支持界面的输入信号而向运载工具发送包括目的地处的可选终点的指令数据。(An apparatus for remote support of autonomous operation of a vehicle, comprising: a processor configured to perform a method comprising: receiving, from a vehicle traveling a driving route from a starting point to an end point at a destination, an assistance request signal for identifying that the vehicle at the destination cannot reach the end point; generating a first map display comprising a geographic area and a representation of vehicles within the geographic area; receiving, from a vehicle, sensor data from one or more sensing devices of the vehicle; generating a remote support interface, the remote support interface including a first map display and sensor data; and transmitting, to the vehicle, instruction data including the selectable destination at the destination in response to the input signal provided to the remote support interface.)

1. An apparatus for remote support of autonomous operation of a vehicle, the apparatus comprising:

a memory; and

a processor configured to execute instructions stored in the memory to:

receiving, from a vehicle traveling a driving route from a starting point to an end point at a destination, an assistance request signal for identifying that the vehicle at the destination cannot reach the end point;

generating a first map display comprising a geographic area and a representation of vehicles within the geographic area;

receiving sensor data from one or more sensing devices of the vehicle from the vehicle;

generating a remote support interface comprising the first map display and the sensor data; and

transmitting, to the vehicle, instruction data including a selectable destination at the destination in response to the input signal provided to the remote support interface.

2. The apparatus of claim 1, wherein the sensor data comprises at least one of image data from an image capture device of the vehicle and object detection information.

3. The apparatus of claim 1 or 2, wherein the memory comprises storage for at least two endpoints at the destinations.

4. The apparatus of claim 3, wherein the input signal comprises a selection of selectable endpoints from the at least two endpoints.

5. The apparatus of claim 4, wherein the selectable end point is within an unmapped portion of the geographic area, and the input signal further comprises an extended driving route from the unmapped portion of the geographic area, through the unmapped portion of the geographic area, to the selectable end point.

6. The apparatus of claim 1 or 2, wherein the selectable end point is a new end point within the mapped portion of the geographic area.

7. The device of claim 6, wherein the processor is configured to execute instructions stored in the memory to:

storing the new endpoint in a set of possible endpoints for the destination.

8. The apparatus of claim 1 or 2, wherein the selectable end point is a new end point in an un-mapped portion of the geographic area, and the instruction data comprises a selectable driving route including a path through the un-mapped portion to the new end point at the destination.

9. The device of claim 8, wherein the processor is configured to execute instructions stored in the memory to:

storing the new endpoint in a set of possible endpoints for the destination; and

storing a path through the un-mapped portion to the new endpoint.

10. The apparatus of claim 1 or 2, wherein the instruction data comprises a transition point between a current position of the vehicle and an optional end point at which the vehicle is to stop.

11. The apparatus of claim 1 or 2, wherein the assistance request signal comprises an automated signal from the vehicle in response to the vehicle staying within a defined distance of the destination for a defined time.

12. The device of claim 1 or 2, wherein the instructions further comprise instructions to:

receiving an input signal on the first map display representing a desired stop location near the selectable endpoint, the input signal forming a line of intersection on the first map display that is perpendicular to a path of the vehicle from a current location of the vehicle to the selectable endpoint, and the input signal being responsive to the sensor data, an

Wherein the instruction data includes sending the desired stopping location to the vehicle as the selectable end point.

13. The apparatus of claim 1 or 2, wherein the assistance request signal is responsive to at least one of a closed entrance between a current location of the vehicle at the destination and the terminal, and a closed sidewalk between the current location of the vehicle at the destination and the terminal.

14. A method for providing remote support for autonomous operation of a vehicle, the method comprising:

receiving, from a vehicle traveling a driving route from a starting point to an end point at a destination, an assistance request signal for identifying that the vehicle at the destination cannot reach the end point;

generating a first map display comprising a geographic area and a representation of vehicles within the geographic area;

receiving sensor data from one or more sensing devices of the vehicle from the vehicle;

generating a remote support interface comprising the first map display and the sensor data; and

transmitting, to the vehicle, instruction data including a selectable destination at the destination in response to the input signal provided to the remote support interface.

15. The method of claim 14, wherein the instruction data includes a path to the selectable end point and at least one of a dwell position along the path and one or more rate limits along the path.

16. The method of claim 15, wherein the path to the alternative destination is based on at least one of a service context, a customer location, a destination at the destination, a local traffic rule or specification, and a stored history of the destination.

17. The method of claim 14 or 15, wherein the remote support interface comprises at least one of data from an infrastructure sensor device, a local rule or specification overlaid on the first map display, and a geo-referenced high resolution satellite image.

Technical Field

The present application relates to a vehicle interface for autonomous vehicle monitoring, including methods, devices, systems, and non-transitory computer-readable media for remote monitoring and teleoperation of an autonomous vehicle.

Background

Increasing the use of autonomous vehicles creates the potential for more efficient movement of passengers and cargo through a transportation network. Furthermore, the use of autonomous vehicles may result in improved vehicle safety and more efficient communication between vehicles. However, autonomous vehicles often encounter situations where they have reached a destination, but a defined endpoint is not available. This may limit the utility of the autonomous vehicle.

Disclosure of Invention

Aspects, features, elements, implementations, and implementations for remote support of vehicle autonomous operation are disclosed herein. These implementations support remote operations that extend an existing route to an alternative destination at a destination.

Aspects of the disclosed implementations include a device for remote support of autonomous operation of a vehicle. The apparatus includes a memory and a processor. The processor may be configured to execute instructions stored in the memory to: receiving, from a vehicle traveling a driving route from a starting point to an end point at a destination, an assistance request signal for identifying that the vehicle at the destination cannot reach the end point; generating a first map display comprising a geographic area and a representation of vehicles within the geographic area; receiving sensor data from one or more sensing devices of the vehicle from the vehicle; generating a remote support interface comprising the first map display and the sensor data; and transmitting instruction data to the vehicle including a selectable destination at the destination in response to the input signal provided to the remote support interface.

Aspects of the disclosed implementations include a method for providing remote support of autonomous operation of a vehicle. The method may include: receiving, from a vehicle traveling a driving route from a starting point to an end point at a destination, an assistance request signal for identifying that the vehicle at the destination cannot reach the end point; generating a first map display comprising a geographic area and a representation of vehicles within the geographic area; receiving sensor data from one or more sensing devices of the vehicle from the vehicle; generating a remote support interface comprising the first map display and the sensor data; and transmitting instruction data to the vehicle including a selectable destination at the destination in response to the input signal provided to the remote support interface.

These and other aspects of the invention are disclosed in the following detailed description of the embodiments, the appended claims and the accompanying drawings.

Drawings

The disclosed technology is best understood from the following detailed description when read with the accompanying drawing figures. It should be emphasized that, according to common practice, the various features of the drawings may not be to scale. On the contrary, the dimensions of the various features are arbitrarily expanded or reduced for clarity. Moreover, unless otherwise specified, like reference numerals refer to like elements throughout the figures.

Fig. 1 is a diagram of an example of a portion of a vehicle in which aspects, features, and elements disclosed herein may be implemented.

Fig. 2 is a diagram of an example of a portion of a vehicle transport and communication system in which aspects, features, and elements disclosed herein may be implemented.

Fig. 3 is a flow chart of a method for remote support of autonomous operation of a vehicle according to the present invention.

Fig. 4 is a diagram of a portion of a vehicle transport network according to the present invention.

Fig. 5A to 5G are diagrams of a first example of a remote operation of extending an existing route to a destination.

Fig. 6A to 6H are diagrams of a second example of a remote operation of extending an existing route to a destination.

Detailed Description

The autonomous vehicle may be undergoing service including arrival at a destination, whether or not there are passengers who may intervene in the operation. The service may be a rental operation or a ferry operation (such as pick-up or drop-off of passengers, etc.), or may be a delivery operation (such as pick-up or drop-off of packages, etc.). The route to the destination is associated with an endpoint at the destination.

For example, the destination may be a set of Global Positioning Satellite (GPS) coordinates associated with an address of a destination, such as a primary portal, a secondary portal, a defined pick-up or drop-off area, a particular loading dock, and so forth. Various situations may occur when a vehicle reaches a destination (e.g., within a defined range of the destination) such that the vehicle cannot reach the destination. For example, the entrance may be closed, the sidewalk may be closed, the door may not be accessible, and so on.

The utility of an autonomous vehicle may be increased by using remote operations to extend an existing route to a destination, more particularly to an optional destination at the destination. Such assistance may enable the vehicle to achieve its service goals. Further, by tracking information related to alternative destinations for different destinations, the overall efficiency of the transport network may be improved.

To describe some implementations of the teachings herein in more detail, reference is first made to an environment in which the invention can be implemented.

Fig. 1 is a diagram of an example of a vehicle 1000 in which aspects, features, and elements disclosed herein may be implemented. Vehicle 1000 includes chassis 1100, powertrain 1200, controller 1300, wheels 1400/1410/1420/1430, or any other element or combination of elements of the vehicle. Although vehicle 1000 is shown as including four wheels 1400/1410/1420/1430 for simplicity, any other propulsion device such as a pusher or tread, etc. may be used. In fig. 1, the line interconnection elements such as powertrain 1200, controller 1300, and wheels 1400/1410/1420/1430 represent information such as data or control signals, power such as electrical power or torque, or both information and power may be communicated between the respective elements. For example, the controller 1300 may receive power from the powertrain 1200 and communicate with the powertrain 1200, the wheels 1400/1410/1420/1430, or both to control the vehicle 1000, which may include accelerating, decelerating, steering, or otherwise controlling the vehicle 1000.

Powertrain 1200 includes a power source 1210, a sending unit 1220, a steering unit 1230, a vehicle actuator 1240, or any other element or combination of elements of the powertrain (such as a suspension, drive shaft, axle, or exhaust system, etc.). Although shown separately, wheels 1400/1410/1420/1430 may be included in powertrain 1200.

Power source 1210 may be any device or combination of devices operable to provide energy such as electrical, thermal, or kinetic energy. For example, power source 1210 includes an engine, such as an internal combustion engine, an electric motor, or a combination of an internal combustion engine and an electric motor, and is operable to provide kinetic energy to one or more of wheels 1400/1410/1420/1430 as motive force. In some embodiments, power source 1210 includes potential energy units, such as: one or more dry cell batteries such as nickel cadmium (NiCd) batteries, nickel zinc (NiZn) batteries, nickel metal hydride (NiMH) batteries, lithium ion (Li-ion) batteries, and the like; a solar cell; a fuel cell; or any other device capable of providing energy.

The transmitting unit 1220 receives energy, such as kinetic energy, from the power source 1210 and transmits the energy to the wheels 1400/1410/1420/1430 to provide motive force. The transmitting unit 1220 may be controlled by the controller 1300, the vehicle actuator 1240, or both. Steering unit 1230 may be controlled by controller 1300, vehicle actuator 1240, or both, and controls wheels 1400/1410/1420/1430 to steer the vehicle. The vehicle actuator 1240 may receive signals from the controller 1300 and may actuate or control the power source 1210, the sending unit 1220, the steering unit 1230, or any combination thereof to operate the vehicle 1000.

In the illustrated embodiment, the controller 1300 includes a positioning unit 1310, an electronic communication unit 1320, a processor 1330, a memory 1340, a user interface 1350, sensors 1360, and an electronic communication interface 1370. Although shown as a single unit, any one or more elements of the controller 1300 may be integrated into any number of separate physical units. For example, the user interface 1350 and processor 1330 may be integrated in a first physical unit, and the memory 1340 may be integrated in a second physical unit. Although not shown in fig. 1, the controller 1300 may include a power source such as a battery or the like. Although shown as separate elements, the positioning unit 1310, the electronic communication unit 1320, the processor 1330, the memory 1340, the user interface 1350, the sensor 1360, the electronic communication interface 1370, or any combination thereof, may be integrated in one or more electronic units, circuits, or chips.

In some embodiments, processor 1330 includes any device or combination of devices, now known or later developed, that is capable of manipulating or processing signals or other information, including an optical processor, a quantum processor, a molecular processor, or a combination thereof. For example, processor 1330 may include one or more special-purpose processors, one or more digital signal processors, one or more microprocessors, one or more controllers, one or more microcontrollers, one or more integrated circuits, one or more application-specific integrated circuits, one or more field programmable gate arrays, one or more programmable logic controllers, one or more state machines, or any combination thereof. The processor 1330 may be operably coupled with the positioning unit 1310, the memory 1340, the electronic communication interface 1370, the electronic communication unit 1320, the user interface 1350, the sensors 1360, the powertrain 1200, or any combination thereof. For example, the processor may be operatively coupled to the memory 1340 via a communication bus 1380.

The processor 1330 may be configured to execute instructions, including remote operation instructions, that may be used to operate the vehicle 1000 from a remote location including an operation center. The remote operation instructions may be stored in the vehicle 1000 or received from an external source such as a traffic management center or a server computing device that may include a cloud-based server computing device or the like.

Memory 1340 may include any tangible, non-transitory computer-usable or computer-readable medium capable of containing, storing, communicating, or transporting machine-readable instructions or any information associated therewith for use by or in connection with processor 1330. For example, the memory 1340 is one or more solid state drives, one or more memory cards, one or more removable media, one or more Read Only Memories (ROMs), one or more Random Access Memories (RAMs), one or more registers, a Low Power Double Data Rate (LPDDR) memory, one or more cache memories, one or more disks (including hard disks, floppy disks, optical disks, magnetic or optical cards), or any type of non-transitory medium suitable for storing electronic information, or any combination thereof.

The electronic communication interface 1370 may be a wireless antenna (as shown), a wired communication port, an optical communication port, or any other wired or wireless unit capable of interacting with the wired or wireless electronic communication medium 1500.

The electronic communication unit 1320 may be configured to transmit or receive signals via a wired or wireless electronic communication medium 1500, such as via an electronic communication interface 1370. Although not explicitly shown in fig. 1, electronic communication unit 1320 is configured to transmit, receive, or both via any wired or wireless communication medium, such as Radio Frequency (RF), ultraviolet light (UV), visible light, optical fiber, wired lines, or a combination thereof. Although fig. 1 shows a single electronic communication unit 1320 and a single electronic communication interface 1370, any number of communication units and any number of communication interfaces may be used. In some embodiments, the electronic communication unit 1320 may include a Dedicated Short Range Communication (DSRC) unit, a Wireless Security Unit (WSU), ieee802.11P (Wifi-P), or a combination thereof.

The location unit 1310 may determine geographic location information of the vehicle 1000 including, but not limited to, longitude, latitude, altitude, direction of travel, or speed. For example, the positioning unit includes a Global Positioning System (GPS) unit, such as a Wide Area Augmentation System (WAAS) -enabled National Marine Electronics Association (NMEA) unit, a radio triangulation unit, or a combination thereof. The positioning unit 1310 may be used to obtain information representing, for example, a current heading of the vehicle 1000, a current position of the vehicle 1000 in two or three dimensions, a current angular orientation of the vehicle 1000, or a combination thereof.

The user interface 1350 may include any unit that can be used as an interface by a human, including any of a virtual keyboard, a physical keyboard, a touch pad, a display, a touch screen, a speaker, a microphone, a camera, a sensor, and a printer. As shown, the user interface 1350 may be operatively coupled with the processor 1330, or with any other element of the controller 1300. Although shown as a single unit, user interface 1350 may comprise one or more physical units. For example, user interface 1350 includes an audio interface for audio communication with a person, and a touch display for visual and touch-based communication with a person.

The sensors 1360 may include one or more sensors (such as a sensor array or the like) that may be operable to provide information that may be used to control the vehicle. The sensors 1360 may provide information related to the current operating characteristics of the vehicle or its surroundings. The sensors 1360 include, for example, velocity sensors, acceleration sensors, steering angle sensors, traction-related sensors, braking-related sensors, or any sensor or combination of sensors operable to report information related to certain aspects of the current dynamic conditions of the vehicle 1000.

In some embodiments, the sensors 1360 include sensors operable to obtain information related to the physical environment surrounding the vehicle 1000. For example, one or more sensors detect road geometry and obstacles (such as fixed obstacles, vehicles, riders and pedestrians, etc.). The sensors 1360 may be or include one or more cameras, laser sensing systems, infrared sensing systems, acoustic sensing systems, or any other suitable type of on-board environmental sensing device or combination of devices now known or later developed. The sensor 1360 and the positioning unit 1310 may be combined.

Although not separately shown, the vehicle 1000 may include a trajectory controller. For example, the controller 1300 may include a trajectory controller. The trajectory controller may be operable to obtain information describing the current state of the vehicle 1000 and a route planned for the vehicle 1000, and determine and optimize a trajectory of the vehicle 1000 based on the information. In some embodiments, the trajectory controller outputs a signal operable to control the vehicle 1000 such that the vehicle 1000 follows the trajectory determined by the trajectory controller. For example, the output of the trajectory controller may be an optimized trajectory that may be supplied to powertrain 1200, wheels 1400/1410/1420/1430, or both. The optimized trajectory may be a control input such as a set of steering angles, wherein each steering angle corresponds to a point in time or a position. The optimization trajectory may be one or more paths, lines, curves, or a combination thereof.

One or more of the wheels 1400/1410/1420/1430 may be a steering wheel that pivots to a steering angle under the control of the steering unit 1230, a pusher wheel that twists under the control of the sending unit 1220 to push the vehicle 1000, or a steered pusher wheel for steering and pushing the vehicle 1000.

The vehicle may comprise units or elements not shown in fig. 1, such as a housing, bluetoothModule, Frequency Modulation (FM) radio unit, near field communicationA communication (NFC) module, a Liquid Crystal Display (LCD) display unit, an Organic Light Emitting Diode (OLED) display unit, a speaker, or any combination thereof.

Fig. 2 is a diagram of an example of a portion of a vehicle transport and communication system 2000 in which aspects, features, and elements disclosed herein may be implemented. The vehicle transport and communication system 2000 includes a vehicle 2100 (such as the vehicle 1000 shown in fig. 1, etc.) and one or more external objects (such as the external objects 2110, etc.), which may include any form of transport (such as the vehicle 1000 shown in fig. 1, pedestrians, riders, etc.) and any form of structure (such as a building, etc.). The vehicle 2100 may travel via one or more portions of the transport network 2200 and may communicate with the external objects 2110 via one or more of the electronic communication networks 2300. Although not explicitly shown in fig. 2, the vehicle may traverse areas that are not explicitly or fully included in the transport network (such as off-road areas, etc.). In some embodiments, the transport network 2200 may include one or more vehicle detection sensors 2202 (such as inductive loop sensors, etc.), which vehicle detection sensors 2202 may be used to detect movement of vehicles on the transport network 2200.

The electronic communication network 2300 may be a multiple-access system that provides communication, such as voice communication, data communication, video communication, messaging communication, or a combination thereof, between the vehicle 2100, the external object 2110, and the operations center 2400. For example, the vehicle 2100 or the external object 2110 may receive information such as information representing the transportation network 2200 from the operation center 2400 via the electronic communication network 2300.

The operations center 2400 includes a controller device 2410 that includes some or all of the features of the controller 1300 shown in fig. 1 for the controller device 2410. Controller device 2410 may monitor and coordinate the movement of vehicles, including autonomous vehicles. The controller device 2410 may monitor the status or condition of a vehicle, such as vehicle 2100, and external objects, such as external object 2110. The controller device 2410 may receive vehicle data and infrastructure data comprising any of: vehicle speed; a vehicle position; a vehicle operating state; a vehicle destination; a vehicle route; vehicle sensor data; an external object velocity; an external object location; an external object operating state; an external object destination; an external object path; and external object sensor data.

Further, the controller device 2410 may establish remote control of one or more vehicles, such as vehicle 2100, or external objects, such as external object 2110. In this manner, the controller device 2410 may teleoperate the vehicle or external object from a remote location. Controller apparatus 2410 may exchange (send or receive) status data with a vehicle, external object, or computing device, such as vehicle 2100, external object 2110, or server computing device 2500, via a wireless communication link, such as wireless communication link 2380, or a wired communication link, such as wired communication link 2390.

The server computing device 2500 may include one or more server computing devices that may exchange (send or receive) status signal data with one or more vehicles or computing devices including the vehicle 2100, the external object 2110, or the operation center 2400 via the electronic communication network 2300.

In some embodiments, vehicle 2100 or external object 2110 communicates via wired communication link 2390, wireless communication link 2310/2320/2370, or a combination of any number or type of wired or wireless communication links. For example, as shown, the vehicle 2100 or the external object 2110 communicates via a terrestrial wireless communication link 2310, via a non-terrestrial wireless communication link 2320, or via a combination thereof. In some implementations, the terrestrial wireless communication link 2310 includes an ethernet link, a serial link, a bluetooth link, an Infrared (IR) link, an Ultraviolet (UV) link, or any link capable of electronic communication.

A vehicle such as vehicle 2100 or an external object such as external object 2110 may communicate with another vehicle, external object, or operation center 2400. For example, the host or subject vehicle 2100 may receive one or more automated inter-vehicle messages, such as Basic Safety Messages (BSMs), from the operation center 2400 via the direct communication link 2370 or via the electronic communication network 2300. For example, the operations center 2400 may broadcast the message to host vehicles within a defined broadcast range, such as 300 meters, or to a defined geographic area. In some embodiments, the vehicle 2100 receives the message via a third party, such as a signal repeater (not shown) or another remote vehicle (not shown). In some embodiments, the vehicle 2100 or the external object 2110 periodically transmits one or more automated inter-vehicle messages based on a defined interval, such as 100 milliseconds.

The vehicle 2100 may communicate with the electronic communication network 2300 via an access point 2330. The access point 2330, which may include a computing device, is configured to communicate with the vehicle 2100, with the electronic communications network 2300, with the operations center 2400, or a combination thereof, via a wired or wireless communication link 2310/2340. For example, the access point 2330 is a base station, Base Transceiver Station (BTS), node B, enhanced node B (eNode-B), home node B (HNode-B), wireless router, wired router, hub, repeater, switch, or any similar wired or wireless device. Although shown as a single unit, the access point may include any number of interconnected elements.

The vehicle 2100 may communicate with the electronic communication network 2300 via a satellite 2350 or other non-terrestrial communication device. The satellite 2350, which may include a computing device, may be configured to communicate with the vehicle 2100, with the electronic communication network 2300, with the operation center 2400, or a combination thereof, via one or more communication links 2320/2360. Although shown as a single unit, the satellite may include any number of interconnected elements.

The electronic communications network 2300 may be any type of network configured to provide voice, data, or any other type of electronic communication. For example, the electronic communication network 2300 includes a Local Area Network (LAN), a Wide Area Network (WAN), a Virtual Private Network (VPN), a mobile or cellular telephone network, the internet, or any other electronic communication system. The electronic communications network 2300 may use communication protocols such as Transmission Control Protocol (TCP), User Datagram Protocol (UDP), Internet Protocol (IP), real-time transport protocol (RTP), hypertext transport protocol (HTTP), or a combination thereof. Although shown as a single unit, the electronic communication network may include any number of interconnected elements.

In some embodiments, the vehicle 2100 communicates with the operation center 2400 via an electronic communication network 2300, an access point 2330, or a satellite 2350. The operations center 2400 may include one or more computing devices capable of exchanging (sending or receiving) data from a vehicle, such as vehicle 2100, an external object including external object 2110, or a computing device, such as server computing device 2500.

In some embodiments, the vehicle 2100 identifies a portion or condition of the transportation network 2200. For example, the vehicle 2100 may include one or more onboard sensors 2102 (such as the sensor 1360 shown in fig. 1, etc.) that include a rate sensor, a wheel speed sensor, a camera, a gyroscope, an optical sensor, a laser sensor, a radar sensor, an acoustic sensor, or any other sensor or device or combination thereof capable of determining or identifying a portion or condition of the transportation network 2200.

The vehicle 2100 may traverse one or more portions of the transportation network 2200 using information communicated via the electronic communication network 2300, such as information representative of the transportation network 2200, information identified by one or more onboard sensors 2102, or a combination thereof. External object 2110 may be capable of all or some of the communications and actions described above with respect to vehicle 2100.

For simplicity, fig. 2 shows a vehicle 2100 as the host vehicle, external objects 2110, a transport network 2200, an electronic communications network 2300, and an operations center 2400. However, any number of vehicles, networks, or computing devices may be used. In some embodiments, the vehicle transport and communication system 2000 includes devices, units, or elements not shown in fig. 2.

Although the vehicle 2100 is shown as communicating with the operation center 2400 via the electronic communication network 2300, the vehicle 2100 (and the external object 2110) may communicate with the operation center 2400 via any number of direct or indirect communication links. For example, the vehicle 2100 or the external object 2110 may communicate with the operation center 2400 via a direct communication link, such as a bluetooth communication link. Although fig. 2 shows one transport network 2200 and one electronic communication network 2300 for simplicity, any number of networks or communication devices may be used.

The external object 2110 is shown in fig. 2 as a second remote vehicle. The external object is not limited to other vehicles. The external object may be any infrastructure element (e.g., fence, sign, building, etc.) that has the ability to send data to the operations center 2400. The data may be, for example, sensor data from infrastructure elements.

Fig. 3 is a flow chart of a method 3000 for remote support of vehicle autonomous operation according to the present invention. The method 3000 may be utilized by a remote support system, such as a fleet manager or a vehicle manager implemented at the operations center 2400. Some or all aspects of method 3000 may be implemented in a vehicle (including vehicle 1000 shown in fig. 1, vehicle 2100 shown in fig. 2) or a computing device (including controller device 2410 shown in fig. 2). In an implementation, some or all aspects of method 3000 may be implemented in a system incorporating some or all of the features described in this disclosure.

At operation 3010, an assistance request signal is received from a vehicle. A vehicle may include a device or apparatus (e.g., a transport mechanism) for transporting an object including any of one or more passengers and cargo. The vehicle may comprise an autonomous vehicle or any of a vehicle driven by a human driver or a semi-autonomous vehicle. The vehicle is traveling a driving route from the start point to an end point at the destination, and the assistance request signal identifies that the vehicle at the destination cannot reach the end point.

The destination may be a street address, a building name, or some other identifier of a location within a geographic area. The destination at the destination may be, for example, GPS coordinates or map coordinates. The endpoint may identify a particular entrance to the destination or a particular point within the destination, such as a structure within a larger campus or a parking location within the destination. These terms may be further explained with reference to fig. 4.

Fig. 4 is a diagram of a portion of a vehicle transport network 4000 according to the present invention. The vehicle transport network 4000 as shown includes one or more non-navigable areas, such as buildings 4100, one or more partially navigable areas, such as parking areas 4200, one or more navigable areas, such as roads 4300/4400, or a combination thereof. In some embodiments, autonomous vehicles (such as autonomous vehicle 1000 shown in fig. 1 or autonomous vehicle 2100 shown in fig. 2, etc.) traverse one or more portions of a vehicle transport network 4000.

The vehicle transport network may include one or more intersections 4210 between one or more navigable or partially navigable areas 4200/4300/4400. For example, the portion of the vehicle transport network shown in fig. 4 includes an entrance or intersection 4210 between the parking area 4200 and the roadway 4400. In some embodiments, parking area 4200 may include parking spaces 4220.

A portion of a vehicle transport network, such as a roadway 4300/4400, may include one or more lanes 4320/4340/4360/4420/4440 and may be associated with one or more directions of travel represented by arrows in fig. 4.

In some embodiments, the vehicle transport network or a portion thereof (such as the portion of the vehicle transport network shown in fig. 4, etc.) may be represented as vehicle transport network information. For example, the vehicle transport network information may be represented as a hierarchy of elements, such as markup language elements, that may be stored in a database or file. For simplicity, the figures herein depict vehicle transportation network information representing a portion of a vehicle transportation network as a graph or map; however, the vehicle transport network information may be represented in any computer usable form capable of representing the vehicle transport network or a portion thereof. The vehicle transport network information may include vehicle transport network control information such as direction of travel information, rate limit information, toll information, grade information (such as inclination or angle information, etc.), surface material information, aesthetic information, or combinations thereof.

A portion of the vehicle transport network or a combination of portions may be identified as a point of interest or a destination. For example, the vehicle transport network information may identify building 4100 as a point of interest, the autonomous vehicle may identify the point of interest as a destination, and the autonomous vehicle may travel from the origin to the destination by traversing the vehicle transport network.

In some embodiments, identifying the destination may include identifying a location of the destination, which may be a discrete uniquely identifiable geographic location, such as geographic location 4500 of building 4100. For example, the vehicle transport network may include a defined location of the destination, such as a street address, postal address, vehicle transport network address, longitude and latitude, or GPS address, among others.

In some embodiments, the destination may be associated with one or more endpoints (such as the portal 4600A shown in fig. 4). In some embodiments, the vehicle transport network information may include defined or predicted endpoint location information, such as information identifying a geographic location of an endpoint associated with the destination, and the like. For example, the destination may be a street parking location adjacent to the destination, or the like. The endpoint in this example may be optional inlet 4600B.

The vehicle transportation network may be associated with or may include a pedestrian transportation network. For example, fig. 4 includes a portion 4700 of a pedestrian transportation network, which portion 4700 can be a pedestrian walkway. The pedestrian transportation network or a portion thereof (such as the portion 4700 of the pedestrian transportation network shown in fig. 4) may be represented as pedestrian transportation network information. In some embodiments, the vehicle transportation network information may include pedestrian transportation network information. The pedestrian transportation network may include a pedestrian navigable area. A pedestrian navigable area, such as a pedestrian walkway or sidewalk, may correspond to an un-navigable area of the vehicle transport network. Although not separately shown in fig. 4, the pedestrian navigable area, such as a pedestrian aisle, may correspond to a navigable area or portion of a navigable area of the vehicle transport network.

In some embodiments, a parking area, such as parking area 4200, is associated with a destination, such as building 4100. For example, the vehicle transport network information may include defined parking area information indicating that one or more parking areas are associated with a destination. In some embodiments, the vehicle transport network information may omit information for identifying parking area 4200 or information for associating parking area 4200 with a destination. Parking area 4200 may be an unmapped portion of a geographic area represented by a map. The unmapped portion of the geographic area may be considered to be a portion that does not define the path of the vehicle and/or a portion that has little or no internal data such that it is defined only or nearly only by its external dimensions.

Referring again to fig. 3, the assistance request signal received at operation 3010 may include an automated signal from the vehicle in response to the vehicle staying within a defined distance of the destination for a defined time. The assistance request signal may be in response to a closed entrance between a current location of the vehicle at the destination and the terminal, a closed sidewalk between the current location of the vehicle at the destination and the terminal, or both. The assistance request signal may be generated in response to a blockage of the endpoint. These instances of generating the assistance request signal may be determined from sensor data of the vehicle and/or from sensor data from infrastructure near the destination at the destination. The assistance request signal may also be generated or initiated by a passenger of the vehicle, for example, when a drop is scheduled. The auxiliary request signal may be generated due to a service workflow anomaly (such as a missed connection).

Fig. 5A to 5G and 6A to 6H are diagrams of an example of a remote operation of extending an existing route to a destination. A first example of an assistance request signal received at operation 3010 may be illustrated by referring to fig. 5A and 5B. In fig. 5A, the vehicle 5000 is traversing a driving route to an end point 5002 at a destination 5004 to pick up passengers. Vehicle 5000 is blocked from reaching endpoint 5002 by vehicles 5006A, 5006B, and 5006C that are each forward or alongside vehicle 5000. As can be seen in fig. 5B, the vehicle 5000 is able to recognize that the vehicle 5000 is obstructed using its image sensor. In this example, a pop-up notification 5008 appears on the display 5010 of the vehicle 5000. The recognition that the vehicle 5000 is blocked from reaching the destination 5002 may generate an assist request signal identifying that the vehicle at the destination 5004 cannot reach the destination 5002.

A second example of the assistance request signal received at operation 3010 may be illustrated by referring to fig. 6A and 6B. Fig. 6A is a set down scenario in which the vehicle 6000 is turning to an entrance 6002 at a destination 6004 to reach an end point 6006 (see fig. 6C). As can be seen in fig. 6B, the vehicle 6000 is blocked from reaching the end point 6006. Specifically, the gate 6008 blocks the vehicle 6000 from reaching the end point 6006 at the destination 6004. An assistance request signal identifying that the vehicle 6000 at the destination 6004 cannot reach the endpoint 6006 may be received through remote support. For example, the assistance request signal may be generated by an operator of the vehicle 6000. The vehicle 6000 may generate the auxiliary request signal in response to the sensor indicating that the door 6008 is blocking the vehicle 6000 from reaching the endpoint 6006. After a defined time has elapsed with the vehicle 6000 staying at the gate 6008, the vehicle 6000 may generate an assistance request signal.

Referring again to fig. 3, a first map display is generated at operation 3020. The first map display may include a representation of a geographic area and vehicles within the geographic area as described with respect to the example of fig. 4. Fig. 5C illustrates an example of a map display 5012, which includes an end point 5002 at a destination 5004. Vehicle 5000 is represented within a geographic area encompassed by map display 5012. A timer 5014 on the display 5012 indicates how long the vehicle 5000 has stayed at the destination 5004. Fig. 6C shows another example of a map display 6010, which includes an end point 6006 at a destination 6004. Vehicle 6000 is represented within the geographic area encompassed by map display 6010. A timer 6012 on the display 6010 indicates how long the vehicle 6000 is at the destination 6004 due to the doors 6008. The timers 5014, 6012 may each optionally indicate how long has elapsed since the remote support received an assistance request signal from the vehicles 5000, 6000, respectively. Fig. 6C also shows a pop-up notification 6014 that the vehicle assist signal is received.

At operation 3030, sensor data from the vehicle may be received from one or more sensing devices of the vehicle. The sensor data may include image data from an image capture device of the vehicle, object detection information from the vehicle, or both. The sensor data may include any other information available from a sensor device of the vehicle or a sensor associated with the vehicle (e.g., stationary) within a geographic area.

At operation 3040, a remote support interface is generated that includes one or more map displays and sensor data. Fig. 5C and 5D collectively illustrate one example of a remote support interface, which includes displays 5012 and 5016. Fig. 5C described above is an example of the map display 5012. Fig. 5D is a display 5016 showing sensor data (in this example, multiple camera views 5018, 5020 from the vehicle 5000). The remote support interface may also include data from one or more infrastructure sensor devices, local rules or specifications covering any display, geo-referenced high resolution satellite images, or a combination of this information. Fig. 5D shows, for example, three satellite images 5022, 5024, 5026. The local rules or specifications may include, for example, speed limits or areas where the vehicle should not travel. In fig. 5D, for example, the cross-hatched area 5028A overlays the satellite image 5022 to either side of the destination 5004, indicating areas where the vehicle 5000 should not be traveling. Another region 5028B (see fig. 5E) is also shown at the exit of the destination 5004 where the vehicle 5000 should not travel.

Using the remote support interface, instructional data for deviating the vehicle from the existing route may be generated. More specifically, one or more input signals may be provided to a remote support interface to generate instructional data for a vehicle. This can be illustrated by fig. 5E and 5F and fig. 6D to 6H. In fig. 5E and 5F, geo-referenced high resolution satellite images 5030, 5042 are shown with the vehicle 5000 and an icon 5032 representing the passenger to take over superimposed on the images 5030, 5042. In the present example, upon receiving the assist request signal, the vehicle 5006B located right in front of the vehicle 5000 requesting support (i.e., the vehicle 5006B shown in the camera view 5018 in fig. 5D) moves. Vehicle 5000 is still blocked by vehicle 5006A according to the existing route. The extended route can be generated by an operator providing an input signal in the form of a new navigation point 5036 that defines an optional destination 5034 at the destination 5004. Fig. 5E and 5F illustrate at least first and second navigation points 5036 of an extended route for modifying the path of the existing route shown in fig. 5D. The endpoint can be represented by a desired stop location near the optional endpoint 5034 that is generated as part of the extended route 5038. For example and as shown in each of these figures, the input signal forms a desired stop location on the display at an intersection 5040 that is perpendicular to the path of the vehicle 5000 from the current location of the vehicle 5000 to the optional destination 5034. Each navigation spot 5036 can also be defined by an intersection marking a dwell point along the extended route 5038.

Fig. 6D-6H illustrate alternative aspects of using one or more input signals to generate an extended route. In fig. 6D-6G, the operator of the remote support interface adds street view data (such as publicly available street view data) to the remote support interface. The street view data are in display images 6016, 6018, and 6020 shown in fig. 6D, 6E, and 6F, respectively. Images 6016, 6018, and 6020 are used to determine an optional end point near the side entry 6026 (see fig. 6G) and an extended route to the optional end point. Image 6016 in fig. 6D shows roadway 6022 that vehicle 6000 will have to traverse after reversing out of entry 6002. An image 6018 in fig. 6E reflects further travel from the entrance 6002 along the road 6022. Fig. 6020 shows a path 6024 extending from a road 6022.

Once the remote support has images 6016, 6018, and 6020, an extended route may be generated. As seen in fig. 6H, the input signal may correspond to a point 6028 defined by the remote support that is located along an extended course 6030 to an optional end point 6032 near the side entry 6026. As can be seen in fig. 6H, a portion of the extended route 6030 is located within a mapped portion of the geographic area (i.e., roads 6022) while another portion of the extended route is located within an unmapped portion of the geographic area (i.e., paths 6024). Since the end point of the extended route is an optional end point, the extended route may be referred to herein as an optional driving route.

In response to an input signal provided to the remote support interface, command data is transmitted to the vehicle including an optional destination at the destination. The vehicle may use the instruction data to perform vehicle autonomous operations. For example, a vehicle 5000 that follows an extended route that bypasses vehicle 5006A using navigation points 5036A, 5036B to reach an optional destination 5034 is shown in fig. 5G. In some examples, the operator selects an optional destination, and the vehicle determines an extended route to the optional destination without further assistance. The instruction data may include a transition point between the current position of the vehicle and an optional destination at which the vehicle is to stop.

In the examples of fig. 5A-5G and 6A-6H, the optional endpoint is a new endpoint. The new endpoint may be stored in the memory in a set of possible endpoints for the destination. In some implementations, the input signal may be a selection of a previously stored endpoint of the possible endpoints. In the case where the selectable destination is within an un-mapped portion of the geographic area, the instruction data includes a selectable driving route including a path through the un-mapped portion to the selectable destination at the destination. In the case where the alternative end point is a new end point, the path may be stored in association with the new end point. The path to the alternative destination may be based on at least one of a service context, a customer location, a destination at the destination, a local traffic rule or specification, a stored history of destinations, or any combination thereof.

The disclosed technology provides support for remote operation of a vehicle in the presence of an obstruction to reaching a desired endpoint. The flow of autonomous vehicles through a particular transportation network is enhanced, thereby improving utilization of available transportation resources, increasing passenger safety, and improving on-time arrival of passengers and cargo.

As used herein, the terms "driver" or "operator" may be used interchangeably. As used herein, the terms "braking" or "decelerating" may be used interchangeably. As used herein, the term "computer" or "computing device" includes any unit or combination of units capable of performing any of the methods disclosed herein or any portion thereof.

As used herein, the term "instructions" may include directions or representations for performing any of the methods disclosed herein or any portion thereof, and may be implemented in hardware, software, or any combination thereof. For example, the instructions may be implemented as information, such as a computer program, stored in a memory that is executable by a processor to perform any of the respective methods, algorithms, aspects, or combinations thereof, as described herein. In some implementations, the instructions, or portions thereof, may be implemented as a special purpose processor or circuitry that may include dedicated hardware for performing any one of the methods, algorithms, aspects, or combinations thereof, as described herein. In some implementations, a portion of the instructions may be distributed across multiple processors on a single device, on multiple devices that may communicate directly or across a network such as a local area network, a wide area network, the internet, or a combination thereof.

As used herein, the terms "example," "embodiment," "implementation," "aspect," "feature," or "element" mean serving as an example, instance, or illustration. Any examples, embodiments, implementations, aspects, features, or elements are examples, embodiments, implementations, aspects, features, or elements that are independent of each other and may be used in combination with any other examples, embodiments, implementations, aspects, features, or elements, unless expressly indicated otherwise.

As used herein, the terms "determine" and "identify," or any variation thereof, include selecting, ascertaining, calculating, looking up, receiving, determining, establishing, obtaining, or otherwise identifying or determining, however, use of one or more apparatus shown and described herein.

As used herein, the term "or" means an inclusive "or" rather than an exclusive "or". That is, unless specified otherwise or clear from context, "X comprises a or B" is intended to indicate either of the natural inclusive permutations. If X comprises A; x comprises B; or X includes both A and B, then "X includes A or B" is satisfied under any of the foregoing circumstances. In addition, the articles "a" and "an" as used in this application and the appended claims should generally be construed to mean "one or more" unless specified otherwise or clear from context to be directed to a singular form.

Moreover, for simplicity of explanation, the elements of the methods disclosed herein can occur in different orders or concurrently, although the figures and descriptions herein may include a sequence of steps or phases or a series of steps or phases. In addition, elements of the methods disclosed herein may occur in conjunction with other elements not expressly shown or described herein. Moreover, not all elements of a method described herein may be required to implement a methodology in accordance with the present invention. Although aspects, features, and elements are described herein in particular combinations, the aspects, features, or elements can be used alone or in various combinations with or without other aspects, features, and elements.

While the disclosed technology has been described in connection with certain embodiments, it is to be understood that the disclosed technology is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures as is permitted under the law.

33页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:编码器装置及其制造方法、驱动装置、工作台装置及机械手装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!