Vehicle technology for automatic turn signaling

文档序号:1785512 发布日期:2019-12-06 浏览:23次 中文

阅读说明:本技术 用于自动地转弯信令的车辆技术 (Vehicle technology for automatic turn signaling ) 是由 S·海格特 于 2018-02-22 设计创作,主要内容包括:本公开公开了用于基于数据分析在车辆转弯或改变车道时自动地激活转弯信号的各种车辆技术,上述数据可以包括来自图像捕获设备、雷达、转向角传感器和超声传感器的数据。本公开还设想,自动地激活正在转弯或改变车道的车辆中的转弯信号的决定,可以考虑其他车辆或行人是否在其附近以及这些其他车辆或行人是否将从转弯信号被激活中受益。(Various vehicle technologies are disclosed for automatically activating a turn signal when a vehicle turns or changes lanes based on data analysis, which may include data from an image capture device, radar, steering angle sensor, and ultrasonic sensor. The present disclosure also contemplates that the decision to automatically activate a turn signal in a vehicle that is turning or changing lanes may take into account whether other vehicles or pedestrians are in its vicinity and whether these other vehicles or pedestrians will benefit from the turn signal being activated.)

1. A method of automatically activating a turn signal source in a vehicle, the method comprising:

Determining, via a processor, that a first vehicle is about to turn or leave a lane based on data from a first data source of the first vehicle;

Determining, via the processor, that a driver of the first vehicle is applying a steering action to the first vehicle based on data from a second data source of the first vehicle;

Determining, via the processor, an approximate position of a second vehicle relative to the first vehicle based on data from a third data source of the first vehicle and data from a fourth data source of the first vehicle;

Activating, via the processor, a turn signal source of the first vehicle.

2. the method of claim 1, wherein the first data source comprises an image capture device.

3. The method of claim 2, wherein the second data source comprises a steering angle sensor.

4. The method of claim 3, wherein the third data source comprises radar.

5. The method of claim 4, wherein the fourth data source comprises an ultrasound sensor.

6. The method of claim 1, wherein the first data source comprises a side camera.

7. The method of claim 1, further comprising:

Prioritizing analysis of data from the first data source over analysis of data from at least one of the second data source, the third data source, or the fourth data source in response to identifying, via the processor, a conflict between the analysis of data from the first data source and the analysis of data from at least one of the second data source, the third data source, or the fourth data source.

8. The method of claim 7, wherein:

The first data source comprises an image capture device;

the second data source comprises a steering angle sensor;

The third data source comprises radar; and

The fourth data source includes an ultrasonic sensor.

9. The method of claim 1, further comprising:

Prioritizing analysis of data from the third data source over analysis of data from at least one of the first data source, the second data source, or the fourth data source in response to identifying, via the processor, a conflict between the analysis of data from the first data source and the analysis of data from at least one of the first data source, the second data source, or the fourth data source.

10. The method of claim 9, wherein:

The first data source comprises an image capture device;

The second data source comprises a steering angle sensor;

The third data source comprises radar; and

The fourth data source includes an ultrasonic sensor.

11. A method of automated turn signaling, the method comprising:

Determining, via a processor, that a vehicle will cross a lane line or a turn based on a measured steering angle value that is within a range of values stored in a memory or that is greater than a threshold value stored in the memory, wherein the vehicle includes the processor, the memory, and the source of the turn signal;

Activating, via a processor, the turn signal source based on the determination.

12. The method of claim 11, wherein the activating comprises activating immediately before the first vehicle crosses the lane line or makes the turn.

13. The method of claim 11, wherein the activating comprises activating during the first vehicle crossing the lane line or making the turn.

14. The method of claim 11, wherein the activating comprises activating immediately after the first vehicle crosses the lane line or makes the turn.

15. The method of claim 11, wherein the vehicle is a first vehicle, and the method further comprises:

Determining, via the processor, that a second vehicle is present within a predetermined distance from the first vehicle, wherein the activating is based on the second vehicle being present within the predetermined distance from the first vehicle.

16. A storage device having stored therein a set of processor-executable instructions that, when executed by an electronic processing system, cause the electronic processing system to:

determining a path of travel of a first vehicle relative to a lane line based on a first set of data received from an image capture device of the first vehicle;

Determining that a second vehicle exists within a predetermined distance from the first vehicle based on a second set of data received from the reflected wave detector;

Activating a turn signal source of the first vehicle when (a) the first vehicle has a travel path and a steering angle such that the first vehicle will cross the lane line or make a turn, and (b) the second vehicle is present within the predetermined distance from the first vehicle.

17. A motor vehicle comprising:

A processor;

A camera;

a steering angle sensor; and

The source of the turning signal is the signal source,

Wherein the processor is programmed to: determining when the vehicle is about to turn or leave a lane based on data from at least one of the camera or the steering angle sensor,

Wherein the processor is programmed to: activating the turn signal source when the processor determines that the vehicle is about to turn or leave a lane.

18. The motor vehicle of claim 17, wherein the processor is programmed to: determining an approximate position of a second vehicle relative to the first vehicle based on data from at least one of the ultrasonic sensor or the radar, wherein the processor is programmed to: activating the turn signal source when the processor determines that the first vehicle is about to turn near the second vehicle or to exit the lane.

19. The motor vehicle in accordance with claim 17, wherein the memory storage value range, wherein the steering angle sensor is configured to output a steering angle value while the vehicle is in motion, wherein the processor is programmed to: determining that the vehicle will exit a lane line or turn when the steering angle value is within the range of values, wherein the processor is programmed to: activating the turn signal source based on determining that the vehicle will leave a lane line or turn.

20. The motor vehicle of claim 17, wherein the memory stores a threshold value for steering angle, wherein the steering angle sensor is configured to output a steering angle value while the vehicle is in motion, wherein the processor is programmed to: determining that the vehicle will leave a lane line or turn when the steering angle value is greater than the threshold value stored in memory, wherein the processor is programmed to: activating the turn signal source based on determining that the vehicle will leave a lane line or turn.

Technical Field

The present disclosure relates to automatic turn signal activation in a vehicle.

background

Vehicles such as automobiles carry turn signal light sources such as light bulbs that are manually activated by the driver of the vehicle, such as when turning or switching lanes. When activated, the turn signal light source visually informs others (such as pedestrians and drivers of other vehicles) that the vehicle may change its direction of travel by, for example, turning to another road or switching roads. However, since the turn signal light source is manually activated by the driver, the driver often ignores activating the turn signal light source before changing the direction of travel. This situation can be dangerous, especially on highways, and is often the cause of vehicle accidents or road rage.

Although various techniques exist to mitigate driver failure to use vehicle turn signals, these techniques are inadequate for various technical reasons, such as false positives and slow response times. For example, some previous attempts to improve this situation include smart turn signals that remain silent in the background, but would alert the driver if the vehicle was turned outside of the lane range, and turn signal assist programs that generate dashboard messages to alert the driver to use the turn signal after repeated failures. These prior art techniques still rely on the driver of the vehicle to manually use the turn signal light source when appropriate.

disclosure of Invention

The present disclosure discloses one or more inventions that provide an automatic signal that a vehicle is actually or will turn or leave its current lane. The departure may be across a lane line into an adjacent lane, turning the vehicle onto a path across the current lane, or otherwise turning to a lane (e.g., a turn ramp).

Typical vehicle laws require vehicle lights to signal such lane departure as seen in typical flashing turn signals found in most cars and trucks. Most vehicles have turn signals at the rear of the vehicle. Some vehicles have an additional turn signal light at a position forward of the vehicle, such as on a side view mirror. However, with the advent of vehicle-to-vehicle communication, vehicles may also provide telemetry communication for notifying other vehicles when leaving a lane or turning a corner.

One embodiment includes a method of automatically activating a turn signal source in a vehicle, the method comprising: determining, via a processor, that a first vehicle is about to turn or leave a lane based on data from a first data source of the first vehicle; determining, via a processor, that a driver of the first vehicle is applying a steering action to the first vehicle based on data from a second data source of the first vehicle; determining, via the processor, an approximate position of the second vehicle relative to the first vehicle based on data from the third data source of the first vehicle and data from the fourth data source of the first vehicle; via the processor, a source of a turn signal for the first vehicle is activated.

In one embodiment, the turn signal source is a turn signal voltage source.

In one embodiment, the source of the turn signal is a transmitter that transmits a telemetry signal that informs others of the turn.

One embodiment includes a method of automatic turn signaling, the method comprising: determining, via a processor, that the vehicle will cross a lane line or a turn based on the measured steering angle value within the range of values stored in the memory, wherein the vehicle comprises the processor, the memory, and a source of the turn signal; and activating, via the processor, a turn signal source based on determining that the vehicle will cross a lane line or turn.

One embodiment includes a storage device having stored therein a set of processor-executable instructions that, when executed by an electronic processing system, cause the electronic processing system to: determining a path of travel of the first vehicle relative to the lane line based on a first set of data received from an image capture device of the first vehicle; determining that the second vehicle exists within a predetermined distance from the first vehicle based on the second set of data received from the reflected wave detector; activating a turn signal source of a first vehicle when (a) the first vehicle has a travel path and a steering angle such that the first vehicle will cross a lane line or make a turn, and (b) a second vehicle is present within a predetermined distance from the first vehicle.

one embodiment includes an apparatus for automatic turn signal source activation, the apparatus comprising a vehicle including a processor, a camera, a steering angle sensor, and a turn signal source, wherein the processor is programmed to: determining when the vehicle is about to turn or exit the lane based on data from at least one of the camera or the steering angle sensor, wherein the processor is programmed to: the turning signal source is activated when the processor determines that the vehicle is about to turn or leave the lane.

One embodiment includes an apparatus for automatic turn signal source activation, the apparatus comprising a first vehicle comprising a processor, a camera, a steering angle sensor, one or more ultrasonic sensors, a radar, and a turn signal source, wherein the processor is programmed to: determining when the first vehicle is about to turn or leave the lane based on data from at least one of the camera or the steering angle sensor, wherein the processor is programmed to: determining an approximate position of the second vehicle relative to the first vehicle based on data from at least one of the ultrasonic sensor or the radar, wherein the processor is programmed to: the turning signal source is activated when the processor determines that the first vehicle is about to turn or leave a lane near the second vehicle.

One embodiment includes an apparatus for automatic turn signal activation, the apparatus comprising a vehicle, the vehicle comprising a processor, a memory, a steering angle sensor, and a source of turn signals, wherein the memory stores a range of values for a steering angle, wherein the steering angle sensor is configured to output a steering angle value when the vehicle is in motion, wherein the processor is programmed to: determining that the vehicle is about to leave a lane line or turn when the steering angle value is within a range of values stored in the memory, wherein the processor is programmed to: the turn signal source is activated based on determining that the vehicle is about to leave the lane line or make a turn.

These and other embodiments and/or aspects of the invention are discussed in more detail below with reference to the figures.

Drawings

FIG. 1 shows a schematic view of an exemplary embodiment of a vehicle according to the present disclosure.

FIG. 2 shows a schematic diagram of an exemplary embodiment of a vehicle equipped with multiple devices monitoring multiple zones according to the present disclosure.

Fig. 3a, 3b show schematic diagrams of an exemplary embodiment of a vehicle equipped with a plurality of front cameras monitoring a plurality of areas and an exemplary embodiment of a front camera module according to the present disclosure.

Fig. 4a, 4b show schematic diagrams of an exemplary embodiment of a vehicle equipped with multiple side cameras monitoring multiple zones and an exemplary embodiment of a side repeater camera according to the present disclosure.

Fig. 5a, 5b show schematic diagrams of an exemplary embodiment of a vehicle equipped with a plurality of rear cameras monitoring a plurality of areas and an exemplary embodiment of a rear camera module according to the present disclosure.

FIG. 6 shows a schematic diagram of an exemplary embodiment of a vehicle equipped with a radar to monitor an area according to the present disclosure.

FIG. 7 shows a schematic diagram of an exemplary embodiment of a vehicle equipped with multiple ultrasonic sensors monitoring multiple zones according to the present disclosure.

Fig. 8a, 8B show an exemplary embodiment of a B-pillar of a vehicle (where the B-pillar carries a camera), an exemplary embodiment of a side perspective view of the B-pillar carrying the camera, and an exploded view of an exemplary embodiment of the camera, according to the present disclosure.

fig. 9a-9d show a number of schematic views of an exemplary embodiment of a B-pillar of a vehicle according to the present disclosure, wherein the B-pillar carries a camera.

FIG. 10 shows a schematic diagram of an exemplary embodiment of a first vehicle following a second vehicle on a roadway having a lane line according to the present disclosure.

Fig. 11 shows a schematic view of an exemplary embodiment of a first vehicle driving on a roadway having a plurality of second vehicles, wherein the roadway includes a plurality of parallel lane lines, according to the present disclosure.

FIG. 12 shows a flowchart of an exemplary embodiment of a first method for automatic turn signal activation according to the present disclosure.

FIG. 13 illustrates a flowchart of an exemplary embodiment of a second method for automatic turn signal activation according to the present disclosure.

FIG. 14 shows a flowchart of an exemplary embodiment of a third method for automatic turn signal activation according to the present disclosure.

Detailed Description

in general, the present disclosure discloses a technique for automatically activating a turn signal source of a first vehicle when the first vehicle urgently leaves its current lane by crossing a lane line or making a turn, with or without a second vehicle in the vicinity of the first vehicle or with or without an observer in the vicinity of the first vehicle. For example, such proximity may be within 20 feet of the first vehicle, within 40 feet of the first vehicle, within 60 feet of the first vehicle, or other distance from the first vehicle. Such automatic activation may occur when the first vehicle processes a plurality of data from a plurality of devices on the first vehicle to determine a trajectory of the first vehicle and, without manual signal activation, determines whether the first vehicle will cross a lane line or make a turn. If it is computationally determined that the first vehicle is leaving a lane or turning, the first vehicle will activate its source of turning signals.

Alternatively, automatic activation may occur when the first vehicle processes a plurality of data from a plurality of devices on the first vehicle to determine a trajectory of the first vehicle and, without manual signal activation, determines whether the first vehicle will cross a lane line or make a turn. Further, the first vehicle processes the data to determine whether there are one or more surrounding objects/vehicles that would benefit from the automatic activation. If so, the turn signal source will be activated.

FIG. 1 shows a schematic view of an exemplary embodiment of a vehicle according to the present disclosure. The vehicle 100 includes (at least one of each) a chassis, a power source, a drive source, a set of wheels 102, a processor 104, a memory 106, an ultrasonic sensor 108, a radar 110, a camera 112, a transceiver 114, a steering angle sensor 116, and a turn signal source 118. The vehicle 100 may be a land vehicle, whether manned or unmanned, whether non-autonomous, semi-autonomous or fully autonomous, such as an automobile/motor vehicle, Sport Utility Vehicle (SUV), van, minivan, limousine, bus, truck, trailer, tank, tractor, motorcycle, bicycle, heavy equipment vehicle, or the like. Note that the vehicle 100 may be front wheel drive, rear wheel drive, four wheel drive, or all wheel drive. For a wheeled vehicle, the turn may be made by the front wheels, the rear wheels, or both. Tracked vehicles achieve turning through differential drive of the tracks. For example, the vehicle 100 may be a Tesla Corporation Model (or any other Tesla Corporation Model) equipped with a Tesla autopilot (enhanced autopilot) driver assistance function and having a hardware 2 component set (2016, 11 months). In some embodiments, the vehicle may also be equipped with a forward looking infrared camera (FLIR), which may be in communication with the processor 104.

The chassis securely carries the power source, drive source and the set of wheels 102. The power source comprises a battery, which is preferably rechargeable. The drive source preferably comprises an electric motor, whether brush or brushless. However, within the scope of the invention, an internal combustion engine is envisaged, in which case the power source comprises a fuel tank carried via the chassis and coupled to the internal combustion engine. The power source is coupled to the drive source such that the drive source is powered thereby. The set of wheels 102 includes at least one wheel that may include an inflatable tire that may include a run flat tire. The set of wheels 102 is driven via a drive source.

The processor 104 is a hardware processor, such as a single-core or multi-core processor. For example, the processor 104 includes a Central Processing Unit (CPU) that may include multiple cores for parallel processing/concurrent independent processing. In some embodiments, processor 104 includes a Graphics Processing Unit (GPU). The processor 104 is powered via a power supply and is coupled to the chassis.

The memory 106 is in communication with the processor 102, such as in any known wired, wireless, or waveguide manner. Memory 106 includes computer-readable storage media, which may be non-transitory. The storage medium stores a plurality of computer readable instructions for execution via the processor 104. The instructions instruct the processor 104 to facilitate performance of a method for automatic turn signal activation, as disclosed herein. For example, the instructions may include an operating system of the vehicle or an application running on the operating system of the vehicle. For example, the processor 104 and memory 106 may enable various file or data input/output operations, whether synchronous or asynchronous, including any of the following: read, write, edit, modify, delete, update, search, select, merge, sort, encrypt, deduplicate, and the like. The memory 106 may include at least one of volatile memory cells (such as Random Access Memory (RAM) cells) or non-volatile memory cells (such as electrically addressed memory cells or mechanically addressed memory cells). For example, an electrically addressed memory includes flash memory cells. For example, the mechanically addressed memory unit comprises a hard disk drive. The memory 106 may include storage media, such as at least one of a data store, a data mart, or a data store. For example, the storage medium may include a database, including a distributed database, such as a relational database, a non-relational database, an in-memory database, or other suitable database, that may store data and allow access to such data via a storage controller, whether directly and/or indirectly, in a raw state, a formatted state, an organized state, or any other accessible state. Memory 106 may include any type of memory, such as primary memory, secondary memory, tertiary memory, offline memory, volatile memory, non-volatile memory, semiconductor memory, magnetic memory, optical memory, flash memory, hard drive memory, floppy drive, tape, or other suitable data storage medium. The memory 106 is powered via a power source and is coupled to the chassis.

The ultrasonic sensor 108 is in communication with the processor 104, such as in any known wired, wireless, or waveguide manner. The ultrasonic sensor 108 includes a transducer that converts electrical signals to ultrasonic waves for output, such as via a transmitter or transceiver, and that converts reflected ultrasonic waves to electrical signals for input, such as via a receiver or transceiver. The ultrasonic sensor 108 evaluates the properties of the target by interpreting acoustic echoes from the sound waves reflected from the target. Such interpretation may include measuring the time interval between sending the sound wave and receiving the echo to determine the distance to the target. The ultrasonic sensor 108 is preferably powered via a power source and coupled to the chassis. In a preferred embodiment, there are multiple ultrasonic sensors 108.

The radar 110 is in communication with the processor 104, such as in any known wired, wireless, or waveguide manner. The radar 110 includes a transmitter, a transmitting antenna, a receiving antenna, a receiver, and a processor (which may be the same as the processor 104) that generate electromagnetic waves, such as in the radio or microwave spectrum, to determine properties of the target. As is common in the art, the same antenna may be used for both transmission and reception. The transmitter antenna radiates radio waves (pulsed or continuous) from the transmitter to reflect off the target and return to the receiver via the receiving antenna to provide information to the processor regarding the position, velocity, angle, and other characteristics of the target. The processor may be programmed to apply Digital Signal Processing (DSP), machine learning, and other related techniques that are capable of extracting useful information from various noise levels, such as via the use of code stored in memory 106. In some embodiments, radar 110 includes a lidar that uses ultraviolet, visible, or near infrared light from a laser in addition to, or instead of, radio waves. Radar 110 is preferably powered via a power source and coupled to the chassis.

The camera 112 is in communication with the processor 104, such as in any known wired, wireless, or waveguide manner. The camera 112 includes an image capture device or optical instrument for capturing or recording images, which may be stored locally (whether temporary or permanent), transmitted to another location, or both. The camera 112 may capture images to enable the processor 104 to perform various image processing techniques, such as compression, image and video analysis, telemetry, and so forth. For example, image and video analysis may include object recognition, object tracking, any known computer vision or machine vision analysis, or other analysis. The images may be individual still photographs or a sequence of images constituting a video. The camera 112 may include an image sensor, such as a semiconductor Charge Coupled Device (CCD) or an active pixel sensor in a Complementary Metal Oxide Semiconductor (CMOS) or N-type metal oxide semiconductor (NMOS), and a lens, such as a linear lens, a concave lens, a convex lens, a wide-angle lens, a fisheye lens, or any other lens. The camera 112 may be analog or digital. The camera 112 may include any focal length, such as wide angle or standard. The camera 112 may include a flash illumination output device. The camera 112 may include an infrared illumination output device. The camera 112 is preferably powered via a power source and coupled to the chassis.

The transceiver 114 is in communication with the processor, such as in any known wired, wireless, or waveguide manner. The transceiver 114 includes a transmitter and a receiver configured for wireless network communications, such as over a satellite network, a vehicle-to-vehicle (V2V) network, a cellular network, or any other wireless network, to receive updates to the instruction set stored in the memory 106, such as over the air. The transceiver 114 is preferably powered via a power source and coupled to the chassis. The transceiver 114 may also enable a Global Positioning System (GPS) geolocation (or other geolocation system). The vehicle 100 may include multiple transceivers configured for wireless communication over the same or different networks. For example, when the vehicle 100 includes multiple transceivers 114, some transceivers may operate on a cellular network while other transceivers may operate on a satellite network. Further, the transceiver 114 may communicate with a transceiver of another vehicle through an onboard Ad Hoc network (VANET).

the steering angle sensor 116 is in communication with the processor 104, such as in any known wired, wireless, or waveguide manner. The steering angle sensor 116 may sense the steering wheel position angle (between the front of the vehicle 100 and the direction of the steered wheels 102) and the turn rate. The steering angle sensor 116 may be analog or digital. The steering angle sensor 116 is powered via a power source and is coupled to the chassis.

The turn signal source 118 is preferably a turn signal voltage source and is in communication with the processor 104, such as in any known wired, wireless, or waveguide manner. The turn signal source 118 (also referred to as a direction indicator/signal or flashing light) includes a light bulb mounted in a light near the left and right front and rear corners of the vehicle 100, such as on the chassis, including on the side/side rearview mirror/fender/tail of the vehicle 100, that is activated to notify others in the vicinity (whether pedestrians or vehicles) that the vehicle 100 may turn toward the respective side or change lanes. As disclosed herein, the turn signal source 118 may be manually activated by the driver of the vehicle, or may be automatically activated. The turn signal source 118 preferably generates a voltage source suitable for a light bulb, which may be a Light Emitting Diode (LED) light bulb, a fluorescent light bulb, an incandescent light bulb, a halogen light bulb, or any other light bulb type that is held in a turn signal light. The turn signal lamp may emit light of any color, such as red, yellow, white, orange, or green, although the turn signal lamp may be covered with a colored transparent/translucent pane (such as plastic or glass) to change color as desired, such as when the bulb emits white or yellow light. As known to those skilled in the art, the color/illumination intensity/repetition frequency (blinking) of the turn signal light may be fixed, or the color/illumination intensity/repetition frequency (blinking) may vary, such as based on various factors. For example, the turn signal lights may preferably blink at a rate of about 60 blinks per minute to about 120 blinks per minute. Note that the opposing flashes may flash at different rates, whether on the same side or on the opposite side of the vehicle. The turn signal source 118 may operate when the vehicle is moving forward or backward. The turn signal source 118 is powered via a power source and is coupled to the chassis. FIG. 2 shows a schematic diagram of an exemplary embodiment of a vehicle equipped with multiple devices monitoring multiple zones according to the present disclosure. The vehicle 100 is equipped with one or more ultrasonic sensors 108, a radar 110, a transceiver 114, a steering angle sensor 116, a turning signal source 118, and a set of cameras 112, including a narrow front camera 112a, a main front camera 112b, a wide front camera 112c, a front view side camera 112d, a rear view side camera 112e, a rear view camera 112f, and a side repeater camera 112g, each of which is in communication with the processor 104, is powered via a power source, and operates based on instructions stored in the memory 106. The instructions instruct the processor 104 to interface with one or more ultrasonic sensors 108, radar 110, transceiver 114, steering angle sensor 116, turn signal source 118, narrow front camera 112a, main front camera 112b, wide front camera 112c, front view side camera 112d, rear view side camera 112e, rear view camera 112f, and side repeater camera 112g in order to perform a method for automatic turn signal activation, as disclosed herein. Note that this configuration provides a 360 degree monitoring area around the vehicle 100. It should also be noted that the various maximum distances listed in fig. 2 are illustrative and may be adjusted higher or lower as desired, such as via the use of other devices, device types, or adjustment ranges, whether manually or automatically, including in real-time. For example, if the vehicle 100 is a Tesla Model S (or any other Tesla Model) equipped with a Tesla autopilot (enhanced autopilot) driver assistance function and having a hardware 2 component set (2016, 11 months), such sensors are components in the hardware 2 component set.

Fig. 3a, 3b show schematic diagrams of an exemplary embodiment of a vehicle equipped with a plurality of front cameras monitoring a plurality of areas and an exemplary embodiment of a front camera module according to an exemplary embodiment of the present disclosure. As shown in fig. 3a, the narrow front camera 112a, the main front camera 112b, and the wide front camera 112c capture various frontal fields of view, with the narrow front camera 112a providing a focused perspective of distant features, which is useful in high-speed operation. The narrow front camera 112a, the main front camera 112b, and the wide front camera 112c are mounted behind the windshield of the vehicle 100 so as to provide wide visibility and remote detection of the focus of distant objects in front of the vehicle 100. In some embodiments, the narrow front camera 112a, the main front camera 112b, or the wide front camera 112c are mounted in other locations, including above the windshield (any portion thereof), or by headlights or front fenders (including bumpers), or on the underside or roof (any portion thereof) of the vehicle 100.

the main front camera 112b provides a field of view that is wider than the narrow front camera 112a but narrower than the wide front camera 112 c. The master front camera 112b covers a wide range of use cases for computer and machine vision when the vehicle 100 is stationary or moving. The wide front camera 112c provides a wider field of view than the primary front camera 112 b. The wide front camera 112c may include a 120 degree fisheye lens to capture traffic lights, cut into the travel path of the vehicle 100, or obstacles in close proximity, whether the vehicle is stationary or moving. The wide front camera 112c is useful in urban low speed maneuvering of the vehicle 100.

As shown in fig. 3b, the front camera module of the vehicle 100 carries the cameras 112a, b, c near the front windshield of the vehicle 100. Although the front camera module is shown above the front mirror, the front camera module may be positioned below or to the side of the mirror, or on the front dashboard or outside the passenger compartment of the vehicle 100, such as by being positioned on the hood or roof or pillars of the vehicle. Note that while the front camera module is depicted as an elongated and securable camera module, the front camera module may be implemented in other configurations.

Fig. 4a, 4b show schematic diagrams of an exemplary embodiment of a vehicle equipped with multiple side cameras monitoring multiple zones and an exemplary embodiment of a side repeater camera according to an exemplary embodiment of the present disclosure. As shown in fig. 4a, the forward looking side camera 112d may provide redundancy/backup functionality by locating vehicles/objects that may accidentally enter the lane in which the vehicle 100 is traveling, and additional safety when entering an intersection with limited visibility. When the vehicle 100 includes a B-pillar, then the vehicle 100 may carry the front view side camera 112d in the B-pillar.

As shown in fig. 4b, the side repeater camera 112g is mounted in a panel of the vehicle 100, such as on a wheel well. The side repeater cameras 112g may provide a set of image data, such as a data feed, to the processor 102, where the set of image data may include a front camera view, a lateral camera view, or a rear camera view, which may be helpful in object recognition, such as nearby vehicles or pedestrians. Also, fig. 4b shows the side repeater camera 112g before being installed in the vehicle 100. Note that although camera 12g is depicted as an elongated and securable camera module, camera 112g may be implemented in other configurations.

Fig. 5a, 5b show schematic diagrams of an exemplary embodiment of a vehicle equipped with a plurality of rear cameras monitoring a plurality of areas and an exemplary embodiment of a rear camera module according to an exemplary embodiment of the present disclosure. As shown in fig. 5a, the rear-view side camera 112e may be installed near the front or rear wheel well of the vehicle 100, and may monitor the rear blind spots on both sides of the vehicle 100, which is important for safely changing lanes and merging into traffic. In some embodiments, the rear-view side camera 112e is mounted in the B-pillar of the vehicle. In some embodiments, the rear-view camera 112f may help monitor the rear blind spot.

as shown in fig. 5b, the module carrying the camera 112f is shown mounted in the trunk of the vehicle 100. However, other mounting locations are also possible, such as a bumper of the vehicle 100 or a rear pillar of the vehicle 100 or a rear windshield of the vehicle 100 or a roof of the vehicle 100. Note that although camera 112f is depicted as a rectangular parallelepiped and securable camera module, camera 112f may be implemented in other configurations.

FIG. 6 shows a schematic diagram of an exemplary embodiment of a vehicle equipped with a radar to monitor an area according to the present disclosure. The radar 10 may be mounted to the front of a vehicle, such as on or behind the hood of the vehicle, the grille of the vehicle, the chassis, the fender of the vehicle, the underside of the vehicle, or any other part of the vehicle. Radar 110 preferably employs radio waves having a wavelength that passes through fog, dust, rain, snow, and other vehicles. The radar 110 communicates with the processor 104 to help detect and respond to a forward object, whether a pedestrian or a vehicle.

FIG. 7 shows a schematic diagram of an exemplary embodiment of a vehicle equipped with multiple ultrasonic sensors monitoring multiple zones according to the present disclosure. The ultrasonic sensor 108 may be mounted to any portion of the vehicle 100, such as to provide 360 degree coverage. The ultrasound sensor 108 may effectively double the monitoring range with increased sensitivity using ultrasound, which may be based on a suitably encoded electrical signal. The ultrasonic sensor 108 is useful for detecting nearby vehicles or pedestrians, especially when the vehicle or pedestrian invades the lane in which the vehicle stands or travels. As known to those skilled in the art, the ultrasonic sensor 108 may provide parking guidance to the processor 104.

Fig. 8a, 8B show an exemplary embodiment of a B-pillar of a vehicle (where the B-pillar carries a camera), an exemplary embodiment of a side perspective view of the B-pillar carrying the camera, and an exploded view of an exemplary embodiment of the camera, according to an exemplary embodiment of the present disclosure. Fig. 9a-9d show a number of schematic views of an exemplary embodiment of a B-pillar of a vehicle according to the present disclosure, wherein the B-pillar carries a camera.

As shown in fig. 8a, the vehicle 100 includes a B-pillar, shown in an exploded view as 200a and in an assembled rear view as 200B. The B-pillar includes a camera module 202, a composite carrier 204, flocking 206, sealing gasket 208, ceramic frit, tempered glass 212, Pressure Sensitive Adhesive (PSA) tape 214, vent patch 216, sealing foam 218, trim clip 220, Noise Vibration Harshness (NVH) foam 222, and camera screws 224. The flocking 206 is mounted over an aperture in the carrier 204. The camera module 202 is secured to the carrier 204 via screws 224 such that the camera module 204 is captured by the flocking 206. Frit 210 is positioned between glass 212 and gasket 208. Gasket 208 is positioned between frit 210 and carrier 204. The gasket 208 seals the flocking 206. The PSA tape secures the frit 210 to the carrier 204. The patch 216 is coupled to the carrier 204 near the camera module 202. The foam 218 is coupled to the carrier 204 while enclosing the camera module 202 and the patch 216. Clip 220 is mounted to carrier 204. Foam 222 is coupled to carrier 204 to surround clip 222. As shown in fig. 8B, the glass 212 extends over the camera 202 in the B-pillar 200B in the assembled front view 200 c. Further, fig. 8b shows camera 202 from front/side view 200 d. Note that although camera 202 is depicted as an elongated and securable camera module, camera 202 may be implemented in other configurations. Although fig. 9a-9d schematically show various illustrative dimensions, any suitable dimensions may be used, as known to those skilled in the art.

Fig. 10 shows a schematic diagram of an exemplary embodiment of a first vehicle 100 following a second vehicle 300 on a road 400 with a lane line according to the present disclosure. The first vehicle 100 and the second vehicle 300 travel on a road 400, the road 400 having a solid lane line on the right side of the vehicle 100 and a dashed line on the left side of the vehicle 100. The vehicle 300 is in front of the vehicle 100. As such, the vehicle 100 may detect and monitor the vehicle 300 using one or more ultrasonic sensors 108, radar 110, and a set of cameras 112, such as via the processor 104, as disclosed herein.

fig. 11 shows a schematic view of an exemplary embodiment of a first vehicle 100 traveling on a roadway 402 with a plurality of second vehicles 300 according to the present disclosure, wherein the roadway 402 includes a plurality of parallel lanes and lane lines. The vehicle 100 travels on the road 400 and is able to detect and monitor various vehicles 300, whether they are located in front of the vehicle 100 or behind the vehicle 100.

FIG. 12 shows a flowchart of an exemplary embodiment of a first method for automatic turn signal activation according to the present disclosure. The vehicle 100 performs the method 400 based on the processor 104 executing a set of instructions stored on the memory 106 and communicatively engaged (whether serially or in parallel) with the ultrasonic sensor 108, the radar 110, the transceiver 114, the steering angle sensor 116, the turn signal source 118, and the set of cameras 112, as disclosed herein. The method 400 includes an input block 402, a plurality of decision blocks 410, 412, 414, and an action block 416 and an inactivity or action block 418. For example, the method 400 may be performed while the driver is actively driving the vehicle 100.

The input box 402 includes a first input 404, a second input 406, and a set of third inputs 408. The first input 404 receives data, such as a data feed or data stream, from one or more cameras 112a-f in the set of cameras 112 as a first data source. The second input 406 receives data, such as a data feed or data stream, from the steering angle sensor 116 as a second data source. The third input 408 receives data, such as a data feed or data stream, from one or more ultrasonic sensors 108 as a third data source and data, such as a data feed or data stream, from radar 110 as a fourth data source. Each of the first input 404, the second input 406, and the third input 408 (including any sub-feeds) is managed via the processor 104 and may receive input data serially or in parallel with each other, whether synchronous or asynchronous with each other, whether in-phase or out-of-phase.

In block 410, the processor 104 determines whether the vehicle 100 is about to cross a lane line, such as on the road 400, based on the first input 404. For example, based on data from the set of cameras 112 as a first data source, the processor 104 may execute various algorithms, such as object identification/tracking/analysis, and determine whether the vehicle 100 is on a trajectory that crosses a lane line, such as based on a lane line positioned relative to the vehicle or based on gaps (changes in size, color, frequency, direction) between lane lines as known in the art. If the processor 104 determines that the vehicle 100 is not intending to cross the lane line, the process 400 continues to block 418, where the processor 104 does not activate the turn signal source 118. Otherwise, if the processor 104 determines that the vehicle 100 is about to cross the lane line, the process 400 moves to block 412.

In block 412, the processor 104 determines whether the driver of the vehicle 100 is applying a steering action based on the second input 406. For example, based on data from the steering angle sensor 116 as a second data source, the processor 104 may determine whether the driver is attempting to drive the vehicle 100 to turn or switch lanes, such as whether the steering angle is within some predefined range of values or above/below some predefined threshold stored in the memory 106. If the processor 104 determines that the driver of the vehicle 100 is not applying a steering action (the vehicle is moving straight ahead), the process 400 continues to block 418 where the processor 104 does not activate the turn signal source 118. Otherwise, if the processor 104 determines that the driver of the vehicle 100 is applying a corrective action (changing the travel path from straight to diagonal), the process 400 moves to block 414.

In block 414, the processor 104 determines, based on the third input 408, whether another vehicle (such as the vehicle 300 of fig. 10 or 11) is present near the vehicle 100, such as by being within a predetermined distance of the vehicle 100 or within a particular side, position, or orientation of the vehicle 100. For example, such proximity may be within 20 feet of the first vehicle, within 40 feet of the first vehicle, within 60 feet of the first vehicle, or other distance from the first vehicle. In this case, activating the turn signal source to alert the vehicle 300 has significant safety benefits. For example, based on data from the set of cameras 112 as a first data source, data from the radar 110 as a third data source, and data from the ultrasonic sensor 108 as a fourth data source, the processor 104 may determine whether the vehicle 300 is present in the vicinity of the vehicle 100, such as by image-based object recognition/tracking/analyzing and processing signals from captured sounds/radio waves bouncing off the vehicle 300. Such presence in the vicinity of the vehicle 100 may be determined based on various parameters about the vehicle 300, whether statically defined or real-time/in-flight. For example, some such parameters may be based on or include a set of thresholds or a series of values stored in memory 106, where the data may reflect a predefined distance, direction, expected path of travel, or other motion-based characteristic of vehicle 300 relative to vehicle 100 (or vice versa).

if the processor 104 determines that the vehicle 100 is near the vehicle 300, as described above, the processor 104 determines whether the source of the turn signal should be activated in order to improve the safety of the vehicle 100 and the vehicle 300. Such a determination may be based on various parameters about the vehicle 300, whether statically defined or determined in real time/in flight. For example, some such parameters may be based on/include a set of thresholds or a series of values stored in memory 106, where the data may reflect a set of criteria applied to a predefined distance, direction, expected path of travel, or other motion-based characteristic of vehicle 300 relative to vehicle 100 (or vice versa). Thus, if the vehicle 300 would benefit from the turn signal source 118 being activated, the process 400 continues to block 416 where the processor 104 activates the turn signal source 118. Otherwise, the processor 104 does not activate the turn signal source 118. Note that in block 416, the processor 104 may activate the turn signal source 118 immediately before the vehicle 100 crosses the lane line, while the vehicle 100 is crossing the lane line, or immediately after the vehicle 100 crosses the lane line. In some embodiments, before, during, or after the processor 104 determines that the turn signal source 118 is to be activated or not activated, such as for turning or lane switching, the processor 104 may then send the action or non-acting information to another vehicle (e.g., vehicle 300), such as over a V2V network, such as via the transceiver 114. In some embodiments, the signal may be sent to a mobile device, whether handheld or wearable, in the vicinity of the vehicle 100, such as when operated by a pedestrian, to alert or notify (such as visually, vibrationally, or audibly) that the pedestrian vehicle 100 is in its vicinity. The signal may be transmitted over a short-range wireless network, such as bluetooth. In some embodiments, the vehicle 300 may react to the signal, such as by slowing down, changing the travel path, turning, forwarding the signal to others, activating a device on the vehicle 300 (such as a source of the turning signal), or other action or not.

In one implementation of the process 400, in response to determining, via the processor 104, that the vehicle 100 is to cross a lane line on the road 400 based on data from a first data source (e.g., a camera): the processor 104 determines that the driver of the vehicle 100 is applying a corrective steering maneuver to the vehicle 100 based on data from a second data source (e.g., a steering angle sensor). Likewise, in response to determining, via the processor 104, that the vehicle 300 in the vicinity of the vehicle 100 would benefit from the turn signal source 118 being activated based on data from the first data source (e.g., camera), data from the third data source (e.g., radar), and data from the fourth data source (e.g., one or more ultrasonic sensors): when the vehicle 100 crosses the lane line near the vehicle 300, the processor 104 activates the turn signal source 118. Note that if the processor 104 identifies a conflict between information originating from the first data source (at least one camera 112) and information originating from at least one of the second data source (steering angle sensor 116), the third data source (radar 110), or the fourth data source (one or more ultrasonic sensors 108), the processor 104 may prioritize the first data source over at least one of the second data source, the third data source, or the fourth data source. Likewise, if the processor 104 identifies a conflict between the third data source (radar 110) and at least one of the first data source (at least one camera 112), the second data source (steering angle sensor 116), or the fourth data source (one or more ultrasonic sensors 108), the processor 104 may prioritize the third data source over at least one of the first data source, the second data source, or the fourth data source.

While fig. 12 and the accompanying description illustrate the analysis of the first, second, third, and fourth data sources in a particular order, it should be noted that these data sources may be analyzed in any order, and one or more data sources may be analyzed simultaneously. Further, where the processor identifies a conflict between information originating from any of the different data sources, information originating from any of the data sources may be prioritized over information originating from any of the other data sources to resolve the conflict.

FIG. 13 illustrates a flowchart of an exemplary embodiment of a second method for automatic turn signal activation according to the present disclosure. The vehicle 100 performs the method 500 based on the processor 104 executing a set of instructions stored on the memory 106 and communicatively engaged (whether serially or in parallel) with the ultrasonic sensor 108, the radar 110, the transceiver 114, the steering angle sensor 116, the turn signal source 118, and the set of cameras 112, as disclosed herein. The method 500 includes an input block 502, a plurality of decision blocks 510, 512, 514, and a plurality of action blocks 518, 522, and a plurality of not-or action blocks 516, 520. For example, the method 500 may be performed while the driver is driving the vehicle 100 at least semi-actively or passively.

The input box 502 includes a first input 504, a second input 506, and a set of third inputs 508. The first input 504 receives data, such as a data feed or data stream, from a first data source, such as one or more of a set of cameras 112 a-f. The second input 506 receives data, such as a data feed or data stream, from a second data source, such as the steering angle sensor 116. The third input 508 receives data, such as a data feed or data stream, from a third data source, such as one or more ultrasonic sensors 108, and receives data, such as a data feed or data stream, from a fourth data source, such as radar 110. Each of the first input 504, the second input 506, and the third input 508 (including any sub-feeds) is managed via the processor 104, and may receive input data serially or in parallel with each other, whether synchronous or asynchronous with each other, whether in-phase or out-of-phase.

In block 510, the processor 104 determines whether the vehicle 100 is about to cross a lane line, such as on the road 400, based on the first input 504. For example, based on data from a first data source (including one or more cameras of the set of cameras 112), the processor 104 may perform various algorithms, such as object recognition/tracking/analysis, and determine whether the vehicle 100 is on a trajectory that crosses a lane line, such as based on a lane line positioned relative to the vehicle or based on gaps (changes in size, color, frequency, direction) between lane lines as known in the art. If the processor 104 determines that the vehicle 100 is about to cross the lane line, the process 500 continues to block 518 where the processor 104 activates the turn signal source 118. Otherwise, if the processor 104 determines that the vehicle 100 is not intended to cross the lane line, the process 500 moves to block 512.

In block 512, the processor 104 determines whether the steering angle value received from the second input 506 (such as from the steering angle sensor 6) satisfies a first threshold, such as by being equal to or greater than the first threshold stored in the memory 106. This would indicate that the vehicle 100 or the driver of the vehicle 100 intends to cross the lane lines of the road 400. If the processor 104 determines that the first threshold is met, such as by the steering angle value being less than the first threshold (not greater than the first threshold), the process 500 continues to block 520 where the processor 520 does not activate the turn signal source 118. Otherwise, process 500 continues to block 514.

In block 514, the processor 104 determines whether the steering angle value satisfies a second threshold, such as equal to or less than the second threshold stored in the memory 106, and determines whether another vehicle (such as the vehicle 300 of fig. 10 or 11) in the vicinity of the vehicle 100 would benefit from the turn signal source 118 being activated based on the first input 508. For example, based on data from the set of cameras 112 as a first data source, data from the radar 110 as a third data source, and data from the ultrasonic sensor 108 as a fourth data source, the processor 104 may determine whether the vehicle 300 is present near the vehicle 100, such as by image-based object recognition/tracking/analysis and processing signals from captured sounds/radio waves bouncing off the vehicle 300, such as by being within a predetermined distance of the vehicle 100 or within a particular side, location, or direction of the vehicle 100. Such presence in the vicinity of the vehicle 100 may be determined based on various parameters about the vehicle 300, whether statically defined or real-time/in-flight. For example, some such parameters may be based on or include a set of thresholds or a series of values stored in memory 106, where the data may reflect a predefined distance, direction, expected path of travel, or other motion-based characteristic of vehicle 300 relative to vehicle 100 (or vice versa). For example, such proximity may be within 20 feet of the first vehicle, within 40 feet of the first vehicle, within 60 feet of the first vehicle, or other distance from the first vehicle. If the processor 104 determines that the vehicle 100 is near the vehicle 300, as described above, the processor 104 determines whether the source of the turn signal should be activated in order to improve the safety of the vehicle 100 and the vehicle 300. Such a determination may be based on various parameters about the vehicle 300, whether statically defined or determined in real time/in flight. For example, some such parameters may be based on or include a set of thresholds or a series of values stored in memory 106, where the data may reflect a set of criteria applied to a predefined distance, direction, expected path of travel, or other motion-based characteristic of vehicle 300 relative to vehicle 100 (or vice versa). Thus, if the vehicle 300 does not benefit from the turn signal source 118 being activated, the process 500 continues to block 516 where the processor 104 does not activate the turn signal source 118. Otherwise, the process 500 continues to block 522, where the processor 104 activates the turn signal source 118. Note that in any or all of blocks 516, 518, the processor 104 may activate the turn signal source 118 immediately prior to the vehicle 100 crossing the lane line, while the vehicle 100 is crossing the lane line, or immediately after the vehicle 100 crosses the lane line.

in one implementation of process 500, in response to determining, via processor 104, that the steering angle value is within a range of values stored in memory 106 (including, illustratively, between a first threshold and a second threshold): the processor 104 determines that the vehicle 100 will cross a lane line on the road 400 and that the vehicle 300 in the vicinity of the vehicle 100 will benefit from the turn signal source 118 being activated, the processor 104 will activate the turn signal source 118 when the vehicle 100 crosses the lane line in the vicinity of the vehicle 300. Note that determining whether the vehicle 300 is near the vehicle 100 and will benefit from the turn signal source 118 being activated via the processor 104 is based on receiving data from the one or more cameras 112a-f as a first data source, data from the radar 108 as a third data source, and data from the ultrasonic sensor 108 as a fourth data source. Note that if the processor 104 identifies a conflict between data from the first data source (camera 112) and data from at least one of the third data source (radar 110) or the fourth data source (one or more ultrasonic sensors 108), the processor 104 prioritizes data from the first data source over at least one of data from the second data source or data from the fourth data source. Likewise, if the processor 104 identifies a conflict between data from the third data source (radar 110) and data from at least one of the first data source (camera 112) or the fourth data source (one or more ultrasonic sensors 108), the processor 104 prioritizes the third data source over at least one of the first data source or the fourth data source.

In some embodiments, a storage device (such as the memory 106) has stored therein a set of processor-executable instructions that, when executed by an electronic processing system, such as the processor 104, cause the electronic processing system to: determining a path of travel of the first vehicle 100 relative to the lane line based on a first set of data (such as data from a first data source) received from an image capture device of the vehicle 100 (such as cameras 112 a-f); determining that the vehicle 300 is present within the predetermined distance from the vehicle 100 based on a second set of data (such as a second data source or a third data source) received from a reflected wave detector (such as the radar 110 or one or more ultrasonic sensors 108); a turn signal source of the vehicle 100, such as the turn signal source 118, is activated when (a) the vehicle 100 has a travel path and a steering angle such that the vehicle 100 will cross a lane line or turn, and (b) the vehicle 300 is present within a predetermined distance from the vehicle 100.

In some embodiments, the vehicle 100 may be configured to automatically activate the turn signal source 118 when the vehicle 100 is off its current lane, when it is about to turn to another roadway/street/segment, during such a turn, or immediately after such a turn. For example, the functionality may be enabled via: the processor 104 determines the geographic location of the vehicle 100 (such as by communicating with GPS satellites via the transceiver 114) and determines whether the vehicle 100 is approaching or is about to turn to another road/street/segment or is automatically guided to make such a turn. If the processor 104 determines that such action is safe, such as ensuring, by the camera 112, radar 110, and ultrasonic sensor 108, that there is no vehicle/pedestrian in the turning path of the vehicle 100, or if the processor 104 determines that the steering sensor angle value satisfies a threshold indicating that such action is to be taken, the processor 104 may activate the turn signal source 118. This functionality may be enhanced via: the processor 104 communicates with the street lights (such as through the transceiver 114) and activates the turn signal source 118 when the street lights are displaying green or when the steering angle value meets a threshold indicating such an action is to be taken in the vicinity of the street lights. Note that the processor 104 may distinguish between a lane switch and a turn based on the steering angle value, such as the steering angle value being within different ranges of values for the lane switch and the turn.

As described above, leaving a lane may also include using an exit ramp or lane when leaving a highway, or using a merge lane or an entry ramp when entering a highway.

Although fig. 13 and the accompanying description illustrate the analysis of the first, second, third, and fourth data sources in a particular order, it should be noted that these data sources may be analyzed in any order, and one or more data sources may be analyzed simultaneously. Further, where the processor identifies a conflict between information originating from any of the different data sources, information originating from any of the data sources may be prioritized over information originating from any of the other data sources to resolve the conflict.

algorithms for detecting lane changes, turns, etc. are well known and incorporated into various driver-assisted and/or autonomous vehicle systems, such as those used in Tesla Corporation Model (or any other Tesla Corporation Model), which contain Tesla autopilot (enhanced autopilot) driver assistance functionality and have a hardware 2 component set (2016 (11) months). For example, a lane change may be detected via a machine vision algorithm executed via the processor 104, the processor 104 analyzing in real time a set of images from the cameras 112a-c depicting a set of road markings or road boundaries of the road on which the vehicle 100 is traveling. Key improvements according to the present disclosure include an autopilot camera, and possibly other information, such as GPS information, to determine vehicle trajectory and whether it is about to cross lane lines without manual signal activation, and use steering angle sensor inputs and autopilot proximity sensors (such as ultrasonic sensors and radar feedback) to detect the presence or absence of other vehicles that may benefit from receipt of turn instructions according to the algorithms disclosed herein.

FIG. 14 shows a flowchart of an exemplary embodiment of a third method for automatic turn signal activation according to the present disclosure. The vehicle 100 performs the method 600 based on the processor 104 executing a set of instructions stored on the memory 106 and communicatively engaged (whether serially or in parallel) with the ultrasonic sensor 108, the radar 110, the transceiver 114, the steering angle sensor 116, the turn signal source 118, and the set of cameras 112, as disclosed herein. Method 600 includes an input block 602, a decision block 604, an action block 606, and an inactivity block 608. For example, method 600 may be performed when the driver is driving vehicle 100 at least semi-actively or passively.

Input box 602 includes input from one or more data sources, whether local to vehicle 100 or remote from vehicle 100. The data source may include any data source disclosed herein, such as a camera, steering angle sensor, radar, ultrasound sensor, FLIR camera, or others.

in block 604, the processor 104 determines whether the vehicle 100 is about to cross a lane line or make a turn. This determination may be made by any of the methods disclosed herein. For example, the processor 104 may determine whether the vehicle 100 is about to cross a lane line or make a turn based on a set of data received from the cameras 112a-c, where the processor 104 executes various algorithms, such as object recognition/tracking/analysis, and determines whether the vehicle 100 is on a trajectory that crosses a lane line or turn, such as based on lane lines positioned relative to the vehicle 100 or based on gaps (changes in size, color, frequency, direction) between lane lines as known in the art. Likewise, for example, the processor 104 may determine whether the vehicle 100 is about to cross a lane line or effect a turn based on a set of data received from the steering angle sensor 116, where the processor 104 determines whether the steering angle value received from the steering angle sensor 116 meets a threshold or is within a predetermined range of values, as stored by the memory 106. Thus, if the processor 104 determines that the vehicle 100 is about to cross a lane line or make a turn, the processor 104 activates the turn signal source 118 as per block 606. Otherwise, the processor 104 does not activate the turn signal source 118, as per block 608.

The computer-readable program instructions for carrying out operations of the present disclosure may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or any source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + +, or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. A code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, electronic circuitry, including, for example, programmable logic circuitry, Field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), may execute computer-readable program instructions to perform aspects of the present disclosure by personalizing the electronic circuitry with state information of the computer-readable program instructions.

Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions. The various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.

The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

Words such as "then," "then," etc. are not intended to limit the order of the steps; these words are merely used to guide the reader through the description of the methods. Although a process flow diagram may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a procedure corresponds to a function, its termination may correspond to a return of the function to the calling function or the main function.

Features or functions described with respect to certain exemplary embodiments may be combined and sub-combined in and/or with various other exemplary embodiments. Moreover, the different aspects and/or elements of the exemplary embodiments as disclosed herein may also be combined and sub-combined in a similar manner. Moreover, some example embodiments (whether individually and/or collectively) may be components of a larger system, where other processes may take precedence over and/or otherwise modify their application. Additionally, as disclosed herein, many steps may be required before, after, and/or while the exemplary embodiments are in progress. Note that at least any and/or all of the methods and/or processes disclosed herein may be performed, at least in part, via at least one entity or participant in any manner.

the terms used herein may imply direct or indirect, complete or partial, temporary or permanent action or inactivity. For example, when an element is referred to as being "on," "connected to," or "coupled to" another element at … …, the element can be directly on … …, connected or coupled to the other element, and/or intervening elements may be present, including indirect and/or direct variants. In contrast, when an element is referred to as being "directly connected" or "directly coupled" to another element, there are no intervening elements present.

Although the terms first, second, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not necessarily be limited by such terms. These terms are used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the present disclosure.

The terminology used herein is for the purpose of describing particular example embodiments and is not intended to be necessarily limiting of the disclosure. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Furthermore, as used herein, the terms "a" and/or "an" shall mean "one or more," even though the phrase "one or more" is also used herein. The terms "comprises," "comprising," "includes" and/or "including," "includes" when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence and/or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Further, when the present disclosure states herein that something is "based on" something else, then such a statement refers to a basis that may also be based on one or more other things. In other words, "based on" as used herein inclusively means "based at least in part on" or "based at least in part on" unless explicitly stated otherwise.

As used herein, the term "or" is intended to mean an inclusive "or" rather than an exclusive "or". That is, unless specified otherwise, or clear from context, "X employs a or B" is intended to mean any of the natural inclusive permutations. That is, if X employs A; b is used as X; or X employs both A and B, then "X employs A or B" is satisfied under any of the foregoing circumstances.

Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. Terms such as those defined in commonly used dictionaries should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized and/or overly formal sense unless expressly so defined herein.

the detailed description has been presented for purposes of illustration and description, but is not intended to be exhaustive and/or limited to the disclosure in the form disclosed. Many modifications and variations in techniques and structures will be apparent to those of skill in the art without departing from the scope and spirit of the disclosure as set forth in the following claims. Accordingly, such modifications and variations are considered a part of this disclosure. The scope of the disclosure is defined by the various claims, including known equivalents and unforeseeable equivalents at the time of filing this disclosure.

32页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于警示灯的光学系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!