Vehicle assist feature control

文档序号:1898995 发布日期:2021-11-30 浏览:29次 中文

阅读说明:本技术 车辆辅助特征控制 (Vehicle assist feature control ) 是由 K·特耶达 尼尔·詹姆斯·威克斯 于 2021-05-20 设计创作,主要内容包括:本公开提供了“车辆辅助特征控制”。在基于传感器数据确定车辆在越野区域中时,将越野操作模式启用为启用状态。然后在接收到选择所述越野操作模式的第一用户输入时在所述车辆的显示器上表示一个或多个辅助特征。然后基于第二用户输入选择所述辅助特征中的至少一者。然后在由用户发起的使所述车辆从关闭状态接合到开启状态的钥匙循环之后,将所述选定的辅助特征停用为停用状态。(The present disclosure provides a "vehicle assist feature control". The off-road operating mode is enabled to an enabled state when the vehicle is determined to be in the off-road region based on the sensor data. One or more assist features are then presented on a display of the vehicle upon receiving a first user input selecting the off-road operating mode. At least one of the assist features is then selected based on a second user input. The selected assist feature is then deactivated to a deactivated state after a user-initiated key cycle that engages the vehicle from an off state to an on state.)

1. A method, comprising:

actuating an off-road operating mode of the vehicle to an enabled state when the vehicle is determined to be in an off-road region based on the sensor data;

then representing one or more assist features on a display of the vehicle upon receiving a first user input selecting the off-road operating mode;

then selecting at least one of the assist features based on a second user input;

the selected assist feature is then activated to a deactivated state after a user-initiated key cycle that engages the vehicle from an off state to an on state.

2. The method of claim 1, further comprising: upon determining that the vehicle has moved from the off-road area to an on-road area, actuating the off-road operating mode to a disabled state and the disabled assist feature to an activated state.

3. The method of claim 1, further comprising: upon receiving a third user input to deselect the off-road operating mode, actuating the off-road operating mode to a disabled state and the disabled assist feature to an activated state.

4. The method of claim 1, further comprising: upon receiving a third user input selecting at least one deactivated assist feature, actuating the selected assist feature to an active state.

5. The method of claim 1, further comprising: at least one assist feature is selected further based on the previous selection.

6. The method of claim 1, further comprising: verifying that the vehicle is in the off-road region based on sensor data after the key cycle.

7. The method of claim 1, further comprising: preventing actuation of the off-road operating mode to the enabled state based on a determination that the vehicle is operating in a road region.

8. The method of claim 1, further comprising: maintaining the disabled assist feature in the disabled state after another key cycle based on determining, via vehicle sensor data, that the vehicle is in the off-road region.

9. The method of claim 1, further comprising: preventing diagnostic testing of the disabled assist feature.

10. The method of claim 9, further comprising: upon actuation of the deactivated assist feature to an activated state, performing a diagnostic test on the activated assist feature.

11. A vehicle comprising a computer programmed to perform the method of any one of claims 1 to 10.

12. A computer programmed to perform the method of any one of claims 1 to 10.

13. A computer program product comprising instructions for performing the method of any one of claims 1 to 10.

Technical Field

The present disclosure relates generally to vehicle assist features.

Background

The vehicle may include auxiliary features such as blind spot monitoring (blind spot monitor), adaptive cruise control, lane departure warning, lane centering, etc. to assist the user in operating the vehicle. The assist feature may be an adaptive feature that actuates one or more vehicle components based on vehicle data (e.g., sensed position, sensed environmental conditions, etc.).

Disclosure of Invention

A system includes a computer including a processor and a memory storing instructions executable by the processor to enable an off-road operating mode of a vehicle to an enabled state when the vehicle is determined to be in an off-road area based on sensor data. The instructions further include instructions for: one or more assist features are then presented on a display of the vehicle upon receiving a first user input selecting the off-road operating mode. The instructions further include instructions for: at least one of the assist features is then selected based on a second user input. The instructions further include instructions for: the selected assist feature is then deactivated to a deactivated state after a user-initiated key cycle that engages the vehicle from an off state to an on state.

The instructions may also include instructions for: upon determining that the vehicle has moved from the off-road area to an on-road area, disabling the off-road operating mode to a disabled state and activating the disabled assist feature to an activated state.

The instructions may also include instructions for: upon receiving a third user input to deselect the off-road operating mode, disabling the off-road operating mode to a disabled state and activating the disabled assist feature to an activated state.

The instructions may also include instructions for: activating at least one deactivated assist feature to an activated state upon receiving a third user input selecting the selected assist feature.

The instructions may also include instructions for: at least one assist feature is selected further based on the previous selection.

The instructions may also include instructions for: after the key cycle, verifying that the vehicle is on the off-road area based on sensor data.

The instructions may also include instructions for: preventing the off-road operating mode from transitioning to the enabled state based on determining that the vehicle is operating in a road region.

The instructions may also include instructions for: maintaining the disabled assist feature in the disabled state after another key cycle based on determining, via vehicle sensor data, that the vehicle is in the off-road region.

The instructions may also include instructions for: preventing diagnostic testing of the disabled assist feature.

The instructions may also include instructions for: upon activation of the deactivated assist feature to an active state, performing a diagnostic test on the activated assist feature.

A method includes enabling an off-road operating mode of a vehicle to an enabled state when the vehicle is determined to be in an off-road region based on sensor data. The method further comprises the following steps: one or more assist features are then presented on a display of the vehicle upon receiving a first user input selecting the off-road operating mode. The method further comprises the following steps: at least one of the assist features is then selected based on a second user input. The method further comprises the following steps: the selected assist feature is then activated to a deactivated state after a user-initiated key cycle that engages the vehicle from an off state to an on state.

The method may further comprise: deactivating the off-road operating mode to a disabled state and activating the deactivated assist feature to an activated state upon determining that the vehicle has moved from the off-road area to an on-road area.

The method may further comprise: deactivating the off-road operating mode to a disabled state and activating the deactivated assist feature to an activated state upon receiving a third user input to deselect the off-road operating mode.

The method may further comprise: activating at least one deactivated assist feature to an activated state upon receiving a third user input selecting the selected assist feature.

The method may further comprise: at least one assist feature is selected further based on the previous selection.

The method may further comprise: verifying that the vehicle is in the off-road region based on sensor data after the key cycle.

The method may further comprise: preventing the off-road operating mode from transitioning to the enabled state based on determining that the vehicle is operating in a road region.

The method may further comprise: maintaining the disabled assist feature in the disabled state after another key cycle based on determining, via vehicle sensor data, that the vehicle is in the off-road region.

The method may further comprise: preventing diagnostic testing of the disabled assist feature.

The method may further comprise: upon activation of the deactivated assist feature to an active state, performing a diagnostic test on the activated assist feature.

Also disclosed herein is a computing device programmed to perform any of the above method steps. Also disclosed herein is a computer program product comprising a computer readable medium storing instructions executable by a computer processor to perform any of the above-described method steps.

Drawings

FIG. 1 is a block diagram illustrating an exemplary vehicle control system of a vehicle.

Fig. 2A-2B are diagrams based on an exemplary HMI display of a vehicle in an on-road or off-road area, respectively.

FIG. 2C is a diagram of one or more assist features represented on the HMI.

FIG. 3A is a first portion of a flowchart of an exemplary process for controlling an off-road operating mode in a vehicle.

Fig. 3B is a second portion of the flow chart of fig. 3A.

Detailed Description

FIG. 1 is a block diagram illustrating an exemplary vehicle system 100. The vehicle 105 includes a vehicle computer 110 that receives data from sensors 115. The vehicle computer 110 is programmed to enable an off-road operating mode of the vehicle 105 to an enabled state when the vehicle 105 is determined to be in an off-road region based on the sensor 115 data. The vehicle computer 110 is also programmed to then present one or more auxiliary features on a display of the vehicle 105 upon receiving a first user input selecting an off-road operating mode. The vehicle computer 110 is also programmed to then select at least one of the assist features based on the second user input. The vehicle computer 110 is also programmed to then disable the selected assist feature to a disabled state after a key cycle initiated by a user to engage the vehicle 105 from an off state to an on state.

The vehicle 105 includes one or more accessory features. An assist feature is an operation in the vehicle for actuating one or more vehicle components 125 to assist or supplement user operation of the vehicle. For example, the vehicle computer 110 may control the vehicle 105 based at least in part on the assist features. An exemplary assist feature is lane keeping, where the vehicle computer 110 controls the actuator 120 and/or the component 125 to maintain the vehicle 105 in a lane of the road region. The vehicle computer 110 may receive sensor 115 data, e.g., indicative of road markings, signs, other vehicles, etc., and may initiate an assist feature that includes actuating one or more vehicle components 125 based on the sensor 115 data. However, when the vehicle 105 is operating in an off-road area, based on the received sensor 115 data, the vehicle computer 110 may initiate one or more assist features that are only applicable or desirable for use on-road and not off-road areas. Advantageously, upon determining that the vehicle 105 is in an off-road area based on the sensor 115 data, the vehicle computer 110 may enable an off-road operating mode that allows a user to selectively deactivate one or more assist features to prevent undesired actuation of selected assist features, thereby improving vehicle operation in the off-road area.

The vehicle 105 includes a vehicle computer 110, sensors 115, actuators 120 for actuating various vehicle components 125, and a vehicle communication module 130. The communication module 130 allows the vehicle computer 110 to communicate with the server 140 and/or another vehicle, for example, via a messaging or broadcast protocol, such as Dedicated Short Range Communication (DSRC), cellular, and/or other protocols that may support vehicle-to-vehicle, vehicle-to-infrastructure, vehicle-to-cloud communications, and/or the like, and/or via the packet network 135.

The vehicle computer 110 includes a processor and memory such as is known. The memory includes one or more forms of computer-readable media and stores instructions executable by the vehicle computer 110 for performing various operations, including as disclosed herein. The vehicle computer 110 may also include two or more computing devices that cooperate to perform operations of the vehicle 105, including the operations described herein. Further, the vehicle computer 110 may be a general purpose computer having a processor and memory as described above, and/or may include special purpose electronic circuitry including an ASIC fabricated for specific operations, e.g., an ASIC for processing sensor data and/or communicating sensor data. In another example, the vehicle computer 110 may include an FPGA (field programmable gate array), which is an integrated circuit manufactured to be configurable by a user. Generally, a hardware description language such as VHDL (very high speed integrated circuit hardware description language) is used in electronic design automation to describe digital and mixed signal systems such as FPGAs and ASICs. For example, ASICs are manufactured based on VHDL programming provided before manufacture, while logic components internal to the FPGA may be configured based on VHDL programming stored, for example, in memory electrically connected to the FPGA circuitry. In some examples, a combination of processors, ASICs, and/or FPGA circuits may be included in the vehicle computer 110.

The vehicle computer 110 may operate the vehicle 105 in an autonomous mode, a semi-autonomous mode, or a non-autonomous (or manual) mode. For purposes of this disclosure, an autonomous mode is defined as a mode in which each of propulsion, braking, and steering of the vehicle 105 is controlled by the vehicle computer 110; in semi-autonomous mode, the vehicle computer 110 controls one or both of propulsion, braking, and steering of the vehicle 105; in the non-autonomous mode, the human operator controls each of propulsion, braking, and steering of the vehicle 105.

The vehicle computer 110 may include programming to operate one or more of the vehicle 105 braking, propulsion (e.g., controlling acceleration of the vehicle 105 by controlling one or more of an internal combustion engine, an electric motor, a hybrid engine, etc.), steering, a transmission, climate control, interior and/or exterior lights, a horn, vehicle doors, etc., and to determine whether and when the vehicle computer 110 (rather than a human operator) controls such operations.

The vehicle computer 110 may include or be communicatively coupled to one or more processors, such as via a vehicle communication network (such as a communication bus) as described further below, for example, included in an Electronic Controller Unit (ECU) or the like, such as a transmission controller, a brake controller, a steering controller, etc., included in the vehicle 105 for monitoring and/or controlling various vehicle components 125. The vehicle computer 110 is typically arranged for communication over a vehicle communication network, which may include a bus in the vehicle 105, such as a Controller Area Network (CAN), etc., and/or other wired and/or wireless mechanisms.

Via the vehicle 105 network, the vehicle computer 110 may transmit messages to and/or receive messages (e.g., CAN messages) from various devices (e.g., sensors 115, actuators 120, ECUs, etc.) in the vehicle 105. Alternatively or additionally, where the vehicle computer 110 actually includes multiple devices, a vehicle communication network may be used for communication between the devices, represented in this disclosure as the vehicle computer 110. Further, as mentioned below, various controllers and/or sensors 115 may provide data to the vehicle computer 110 via a vehicle communication network.

The vehicle 105 sensors 115 may include a variety of devices such as are known for providing data to the vehicle computer 110. For example, the sensors 115 may include light detection and ranging (lidar) sensors 115 or the like disposed on the top of the vehicle 105, behind the front windshield of the vehicle 105, around the vehicle 105, or the like, that provide the relative position, size, and shape of objects around the vehicle 105. As another example, one or more radar sensors 115 secured to a bumper of the vehicle 105 may provide data to provide a location of objects, other vehicles, etc. relative to the location of the vehicle 105. Alternatively or additionally, the sensors 115 may also include, for example, camera sensors 115 (e.g., forward looking, side looking, etc.) that provide images from an area surrounding the vehicle 105. In the context of the present disclosure, an object is a physical (i.e., substance) item that has mass and can be represented by a physical phenomenon (e.g., light or other electromagnetic waves or sound, etc.) that can be detected by the sensor 115. Accordingly, the vehicle 105, as well as other items including those discussed below, fall within the definition of "object" herein.

The vehicle computer 110 is programmed to receive data from the one or more sensors 115 substantially continuously, periodically, and/or upon direction by the server 140, etc. The data may include, for example, the location of the vehicle 105. The location data specifies one or more points on the ground and may be in conventional form, such as geographic coordinates, such as latitude and longitude coordinates, obtained via a known navigation system using the Global Positioning System (GPS). Additionally or alternatively, the data may include a location of an object (e.g., another vehicle, a sign, a tree, a bush, etc.) relative to the vehicle 105. As one example, the data may be image data of the environment surrounding the vehicle 105. In this example, the image data may include one or more objects and/or markings, e.g., painted lines, symbols, text, etc., on the ground (e.g., on the ground on which the vehicle 105 is operating). Image data herein refers to digital image data that may be acquired by the camera sensor 115, e.g., including pixels having intensity values and color values. The sensors 115 may be mounted to any suitable location in or on the vehicle 105, e.g., on a bumper of the vehicle 105, on a roof of the vehicle 105, etc., to collect images of the environment surrounding the vehicle 105.

Vehicle 105 actuator 120 is implemented via circuitry, chips, or other electronic and/or mechanical components that can actuate various vehicle subsystems according to appropriate control signals as is known. The actuators 120 may be used to control components 125, including braking, acceleration, and steering of the vehicle 105.

In the context of the present disclosure, the vehicle component 125 is one or more hardware components adapted to perform a mechanical or electromechanical function or operation, such as moving the vehicle 105, decelerating or stopping the vehicle 105, steering the vehicle 105, or the like. Non-limiting examples of components 125 include propulsion components (including, for example, an internal combustion engine and/or an electric motor, etc.), transmission components, steering components (e.g., which may include one or more of a steering wheel, a steering rack, etc.), suspension components 125 (e.g., which may include one or more of a damper, such as a shock absorber or strut, a bushing, a spring, a control arm, a ball joint, a linkage, etc.), braking components, parking assist components, adaptive cruise control components, adaptive steering components, one or more passive restraint systems (e.g., airbags), a movable seat, etc.

The vehicle 105 also includes a Human Machine Interface (HMI) 118. The HMI118 includes user input devices such as knobs, buttons, switches, pedals, joysticks, touch screens, and/or microphones. The input devices may include sensors 115 to detect user inputs and provide user input data to the vehicle computer 110. That is, the vehicle computer 110 may be programmed to receive user input from the HMI 118. The user may provide each user input via the HMI118, for example, by pressing a virtual button on a touch screen display, by providing a voice command, and so forth. For example, a touch screen display included in the HMI118 may include a sensor 115 to detect a user pressing a virtual button on the touch screen display to, for example, more select or deselect an off-road operation, select or deselect at least one assist feature, etc., which input may be received in the vehicle computer 110 and used to determine a selection of user input.

The HMI118 also includes output devices such as a display (including a touch screen display), speakers, and/or lights, etc., that output signals or data to a user. The HMI118 is coupled to the vehicle communication network and can send and/or receive messages to/from the vehicle computer 110 and other vehicle subsystems.

Additionally, the vehicle computer 110 may be configured to communicate with devices external to the vehicle 105 via the vehicle-to-vehicle communication module 130 or interface, e.g., with another vehicle and/or the server 140 (typically via direct radio frequency communication) by vehicle-to-vehicle (V2V) or vehicle-to-infrastructure (V2X) wireless communication (cellular and/or DSRC, etc.). The communication module 130 can include one or more mechanisms by which the computer 110 of the vehicle 105 can communicate, such as a transceiver, including any desired combination of wireless (e.g., cellular, wireless, satellite, microwave, and radio frequency) communication mechanisms, as well as any desired network topology (or topologies when multiple communication mechanisms are utilized). Exemplary communications provided via the communications module 130 include cellular, bluetooth, IEEE802.11, Dedicated Short Range Communications (DSRC), and/or Wide Area Networks (WANs) including the internet, which provide data communication services.

The network 135 represents one or more mechanisms by which the vehicle computer 110 may communicate with a remote computing device (e.g., the server 140, another vehicle computer, etc.). Thus, the network 135 may be one or more of a variety of wired or wireless communication mechanisms, including any desired combination of wired (e.g., cable and fiber) and/or wireless (e.g., cellular, wireless, satellite, microwave, and radio frequency) communication mechanisms, as well as any desired network topology (or topologies when multiple communication mechanisms are utilized). Exemplary communication networks include wireless communication networks providing data communication services (e.g., usingLow power consumption (BLE), IEEE802.11, vehicle-to-vehicle (V2V) such as Dedicated Short Range Communication (DSRC), etc., Local Area Network (LAN), and/or Wide Area Network (WAN) including the internet.

The server 140 may be a conventional computing device programmed to provide operations such as those disclosed herein, i.e., including one or more processors and one or more memories. Further, server 140 may be accessed via network 135 (e.g., the internet or some other wide area network).

The vehicle computer 110 is programmed to determine whether the vehicle 105 is in an on-road or off-road area. A road area is an area of ground that includes any paved or finished surface provided for land vehicle travel. An off-road area is an area of ground that includes any surface that is unaltered to provide for travel of the vehicle.

The vehicle computer 110 may identify whether the vehicle 105 is in an on-road or off-road area, for example, based on data (e.g., map data) received from a remote computer (e.g., server 140). For example, the vehicle computer 110 may receive the location of the vehicle 105, e.g., from sensors 115, a navigation system, a remote computer, etc. The vehicle computer 110 may compare the location of the vehicle 105 to the map data, for example, to determine whether the vehicle 105 is in a road area or an off-road area specified in the map data. As another example, the vehicle computer 110 may determine that the vehicle 105 is in a road region based on a GPS-based geofence. In this example, the GPS geofence specifies the perimeter of the road area. The vehicle computer 110 may determine that the vehicle 105 is in the road region based on the location data of the vehicle 105 indicating that the vehicle 105 is within the geofence of the specified road region. Conversely, the vehicle computer 110 may determine that the vehicle 105 is in an off-road area based on the location data of the vehicle 105 indicating that the vehicle 105 is not within the geofence of the specified road area.

Alternatively, the vehicle computer 110 may receive image data from one or more sensors 115 and analyze it to determine an on-road or off-road area. In this example, the image data may include the environment surrounding the vehicle 105. The vehicle computer 110 may determine that the vehicle 105 is in an on-road or off-road area based on, for example, identifying objects and/or markers in the image data using image recognition techniques. For example, the vehicle computer 110 may determine that the vehicle 105 is in the road region based on identifying lane markings in the image data (i.e., painted lines in the road region that define one or more lanes in the road region). As another example, the vehicle computer 110 may determine that the vehicle 105 is in an off-road area based on identifying the geography in which the vehicle is operating.

The vehicle computer 110 is programmed to transition the off-road operating mode between the disabled state and the enabled state based on the position of the vehicle 105. For example, upon determining that the vehicle 105 has moved from an on-road area to an off-road area, the vehicle computer 110 enables the off-road operating mode from the disabled state to the enabled state. As another example, upon determining that the vehicle 105 has moved from an off-road area to an on-road area, the vehicle computer 110 disables the off-road operating mode from the enabled state to the disabled state. That is, the off-road operating mode is enabled when the vehicle 105 is in an off-road region, and the off-road operating mode is disabled when the vehicle 105 is in a road region.

Additionally, the vehicle computer 110 is programmed to maintain the off-road operating mode in one of the enabled state or the disabled state based on the vehicle 105 remaining in the off-road area or the on-road area, respectively. For example, after each key cycle, the vehicle computer 110 verifies whether the vehicle 105 is in an on-road or off-road area based on the sensor 115 data. For example, the vehicle computer 110 may verify whether the vehicle 105 is in an on-road area or an off-road area based on the position data and/or image data of the vehicle 105 as described above. The vehicle computer 110 then compares the position of the vehicle 105 after the key cycle with the position of the vehicle 105 before the key cycle. If the vehicle 105 is in the road region before and after the key cycle, the vehicle computer 110 maintains the off-road operating mode in the disabled state. Similarly, if the vehicle 105 is in an off-road region before and after the key cycle, the vehicle computer 110 maintains the off-road operating mode in the enabled state.

The key cycle engages the vehicle 105 between an on state (i.e., the engine is operating) and an off state (i.e., the engine is not operating). Specifically, the key cycle engages the vehicle 105 from an on state to an off state and back to the on state. Each key cycle may be initiated by a user, for example, by turning a key in an ignition switch by pressing a button or the like.

The off-road mode of operation may disable the assist feature selected based on the user input. When the off-road operating mode is active, the vehicle computer 110 enables a user to select the off-road operating mode. For example, the vehicle computer 110 may actuate the HMI118 to detect a first user input selecting an off-road operating mode. For example, the HMI118 may be programmed to display virtual buttons on a touch screen display that a user can press to select an off-road operating mode (see fig. 2B). As another example, the HMI118 may be programmed to make a virtual button non-selectable when the off-road operating mode is in a disabled state, and selectable via a touch screen display when the off-road operating mode is in an enabled state. In other words, the HMI118 may activate the sensor 115, which may detect a user pressing a virtual button to select an off-road operating mode. Upon detecting the first user input, the HMI118 can then provide the first user input to the vehicle computer 110, and the vehicle computer 110 can select an off-road operating mode based on the first user input.

Additionally, in the enabled state, the vehicle computer 110 may enable a user to select one or more accessory features (as discussed below) based on the selection of the off-road operating mode, whereupon those features are activated in the off-road operating mode. Further, in the enabled state, the vehicle computer 110 may set a flag to detect a key cycle after selecting the off-road operating mode. Upon detecting a key cycle, the vehicle computer 110 may deactivate selected assist features (as discussed below).

When the off-road operating mode is in the disabled state, the vehicle computer 110 may actuate the HMI118 to disable detection of the first user input and the second user input. In other words, the vehicle computer 110 prevents the user from selecting the off-road operating mode and/or prevents the one or more accessory features from being disabled in the disabled state (i.e., when the vehicle 105 is on a road area). For example, the HMI118 may be programmed to remove a virtual button from a touch screen display (see fig. 2A). As another example, the HMI118 can be programmed to make a virtual button non-selectable. In other words, the HMI118 may deactivate the sensor 115, which may detect a user pressing a virtual button to select an off-road operating mode. Additionally, the vehicle computer 110 may remove the flag for detecting key cycles when the off-road operating mode is in the disabled state.

Upon receiving the first user input, the vehicle computer 110 is programmed to represent one or more auxiliary features via the HMI118, for example, on a touch screen display. For example, the vehicle computer 110 may actuate the HMI118 to display a respective virtual button for each assist feature on the touch screen display (see fig. 2C). Non-limiting examples of assist features include: blind spot monitoring, lane departure warning, lane keeping assist, lane centering, adaptive cruise control, forward collision warning, etc. Additionally or alternatively, the vehicle computer 110 may represent one or more vehicle components 125, such as a passive restraint system, on the touch screen display based on the received first user input. In this example, the vehicle computer 110 may actuate the HMI118 to display one respective virtual button for each vehicle component 125. The auxiliary features and/or vehicle components 125 may be specified by the vehicle and/or component manufacturer and stored in the memory of the vehicle computer 110.

The vehicle computer 110 is programmed to select one or more assist features based on the second user input. The HMI118 can detect a second user input selecting the at least one assist feature and can provide the second user input to the vehicle computer 110. For example, the sensor 115 in the HMI118 may detect a user pressing a virtual button that selects an assist feature (see fig. 2C). The vehicle computer 110 may then select an assist feature based on the second user input. Additionally or alternatively, the vehicle computer 110 may select one or more assist features based on the previous selection. For example, the vehicle computer 110 may store the second user input in memory. That is, the vehicle computer 110 may store one or more assist features selected by the second user input. Upon receiving the subsequent first user input, the vehicle computer 110 may then select the same assist feature as the stored second user input.

In selecting the assist feature, the vehicle computer 110 may be programmed to output a message to the user. For example, the vehicle computer 110 may actuate the HMI118 to display a message via a touch screen display. As another example, the vehicle computer 110 may actuate the HMI118 to provide an audio message via a speaker in the vehicle 105. The message may instruct the user to perform a key cycle, i.e., engage the vehicle 105 from an on state to an off state and back to the on state to confirm the selected assist feature.

As described above, when the vehicle 105 is verified to be in the off-road area after the key cycle, the vehicle computer 110 deactivates the selected assist feature, i.e., disables the selected assist feature from the activated state to the deactivated state. That is, after the second user input, the selected assist feature remains in the activated state until the user initiates a key cycle. In the deactivated state, the vehicle computer 110 suppresses the assist feature. That is, the vehicle computer 110 does not initiate the deactivated assist feature to supplement or assist the user in operating the vehicle 105 in the off-road area.

In the active state, the vehicle computer 110 operates the vehicle 105 based at least in part on the assist features. For example, the vehicle computer 110 may initiate one or more assist features based on the sensor 115 data to supplement or assist a user in operating the vehicle 105 in a road region. That is, the vehicle computer 110 may actuate one or more vehicle components 125 to adjust operation of the vehicle 105 based on the environment surrounding the vehicle 105. For example, the vehicle computer 110 may adjust the speed of the vehicle 105 based on the adaptive cruise control assist feature, for example, according to the speed of other vehicles operating in the road region. As another example, the vehicle computer 110 may operate the vehicle 105 to maintain at least a minimum distance from vehicles ahead of the vehicle 105 in the same lane on the road region. As yet another example, the vehicle computer 110 may adjust lateral movement of the vehicle 105 within a lane of the road region according to lane keeping aid features.

Additionally, after each key cycle, the vehicle computer 110 may receive diagnostic data from one or more ECUs (e.g., restraint control module, body control module, etc.). For example, one or more ECUs may be programmed to perform diagnostic tests to determine whether the auxiliary features are operable, i.e., operable within specified parameters specified, for example, by the vehicle and/or component manufacturer. That is, one or more of the ECUs may perform conventional self-diagnostic tests to detect faults in the assist features to confirm that the assist features are operational. If the assist feature is inoperable, the diagnostic test may output a fault and the vehicle computer 110 may identify the fault in the assist feature. That is, if the vehicle computer 110 identifies a fault in the assist feature, the diagnostic data indicates that the assist feature is inoperable and needs to be repaired or replaced.

The vehicle computer 110 deactivates the selected assist features after the key cycle to prevent misidentifying a fault in the selected assist features. For example, the vehicle computer 110 may receive diagnostic data for each assist feature in the active state and may suppress diagnostic data for deactivated assist features. That is, the vehicle computer 110 may prevent one or more ECUs from performing diagnostic tests on the deactivated assist features. Inhibiting the diagnostic data for the deactivated assist feature may prevent the vehicle computer 110 from identifying a fault in the deactivated assist feature based on the second user input.

Additionally, the vehicle computer 110 may be programmed to record the deactivation of selected assist features. For example, when the vehicle computer 110 deactivates the selected assist feature to the deactivated state, the vehicle computer 110 may store data in memory indicating that the assist feature is deactivated, including, for example, the time of day of the second user input, the time of day of the key cycle, the location of the vehicle 105, the deactivated assist feature, and the like.

The vehicle computer 110 is programmed to activate each disabled assist feature to an active state based on disabling the off-road operating mode to a disabled state. That is, each assist feature is in an active state when the vehicle 105 is operating in a road region. When an assist feature is activated from a deactivated state to an activated state, the vehicle computer 110 may instruct the corresponding ECU to perform a diagnostic test on the activated assist feature (as described above). That is, the vehicle computer 110 may determine that the assist feature is operational when activated to the active state based on the diagnostic data.

Additionally or alternatively, the vehicle computer 110 may be programmed to activate the at least one assist feature from the deactivated state to the activated state based on a third user input. For example, the HMI118 may detect a third user input deselecting the off-road operating mode, e.g., the user pressing a virtual button to deselect the off-road operating mode, and provide the third user input to the vehicle computer 110. In this example, the vehicle computer 110 may then activate each deactivated assist feature to an activated state based on a third user input. As another example, the HMI118 may detect a third user input deselecting one or more deactivated assist features, e.g., the user presses a virtual button to deselect a deactivated assist feature, and provide the third user input to the vehicle computer 110. In this example, the vehicle computer 110 may then activate the selected assist feature to the activated state based on a third user input.

FIG. 3A is a first portion of a flowchart of an exemplary process 300 for controlling an off-road operating mode in the vehicle 105 (the second portion is shown in FIG. 3B because the entire flowchart is not open on a single sheet of paper). The process 300 begins at block 305.

In block 305, the vehicle computer 110 receives data from one or more sensors 115 and/or from a remote computer (e.g., server 140). For example, the data may be map data, e.g., from server 140 via network 135, including the location of vehicle 105 and the location of road regions, e.g., each specified in geographic coordinates. Additionally or alternatively, the data may be image data from one or more sensors 115, such as via a vehicle network, including the environment surrounding the vehicle 105, such as the terrain over which the vehicle is operating, one or more objects, and so forth. The process 300 continues in block 310.

In block 310, the vehicle computer 110 determines whether the vehicle 105 is operating in a road region based on the received data (e.g., image data and/or map data). For example, as discussed above, the vehicle computer 110 may compare the location of the vehicle 105 to the location of a road area specified by the map data (e.g., to determine whether the vehicle 105 is within a geofence of the road area). Additionally or alternatively, as discussed above, the vehicle computer 110 may analyze the image data, for example, using image processing techniques, to identify the terrain in which the vehicle is operating, one or more objects surrounding the vehicle 105, and/or the like. If the vehicle computer 110 determines that the vehicle 105 is not in a road area, i.e., not in an off-road area, the process 300 continues in block 320. Otherwise, process 300 continues in block 315.

In block 315, the vehicle computer 110 maintains the off-road operating mode in the disabled state. For example, the vehicle computer 110 may maintain the off-road operating mode in the disabled state when it is determined that the vehicle 105 remains in the road region. In the disabled state, the vehicle computer 110 prevents the user from selecting the off-road operating mode. In addition, each of the assist features is in an active state when the off-road operating mode is in a disabled state. That is, the vehicle computer 110 may initiate any of the assist features to supplement or assist user operations of the vehicle 105 in the road region, e.g., based on the sensor 115 data. The process 300 returns to block 305.

In block 320, the vehicle computer 110 enables the off-road operating mode in the enabled state. For example, upon determining that the vehicle 105 has moved from an on-road area to an off-road area, the vehicle computer 110 enables the off-road operating mode from the disabled state to the enabled state. Additionally, the vehicle computer 110 may maintain the off-road operating mode in an enabled state when it is determined that the vehicle 105 remains in the off-road area. In the enabled state, the vehicle computer 110 enables a user to select an off-road operating mode. The process 300 continues in block 325.

In block 325, the vehicle computer 110 determines whether the off-road operating mode is selected. For example, in the enabled state, as discussed above, the vehicle computer 110 may actuate the HMI118 to detect a first user input selecting an off-road operating mode. In other words, the HMI118 may activate the sensor 115, which may detect a first user input, e.g., a user pressing a virtual button on a touch screen display to select an off-road mode of operation. Upon detecting the first user input, the HMI118 can then provide the first user input to the vehicle computer 110, and the vehicle computer 110 can select an off-road operating mode based on the first user input. If the vehicle computer 110 receives a first user input selecting an off-road operating mode, the process 300 continues in block 335. Otherwise, process 300 continues in block 330.

In block 330, the vehicle computer 110 determines whether the vehicle 105 is operating in a road region based on the data (e.g., image data and/or map data). For example, as discussed above, the vehicle computer 110 may compare the location of the vehicle 105 to the location of a road area specified by the map data. Additionally or alternatively, as discussed above, the vehicle computer 110 may analyze the image data, for example, using image processing techniques, to identify the terrain in which the vehicle is operating, one or more objects surrounding the vehicle 105, and/or the like. If the vehicle computer 110 determines that the vehicle 105 is operating in a road region, the process 300 continues in block 385. Otherwise, process 300 returns to block 325.

In block 335, the vehicle computer 110 represents one or more assist features. For example, the vehicle computer 110 may actuate the HMI118 to display a respective virtual button for each assist feature on the touch screen display. Additionally, the vehicle computer 110 may represent one or more vehicle components 125, such as a passive restraint system, via the HMI 118. The displayed auxiliary features and/or vehicle components 125 may be specified by the vehicle and/or component manufacturer, as discussed above. The process 300 continues in block 340.

In block 340, the vehicle computer 110 determines whether to select at least one assist feature based on the second user input. For example, the sensor 115 of the HMI118 may detect a second user input, e.g., the user pressing a virtual button that selects an assist feature, as discussed above. The HMI118 can then provide the second user input to the vehicle computer 110, and the vehicle computer 110 can then select an assist feature based on the second user input. Additionally or alternatively, as discussed above, the vehicle computer 110 may select one or more assist features based on the previous selection. If the vehicle computer 110 determines that at least one assist feature is selected, the process 300 continues in block 350. Otherwise, process 300 continues in block 345.

In block 345, the vehicle computer 110 determines whether the vehicle 105 is operating in the road region based on the data (e.g., image data and/or map data). For example, as discussed above, the vehicle computer 110 may compare the location of the vehicle 105 to the location of a road area specified by the map data. Additionally or alternatively, as discussed above, the vehicle computer 110 may analyze the image data, for example, using image processing techniques, to identify the terrain in which the vehicle is operating, one or more objects surrounding the vehicle 105, and/or the like. If the vehicle computer 110 determines that the vehicle 105 is operating in a road region, the process 300 continues in block 385. Otherwise, process 300 returns to block 340.

In block 350, the vehicle computer 110 determines whether a key cycle has occurred after the second user input. As discussed above, the vehicle computer 110 determines the key cycle based on detecting that the vehicle 105 is engaged in the on state if the vehicle has transitioned from the on state to the off state and returned to the on state. The key cycle may be initiated by a user. If the vehicle computer 110 determines that a key cycle has occurred, the process 300 continues in block 360. Otherwise, process 300 continues in block 355.

In block 355, the vehicle computer 110 maintains the selected assist feature in an active state. In other words, as discussed above, the vehicle computer 110 continues to operate the vehicle 105 based at least in part on the selected assist feature. That is, the vehicle computer 110 does not disable the selected assist feature to the disabled state until after the key cycle. The process 300 returns to block 345.

In block 360, the vehicle computer 110 determines whether the vehicle 105 is operating in a road region based on data (e.g., image data and/or map data). For example, as discussed above, the vehicle computer 110 may compare the location of the vehicle 105 to the location of a road area specified by the map data. Additionally or alternatively, as discussed above, the vehicle computer 110 may analyze the image data, for example, using image processing techniques, to identify the terrain in which the vehicle is operating, one or more objects surrounding the vehicle 105, and/or the like. If the vehicle computer 110 determines that the vehicle 105 is operating in a road region, the process 300 continues in block 385. Otherwise, process 300 continues to block 365.

Turning now to FIG. 3B, after block 360 shown in FIG. 3A, in block 365 the vehicle computer 110 deactivates the selected assist feature to a deactivated state. In the deactivated state, the vehicle computer 110 suppresses the assist feature. That is, the vehicle computer 110 does not initiate the deactivated assist feature to supplement or assist user operation of the vehicle 105 in the off-road area. As discussed above, the vehicle computer 110 may record the deactivation of the selected assist feature, for example, in memory. Additionally, as discussed above, the vehicle computer 110 may suppress diagnostic data from the deactivated assist feature. The process 300 continues in block 370.

In block 370, the vehicle computer 110 determines whether the vehicle 105 is operating in a road region based on data (e.g., image data and/or map data). For example, as discussed above, the vehicle computer 110 may compare the location of the vehicle 105 to the location of a road area specified by the map data. Additionally or alternatively, as discussed above, the vehicle computer 110 may analyze the image data, for example, using image processing techniques, to identify the terrain in which the vehicle is operating, one or more objects surrounding the vehicle 105, and/or the like. If the vehicle computer 110 determines that the vehicle 105 is operating in a road region, the process 300 continues in block 385. Otherwise, process 300 continues to block 375.

In block 375, the vehicle computer 110 determines whether to deselect at least one assist feature. For example, the sensor 115 of the HMI118 may detect a third user input, e.g., the user pressing one or more virtual buttons that deselect a respective assist feature, as discussed above. In other words, the third user input may deselect one or more assist features. As another example, the HMI118 may detect a third user input deselecting the off-road operating mode, e.g., the user pressing a virtual button on a touch screen display to deselect the off-road operating mode, as discussed above. In this example, the third user input deselects each of the deactivated assist features. The HMI118 can then provide the third user input to the vehicle computer 110, and the vehicle computer 110 can then deselect the assist feature based on the third user input. If the vehicle computer 110 determines that at least one assist feature is deselected, the process 300 continues in block 380. Otherwise, process 300 returns to block 370.

In block 380, the vehicle computer 110 activates the deselected assist feature to an active state. That is, the vehicle computer 110 may operate the vehicle 105 based at least in part on the activated assist feature. For example, the vehicle computer 110 may initiate one or more activated assist features to supplement or assist user operations of the vehicle 105 in, for example, a roadway area. Additionally, as discussed above, the vehicle computer 110 may determine that the activated assist feature is operational. The process 300 returns to block 325.

In block 385, the vehicle computer 110 disables the off-road operating mode to a disabled state based on the vehicle 105 operating in the road region. For example, the vehicle computer 110 may actuate the HMI118 to prevent a user from selecting an off-road operating mode and/or to prevent deactivation of one or more accessory features while the vehicle 105 is in a road region. In addition, when the off-road operating mode is disabled from the enabled state to the disabled state, the vehicle computer 110 activates each disabled auxiliary feature to the enabled state, as discussed above. The process 300 ends after block 385.

As used herein, the adverb "substantially" means that shapes, structures, measurements, quantities, times, etc. may deviate from the precisely described geometries, distances, measurements, quantities, times, etc. due to imperfections in materials, machining, manufacturing, data transmission, computational speed, etc.

In general, the described computing systems and/or devices may employ any of a number of computer operating systems, including, but in no way limited to, the following versions and/or variations:application, AppLink/Smart Device Link middleware, Microsoft WindowsOperating System, Microsoft WindowsOperating System, Unix operating System (e.g., distributed by oracle corporation of the Redwood coast, Calif.)Operating system), the AIX UNIX operating system, the Linux operating system, the Mac OSX and iOS operating systems, the Mac OS operating system, the BlackBerry OS, the BlackBerry, Inc. of Tokyo, Calif., the Android operating system developed by Google and the open cell phone alliance, or the QNX software systems, Inc. published by International Business machines corporation, Armonk, N.Y.CAR infotainment platform. Examples of a computing device include, but are not limited to, an on-board computer, a computer workstation, a server, a desktop, a notebook, a laptop, or a handheld computer, or some other computing system and/or device.

Computers and computing devices generally include computer-executable instructions, where the instructions may be capable of being executed by one or more computing devices, such as those listed above. Computer-executable instructions may be compiled or interpreted by a computer program created using a variety of programming languages and/or techniques, including but not limited to Java, alone or in combinationTMC, C + +, Matlab, Simulink, Stateflow, Visual Basic, Java Script, Perl, HTML, and the like. Some of these applications may be compiled and executed on a virtual machine, such as a Java virtual machine, a Dalvik virtual machine, or the like. Generally, a processor (e.g., a microprocessor) receives instructions from, for example, a memory, a computer-readable medium, etc., and executes those instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions and other data may be stored and transmitted using a variety of computer-readable media. Files in a computing device are typically stored, such as on a storage mediumA random access memory, etc. on a computer readable medium.

The memory may include a computer-readable medium (also referred to as a processor-readable medium) including any non-transitory (e.g., tangible) medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media may include, for example, optical or magnetic disks and other persistent memory. Volatile media may include, for example, Dynamic Random Access Memory (DRAM), which typically constitutes a main memory. Such instructions may be transmitted by one or more transmission media, including coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to the processor of the ECU. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other medium from which a computer can read.

A database, data store, or other data storage described herein may include various mechanisms for storing, accessing, and retrieving various data, including a hierarchical database, a set of files in a file system, an application database in a proprietary format, a relational database management system (RDBMS), and so forth. Each such data storage device is typically included within a computing device employing a computer operating system, such as one of those mentioned above, and is accessed via a network in any one or more of a number of ways. The file system may be accessed from a computer operating system and may include files stored in various formats. RDBMS also typically employ the Structured Query Language (SQL) in addition to the language used to create, store, edit, and execute stored programs, such as the PL/SQL language described above.

In some examples, system elements may be embodied as computer readable instructions (e.g., software) on one or more computing devices (e.g., servers, personal computers, etc.), stored on computer readable media (e.g., disks, memory, etc.) associated therewith. A computer program product may comprise such instructions stored on a computer-readable medium for performing the functions described herein.

With respect to the media, processes, systems, methods, heuristics, etc. described herein, it should be understood that, although the steps of such processes, etc. have been described as occurring according to a certain ordered sequence, such processes may be practiced with the steps described as occurring in a different order than that described herein. It is also understood that certain steps may be performed simultaneously, that other steps may be added, or that certain steps described herein may be omitted. In other words, the description of processes herein is provided for the purpose of illustrating certain embodiments and should in no way be construed as limiting the claims.

Accordingly, it is to be understood that the above description is intended to be illustrative, and not restrictive. The adjectives "first," "second," "third," and "fourth" are used throughout this document as identifiers, and are not intended to denote importance or order. Many embodiments and applications other than the examples provided would be apparent to those of skill in the art upon reading the above description. The scope of the invention should be determined, not with reference to the above description, but should instead be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is contemplated and intended that future developments will occur in the arts discussed herein, and that the disclosed systems and methods will be incorporated into such future embodiments. In sum, it should be understood that the invention is capable of modification and variation and is limited only by the following claims.

Unless expressly indicated to the contrary herein, all terms used in the claims are intended to be given their ordinary and customary meaning as understood by those skilled in the art. In particular, the use of singular articles such as "a," "the," "said," etc. should be read to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary.

According to the invention, there is provided a system having a computer including a processor and a memory, the memory storing instructions executable by the processor to: enabling an off-road operating mode of the vehicle to an enabled state when the vehicle is determined to be in an off-road area based on the sensor data; then upon receiving a first user input selecting the off-road operating mode, presenting one or more assist features on a display of the vehicle; then selecting at least one of the assist features based on a second user input; the selected assist feature is then deactivated to a deactivated state after a key cycle initiated by a user to engage the vehicle from an off state to an on state.

According to one embodiment, the instructions further comprise instructions for: upon determining that the vehicle has moved from the off-road area to an on-road area, disabling the off-road operating mode to a disabled state and activating the disabled assist feature to an activated state.

According to one embodiment, the instructions further comprise instructions for: upon receiving a third user input to deselect the off-road operating mode, disabling the off-road operating mode to a disabled state and activating the disabled assist feature to an activated state.

According to one embodiment, the instructions further comprise instructions for: activating at least one deactivated assist feature to an activated state upon receiving a third user input selecting the selected assist feature.

According to one embodiment, the instructions further comprise instructions for: at least one assist feature is selected further based on the previous selection.

According to one embodiment, the instructions further comprise instructions for: after the key cycle, verifying that the vehicle is on the off-road area based on sensor data.

According to one embodiment, the instructions further comprise instructions for: preventing the off-road operating mode from transitioning to the enabled state based on determining that the vehicle is operating in a road region.

According to one embodiment, the instructions further comprise instructions for: maintaining the disabled assist feature in the disabled state after another key cycle based on determining, via vehicle sensor data, that the vehicle is in the off-road region.

According to one embodiment, the instructions further comprise instructions for: preventing diagnostic testing of the disabled assist feature.

According to one embodiment, the instructions further comprise instructions for: upon activation of the deactivated assist feature to an active state, performing a diagnostic test on the activated assist feature.

According to the invention, a method comprises: enabling an off-road operating mode of the vehicle to an enabled state when the vehicle is determined to be in an off-road area based on the sensor data; then upon receiving a first user input selecting the off-road operating mode, presenting one or more assist features on a display of the vehicle; then selecting at least one of the assist features based on a second user input; the selected assist feature is then activated to a deactivated state after a key cycle initiated by a user to engage the vehicle from an off state to an on state.

In one aspect of the invention, the method comprises: deactivating the off-road operating mode to a disabled state and activating the deactivated assist feature to an activated state upon determining that the vehicle has moved from the off-road area to an on-road area.

In one aspect of the invention, the method comprises: deactivating the off-road operating mode to a disabled state and activating the deactivated assist feature to an activated state upon receiving a third user input to deselect the off-road operating mode.

In one aspect of the invention, the method comprises: activating at least one deactivated assist feature to an activated state upon receiving a third user input selecting the selected assist feature.

In one aspect of the invention, the method comprises: at least one assist feature is selected further based on the previous selection.

In one aspect of the invention, the method comprises: verifying that the vehicle is in the off-road region based on sensor data after the key cycle.

In one aspect of the invention, the method comprises: preventing the off-road operating mode from transitioning to the enabled state based on determining that the vehicle is operating in a road region.

In one aspect of the invention, the method comprises: maintaining the disabled assist feature in the disabled state after another key cycle based on determining, via vehicle sensor data, that the vehicle is in the off-road region.

In one aspect of the invention, the method comprises: preventing diagnostic testing of the disabled assist feature.

In one aspect of the invention, the method comprises: upon activation of the deactivated assist feature to an active state, performing a diagnostic test on the activated assist feature.

21页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:车道居中功能控制方法、装置、设备及可读存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!