Driving assistance system and driving assistance method

文档序号:1607088 发布日期:2020-01-10 浏览:15次 中文

阅读说明:本技术 驾驶辅助系统和驾驶辅助方法 (Driving assistance system and driving assistance method ) 是由 唐帅 孙铎 于 2018-07-03 设计创作,主要内容包括:本发明涉及驾驶辅助系统和驾驶辅助方法。该驾驶辅助系统包括图像接收装置,其配置为接收所述车辆的驾驶员的眼部图像;眼部参数提取装置,其配置为根据所述眼部图像来提取所述驾驶员的眼部参数;眼部动作确定装置,其配置为根据所述眼部参数来确定所述驾驶员的眼部动作;以及眼部模拟控制装置,其配置为控制所述车辆的眼部模拟装置,使得所述眼部模拟装置模拟所述驾驶员的眼部动作。根据本发明的驾驶辅助系统和驾驶辅助方法能够直接地且直观地向外部目标模拟乘坐在车厢内部的驾驶员的真实眼部动作。(The invention relates to a driving assistance system and a driving assistance method. The driving assistance system includes an image receiving device configured to receive an eye image of a driver of the vehicle; an eye parameter extraction device configured to extract an eye parameter of the driver from the eye image; an eye motion determination device configured to determine an eye motion of the driver from the eye parameter; and an eye simulation control device configured to control the eye simulation device of the vehicle such that the eye simulation device simulates the eye movement of the driver. The driving assistance system and the driving assistance method according to the present invention can directly and intuitively simulate the real eye movements of the driver seated in the vehicle compartment to the external target.)

1. A driving assistance system for a vehicle, comprising:

an image receiving device configured to receive an eye image of a driver of the vehicle;

an eye parameter extraction device configured to extract an eye parameter of the driver from the eye image;

an eye motion determination device configured to determine an eye motion of the driver from the eye parameter; and

an eye simulation control device configured to control the eye simulation device of the vehicle such that the eye simulation device simulates the eye movement of the driver.

2. The driving assistance system according to claim 1,

the ocular parameter comprises at least one of the following parameters: orbital contour, pupil contour, orbital position, pupil position, eyelid position, change in orbital position, change in pupil position, change in eyelid position.

3. The driving assistance system according to claim 1,

the eye action comprises at least one of the following actions: a blinking action, an eye-closing action, a saccadic action, and a fixation action.

4. The driving assistance system according to claim 1,

the eye movement determination means is configured to determine the eye movement of the driver according to a preset model.

5. The driving assistance system according to claim 1, further comprising:

a triggering device configured to allow at least one of the image receiving device, the ocular parameter extraction device, the ocular motion determination device, and the ocular simulation control device to start operating when at least one of the following triggering conditions is detected:

the speed of the vehicle is greater than a preset speed; and

an external target is present within a predetermined distance of the vehicle periphery.

6. The driving assistance system according to claim 1,

the eye simulation device is a display device mounted outside the vehicle.

7. The driving assistance system according to claim 1,

the eye simulation device is an illumination device mounted outside the vehicle, the illumination device being formed of an M × N photodiode array.

8. A vehicle comprising the driving assistance system according to any one of claims 1 to 7.

9. A driving assistance method for a vehicle, comprising the steps of:

receiving an eye image of a driver of the vehicle;

extracting eye parameters of the driver according to the eye image;

determining an eye movement of the driver according to the eye parameter; and

controlling an eye simulation device of the vehicle so that the eye simulation device simulates the eye movement of the driver.

10. The driving assistance method according to claim 9, wherein,

the ocular parameter comprises at least one of the following parameters: orbital contour, pupil contour, orbital position, pupil position, eyelid position, change in orbital position, change in pupil position, change in eyelid position.

11. The driving assistance method according to claim 9, wherein,

the eye action comprises at least one of the following actions: a blinking action, an eye-closing action, a saccadic action, and a fixation action.

12. The driving assistance method according to claim 9, wherein,

the step of determining the eye movement comprises determining the eye movement of the driver according to a preset model.

13. The driving assistance method according to claim 9, further comprising detecting the following conditions:

whether the vehicle speed is greater than a predetermined vehicle speed; and

whether an external object is present within a predetermined distance of the vehicle periphery.

14. The driving assistance method according to claim 9, wherein,

the eye simulation device is a display device mounted outside the vehicle.

15. The driving assistance method according to claim 9, wherein,

the eye simulation device is an illumination device mounted outside the vehicle, the illumination device being formed of an M × N photodiode array.

Technical Field

The invention relates to the technical field of vehicle assistance. More specifically, the invention relates to a driving assistance system and a driving assistance method for a vehicle.

Background

It is known that the eye movement of the driver often reflects the driving intention, the mental state (e.g., fatigue, etc.), the attention point, and the like of the driver. However, during the driving of the vehicle, traffic participants outside the vehicle (such as pedestrians or drivers of other vehicles) may be too far away from the vehicle or the face and even the head of the driver are shielded by the vehicle body or the glass film, so that the eye movement of the driver cannot be seen, and thus there may be a driving safety hazard due to lack of eye contact between the traffic participants and the driver.

Therefore, in order to make a target outside the vehicle aware of the driving state of the driver, it is necessary to provide a driving assistance system that can clearly reflect the eye movement of the driver.

Disclosure of Invention

An object of the present invention is to provide a driving assistance system and a driving assistance method capable of detecting an eye movement of a driver of a vehicle. Another object of the present invention is to provide a driving assistance system and a driving assistance method capable of simulating a real eye movement of a driver via a display device or an illumination device of a vehicle.

An aspect of the present invention provides a driving assistance system for a vehicle, including an image receiving device configured to receive an eye image of a driver of the vehicle; an eye parameter extraction device configured to extract an eye parameter of the driver from the eye image; an eye motion determination device configured to determine an eye motion of the driver from the eye parameter; and an eye simulation control device configured to control the eye simulation device of the vehicle such that the eye simulation device simulates the eye movement of the driver.

According to an embodiment of the invention, the ocular parameter comprises at least one of the following parameters: orbital contour, pupil contour, orbital position, pupil position, eyelid position, change in orbital position, change in pupil position, change in eyelid position.

According to an embodiment of the invention, the eye action comprises at least one of the following actions: a blinking action, an eye-closing action, a saccadic action, and a fixation action.

According to an embodiment of the invention, the eye movement determination means is configured to determine the eye movement of the driver according to a preset model.

According to an embodiment of the present invention, the driving assistance system further includes a triggering device configured to allow at least one of the image receiving device, the eye parameter extraction device, the eye motion determination device, and the eye simulation control device to start operating when at least one of the following triggering conditions is detected: the speed of the vehicle is greater than a preset speed; and an external target within a predetermined distance of the vehicle periphery.

According to an embodiment of the present invention, the eye simulation device is a display device mounted on the outside of the vehicle.

According to an embodiment of the present invention, the eye simulation device is an illumination device mounted outside the vehicle, the illumination device being formed of an M × N photodiode array.

Another aspect of the invention provides a vehicle including the above-described driving assistance system.

Yet another aspect of the present invention provides a driving assistance method for a vehicle, including the steps of: receiving an eye image of a driver of the vehicle; extracting eye parameters of the driver according to the eye image; determining an eye movement of the driver according to the eye parameter; and controlling an eye simulation device of the vehicle so that the eye simulation device simulates the eye movement of the driver.

According to an embodiment of the invention, the ocular parameter comprises at least one of the following parameters: orbital contour, pupil contour, orbital position, pupil position, eyelid position, change in orbital position, change in pupil position, change in eyelid position.

According to an embodiment of the invention, the eye action comprises at least one of the following actions: a blinking action, an eye-closing action, a saccadic action, and a fixation action.

According to an embodiment of the invention, the step of determining the eye movements comprises determining the eye movements of the driver according to a preset model.

According to an embodiment of the present invention, the driving assistance method further includes detecting the following conditions: whether the vehicle speed is greater than a predetermined vehicle speed; and whether an external object is present within a predetermined distance of the vehicle periphery.

According to an embodiment of the present invention, the eye simulation device is a display device mounted on the outside of the vehicle.

According to an embodiment of the present invention, the eye simulation device is an illumination device mounted outside the vehicle, the illumination device being formed of an M × N photodiode array.

Therefore, compared to the related art, the driving assistance system and the driving assistance method according to the present invention can directly and intuitively simulate the real eye movements of the driver seated in the vehicle cabin to the external target via the display device or the illumination device of the vehicle.

Drawings

The present invention may be better understood from the following description of specific embodiments thereof taken in conjunction with the accompanying drawings, in which like reference numerals identify identical or functionally similar elements.

Fig. 1 shows a schematic view of a driving assistance system according to an embodiment of the invention.

Fig. 2 shows an application scenario of the eye simulation apparatus according to an embodiment of the present invention.

Fig. 3 shows a block flow diagram of a driving assistance method according to an embodiment of the invention.

Detailed Description

Hereinafter, embodiments of the present invention are described with reference to the drawings. The following detailed description and drawings are illustrative of the principles of the invention, which is not limited to the preferred embodiments described, but is defined by the claims. The invention will now be described in detail with reference to exemplary embodiments thereof, some of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings, in which like reference numerals refer to the same or similar elements throughout the different views unless otherwise indicated. The aspects described in the following exemplary embodiments do not represent all aspects of the present invention. Rather, these aspects are merely exemplary of the systems and methods according to the various aspects of the present invention as recited in the appended claims.

The driving assist system according to the embodiment of the invention may be mounted on or applied to a vehicle. The vehicle may be an internal combustion engine vehicle using an internal combustion engine as a drive source, an electric vehicle or a fuel cell vehicle using an electric motor as a drive source, a hybrid vehicle using both of the above as drive sources, or a vehicle having another drive source.

FIG. 1 isA schematic diagram of a driving assistance system according to an embodiment of the invention. As shown in fig. 1, the vehicle 10 includes a driving assistance system 100. The driving assistance system 100 may be coupled to and/or in communication with other components of the vehicle 10. For example, the driving assistance system 100 may be via a Controller Area Network (CAN) bus or

Figure BDA0001717708340000041

The network is connected to the camera device 200, the display device 300, the illumination device 400, and the like of the vehicle 10. Well-known power and steering devices, drive trains, and like components of the vehicle 10 are not shown in FIG. 1 for the sake of clarity.

With continued reference to fig. 1, the driving assistance system 100 may include an image receiving means 110, an eye parameter extraction means 120, an eye motion determination means 130, and an eye simulation control means 140. According to other embodiments of the present invention, the driving assistance system 100 optionally includes a triggering device 150. These devices may be in wired or wireless communication with each other. These means may be implemented by hardware circuits, by software modules, or by a combination of hardware and software.

The image receiving device 110 may receive an eye image of the driver of the vehicle 10. In an exemplary embodiment, the image receiving device 110 may receive the eye image of the driver from the camera device 200 of the vehicle 10. The camera device 200 includes one or more in-vehicle cameras. The in-vehicle camera may be an RGB camera, or may be an infrared camera or the like capable of capturing images under low light. An in-vehicle camera is provided to be mounted inside the cabin of the vehicle 10 to capture an eye image of the driver. The in-vehicle camera may be mounted at a suitable position inside the cabin of the vehicle 10, such as at the upper center of the front windshield, at any or all of the four corners of the front windshield, at the steering wheel or the instrument panel, or the like. In some embodiments, the in-vehicle camera may be configured to be adjustable, for example, the in-vehicle camera may be configured to rotate or translate relative to the body of the vehicle 10 (e.g., front windshield, steering wheel, or dashboard) to avoid obstructing the capture of eye images of a suitable viewing angle and/or field of view. The eye image may include any one of a still image, a moving image, and a stereoscopic image.

The eye parameter extraction means 120 may extract the eye parameter of the driver from the eye image of the driver received by the image reception means 110. Herein, "ocular parameter" means a parameter related to the ocular characteristics of the driver of the vehicle 10. The ocular features include the orbit, eyelids, and pupil, with the eyelids including the upper and lower eyelids. Ocular parameters include, but are not limited to, the following: eye feature contours (including orbital contours and pupil contours, etc.), eye feature locations (including orbital position, pupil position, and eyelid position, etc.), changes in eye feature locations (changes in orbital position, changes in pupil position, changes in eyelid position, etc.). Here, the term "eye feature position" includes not only an absolute position of the eye feature itself with respect to the center of the driver's eye, but also a relative position between any two eye features.

According to the embodiment of the present invention, first, the eye parameter extraction means 120 may determine the eye characteristics of the driver by analyzing the light and shade variation of the eye image. The light and shadow variations are, for example, luminance variations between different regions of the eye image. Since the luminance distributions of different regions of the eye image are different, the eye parameter extraction device 120 can determine the respective eye features accordingly. Subsequently, the ocular parameter extraction device 120 may obtain a part of the related parameters of the ocular features, such as the orbit contour, the pupil contour, the orbit position, the pupil position, the upper eyelid position, and the lower eyelid position, by directly detecting the determined ocular features. The orbital contour may be obtained by delineating the boundary of the orbit. The pupil profile can be obtained by delineating the boundaries of the pupil. Orbital position refers to the set of positions of all points that make up the boundary of the orbit. However, it should be understood that orbital location may also refer to the location of the center point of the orbit. The center point of the orbit is obtained by calculating half the distance between the two corners of the eye after the detection of the outline of the orbit. The pupil position refers to the position of the center point of the pupil. The center point of the pupil is obtained by calculating half of the pupil diameter after detecting the pupil profile. The upper eyelid position refers to the position of the highest point of the upper eyelid. However, it should be understood that the upper eyelid position may also be a set of positions of several feature points of the upper eyelid. The lower eyelid position refers to the position of the lowest point of the lower eyelid. Similarly, the lower eyelid position may also be a set of positions of several feature points of the lower eyelid. After obtaining the self-parameters of each ocular feature, the ocular parameter extraction device 120 may further obtain relative parameters between the ocular features by calculating the directly measured ocular parameters of the part, thereby obtaining a part of related parameters of the ocular features. For example, after obtaining the position of the center point of the pupil and the position of the center point of the orbit, the ocular parameter extraction device 120 may calculate the relative distance between the center point of the pupil and the center point of the orbit, thereby obtaining the relative positions of the pupil and the orbit.

The above describes that the eye parameter extraction device 120 analyzes one image frame in the eye image to obtain a part of the eye parameters. However, according to other embodiments of the present invention, the eye parameter extraction device 120 may also extract the position change of the eye feature by analyzing a plurality of consecutive image frames in a moving image or a still image in the eye image. For example, the eye parameter extraction device 120 may extract a plurality of consecutive still image frames captured within a predetermined time period from the eye image, and obtain a position change of a certain eye feature within the predetermined time period by sequentially detecting the position of the eye feature in the consecutive still image frames. For example, the predetermined time period may be 2 s. Similarly, the eye parameter extraction device 120 may also obtain the relative position change of the eye feature, such as the relative position change of the upper eyelid and the lower eyelid, the relative position change of the pupil and the orbit, and the like, by sequentially detecting a plurality of consecutive static image frames captured within a predetermined time period.

The eye motion determination means 130 may determine the eye motion of the driver according to the eye parameters extracted by the eye parameter extraction means 120. Herein, "eye movements" refer to movements related to the eyes of the driver of the vehicle 10, including, but not limited to, blinking movements, eye-closing movements, saccadic movements, and fixation movements.

According to an embodiment of the present invention, the eye movement determination device 130 may determine the eye movement of the driver according to a preset model. The eye movement determination device 130 may pre-store or retrieve one or more pre-set models from an external device. In some embodiments, the eye motion determination device 130 may: predetermining one or more motion categories (e.g., blink, eye closure, saccade, and gaze motions, etc.); selecting a plurality of identification features (such as eye feature outline, eye feature position change, duration and the like) for each action category and establishing a model according to a preset rule; then, the eye parameters obtained by the eye parameter extraction device 120 are input into the corresponding model, and whether the current eye action of the driver belongs to the predetermined action category is determined.

In one example, the eye movement determination device 130 may determine that the current eye movement of the driver is a gaze movement when it detects that none of the orbit contour, the orbit position, the pupil position, and the eyelid position has changed within the predetermined time period. In another example, the eye movement determination device 130 may determine that the current eye movement of the driver is a blinking movement when the change in the orbit contour and the orbit position is detected and the relative positions of the upper and lower eyelids change from 0 to 0 from the maximum value and then from 0 to the maximum value within the predetermined time period described above. In yet another example, the eye movement determination device 130 may determine that the current eye movement of the driver is an eye closing movement when it is detected that the eye orbit contour and the eye orbit position are not changed and the relative positions of the upper and lower eyelids are always 0 within the predetermined time period. In another example, the eye movement determination device 130 may determine that the current eye movement of the driver is a saccade movement when the eye contour, the eye orbit position, and the eyelid position are not changed within the predetermined time period but the pupil position is changed.

The eye simulation control device 140 may control the eye simulation device of the vehicle 10 such that the eye simulation device simulates the eye movement of the driver determined by the eye movement determination device 130. Specifically, after the eye movement determination means 130 determines the current eye movement of the driver, the eye simulation control means 140 may change the display state or presentation state of the eye simulation means so as to simulate the current eye movement of the driver.

In one embodiment, the eye simulation device may be a display device 300 mounted on the exterior of the vehicle 10. The display device 300 may be mounted at any location on the top, front, side, or rear of the vehicle 10. In some embodiments, the display device 300 may be mounted at the lower front windshield, door, or window of the vehicle 10. The display device 300 may be any one of an LCD display screen, an LED display screen, and an OLED display screen. In an exemplary embodiment, the display device 300 may be a flat display screen mounted on the roof of the vehicle 10, and the flat display screen may be disposed to rotate with respect to the body of the vehicle 10 so that the front of the display device 300 faces an external object, so that the external object may obtain an optimal viewing angle. Furthermore, the display device 300 may also be provided as a curved display screen that is visible at any angle relative to external objects around the vehicle 10. The display device 300 may present virtual eyes similar to the driver's real eyes to mimic the driver's real eye movements. For example, when the eye movement determining device 130 determines that the current eye movement of the driver is a gaze movement, the eye simulation control device 140 may control the display device 300 according to the eye parameters (such as the eye socket contour and the pupil position) extracted by the eye parameter extracting device 120, so that the display device 300 displays the virtual eye movement corresponding to the eye parameters, so as to ensure that the real gaze line of the driver is consistent with the virtual gaze line of the display device 300. Fig. 2 shows a front window external display device 300 mounted on the vehicle 10 simulating a gaze movement in which the driver is looking in the left direction.

Here, it should be noted that the position of the virtual gaze line should also take into account installation parameters of the display device 300, which are well known to those skilled in the art and will not be described herein.

In another embodiment, the eye simulation apparatus may be an illumination apparatus 400 mounted outside the vehicle 10. The lighting device 400 may be mounted at any location on the top, front, side, or rear of the vehicle 10. In some embodiments, the lighting device 400 may be mounted on the upper portions of the front and rear bumpers of the vehicle 10. The illumination device 400 may be formed of an array of M × N photodiodes. M may or may not be equal to N. Thus, the eye simulation control device 140 can cause the illumination device 400 to display a virtual eye movement corresponding to the real eye movement of the driver by controlling the on/off of the M rows × N columns of photodiodes.

It is described above that the eye simulation control device 140 controls the display device 300 and the illumination device 400 as the eye simulation device to simulate the real eye movements of the driver. However, it will be understood by those skilled in the art that the eye simulation control device 140 may also control the eye simulation device to simulate the shape of the eyes of the driver according to the eye parameters in the process of simulating the eye movements of the driver.

Thus, the driving assistance system according to the embodiment of the invention can determine the actual eye movements of the driver riding inside the cabin of the vehicle, and can directly and intuitively simulate the actual eye movements of the driver to an external target via a display device or an illumination device outside the vehicle.

According to other embodiments of the present invention, the driver of the vehicle 10 may selectively activate the driving assistance system 100 through the triggering device 150. Specifically, the triggering device 150 may allow at least one of the image receiving device 110, the eye parameter extraction device 120, the eye movement determination device 130, and the eye simulation control device 140 to start operating when the triggering condition is satisfied.

In some embodiments, the trigger condition may be that the vehicle speed of the vehicle 10 is greater than a predetermined vehicle speed, which may be obtained, for example, by a vehicle speed sensor of the vehicle 10. The predetermined vehicle speed may be, for example, 0. Therefore, the driving assist system 100 according to the present invention performs the simulation of the eye movement of the driver during the traveling of the vehicle, but stops the simulation of the eye movement of the driver during the braking of the vehicle to reduce the waste of driving resources.

In still other embodiments, the trigger condition may be the presence of an external object within a predetermined distance of the vehicle perimeter, which may be obtained, for example, by an infrared sensor or an ultrasonic sensor of the vehicle 10. The external target may be a pedestrian, a cyclist, a motorcyclist, or a driver of other vehicles. The predetermined distance may be 15 m. However, it should be understood that the predetermined distance should not be set so large as to avoid such a situation as follows: although the vehicle 10 simulates the driver's eye movement, the simulated eye movement is not clear to an external object due to a too large distance. Therefore, the driving assist system 100 according to the present invention performs the simulation of the eye movement of the driver when the external target exists around the vehicle, but stops the simulation of the eye movement of the driver when the external target does not exist around the vehicle to reduce the waste of driving resources.

As an alternative embodiment, the trigger condition may also include a manual trigger, which may be obtained, for example, by detecting a trigger instruction input by the driver of the vehicle through a button, touch, voice, or the like. For example, an operator interface may be provided at a dashboard or other operative location of the vehicle 10 to enable a driver of the vehicle to select whether to manually input a trigger instruction during vehicle travel.

A driving assist method according to an embodiment of the invention will be described below with reference to the drawings. Fig. 3 is a flowchart illustrating the driving assistance method 30 according to the embodiment of the invention. The driving assistance method 30 is executed by the driving assistance system 100 described above.

As shown in fig. 3, in step 32, the vehicle speed of the vehicle 10 and external objects within a predetermined distance around the vehicle 10 are detected. As mentioned above, the external target may be a pedestrian, a cyclist, a motorcyclist, or a driver of other vehicles. The method 30 proceeds to step S34 only when it is detected that the vehicle speed of the vehicle 10 is greater than the predetermined vehicle speed and/or that an external target is present within the predetermined distance of the vehicle periphery.

In step S34, an eye image of the driver of the vehicle 10 is received. According to an embodiment of the present invention, the eye image of the driver may be acquired from the camera device 200 of the vehicle 10. The eye image may include any one of a still image, a moving image, and a stereoscopic image of the eye. The method then proceeds to step S36.

In step S36, the eye parameters of the driver are extracted from the eye image received in step S34. Herein, "ocular parameter" means a parameter related to the ocular characteristics of the driver of the vehicle 10. According to the embodiment of the invention, the eye feature of the driver can be determined by analyzing the light and shadow change of the eye image, and then the absolute parameter of the eye feature can be obtained by directly detecting the determined eye feature. Furthermore, relative parameters between ocular features may be obtained by further calculating the directly measured portions of absolute ocular parameters. According to other embodiments of the present invention, the change in the position of the eye feature may also be extracted by analyzing a plurality of consecutive image frames in a dynamic image or a static image in the eye image. For a detailed description of the extraction of the ocular parameters related to the ocular features, reference is made to the above description, which is not repeated herein. The method then proceeds to step S38.

In step S38, the eye movement of the driver is determined from the eye parameters extracted in step S36. Herein, "eye movements" refer to movements related to the eyes of the driver of the vehicle 10, including, but not limited to, blinking movements, eye-closing movements, saccadic movements, and fixation movements. According to an embodiment of the present invention, the eye movement of the driver may be determined according to a preset model. Specifically, one or more motion categories (e.g., blink, eye closure, saccade, and gaze motions, etc.) may be predetermined; for each action category, selecting a number of recognition features (e.g., eye feature position change, duration, etc.) and modeling in a predetermined rule; then, the eye parameters obtained in step S36 are input into the corresponding model, and it is determined whether the current eye movement of the driver belongs to a predetermined movement category. For the detailed description of the eye movement judgment, refer to the above, and are not repeated herein. The method then proceeds to step S40.

In step S40, the eye simulation apparatus of the vehicle is controlled to simulate the eye movement of the driver determined in step S38. According to the embodiment of the invention, the current eye action of the driver can be simulated by changing the display state or the presentation state of the eye simulation device. The eye simulation apparatus may be the display apparatus 300 or the illumination apparatus 400 mounted outside the vehicle 10. For a detailed description of the display device or the illumination device, reference is made to the above description and no further description is made here.

Thus, the driving assistance method according to the embodiment of the invention can determine the actual eye movement of the driver riding inside the cabin of the vehicle, and can directly and intuitively simulate the actual eye movement of the driver to an external target via a display device or an illumination device outside the vehicle.

While the invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the construction and methods of the embodiments described above. On the contrary, the invention is intended to cover various modifications and equivalent arrangements. In addition, while the various components and method steps of the disclosed invention are shown in various example combinations and configurations, other combinations, including more, less or all, of the components or methods are also within the scope of the invention.

12页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于指示车辆的自主运动学动作的方法和系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!