Vehicle parking position notification system

文档序号:1870308 发布日期:2021-11-23 浏览:14次 中文

阅读说明:本技术 车辆停车位置通知系统 (Vehicle parking position notification system ) 是由 桥口拓允 岩濑贵志 安竹昭 有永刚 于 2021-03-10 设计创作,主要内容包括:本发明的车辆停车位置通知系统中,在车辆处于停车中的状况下且满足指定的开始条件时,控制部以如下的方式控制飞行体:使飞行体从车辆起飞,并且使拍摄部拍摄用于通知车辆的停车位置且至少包含该车辆的位置通知图像,且使位置通知图像发送到车外的使用者所携带的具有显示部的移动终端。由此,在车辆在大型停车场停车的状况下,能够将车辆的停车位置可靠地通知给使用者。(In the vehicle parking position notification system according to the present invention, when the vehicle is in a parking state and a predetermined start condition is satisfied, the control unit controls the flight vehicle as follows: the flying object is taken off from the vehicle, and the image capturing unit captures an image including at least a position notification image of the vehicle for notifying a parking position of the vehicle, and transmits the position notification image to a mobile terminal having a display unit carried by a user outside the vehicle. Thus, the user can be reliably notified of the parking position of the vehicle in a situation where the vehicle is parked in a large parking lot.)

1. A vehicle parking position notification system characterized by comprising:

a flying body which is mounted on a vehicle, has an imaging unit, and can fly outside the vehicle; and

a control unit that controls the flying object; wherein the content of the first and second substances,

the control unit performs control such that, when the vehicle is in a parking state and a predetermined start condition is satisfied: the flying object is caused to take off from the vehicle, and the image pickup unit is caused to pick up an image including at least a position notification image of the vehicle for notifying a parking position of the vehicle, and the position notification image is caused to be transmitted to a mobile terminal having a display unit carried by a user outside the vehicle.

2. The vehicle parking position notification system according to claim 1, characterized in that:

the control unit performs control as follows: and causing the imaging unit to image an image including both the vehicle and the user outside the vehicle as the position notification image based on the position information of the vehicle and the position information of the mobile terminal.

3. The vehicle parking position notification system according to claim 1 or 2, characterized in that:

the position notification image displays a direction indication graphic indicating a direction of a parking position of the vehicle with respect to a current position of the user.

4. The vehicle parking position notification system according to any one of claims 1 to 3, characterized in that:

the position notification image displays a path indication graphic indicating a recommended path from the current position of the user to the parking position of the vehicle.

5. The vehicle parking position notification system according to any one of claims 1 to 4, characterized in that:

the flying object further has an illumination section for illuminating light,

the control unit controls the illumination unit to illuminate light toward the user.

6. The vehicle parking position notification system according to any one of claims 1 to 5, characterized in that:

the control unit controls the flying object so that the flying object takes off from the vehicle, and causes the image capturing unit to capture the position notification image and transmit the position notification image to the mobile terminal, with a detection of the user approaching the vehicle in a stop as the start condition.

Technical Field

The present invention relates to a vehicle parking position notification system.

Background

When parking in a large parking lot such as a large commercial facility, a user may forget a correct parking position of his or her vehicle when he or she leaves the vehicle and returns to the vehicle.

Japanese patent laid-open publication No. 2016-138853 discloses the following technique: when a door lock of a vehicle is released by a keyless entry system in a situation where the vehicle is parked in a large parking lot, a flying object mounted on the vehicle flies to use the flying object as a mark of a position of the vehicle.

However, since the flying object mounted on the vehicle is generally small, it is difficult for the user to visually recognize the flying object only by flying the flying object in a large parking lot when the user is located at a position away from the vehicle.

Disclosure of Invention

The present invention has been made in view of the above circumstances, and an object thereof is to provide a vehicle parking position notification system including: in a situation where a vehicle is parked in a large parking lot, the parking position of the vehicle can be reliably notified to the user.

A vehicle parking position notification system according to an aspect of the present invention includes: a flying body which is mounted on a vehicle, has an imaging unit, and can fly outside the vehicle; and a control unit that controls the flying object; wherein, when the vehicle is in a parking state and a predetermined start condition is satisfied, the control unit performs control such that: the flying object is caused to take off from the vehicle, and the image pickup unit is caused to pick up an image including at least a position notification image of the vehicle for notifying a parking position of the vehicle, and the position notification image is caused to be transmitted to a mobile terminal having a display unit carried by a user outside the vehicle.

According to the present invention, in a situation where a vehicle is parked in a large parking lot, the parking position of the vehicle can be reliably notified to the user.

Drawings

Fig. 1 is a diagram showing an application example of a parking position notification system according to an embodiment of the present invention.

Fig. 2 is a diagram schematically showing the appearance of the flight vehicle.

Fig. 3 is a block diagram showing a functional configuration of the flight vehicle.

Fig. 4 is a block diagram showing a functional structure of the vehicle.

Fig. 5 is a block diagram showing a functional configuration of the flight control unit.

Fig. 6 is a diagram showing a functional configuration of the vehicle information detection unit.

Fig. 7 is a flowchart showing a control flow of the parking position notification control executed by the vehicle control unit.

Fig. 8 is a diagram showing an example of a photographed image displayed on the display unit of the mobile terminal.

Detailed Description

Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. Elements that are labeled the same in different figures are assumed to be the same or corresponding elements.

Fig. 1 is a diagram showing an application example of a parking position notification system according to an embodiment of the present invention. A vehicle 1 is mounted with a flying body 2 capable of flying outside the vehicle 1. A landing platform 3 of the flying object 2 is disposed at a predetermined position (in this example, on the rear window) of the vehicle 1. The landing stage 3 has a horizontal landing surface for landing the flying object 2. Below the landing surface, a take-up winder 19 (not shown in fig. 1) is arranged. The take-up winder 19 has a rotating shaft (not shown) around which the tie 4 serving as a power feeding cord is wound. A through hole is provided in a substantially central portion of the rising and falling surface, and the tie 4 wound around the rotation shaft of the take-up winder 19 is pulled out above the rising and falling surface through the through hole and connected to the flying body 2.

Fig. 2 is a diagram schematically showing the external appearance of the flying object 2. The flying body 2 is configured as a so-called quad-rotor type drone. The flying object 2 includes a main body 111, a camera 112 disposed on the front side surface of the main body 111, propellers 113 disposed at the four corners of the main body 111, shafts 114 extending perpendicularly from the left and right side surfaces of the main body 111, and a spherical mesh-like cushion member 115 enclosing the main body 111. The body 111 and the cushion member 115 are fixed to each other by a shaft 114. The tie member 4 is connected to the bottom surface of the body portion 111.

Fig. 3 is a block diagram showing a functional configuration of the flight vehicle 2. As shown in fig. 3, the flying object 2 includes a communication unit 21, an imaging unit 22, a driving unit 23, a position detection unit 24, a posture detection unit 25, and an illumination unit 26. The flying object 2 also has a control unit 20 that controls the operations of the processing units described above to control the overall control of the flying object 2. The flight vehicle 2 further includes a power supply unit 29 that supplies drive power to the processing units and the control unit 20. The power supply portion 29 is connected to the tie 4. By supplying the driving power of the flying object 2 from the vehicle 1 side via the tie 4, the mounting of the battery on the flying object 2 can be omitted, and as a result, the weight of the flying object 2 can be reduced. The total weight of the flying body 2 is smaller than a limit weight (for example, 200 g) to be a weight-based flight limiting object.

The communication unit 21 performs data communication bidirectionally with the communication unit 122 on the above-described flying body 1 side by a short-range wireless communication method such as Bluetooth (registered trademark). However, the communication unit 21 and the communication unit 122 may be configured to perform wired communication via a data communication line by adding the data communication line to the tie member 4.

The imaging section 22 includes a camera 112 shown in fig. 2. The image captured by the imaging unit 22 includes both a still image (photograph) and a moving image (video). In the following description, the case of the image captured by the imaging unit 22 is taken as an example. The imaging unit 22 stores image data of an image captured by the camera 112 in a recording medium such as a flash memory and outputs the image data in real time. The image data may be stored on the vehicle 1 side.

The driving unit 23 includes a motor for driving the propeller shaft of the propeller 113 shown in fig. 2 to rotate. The driving unit 23 individually controls the rotational direction and the rotational speed of the four propellers 113. This enables the flying body 2 to perform any flying operation such as forward movement, backward movement, upward movement, downward movement, turning, hovering, and the like.

The position detection unit 24 includes a GPS receiver, an altitude sensor, and the like, and detects the position of the flying object 2 in real time to output position data indicating the detected position.

The attitude detection unit 25 includes an acceleration sensor, a gyro sensor, an orientation sensor, and the like, and detects the attitude of the flying object 2 in real time to output attitude data indicating the detected attitude.

The illumination unit 26 includes any illumination device such as an LED disposed on one or more surfaces including the front surface of the main body 111 of the flying object 2.

The control unit 20 transmits the image data output from the imaging unit 22, the position data output from the position detection unit 24, and the posture data output from the posture detection unit 25 to the vehicle 1 from the communication unit 21 in real time.

Fig. 4 is a block diagram showing a functional configuration of the vehicle 1. As shown in fig. 4, the vehicle 1 includes a battery control unit 11, a flying object control unit 12, a vehicle information detection unit 14, a communication unit 15, an image processing unit 51, and an approach detection unit 52. The vehicle 1 further includes a control unit 10 that controls the operations of the processing units described above to control the overall control of the vehicle 1. The vehicle 1 further includes a battery 18 that supplies drive power to the processing units and the control unit 10. The tie member 4 is connected to the battery 18. Furthermore, the vehicle 1 has a take-up winder 19. The driving power of the take-up winder 19 is supplied from the battery 18. The take-up winder 19 includes a rotating shaft (not shown) around which the fastener 4 is wound and a motor (not shown) for driving the rotating shaft to rotate. The take-up winder 19 controls the feeding and recovery of the tie members 4 based on the driving of the rotating shaft by the motor so that an appropriate amount of the tie members 4 corresponding to the flying condition of the flying body 2 are fed out from the rotating shaft.

The battery control unit 11 controls the charging and discharging operations of the battery 18. The battery control unit 11 controls the start and stop of the supply of electric power from the battery 18 to the tie 4.

The flying object control unit 12 controls the flight of the flying object 2. The details of the flight control unit 12 will be described later.

The vehicle information detection unit 14 detects various information of the vehicle 1. The details of the vehicle information detection unit 14 will be described later.

The communication unit 15 performs data communication bidirectionally with the mobile terminal 50 of a user registered in advance by using an arbitrary wireless communication method such as Bluetooth (registered trademark), wireless LAN, or public telephone line network. The mobile terminal 50 is a smartphone, a mobile phone, a tablet computer, a notebook computer, a smart key, or the like, and has a display unit (LCD, organic EL, or the like) and a position detection unit (GPS receiver, or the like).

The image processing unit 51 acquires the positional information of the vehicle 1 from the GPS receiver 148 (fig. 6) of the vehicle information detection unit 14, the positional information of the flying object 2 from the position detection unit 24 via the intervening flying object control unit 12, and the positional information of the mobile terminal 50 from the mobile terminal 50 via the communication unit 15. The image processing unit 51 also acquires attitude information of the flight vehicle 2 from the attitude detection unit 25 via the flight vehicle control unit 12. The image processing unit 51 specifies the vehicle 1 and the user 70 (the user carrying the mobile terminal 50) in the captured image 60 (fig. 8) captured by the imaging unit 22 of the flying object 2, based on the positional information of the vehicle 1, the flying object 2, and the mobile terminal 50 and the attitude information of the flying object 2. The map processing unit 51 analyzes the captured image 60 to identify a sidewalk, a crosswalk, a lane, a parking space, and the like in the captured image 60. The map processing unit 51 derives a recommended route for the user 70 to walk from the current position to the parking position of the vehicle 1, based on the priority sidewalk and crosswalk. The image processing unit 51 superimposes a route instruction pattern 63 that instructs the recommended route on the captured image 60 by using AR (Augmented Reality) technology.

The proximity detection unit 52 detects the proximity of the user 70 to the vehicle 1 by observing the time-series change in the position of the mobile terminal 50 relative to the position of the vehicle 1 based on the positional information of the vehicle 1 and the mobile terminal 50. The approach detection unit 52 detects that the user 70 approaches the vehicle 1 when the distance between the vehicle 1 and the mobile terminal 50 is less than a predetermined value (several tens of meters, for example, 30 m).

Fig. 5 is a block diagram showing a functional configuration of the flight control unit 12. As shown in fig. 5, the flight control unit 12 includes a communication unit 122, a flight path generation unit 123, a steering signal generation unit 124, an imaging signal generation unit 125, a winding control unit 126, and an illumination signal generation unit 127. The flight control unit 12 also includes a control unit 121 that controls the operations of the processing units described above.

The communication unit 122 performs data communication bidirectionally with the communication unit 21 on the flight vehicle 2 side described above by a short-range wireless communication method such as Bluetooth (registered trademark). However, the wireless communication method is not limited to short distance communication, and may be mobile communication. The communication unit 122 receives the image data, the position data, and the posture data transmitted from the communication unit 21 on the flying object 2 side.

The map processing section 123 acquires the positional information of the vehicle 1 from the GPS receiver 148 of the vehicle information detection section 14, and acquires the positional information of the mobile terminal 50 from the mobile terminal 50 via the communication section 15. The flight path generation unit 123 derives an imaging position for imaging an image in which both the vehicle 1 and the user 70 are included in the same frame, based on the respective position information of the vehicle 1 and the mobile terminal 50. The flight path generation unit 123 generates, as a flight path, the shortest path along which the flying object 2 is moved from the current position of the vehicle 1 to the imaging position.

The steering signal generation unit 124 generates a steering signal for flying the flying object 2 along the flight path generated by the flight path generation unit 123, based on the image data, the position data, and the posture data received by the communication unit 122.

The imaging signal generation unit 125 generates an imaging start signal for starting imaging by the imaging unit 22 of the flying object 2 and an imaging stop signal for stopping imaging, based on the flying state of the flying object 2.

The winding control unit 126 controls the take-up winder 19. Specifically, the winding control unit 126 controls the feeding and the recovery of the tie 4 by driving the rotating shaft by the motor so that an appropriate amount of the tie 4 corresponding to the flight condition of the flight body 2 is paid out from the rotating shaft of the take-up winder 19.

The illumination signal generation unit 127 generates an illumination signal for causing the illumination unit 26 of the flying object 2 to perform irradiation of a predetermined notification illumination (for example, continuous irradiation of a high-luminance flash).

The control unit 121 transmits the steering signal generated by the steering signal generation unit 124, the shooting start signal and the shooting stop signal generated by the shooting signal generation unit 125, and the illumination signal generated by the illumination signal generation unit 127 from the communication unit 122 to the flying object 2 in real time.

Fig. 6 is a diagram showing a functional configuration of the vehicle information detection unit 14. As shown in fig. 6, the vehicle information detection unit 14 includes a door lock sensor 141, a door open/close sensor 142, a seat sensor 143, a start detection sensor 144, an intrusion detection sensor 146, and a GPS receiver 148.

The door lock sensor 141 detects a locked state or an unlocked state of the door of the vehicle 1. The door opening/closing sensor 142 detects an open state or a closed state of the door of the vehicle 1. The seat sensor 143 detects a seating state of an occupant on a seat of the vehicle 1. The start detection sensor 144 detects a start state or a stop state of a driving force generation device (an engine, a travel motor, or the like) of the vehicle 1 by an ignition switch sensor or by detecting an on state or an off state of a relay switch connecting the battery and the travel motor. The in-vehicle intrusion detection sensor 146 detects the presence or absence of an intruder in a monitoring area, which is the in-vehicle area of the vehicle 1 in a stopped state. The GPS receiver 148 determines the current position of the vehicle 1 from GPS signals received from GPS satellites, and outputs position information indicating the current position.

Fig. 7 is a flowchart showing a control flow of the parking position notification control executed by the control unit 10 of the vehicle 1. The control unit 10 executes the control flow described above when the vehicle 1 is parked in a large parking lot having a predetermined area or more, such as a parking lot of a large commercial facility. The control section 10 detects that the current position of the vehicle 1 is within the parking lot based on the position information output from the GPS receiver 148. The control unit 10 detects that the driving force generation device of the vehicle 1 is in a stopped state by the start detection sensor 144, that the interior of the vehicle 1 is in an unattended state by the interior intrusion detection sensor 146, and that all the doors of the vehicle 1 are in a locked state by the door lock sensor 141, thereby detecting that the vehicle 1 is parked.

When the user 70 carrying the portable terminal 50 forgets the parking position of the vehicle 1 in the large parking lot, the user can request the vehicle 1 to transmit the position notification image by performing an operation input from a touch panel or the like of the portable terminal 50. The request signal is transmitted from the portable terminal 50 to the vehicle 1 based on the transmission request made by the user 70.

In step SP11, the control unit 10 determines whether the communication unit 15 has received the request signal from the mobile terminal 50.

When the request signal is not received by the communication unit 15 (no in step SP 11), the control unit 10 repeatedly executes the process of step SP 11.

When the communication unit 15 receives the request signal (yes in step SP 11), the control unit 10 causes the flight control unit 12 to execute the takeoff control of the flight vehicle 2 in step SP 12. First, the battery control unit 11 starts the supply of electric power from the battery 18 to the flying object 2 via the tie 4. Next, the steering signal generating unit 124 generates a steering signal for taking off the flying object 2 from the landing stage 3. Next, the communication unit 122 transmits the steering signal to the flight vehicle 2. Next, the communication unit 21 on the flying object 2 side receives the steering signal transmitted from the communication unit 122 on the vehicle 1 side. Next, the driving unit 23 starts control for taking off the flying object 2 by driving the propeller 113 in accordance with the steering signal. Thereby, the flying object 2 takes off from the landing platform 3.

Next, in step SP13, the control unit 10 causes the flying object control unit 12 to execute flight control and imaging control of the flying object 2. The flight path generation unit 123 derives an imaging position for imaging an image in which both the vehicle 1 and the user 70 are included in the same frame, based on the respective pieces of positional information of the vehicle 1 and the mobile terminal 50, and generates a flight path for moving the flying object 2 to the imaging position. Next, the steering signal generating unit 124 generates a steering signal for moving the flying object 2 to the imaging position along the flight path. Next, the communication unit 122 transmits the steering signal to the flight vehicle 2. Next, the communication unit 21 on the flying object 2 side receives the steering signal transmitted from the communication unit 122 on the vehicle 1 side. Next, the driving unit 23 drives the propeller 113 in accordance with the steering signal. Thereby, the flying object 2 flies to the imaging position along the flight path, and thereafter, hovers to maintain the imaging position. The imaging signal generation unit 125 generates an imaging start signal for starting the imaging by the imaging unit 22 of the flying object 2. The shooting start signal is transmitted from the communication unit 122 to the flying object 2, whereby the shooting unit 22 starts shooting by the camera 112. The image data of the captured image 60 captured by the camera 112 is transmitted from the communication unit 21 to the vehicle 1.

In step SP13, the control unit 10 may cause the illumination signal generation unit 127 to generate an illumination signal for executing the illumination of the notification illumination. The illumination signal is transmitted from the communication unit 122 to the flying object 2, and notification illumination (light) is thereby emitted from the illumination unit 26 of the flying object 2 toward the user 70. This allows the user 70 to easily visually recognize the flying object 2 even from a distance.

Next, in step SP14, the control unit 10 executes image display control for displaying the captured image 60 on the portable terminal 50. The control unit 10 transfers the image data received from the flight vehicle 2 to the communication unit 15, and the communication unit 15 transmits the image data to the mobile terminal 50. Thereby, the shot image 60 shot by the camera 112 of the flying object 2 is displayed on the display unit 55 of the mobile terminal 50. The image data output from the imaging unit 22 may be directly transmitted from the flying object 2 to the mobile terminal 50 without passing through the vehicle 1.

Fig. 8 is a diagram showing an example of the photographed image 60 displayed on the display unit 55 of the mobile terminal 50. The camera 112 of the flying object 2 captures images of the vehicle 1 and the user 70 from an imaging position obliquely above the vehicle 1, whereby both the vehicle 1 and the user 70 are included in the captured image 60. The image processing unit 51 may highlight the vehicle 1 and the user 70 in the captured image 60 by coloring or labeling, for example, so as to be easily distinguished from other vehicles and other persons. Further, the image processing unit 51 superimposes a route instruction pattern 63 that instructs a recommended route when the user 70 walks from the current position to the parking position of the vehicle 1 on the captured image 60.

Further, a direction indication graphic 61 indicating a direction of the parking position of the vehicle 1 with respect to the current position of the user 70 is displayed on the display portion 55. The direction indication graphic 61 shown in fig. 8 means that the vehicle 1 is positioned diagonally in front right when looking out from the user 70 who holds the mobile terminal 50 upright at the front of the body. The direction instruction figure 61 is generated by the mobile terminal 50 based on the position information of each of the vehicle 1 and the mobile terminal 50 and the posture information of the mobile terminal 50.

Further, the display unit 55 displays a confirmation graphic 62 that is tapped by the user 70 when the user 70 finds the vehicle 1. The confirmation pattern 62 is generated by the image processing unit 51 or the mobile terminal 50. Based on the confirmation pattern 62 being tapped, an end signal of parking position notification control including a return command of the flying object 2 is transmitted from the mobile terminal 50 to the vehicle 1.

In step SP15, the control unit 10 determines whether the communication unit 15 has received a signal for ending the parking position notification control from the mobile terminal 50.

When the communication unit 15 has not received the end signal (no in step SP 15), next in step SP16, the control unit 10 determines whether boarding of the user 70 on the vehicle 1 has been detected. Specifically, the control unit 10 detects that the door opening/closing sensor 142 opens or closes the door of the driver's seat of the vehicle 1, or that the seat sensor 143 detects that the occupant sits on the driver's seat of the vehicle 1 after the opening/closing operation, thereby detecting that the user 70 rides on the vehicle 1.

If the boarding of the user 70 on the vehicle 1 is not detected (no in step SP 16), the control unit 10 repeatedly executes the processing of steps SP13 to SP 16.

When the communication unit 15 receives the end signal (yes in step SP 15) or when the boarding of the user 70 on the vehicle 1 is detected (yes in step SP 16), next in step SP17, the control unit 10 causes the maneuver signal generating unit 124 to generate a maneuver signal for flying the flying object 2 from the current position toward the landing stage 3. The flight of the flying object 2 is controlled based on the steering signal transmitted to the flying object 2, and the flying object 2 returns to the landing stage 3.

In addition, when the flying object 2 is returned based on detection of the boarding of the user 70 on the vehicle 1, the control unit 10 may forcibly collect the flying object 2 by rotating the rotating shaft of the take-up winder 19 at a high speed by the winding control unit 126 instead of causing the steering signal generation unit 124 to generate the steering signal for returning the flying object 2. This enables the flying object 2 to be quickly collected.

Next, the imaging signal generation unit 125 generates an imaging stop signal for stopping the imaging unit 22 of the flying object 2 from imaging. The shooting stop signal is transmitted to the flying object 2, and the shooting unit 22 stops shooting by the camera 112. Thereafter, the battery control unit 11 stops the supply of electric power from the battery 18 to the flying object 2.

According to the parking position notification system of the present embodiment, in a situation where the vehicle 1 is parking in the parking lot, the control unit 10 causes the flying object 2 to fly from the vehicle 1, causes the imaging unit 22 of the flying object 2 to capture the captured image 60 (position notification image), and causes the display unit 55 of the mobile terminal 50 to display the captured image 60. The user 70 carrying the portable terminal 50 can determine the parking position of the vehicle 1 by confirming the photographed image 60 displayed on the display unit 55 of the portable terminal 50 and specifying the buildings or the like serving as signs around the vehicle 1. As a result, even in a situation where the vehicle 1 is parked in a large parking lot, the parking position of the vehicle 1 can be reliably notified to the user 70.

In the photographed image 60, both the vehicle 1 and the user 70 are included in the frame. Therefore, the user 70 carrying the mobile terminal 50 can easily determine the parking position of the vehicle 1 by checking the photographed image 60 displayed on the display unit 55 of the mobile terminal 50 to grasp the relative positional relationship between the user and the vehicle 1.

Further, a direction indication graphic 61 indicating a direction of the parking position of the vehicle 1 with respect to the current position of the user 70 is attached to the photographed image 60. Therefore, the user 70 carrying the portable terminal 50 can easily determine the parking position of the vehicle 1 by checking the photographed image 60 and the direction indication pattern 61 displayed on the display unit 55 of the portable terminal 50.

Further, a route instruction pattern 63 that instructs a recommended route from the current position of the user 70 to the parking position of the vehicle 1 is attached to the photographed image 60. Therefore, the user 70 carrying the mobile terminal 50 can easily reach the parking position of the vehicle 1 by checking the photographed image 60 and the route instruction pattern 63 displayed on the display unit 55 of the mobile terminal 50.

Further, the control unit 10 executes the parking position notification control on the condition that the transmission request of the position notification image received from the mobile terminal 50 is made. Therefore, it is possible to avoid in advance the execution of unnecessary parking position notification control in a case where the user 70 remembers the parking position of the vehicle 1.

Further, when the boarding of the occupant on the vehicle 1 is detected, the control unit 10 returns the flying object 2 to the vehicle 1. Therefore, the flying object 2 can be returned before the start of traveling of the vehicle 1, and therefore, it is possible to avoid a situation in which the flying object 2 comes into contact with a surrounding obstacle after the start of traveling.

(modification)

In the above-described embodiment, the control unit 10 executes the takeoff control of the flying object 2 on the condition that the communication unit 15 receives the request signal from the mobile terminal 50 (i.e., an example of the start condition), but the present invention is not limited to this example. In a situation where the vehicle 1 is parked, the control unit 10 may execute the takeoff control of the flying object 2 on the condition that the approach detection unit 52 detects that the user 70 approaches the vehicle 1 (i.e., another example of the start condition).

The approach detection unit 52 detects that the user 70 approaches the vehicle 1 while the vehicle is parked when the distance between the vehicle 1 and the mobile terminal 50 is less than a predetermined value (several tens of meters, for example, 30m) based on the positional information of the vehicle 1 and the mobile terminal 50. When the approach detection unit 52 detects that the user 70 approaches the vehicle 1, the control unit 10 executes the takeoff control of the flying object 2 regardless of whether the communication unit 15 receives the request signal from the portable terminal 50 (step SP 12).

According to this modification, the control unit 10 executes the parking position notification control on the condition that the approach of the user 70 to the vehicle 1 in the parking state is detected. Therefore, when starting to execute the parking position notification control, the user 70 does not need to manually input the operation to the mobile terminal 50, and the convenience of the user 70 can be improved.

< summary >

A vehicle parking position notification system according to an aspect of the present invention includes: a flying body which is mounted on a vehicle, has an imaging unit, and can fly outside the vehicle; and a control unit that controls the flying object; wherein, when the vehicle is in a parking state and a predetermined start condition is satisfied, the control unit performs control such that: the flying object is caused to take off from the vehicle, and the image pickup unit is caused to pick up an image including at least a position notification image of the vehicle for notifying a parking position of the vehicle, and the position notification image is caused to be transmitted to a mobile terminal having a display unit carried by a user outside the vehicle.

According to this aspect, when the predetermined start condition is satisfied in a state where the vehicle is parked, the control unit controls the flying object so that the flying object takes off from the vehicle, the image pickup unit picks up the position notification image, and the picked-up position notification image is transmitted to the mobile terminal carried by the user outside the vehicle. A user carrying the mobile terminal can determine the parking position of the vehicle by confirming the position notification image displayed on the display unit of the mobile terminal and specifying a building or the like serving as a sign around the vehicle. As a result, even in a situation where the vehicle is parked in a large parking lot, the parking position of the vehicle can be reliably notified to the user.

In the above aspect, the control unit preferably performs control as follows: and causing the imaging unit to image an image including both the vehicle and the user outside the vehicle as the position notification image based on the position information of the vehicle and the position information of the mobile terminal.

According to this aspect, both the vehicle and the user outside the vehicle are included in the position notification image. Therefore, the user carrying the mobile terminal can easily determine the parking position of the vehicle by confirming the position notification image displayed on the display unit of the mobile terminal to grasp the relative positional relationship between the user and the vehicle.

In the above aspect, it is preferable that the position notification image displays a direction indication graphic for indicating a direction of a parking position of the vehicle with respect to a current position of the user.

According to this aspect, the direction indication graphic indicating the direction of the parking position of the vehicle with respect to the current position of the user is displayed on the position notification image. Therefore, the user carrying the mobile terminal can easily determine the parking position of the vehicle by checking the position notification image and the direction indication pattern displayed on the display unit of the mobile terminal.

In the above aspect, it is preferable that the position notification image displays a route instruction pattern for instructing a recommended route from the current position of the user to the parking position of the vehicle.

According to this aspect, the route indication graphic indicating the recommended route from the current position of the user to the parking position of the vehicle is displayed on the position notification image. Therefore, the user carrying the mobile terminal can easily reach the parking position of the vehicle by confirming the position notification image and the route instruction pattern displayed on the display unit of the mobile terminal.

In the above aspect, it is preferable that the flying object further includes an illumination unit that irradiates light, and the control unit controls the illumination unit to irradiate light to the user.

According to this aspect, the control unit controls the lighting unit of the flying object to emit light toward the user. This allows the user to easily visually recognize the flying object even from a distance.

In the above-described aspect, it is preferable that the control unit controls the flying object so that the flying object takes off from the vehicle, causes the imaging unit to image the position notification image, and transmits the position notification image to the mobile terminal, with the start condition being that the user is detected to be approaching the vehicle in a stopped state.

According to this configuration, the control unit controls the flying object so that the flying object takes off from the vehicle, causes the image capturing unit to capture the position notification image, and transmits the position notification image to the mobile terminal, with the detection of the approach of the user to the vehicle while the vehicle is parked as a start condition. Therefore, when the control is started, the user does not need to perform manual operation input, and the convenience of the user can be improved.

18页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种模块化无人机

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!