Information processing unit, information processing method, message handling program, image processing apparatus and image processing system

文档序号:1776839 发布日期:2019-12-03 浏览:21次 中文

阅读说明:本技术 信息处理装置、信息处理方法、信息处理程序、图像处理装置以及图像处理系统 (Information processing unit, information processing method, message handling program, image processing apparatus and image processing system ) 是由 森浩史 佐野洋之 宫牧秀宇 辻永雅人 佐藤洋一郎 于 2018-03-27 设计创作,主要内容包括:该信息处理装置获取已经通过图像拾取装置按照图像拾取指令拾取的图像,所述图像拾取装置安装在移动体上;并且对应于按照所述图像拾取指令而从所述图像拾取装置发送的信号,该信息处理装置从传感器装置获取包括所述移动物体的位置信息的传感器信息,并将由此获取的传感器信息与由此获取的图像相关联。(Information processing unit acquisition has passed through the image that image pick-up device is picked up according to image pickup instruction, and described image pick device is mounted on moving body;And correspond to the signal for picking up instruction according to described image and sending from described image pick device, the information processing unit obtains the sensor information of the location information including the mobile object from sensor device, and the sensor information thus obtained is associated with the image thus obtained.)

1. a kind of information processing unit, is configured as:

The imaging device by being mounted on mobile object is obtained according to imaging instruction captured image;

The signal sent according to response in the imaging instruction from the imaging device, obtaining from sensor device includes the shifting The sensor information of the location information of animal body;And

The sensor information that will acquire is associated with the image of acquisition.

2. information processing unit according to claim 1, wherein

The sensor information further includes the elevation information of the mobile object.

3. information processing unit according to claim 1, wherein

The sensor information further includes the inclination information of the mobile object.

4. information processing unit according to claim 1, wherein

The signal designation and the imaging device capture image-related phototiming.

5. information processing unit according to claim 1, wherein

The sensor information is the newest sensor information on the time point of phototiming.

6. information processing unit according to claim 1, wherein

The information processing unit calculates new sensor information according to multiple sensor informations, and by new sensor information with Image is associated.

7. information processing unit according to claim 4, wherein

The phototiming is the timing of the end for the exposure that the imaging device is executed to capture image.

8. information processing unit according to claim 4, wherein

The phototiming is the timing of the beginning for the exposure that the imaging device is executed to capture image.

9. information processing unit according to claim 4, wherein

The phototiming is the timing for issuing the imaging instruction for imaging device capture image.

10. information processing unit according to claim 1, wherein

The information processing unit creates composograph by arranging multiple images based on the sensor information.

11. information processing unit according to claim 10, wherein

The information processing unit arranges institute while correcting position for described multiple images based on the sensor information State multiple images.

12. information processing unit according to claim 10, wherein

The sensor information further includes the elevation information of the mobile object, and

Information processing unit placement of images while zooming in or out image based on the elevation information.

13. information processing unit according to claim 10, wherein

The sensor information further includes the inclination information of the mobile object, and

Information processing unit layout drawing while executing keystone and/or pruning modes based on the inclination information Picture.

14. information processing unit according to claim 1, wherein

The information processing unit obtains the elevation data of the height above sea level for the ground surface that instruction is captured by the imaging device, and by institute It is associated to state the information for including in elevation data and the sensor information.

15. information processing unit according to claim 14, wherein

The information processing unit is associated with image by the elevation data.

16. information processing unit according to claim 14, wherein

Based on the location information of the mobile object, the height above sea level is obtained from the map datum comprising position and the information of height above sea level Data.

17. information processing unit according to claim 14, wherein

The elevation data is obtained based on the range information obtained by the distance measuring sensor being installed on the mobile object.

18. information processing unit according to claim 17, wherein

The information processing unit is according to response in the imaging instruction from the signal that the imaging device is sent from the ranging Sensor obtains the elevation data, and the elevation data that will acquire and the information for including in the sensor information are associated.

19. a kind of information processing method, comprising:

The imaging device by being mounted on mobile object is obtained according to imaging instruction captured image;

The signal sent according to response in the imaging instruction from the imaging device, obtaining from sensor device includes the shifting The sensor information of the location information of animal body;And

The sensor information that will acquire is associated with the image of acquisition.

20. a kind of message handling program,

The message handling program makes computer execution information processing method, and the information processing method includes:

The imaging device by being mounted on mobile object is obtained according to imaging instruction captured image;

The signal sent according to response in the imaging instruction from the imaging device, obtaining from sensor device includes the shifting The sensor information of the location information of animal body;And

The sensor information that will acquire is associated with the image of acquisition.

21. a kind of image processing apparatus, is configured as

It receives and believes according to response in the position including mobile object that imaging instruction is obtained from the signal that imaging device is sent The offer of the associated multiple images of the sensor information of breath, and by arranging the multiple figure based on the sensor information As creating composograph.

22. image processing apparatus according to claim 21, is configured to

Described multiple images are arranged by the elevation data of the height above sea level on the ground captured based on instruction by the imaging device to create Build composograph.

23. image processing apparatus according to claim 22, wherein described image processing unit

It defines and corresponding first coordinate of areas imaging associated with the movement of the mobile object of the imaging device System and the second coordinate system corresponding with the face of placement of images;

Based on the sensor information and the elevation data, projection position of the image on ground surface in the first coordinate system is calculated It sets, and

By the way that projected position is converted to the second coordinate system come placement of images.

24. image processing apparatus according to claim 23, wherein

For each angle in four angles of the image with quadrangle form, the projected position of image is calculated.

25. image processing apparatus according to claim 24, wherein

Described image processing unit is by being converted to the second coordinate system for the projected position at four angles of image come placement of images.

26. image processing apparatus according to claim 22, wherein

The elevation data is obtained from the map datum comprising position and the information of height above sea level.

27. image processing apparatus according to claim 26, wherein

Based on the location information of the mobile object, by reference to the map datum comprising position and the information of height above sea level to obtain State elevation data.

28. image processing apparatus according to claim 22, wherein

The elevation data is obtained by the distance measuring sensor being mounted on the mobile object.

29. a kind of image processing system, comprising:

Mobile object;

Imaging device, the imaging device are installed on the mobile object;

Sensor device, the sensor device are installed on the mobile object and are configured as referring in imaging according to response The signal sent from imaging device is enabled to detect the sensor information of the location information including the mobile object;And

Information processing unit, the information processing unit are configured as the sensor information and imaging device capture Image is associated.

30. image processing system according to claim 29, wherein

The information processing unit creates composograph by arranging multiple images based on sensor information.

31. image processing system according to claim 29, further includes

Image processing apparatus, described image processing unit be configured as receive captured by the imaging device and with the sensor The offer of the associated multiple images of information, and by arranging that described multiple images are closed to create based on the sensor information At image.

Technical field

This technology is related to information processing unit, information processing method, message handling program, image processing apparatus and image Processing system.

Background technique

In the related art, there are a kind of method, this method is using the camera being mounted on mobile object between fixed Every being imaged, by extracting the shared characteristic point of lap by the adjacent image in back and forth or left and right each other and handing over Fork merges image while checking these characteristic points to obtain the composograph (patent document 1) on big region.

Reference listing

Patent document

Patent document 1: Japanese Unexamined Patent Publication 08-159762 bulletin

Summary of the invention

The problem to be solved in the present invention

In this approach, characteristic point is extracted from each single image and be total in the lap of cross-check adjacent image Image superposition is needed into a large amount of calculation processing ability while the characteristic point enjoyed.

In addition, more overlappings between adjacent image are preferred, and due to needing in order to consistently extract characteristic point Greater number of image forms the composograph with fixed-area, therefore the cost of data storage also increases.In addition, closing In the mobile route of mobile object, there is also another disadvantages.That is, in order to ensure more images overlapping between adjacent path, it must It must be moved back and forth in narrow interval, thus in view of the limitation of battery capacity, and make it difficult to extend in flight Imageable area.

Devise this technology in view of the above problems, and be to provide can be by the position of mobile object for the purpose of this technology The information information processing with for image procossing associated with the imaging device captured image by being mounted on the mobile object Device, information processing method, message handling program, image processing apparatus and image processing system.

Solution to the problem

Realize that the first technology of above-mentioned purpose is related to a kind of information processing unit, which is configured as: obtaining Take the imaging device by being mounted on mobile object according to imaging instruction captured image;According to response in the imaging instruction from The signal that the imaging device is sent obtains the sensor letter of the location information including the mobile object from sensor device Breath;And the sensor information that will acquire is associated with the image of acquisition.

In addition, the second technology is related to a kind of information processing method, comprising: obtain and filled by the imaging being mounted on mobile object It sets according to imaging instruction captured image;The signal sent according to response in the imaging instruction from the imaging device, from biography Sensor arrangement obtains the sensor information of the location information including the mobile object;And the sensor information that will acquire with obtain The image taken is associated.

In addition, third technology is related to making the message handling program of computer execution information processing method, the information processing journey Sequence includes: to obtain by the imaging device that is mounted on mobile object according to imaging instruction captured image;According to response in described The signal that imaging instruction is sent from the imaging device obtains the location information including the mobile object from sensor device Sensor information;And the sensor information that will acquire is associated with the image of acquisition.

In addition, the 4th technology is related to a kind of image processing apparatus, which is configured as receiving and according to sound The sensor information phase for the location information including mobile object that should be obtained in imaging instruction from the signal that imaging device is sent The offer of associated multiple images, and composite diagram is created by arranging described multiple images based on the sensor information Picture.

In addition, the 5th technology is related to a kind of image processing system, comprising: mobile object;Imaging device, the imaging device It is installed on the mobile object;Sensor device, the sensor device are installed on the mobile object and are matched The signal that sends according to response in imaging instruction from imaging device is set to detect the location information including the mobile object Sensor information;And information processing unit, the information processing unit be configured as by the sensor information and it is described at As device captured image is associated.

The effect of invention

According to this technology, the location information of mobile object can be with the imaging device capture by being mounted on mobile object Image is associated to be used for image procossing.Note that effect described herein need not be restricted, and it is also possible to this specification Described in any effect.

Detailed description of the invention

Fig. 1 is the figure for showing the configuration of image processing system.

Fig. 2A is the plan view for showing the exterior arrangement of mobile object, and Fig. 2 B is the side for showing the exterior arrangement of mobile object View.

Fig. 3 is the block diagram for showing the configuration of mobile object.

Fig. 4 is the block diagram for showing the configuration of imaging device.

Fig. 5 is the block diagram for showing the configuration of sensor device.

Fig. 6 is the flow chart for showing the process of the processing executed by image processing system.

Fig. 7 is the figure for illustrating flight plan and imaging plan.

Fig. 8 is the sequence chart for the processing that image processing system executes.

Fig. 9 is the associated figure for illustrating image and sensor information.

Figure 10 is the associated figure for illustrating image and sensor information.

Figure 11 is the flow chart for showing the process of composograph creation processing.

Figure 12 is the exemplary figure for showing the composograph by composograph creation processing creation.

Figure 13 is the figure for illustrating to correct the processing of image based on sensor information.

Figure 14 is the figure for illustrating the general introduction of the composograph creation without using height above sea level.

Figure 15 is the figure for illustrating the general introduction of composograph creation according to the second embodiment.

Figure 16 is the flow chart for showing disposed of in its entirety according to the second embodiment.

Figure 17 is the figure for showing the configuration of altitude database.

Figure 18 A is the block diagram for being shown provided with the configuration of sensor device of LiDAR sensor, and Figure 18 B is to show installation There is the side view of the exterior arrangement of the mobile object of LiDAR sensor.

Figure 19 is the associated figure for illustrating image, GPS data and elevation data.

Figure 20 is the explanatory diagram that the method for ground surface height above sea level is calculated using LiDAR.

Figure 21 is the flow chart for showing composograph creation processing.

Figure 22 is the explanatory diagram of ground orthogonal coordinate system and composograph coordinate system.

Figure 23 is the explanatory diagram that latitude and longitude are converted to ground orthogonal coordinate system.

Figure 24 is the explanatory diagram of network.

Figure 25 is the figure for showing the vector sum network at four angles of image.

Figure 26 is the figure for illustrating to calculate the method for the shadow positions at the angle of image.

Figure 27 is the figure for illustrating to calculate the method for the shadow positions at the angle of image.

Figure 28 is the figure for illustrating the conversion of the shadow positions at four angles of image to composograph coordinate system.

Specific embodiment

Hereinafter, the embodiment of this technology will be described with reference to the drawings.Hereinafter, it will be described in the following order.

<1. first embodiment>

[configuration of 1-1. image processing system]

[configuration of 1-2. mobile object]

[configuration of 1-3. imaging device]

[configuration of 1-4. sensor device]

[configuration of 1-5. terminal installation and cloud]

[processing that 1-6. is executed by image processing system]

[association process of 1-7. imaging and sensor information]

[1-8. composograph creation processing]

<2. second embodiment>

[general introduction of 2-1. composograph creation according to the second embodiment]

[2-2. disposed of in its entirety according to the second embodiment]

[2-3. composograph creation processing]

<3. modification>

<1. embodiment>

[configuration of 1-1. image processing system]

Firstly, reference Fig. 1 to be described to the configuration of image processing system 1000.Image processing system 1000 includes mobile object 100, imaging device 200, sensor device 300, terminal installation 400 and cloud 500.

In the present embodiment, mobile object 100 is the small-sized electric aircraft (unmanned vehicle) of referred to as unmanned plane.Imaging Device 200 is mounted to mobile object 100, and the image of ground surface is captured in 100 flight of mobile object.Sensor device 300 are provided with various sensors, and are mounted to mobile object 100, and the sensor obtained by various sensors is believed Breath is supplied to imaging device 200.Pay attention to, it is assumed that " imaging of ground surface " not only includes earth surface and ground surface itself, is gone back Including culture (road such as, building building on the ground and being layered on ground surface) and natural forms.

Terminal installation 400 is the computer etc. by using the user etc. of image processing system 1000 to use on the ground, and And it is handled using composograph creation is executed by 200 captured image of imaging device.

Cloud is configured by the server of cloud service provider company etc., and use is held by 200 captured image of imaging device Row composograph creation processing.

Composograph creation processing uses the imaging when mobile object 100 is mobile by being mounted on mobile object 100 The multiple images that device 200 captures by placement of images make neighboring image portions overlapping create single big composograph. Therefore, terminal installation 400 and/or cloud 500 are provided to by 200 captured image of imaging device.

Note that composograph creation processing need not be by the case where terminal installation 400 executes composograph creation processing Cloud 500 executes, and in the case where composograph creation processing is executed by cloud 500, composograph creation processing need not be by terminal Device 400 executes.In addition, composograph creation processing can also be executed by both terminal installation 400 and cloud 500.Cloud 500 can also Not execute composograph creation processing, but the composograph created by terminal installation 400 is saved, and in response to coming from user Request composograph is provided.

In the present embodiment, imaging plan is predefined.The characteristic specifications of imaging device 200 are considered in imaging plan instruction Flight plan (height, speed of the combined mobile object 100 of (horizontal/vertical camera lens visual angle) and operation setting (shutter period) Degree, path) and indicate execute the imaging of imaging with fixed intervals in which (imaging position) and at which range (areas imaging) Plan.Mobile object 100 flies according to flight plan, and imaging device 200 executes imaging according to imaging plan.

[configuration of 1-2. mobile object]

The configuration of mobile object 100 will be described referring to figs. 2 and 3.Fig. 2A is the plan view of mobile object 100, and Fig. 2 B is The front view of mobile object 100.Body includes part centered on cylindric or polygonal cylindrical shape fuselage 1, and is supported It is attached to the axis 2a to 2f on the top of fuselage 1.As an example, the shape of fuselage 1 is as hexagonal cylindrical, and six support shafts 2a to 2f radially extending outwardly from fuselage 1 with equi-angularly space.Fuselage 1 and support shaft 2a to 2f include the high intensity of lightweight Material.

In addition, body includes that fuselage 1 and shape, the arrangement of each component part of support shaft 2a to 2f etc. are designed to So that center of gravity is located across on the vertical line at the center of support shaft 2a to 2f.In addition, being internally provided with circuit unit in fuselage 1 5 and battery 6, so that center of gravity is located on vertical line.

In the figure 2 example, there are six rotor and motors.However, tool is there are four the configuration of rotor and motor or with eight The configuration of a or more rotor and motor is also acceptable.

The motor 3a to 3f for serving as the driving source of rotor is separately mounted to the front end of support shaft 2a to 2f.Rotor 4a is to 4f quilt It is mounted in the rotary shaft of motor 3a to 3f.Circuit unit 5 including the control unit for controlling each motor is installed in In the central part that support shaft 2a to 2f is assembled.

Motor 3a and rotor 4a and motor 3d and rotor 4d form a pair.Similarly, (motor 3b, rotor 4b) and (motor 3e, rotor 4e) a pair is formed, and (motor 3c, rotor 4c) and (motor 3f, rotor 4f) forms a pair.

The battery 6 for serving as power supply is disposed in bottom inside fuselage 1.For example, battery 6 include lithium ion secondary battery and Control the battery control circuit being charged and discharged.Battery 6 is removably mounted at the inside of fuselage 1.By the center of gravity for making battery 6 With the centroid align-ment of body, the stability of center of gravity can be improved.

Commonly referred to as the small-sized electric aircraft of unmanned plane realizes required flight by the output of control motor.Example Such as, it being remain stationary in the air under motionless floating state, the gyro sensor being mounted on body is used to detect gradient, And the motor that side is reduced by increasing body exports and reduces body and increases the motor output of side to keep body holding horizontal.This Outside, when advancing forward, motor output in opposite direction is exported and is increased by reducing the motor along direction of travel, machine can be made Body posture and generates thrust in leaning forward in the direction of travel.In this gesture stability and Solid rocket engine of small-sized electric aircraft In, the installation site of battery 6 as described above body stability and it is easily controllable between obtain balance.

Fig. 3 is the block diagram for showing the configuration of mobile object 100.Mobile object 100 is provided with control unit 101, the whole world is determined Position system (GPS) module 102, Inertial Measurement Unit (IMU) module 103, altimeter 104, azimuth indicator 105, communication unit 106, battery 6 and motor 3a to 3f.Note that be omitted above the support shaft described in the exterior arrangement of mobile object 100, Rotor etc..Assuming that control unit 101, GPS module 102, IMU module 103, altimeter 104, azimuth indicator 105 and communication unit Member 106 includes in circuit unit 5 shown in the appearance of the mobile object 100 in Fig. 1.

Control unit 101 includes central processing unit (CPU), random access memory (RAM), read-only memory (ROM) Deng.In ROM, the program for being loaded and being run by CPU etc. is stored.RAM is used as the working storage of CPU.CPU passes through basis The program stored in ROM executes various processing and issues order to control mobile object 100 on the whole, and imaging device 200 is arranged Imaging timing, send imaging device 200 etc. for imaging instruction signal.

In addition, control unit 101 controls the flight of mobile object 100 by the output of control motor 3a to 3f.

GPS module 102 obtains current location (latitude and longitude information) and the current time of mobile object 100, and will obtain The information taken is supplied to control unit 101.

IMU module 103 is inertial measuring unit, and IMU module 103 is passed by using acceleration transducer, angular speed Sensor, gyro sensor etc. calculate three-dimensional angular speed and acceleration for two or three axis directions to detect mobile object The posture information of angular speed when 100 posture and gradient, turning and the angular speed around Y direction, and will test Posture information is supplied to control unit 101.

Altimeter 104 measures height locating for mobile object 100, and altitude information is supplied to control unit 101.Highly Meter 104 is barometertic altimeter, radio altimeter etc..Barometertic altimeter detects atmospheric pressure, and radio altimeter is then to just The earth's surface surface launching radio wave of lower section measures the back wave from ground surface, and is arrived according to radio wave transmission and back wave Time computed altitude between reaching.

Azimuth indicator 105 detects the traveling orientation of mobile object 100 using the effect of magnet, and by the traveling orientation It is supplied to control unit 101.

Communication unit 106 is the various communication terminals or communication module for sending and receiving data with imaging device 200. The imaging instruction signal of triggering imaging is sent to imaging device 200 from mobile object 100.Communication with terminal installation 400 can be with It is the wire communication of such as universal serial bus (USB) communication, or such as Wi-Fi, bluetooth (registered trademark) or ZigBee The wireless communication of WLAN (LAN).In addition, communication unit 106 also with rise for from ground control mobile object 100 dress The external equipment (such as, personal computer, tablet terminal or smart phone) of standby (referred to as base station) effect is communicated.Motive objects Body 100 sends base station for the in-flight state of mobile object 100 by the communication of communication unit 106.It is come from addition, receiving The instruction etc. of base station.Because mobile object 100 is the aircraft to fly in the air, execute and base by wireless communication The communication stood.

In the present embodiment, imaging device 200 is mounted to the downside of mobile object 100, and sensor device 300 is pacified It is attached to the upside of mobile object 100.Note that sensor device 300 is not limited to be mounted to the upside of mobile object 100, still It is preferably mounted to the upside of mobile object 100, to be easily obtained the information of such as GPS.

[configuration of 1-3. imaging device]

As shown in Figure 2 B, imaging device 200 is by playing the camera support 50 of universal joint from the fuselage of mobile object 100 1 bottom is hung down installation.By the driving of camera support 50, imaging device 200 can be directed at camera lens and from horizontal direction 360 It spends in any direction of vertical direction and is imaged.Note that the operation of camera support 50 by control unit 101 based on imaging plan come Control.

The configuration of imaging device 200 will be described with reference to the block diagram in Fig. 4.Imaging device 200 is provided with control unit 201, optical imaging system 202, lens drive driver 203, imaging sensor 204, image signal processing unit 205, image Memory 206, storage unit 207, communication unit 208, IMU module 209, azimuth indicator 210 and information process unit 250.

Control unit 201 is including CPU, RAM, ROM etc..CPU is by executing various processing according to program stored in ROM And order is issued to control imaging device 200 on the whole.

In addition, control unit 201 also plays information process unit 250.The execution of information process unit 250 will pass through into It is single that it is stored in storage in association as the image obtained and from the sensor information as metadata of the transmission of sensor device 300 Processing in member 207.Later by the details of the processing of description information processing unit 250.

Note that information process unit 250 includes program, and the program can be pre- installed appropriately in imaging device 200, Or the program can be downloaded, be distributed on storage medium etc., and be installed by user oneself.In addition, information process unit 250 are also possible to the configuration independently of control unit 201.In addition, information process unit 250 not only can be by program come real It is existing, but also can be combined by having the function of hardware of information process unit 250 etc. with dedicated unit, circuit etc. come real It is existing.

Optical imaging system 202 includes saturating for the light from subject to be converged to the imaging on imaging sensor 204 Mirror, for keeping imaging len mobile to be focused with the driving mechanism of zoom, tripper, aperture device etc..These are based on Control unit 201 and lens from imaging device 200 drive the control signal of driver 203 to drive.Pass through optical imagery The optical imagery for the subject that system 202 obtains is formed on the imaging sensor 204 being arranged in imaging device 200.

Lens drive driver 203 for example including microcontroller etc., and by being made into according to the control of control unit 201 Automatic focus so that target subject focuses is executed as lens move predetermined amount in the direction of the optical axis.In addition, according to automatic control is carried out Unit 201 processed controls to control the operation of such as driving mechanism, tripper and aperture device of optical imaging system 202. Using this arrangement, the adjusting of time for exposure (shutter speed) and the adjusting etc. of f-number (f number) are executed.

Incident light photoelectric conversion from subject is charge, and output pixel signal by imaging sensor 204.Then, Picture element signal is output to image signal processing unit 205 by imaging sensor 204.Charge-coupled device (CCD) sensor, complementation Metal-oxide semiconductor (MOS) (CMOS) sensor etc. is used as imaging sensor 204.

Image signal processing unit 205 makes the imaging signal experience sampling holding processing exported from imaging sensor 204, with Keep good by correlated-double-sampling (CDS) processing, automatic growth control (AGC) processing, analog/digital (A/D) conversion etc. Signal-to-noise ratio (S/N), and create picture signal.

In addition, image signal processing unit 205 can also carry out scheduled signal processing, such as demosaicing to picture signal Processing, white balance adjustment process or color correction process, gamma correction processing, Y/C conversion process, automatic exposure (AE) handle with And conversion of resolution processing.

For example, video memory 206 is to include that volatile memory, the buffering of dynamic random access memory (DRAM) are deposited Reservoir.Video memory 206 temporarily buffers the image data of the predetermined process by image signal processing unit 205.

For example, storage unit 207 is the large-capacity storage media of such as hard disk, USB flash memory or SD storage card.For example, base In the standard of such as joint photographic experts group (JPEG), captured image is saved with compression or unpressed state.In addition, also with Image stores exchangeable image file format (EXIF) data in association, and it includes additional informations, such as with the image of preservation The imaging time information of related information, the imaging position information for indicating imaging position and instruction imaging date and time.This Outside, the IMU data obtained by IMU module 209 are associated with the image as extensible meta-data platform (XMP).Pass through information Image is stored in storage unit 207 by processing unit 250 using state associated with the sensor information as metadata.

Communication unit 208 is the various communications for sending and receiving data with mobile object 100 and sensor device 300 Terminal or communication module.In the communication with mobile object 100, the imaging instruction signal from mobile object 100 is received.With In the communication of sensor device 300, the exposure of the predetermined timing for the start and ending for notifying sensor device 300 from exposure is sent Notification signal, and in addition, receive the sensor information sent from sensor device 300.Mobile object 100 and sensor device Communication between 300 can be the wire communication of such as usb communication, or such as Wi-Fi, bluetooth (registered trademark) or The wireless communication of the Wireless LAN of ZigBee.Note that being executed between imaging device 200 and mobile object 100 by usb communication Communication in the case where, imaging device 200 can from the battery 6 of mobile object 100 receive power supply.

IMU module 209 and azimuth indicator 210 are similar to those of setting in mobile object 100.IMU module 209 is examined The posture and gradient of imaging device 200 are surveyed, and azimuth indicator 210 detects the imaging orientation of imaging device 200.

[configuration of 1-4. sensor device]

The configuration of sensor device 300 will be described with reference to the block diagram in Fig. 5.It is single that sensor device 300 is provided with control Member 301, GPS module 302, IMU module 303, altimeter 304, azimuth indicator 305, illuminance transducer 306 and communication unit 307。

Control unit 301 is including CPU, RAM, ROM etc..CPU is by executing various processing according to program stored in ROM And order is issued to control sensor device 300 on the whole.

GPS module 302, IMU module 303, altimeter 304 and azimuth indicator 305 are similar to the institute in mobile object 100 It is arranged.In this way, sensor device 300 is provided with multiple sensors, and the sensing that will be obtained by these sensors Device information is supplied to imaging device 200.

GPS module 302 is for detecting the horizontal position coordinate of mobile object 100 in 100 flight of mobile object.IMU mould Block 303 is for detecting the gradient of mobile object 100 in 100 flight of mobile object.Altimeter 304 is used in mobile object The height of mobile object 100 is detected when 100 flight.Azimuth indicator 305 is for detecting motive objects in 100 flight of mobile object The traveling orientation of body 100.Note that mobile object 100 is additionally provided with GPS module 102, IMU module 103, altimeter 104 and orientation Indicator 105, but set in mobile object 100 is flight for mobile object 100 itself, and filled in sensor Set by setting in 300 is for obtaining sensor information associated with image.

Illuminance transducer 306 is provided with photoelectric detector, by the way that the light being incident on photoelectric detector is converted to electric current Brightness to detect brightness, and will test is supplied to sensor control unit 301.Due to the brightness root of imaging circumstances and image Change according to imaging time, weather etc., therefore brightness is detected by illuminance transducer 306.In addition, by by illuminance transducer 306 sensory characteristic is divided into multiple wavelength bands, can also specify and is aimed downwardly in motive objects at the time point of capture image Energy ratio in the spatial distribution of sunlight on body 100 and ground or multiple specific wavelength band.

Communication unit 307 is the various communication terminals or communication module for sending and receiving data with imaging device 200. Communication unit 307 receives exposure notification signal, while the sensor that will also be obtained by sensor device 300 from imaging device 200 Information is sent to imaging device 200.With the communication of imaging device 200 can be such as usb communication wire communication or such as The wireless communication of the Wireless LAN of Wi-Fi, bluetooth (registered trademark) or ZigBee.Note that executing sensor by usb communication In the case where communication between device 300 and imaging device 200, sensor device 300 can be by imaging device 200 from movement The battery 6 of object 100 receives power supply.

Configuration mobile object 100, imaging device 200 and sensor device 300 as described above.Unmanned plane (has installation Imaging device 200 mobile object 100) can be not only manually operated by operator, but also can using GPS data and IMU data are flown automatically and automated imaging.In the case where executing automatic flight and automated imaging, about flight path The image-forming information of routing information and such as imaging position, imaging direction and imaging time is pre-arranged, and mobile object 100 Control unit 110 control the flight of mobile object 100, and indicate that imaging device 200 executes imaging according to the content of setting.Separately It outside, can also be by wireless communication from base station to acquisite approachs information and image-forming information.

[configuration of 1-5. terminal installation and cloud]

Terminal installation 400 is personal computer etc., and use is by 200 captured image of imaging device and sensor information To execute composograph creation processing.By 200 captured image of imaging device and sensor information via such as USB flash memory or SD The portable storage media of storage card is transmitted to terminal installation 400 from the storage unit 207 of imaging device 200.Note that imaging dress Setting 200 can be additionally configured to image and sensor information being saved directly to portable storage media, without being subjected to storage Unit 207.Furthermore it is possible to which communication between the communication unit 208 and terminal installation 400 that pass through imaging device 200 transmits figure Picture.Terminal installation 400 can be PC or Desktop PC on knee, and can also be tablet terminal, smart phone, game machine etc., As long as the device is provided with enough processing capacities and is able to carry out composograph creation processing.Execute composograph wound The device or processing unit (such as terminal installation 400) of building processing correspond to " image processing apparatus " in claim.

Cloud 500, which is also used by 200 captured image of imaging device and sensor information, executes composograph creation processing. Cloud refers to a kind of using mode of computer, and is structured on the server of cloud service provider company.Cloud service refers to The service that the server as present on network provides, and be that a kind of of computer Internet-based utilizes mode.Required Processing is essentially all to execute in server end.User passes through internet rather than oneself PC, smart phone, mobile phone Etc. saving the data on server.Therefore, user can also various environment (be such as in, company, in Internet bar, learning School or the place gone) in using the service and checking, edit or upload data.According to the composite diagram of the embodiment of this technology It can be executed by cloud 500 as creation is handled, and be supplied to consumer as cloud service.

Image and sensor information pass through internet and are sent to cloud 500 from terminal installation 400.In addition, in imaging device 200 In the case where being provided with internet, image and sensor information can also be sent out by internet from imaging device 200 It is sent to cloud 500.The composograph created by cloud 500 can be only maintained in cloud 500, or can also be sent to terminal dress Set 400 and/or imaging device 200.

[processing that 1-6. is executed by image processing system]

Next, the general introduction that the processing executed by image processing system 1000 will be described with reference to the flow chart in Fig. 6.It is first First, in step s 11, the flight plan of pre-programmed is uploaded to the control unit 101 of mobile object 100, while in addition, prelisting The imaging plan of journey is uploaded to the control unit 201 of imaging device 200.Upload is for example, by being inputted by user, from base station The operation automatically delivered etc. is performed.Next, in step s 12, starting the flight of mobile object 100.Next, in step In rapid S13, imaging is executed by imaging device 200 in 100 flight of mobile object.In addition, parallel with the imaging of imaging device 200 Ground obtains sensor information by sensor device 300.Next, in step S14, by captured image and sensor information phase It is associatedly stored in the storage unit 207 of imaging device 200.

Next, processing enters step S15, and by image and as the associated sensor information of metadata from As the storage unit 207 of device 200 is sent to terminal installation 400 and/or cloud 500.Next, in step s 16, terminal installation 400 determine whether to perform imaging at all imaging positions being imaged in the works, and repeat step S13 to S16, directly Until performing imaging at all imaging positions (step S15, no).Then, in step S17, by terminal installation 400 And/or cloud 500 executes composograph creation processing.The details of each processing will be described later.

[association process of 1-7. imaging and sensor information]

Next, by reference Fig. 7 to Figure 10 description according to the imaging and sensor information association process of the present embodiment.Fig. 7 It is the figure of the flight path and imaging position for illustrating mobile object 100.

In Fig. 7 dotted line instruction imaging be arranged in the works will by the ground surface that imaging device 200 is imaged it is entire at As range.Composograph eventually by composograph creation processing creation becomes the image comprising entire areas imaging.

The flight path of solid line instruction mobile object 100 in Fig. 7, the flight path are configured such that entire imaging Range is imaged, and multiple points instruction on solid line is executed the imaging position of imaging by imaging device 200.Due to motive objects Body 100 includes the GPS module 102 and IMU module 103 of the position and posture for obtaining its own after taking off, therefore motive objects Body 100 flies in flight path according to flight plan, and by sending imaging instruction to imaging device 200 based on imaging plan Signal to execute imaging in each imaging position.

It is carried out automatically based on GPS data with fixed range interval note that can be in the method that fixed intervals are imaged The method for repeating imaging, the method for being also possible to carry out repeating imaging automatically with Fixed Time Interval, but this method is not limited to It is any.With the imaging of fixed range interval, imaging instruction is sent to imaging device 200 from mobile object 100. This is because the position of mobile object 100 and imaging device 200 is obtained by the GPS module 102 being arranged in mobile device 100 It takes, and imaging instruction is what the GPS data based on indicating positions issued.On the other hand, at regular intervals into In the case where row imaging, imaging can be issued by one of mobile object 100 or the imaging device 200 for being provided with clocking capability and referred to It enables.

As shown in the enlarged drawing of the extraction part of imaging position A, B, C and D in Fig. 7, flight plan and imaging plan quilt It is arranged so that on direction of the lap of areas imaging on the direction of travel of mobile object 100 and between flight path Occur.By being imaged in this way so that lap appears in areas imaging, using by the way that acquisition is imaged In the case that multiple images create composograph, gap and lack part can be prevented.

Note that the explanation for the ease of this technology is already provided with flight plan shown in Fig. 7 and imaging plan, and this skill Art is not limited to this flight path and imaging position.

Next, by mobile object 100, imaging device 200 and sensor device is described with reference to the sequence chart in Fig. 8 Process flow between 300.Firstly, in the step s 21, the imaging instruction signal based on imaging plan is sent out from mobile object 100 It is sent to imaging device 200.

Next, the imaging device 200 that imaging instruction signal has been received starts to expose in step S22, and After obtaining image in step S23, terminate to expose in step s 24 after by scheduled exposure time.Next, in step In S25, at the time point being over by time for exposure and exposure, the exposure notification signal that instruction exposure is over Sensor device 300 is sent to from imaging device 200.

Next, sensor information receives exposure notification signal from imaging device 200 from step S26 Sensor device 300 is sent to imaging device 200.In addition, in step s 27, the figure that imaging device 200 will be obtained by imaging As being stored in storage unit 207 with as the associated sensor information of metadata.

The above processing is executed to all images obtained by imaging.

Next, will be described with reference to Figure 9 the sensor information that will be obtained by sensor device 300 as metadata with by The associated processing of 200 captured image of imaging device.By the institute based on flight plan shown in fig. 7 and imaging plan capture There is image to be stored in storage unit 207 with as the associated sensor information obtained by sensor device 300 of metadata. Note that the description below with reference to Fig. 9 assumes that from the sensor information that sensor device 300 is sent to imaging device 200 be by GPS What module 302 obtained GPS data (corresponding to " location information " in claim) and the IMU data that are obtained by IMU module 303 (corresponding to " inclination information " in claim).Note that sensor information can also include the height obtained by altimeter 304 Degree is according to (corresponding to " elevation information " in claim) etc..

When sending imaging instruction signal from mobile object 100 to imaging device 200, the imaging of the signal has been received Device 200 starts to expose, and by after the scheduled time for exposure, end exposure and image is acquired.In addition, sensing Device device 300 is asynchronously periodically obtained using GPS module 302 at predetermined intervals with imaging instruction and phototiming GPS data, and in addition, periodically 303 IMU data are obtained using IMU module at predetermined intervals.GPS module 302 The timing of the timing and the acquisition IMU data of IMU module 303 that obtain GPS data can be asynchronous or synchronous.Note that when obtaining When taking new GPS data and IMU data, all legacy datas are preferably saved without being deleted.

Firstly, imaging device 200 starts to expose when sending imaging instruction signal from mobile object 100 to imaging device 200 Light, and when by end exposure after the scheduled time for exposure, image is acquired, in addition, the instruction end exposure moment It exposes notification signal and is sent to sensor device 300 from imaging device 200.

When receiving exposure notification signal, exposure notification signal will be had been received closest in sensor device 300 At the time of at the time of the GPS data that obtains be sent to imaging device 200.In addition, when receiving exposure notification signal, sensor Device 300 similarly by the time of closest to exposure notification signal has been received at the time of the IMU data that obtain be sent to Imaging device 200.

As described above, the phototiming of imaging device 200 in sensor device 300 GPS data and IMU data obtain Taking is asynchronous constantly, thus sensor device 300 from imaging device 200 receive exposure notification signal at the time of not necessarily with biography Sensor arrangement 300 matches at the time of obtaining GPS data and IMU data.Therefore, sensor device 300 will connect closest (newest) GPS data and IMU data obtained at the time of at the time of receiving exposure notification signal is sent to imaging device 200.

In the case where Fig. 9, since the acquisition of GPS data A is to receive instruction for acquisition figure in sensor device 300 It is executed as the time point of the exposure notification signal of the exposure of data A, therefore when the acquisition of GPS data A is completed, sensor dress It sets 300 and sends imaging device 200 for GPS data A.Similarly, due to receiving instruction for obtaining in sensor device 300 The time point of the exposure notification signal of the exposure of image data A executes the acquisition of IMU data A, therefore works as the acquisition of IMU data A When completion, IMU data A is sent imaging device 200 by sensor device 300.In this way, exposure is received closest GPS data A and IMU the data A obtained at the time of at the time of notification signal is sent to imaging device 200 from sensor device 300.

Then, information process unit 250 is using GPS data A and IMU data A as metadata preservation associated with image A In storage unit 207.

When next imaging instruction signal is sent imaging device 200 by mobile object 100, imaging device 200 is similar Ground executes exposure, and when end exposure, obtains image B by imaging, in addition, sending sensor for exposure notification signal Device 300.In the case where Fig. 9, since the acquisition of GPS data B is to receive instruction for acquisition figure in sensor device 300 It is executed as the time point of the exposure notification signal of the exposure of data B, therefore, when the acquisition of GPS data B is completed, sensor GPS data B is sent to imaging device 200 by device 300.Similarly, since the acquisition of IMU data B is in sensor device 300 The time point execution of the exposure notification signal of exposure of the instruction for obtaining image data B is received, therefore, as IMU data B Acquisition complete when, IMU data B is sent to imaging device 200 by sensor device 300.In this way, it is received closest GPS data B and IMU the data B obtained at the time of at the time of exposing notification signal is sent to imaging dress from sensor device 300 Set 200.Then, information process unit 250 is stored in association using GPS data B and IMU data B as metadata and image B In storage unit 207.

In this way, all captured images are saved in storage unit associated with sensor information simultaneously In 207.

Note that triggering image and sensor information between associated exposure notification signal transmission can also instruction at As sometime execution at the time of device 200, which executes, to be imaged between the end exposure moment.In addition, the hair of exposure notification signal Sending can also sometime execution at the time of imaging device 200 begins preparing exposure between the end exposure moment.This In the case where receiving imaging instruction signal and being exposed between preparation there are a time lag effectively.In addition, exposure notification signal Send can also imaging device 200 exposure start time and the end exposure moment between sometime execution.At this In the case of kind, the GPS information obtained at the time of closer to capture image can be obtained.When exposing start time and end exposure The specific example of period between quarter is at the time of exposure starts or at the time of end exposure.

In this way, by obtaining GPS data so that the phototiming of imaging device 200 is triggering, it is available more Close to the GPS data of phototiming, therefore more accurate GPS data can be assigned to image.For example, the feature with extraction image Point and the case where merging image to create composograph while cross-check characteristic point is compared, due to figure As associated GPS data and IMU data are with the state relation of high time precision, therefore only rely on GPS data and IMU data just Image can be merged.Therefore, faster composograph creation is treated as possibility.In this way, in this technique, Ke Yichuan Composograph is built, without extracting the processing with the characteristic point of cross-check image.

Here, it will be described with reference to Figure 10 according to GPS data A and GPS data the B meter obtained before and after phototiming The case where calculating the new GPS data of higher precision.It, can be according to receiving for obtaining image in the exemplary situation of Figure 10 The GPS data A and GPS data B that obtain before and after at the time of the exposure notification signal of A calculates new GPS data, and It is associated with image A that new GPS data can be considered as GPS data.

For example, receive expose notification signal at the time of in obtain GPS data A at the time of with obtain GPS data B when In the case where centre (with the ratio of 5:5) between quarter, the median between GPS data A and GPS data B passes through weighted average Value etc. calculates, and it is associated with image A to be considered as new GPS (sensor information).In Figure 10, new GPS data is referred to as " GPS data N ".Note that in this case, after obtaining GPS data B, by new GPS data N from sensor device 300 It is sent to imaging device 200.New sensor information is calculated according to multiple sensor informations and is not limited to GPS data, and can be with Applied to all the sensors information of this technology processing, such as IMU data and altitude information.

Be not limited at the time of note that serving as GPS data and IMU data associated with image benchmark end exposure when Between.At the time of reference instant is also possible to the imaging instruction of mobile object 100 or imaging device 200 itself, or exposure is opened At the time of beginning.

[1-8. composograph creation processing]

Next, by description according to the processing of the multiple images creation composograph captured by imaging device 200.In outside Composograph creation processing is executed in terminal installation 400 and/or cloud 500.However, in the information process unit of imaging device 200 In the case that 250 have the enough processing capacities for executing composograph creation processing, composite diagram can be executed by imaging device 200 As creation is handled.

Firstly, the process that composograph creation processing will be described with reference to the flow chart in Figure 11.It is right in step S31 One of image for creating composograph executes correction process.Correction process is based on associated with image as metadata Sensor information corrects the processing of image.The details of correction process will be described later.

Next, in step s 32, executing the image after arrangement corrects to create the processing of composograph.It will retouch later State the details of arrangement processing.Next, in step S33, it is determined whether perform arrangement processing to all images.Still In the case where not executing arrangement processing to all images, processing enters step S31, and repeats step S31 to S33, Zhi Daobu Until having set all images (step S33, no).

Then, in the case that determination performs arrangement processing to all images in step S33, processing terminate for this simultaneously And composograph completes (step S33, yes).

Figure 12 is conjunction and multiple images are arranged so that neighboring image portions are overlapped using multiple images creation At the diagram of image.Note that the example of Figure 12 assumes that composograph is using by from capturing and farmland is imaged in the air Multiple images creation farmland map.

In Figure 12, for ease of description, some figures in the multiple images for forming created map image As (depicting frame around image A to image I), and the shade of these images has changed.Point on image A to image I It is the position of the latitude and longitude by GPS data associated with each image instruction, and dotted line indicates the latitude of each image Degree and longitude.

In composograph creation processing, based on as metadata GPS data correction chart associated with each image The placement of images while position of picture.In flight plan and it is imaged in the works, as shown in fig. 7, imaging position is arranged in straight line On, and areas imaging has also been preset, and ideally will execute imaging according to plan.However, considering in actual imaging To in many cases, since the influence of wind etc., imaging position and areas imaging tend to deviate imaging plan.

Therefore, in this technique, by the way that GPS data is considered as benchmark come placement of images.By the way that GPS data is considered as benchmark Come placement of images method can be by by map datum associated with latitude and longitude information latitude and longitude with The latitude indicated by GPS data associated with image and longitude are aligned the method come placement of images.It, can for map datum It is sent out with using with by Japanese geospatial information management board (Geospatial Information Authority of Japan) The associated map datum of latitude and longitude data of cloth, or with by internet Map Services provide longitude and latitude data Associated map datum.

As described above, the imaging for the imaging device 200 being mounted on mobile object 100 is in advance based on flight plan and imaging Plan, and which position (imaging position) and which range (areas imaging) capture image be scheduled.In reality In the case that captured image deviates predetermined imaging position and areas imaging, correction is executed using GPS data.

In the map image of Figure 12, since image A, B and C have matched longitude, image A, B and C are along latitude Direction is arranged without longitude misalignment.Image D is dissipated from the imaging position of imaging in the works, and therefore in image C and figure As there are longitude misalignments on latitude direction between D.Therefore, as shown in FIG. 13A, when creating map image, by image from The imaging position of imaging in the works moves in parallel the position of GPS data associated with image, to correct the cloth set of image It sets.Because GPS data is the latitude and longitude information of imaging position, by image layout at making the metadata for serving as image GPS data be aligned with actual latitude and longitude.

In addition, as shown in Figure 13 B, figure is captured at the higher position of height than the mobile object 100 in flight plan As in the case where, compared with the case where executing imaging according to flight plan and imaging plan, subject becomes smaller and visual angle broadens. In this case, it is corrected by executing enhanced processing, so that image is held with according to flight plan and imaging plan The size matching of the subject obtained in the case where row imaging.Then, arrange the image of corrected processing to create composite diagram Picture.

On the other hand, as shown in fig. 13 c, captured at the position lower than the height of the mobile object 100 in flight plan In the case where image, compared with the case where executing imaging according to flight plan and imaging plan, subject becomes larger and visual angle becomes It is narrow.In this case, it is corrected by executing diminution processing, so that image is planned with according to flight plan and imaging The size matching of the subject obtained in the case where executing imaging, and then arrange that the image of corrected processing is closed to create At image.

In addition, using image creation two dimension composograph, pass through into it is expected that substantially parallel with farmland As device 200 executes imaging.However, imaging while mobile object 100 or the inclined situation of imaging device 200 under, such as scheme Shown in 13D, error occurs in areas imaging.In this case, school is executed by executing keystone and pruning modes Just, so that image obtains in the case where the imaging substantially parallel with ground with being executed according to flight plan and imaging plan Subject matching.Then, arrange the image of corrected processing to create composograph.

In addition, being passed in illuminance transducer data as illumination in metadata situation associated with image, is also based on Sensor data execute brightness, color balance etc. of the correction to adjust image.Note that school need not be executed to the image for not needing correction Positive processing.

In this way, by arranging all images simultaneously also based on as metadata sensor letter associated with image Breath correction creation of image composograph.Note that metadata associated with each image of composograph is formed preferably remains It is associated with the composograph of creation.Using this arrangement, can based on GPS data etc. easily designated user in composograph In specify concern position.

Note that by the way that the latitude and longitude that are indicated by the GPS data for the image arranged first are considered as arrangement second with after The benchmark of continuous image can also create composite diagram in the case where not using map datum associated with latitude and longitude information Picture.

According to this technology, do not need to extract with the characteristic point of cross-check image the processing for creating composograph, therefore Computational throughput needed for creation composograph can be reduced.Using this arrangement, composite diagram can than ever-faster be created Picture, also, in addition, even if composograph can also be created on the not high inexpensive computers of processing capacity.

For example, in this technique, creating synthesis because not needing to extract with the processing of the characteristic point of cross-check image Image, thus this technology can be used for creating no building etc. serve as significant feature such as farmland broad regions composite diagram Picture.

In addition, because having used based on flight plan and imaging plan and captured image, it is possible to make image each other Overlapping is become smaller with the region for creating composograph.Using this arrangement, compared with past technology, can with less image come Create composograph.Moreover, compared with past technology, the time needed for composograph creation can be shortened.Make image each other Overlapping means to expand flying back and forth for mobile object 100 for imaging to create the fact that the region of composograph becomes smaller Interval between walking along the street line, and imaging efficiency can be enhanced.If enhancing imaging efficiency, under identical battery condition, Compared with the past can expand the region that can be imaged in flight.Moreover, the image creation fewer than the prior art can be used Composograph means that battery can be saved, because can execute less imaging compared with prior art.

It, can be by will be at using the configuration that imaging device 200 and sensor device 300 are installed to mobile object 100 In the market general of the specific function for this technology is such as not provided with as device 200 and sensor device 300 are installed to This technology is realized on the mobile object 100 of logical unmanned plane.Therefore, it will be introduced to the market according to the product of this technology and used with user Also it is easy to.

Because the composograph of creation be supplied with by with form the associated metadata of the image of composograph, It, can be by will such as undergrowth, disease and position the problem of insect by GPS data in the case where the map image in farmland Crawl is location information.And hence it is also possible to be sprinkled water using the tractor equipped with GPS function to specific region, fertilising, application agriculture Medicine etc..

Note that this technology is not exclusively by the characteristic point of extraction and cross-check image so that image overlaps each other to create Composograph.Composograph can also be created by the way that the method for extraction and cross-check characteristic point is used in combination.

<2. second embodiment>

[general introduction of 2-1. composograph creation according to the second embodiment]

Next, the second embodiment that this technology will be described.Second embodiment is by using the ground surface in areas imaging Height above sea level execute the creation of high-precision composograph.Height above sea level refers in the sea (being Tokyo Bay in Japan) that will be used as benchmark In the case that mean sea level is considered as 0m datum level, height of the ground surface apart from datum level.Firstly, will be retouched with reference to Figure 14 and 15 State the general introduction of composograph creation according to the second embodiment.

Figure 14 is the height of the imaging device 200 in the case where the variation for the ground elevation not reflected in composograph creation The diagram of relationship between (flying height of mobile object 100), ground surface, image and image layout result.As shown in figure 14, Composograph is created by the way that the multiple images obtained by imaging are projected and are arranged on the virtual face different from ground surface. The face of placement of images is the shared face of all images.Note that assuming height (the flight height of mobile object 100 of imaging device 200 Degree) it is fixed.

Consider following situation: assuming that the height above sea level of the arrangement destination of all images obtained by imaging is identical height Height above sea level without reflecting ground surface, and by creating composograph with scheduled equivalent size placement of images.In this feelings Under condition, if the height above sea level of ground surface has variation, mistake will occur for the picture size arranged, and image will be arranged to show It obtains and is greater than (or being less than) actual picture size.

It as shown in Figure 14 A, will be with the ruler than real image in the case where the height above sea level of ground surface is lower than the face of placement of images Very little small size carrys out composograph.In addition, as shown in Figure 14 C, in the case where the height above sea level of ground surface is higher than the face of placement of images, It will carry out composograph with the larger-sized size than real image.There are the feelings of variation in the height above sea level of ground surface in this way Under condition, error occurs for the arrangement of the image in composograph creation.The error feelings close in the face of ground surface and placement of images Very little under condition, but become larger as the face of ground surface and placement of images is separated by remoter.

On the contrary, in a second embodiment, as shown in figure 15, by obtaining the height above sea level of the areas imaging of image and according to the sea It pulls out by image layout in face, by creating composograph with correct size placement of images.It, can be with using this arrangement Higher precision creates high-precision composograph.Note that being described by the way that the flying height of mobile object 100 is considered as fixation Second embodiment.

About the height above sea level of ground surface, its intermediate altitude and the longitude issued by Japanese geospatial information management board can be used Or the associated map datum in position of the instructions such as dimension, or by the associated with height above sea level of the Map Services offer on internet Map datum.Further, it is also possible to obtain height above sea level using distance measuring sensor.

[2-2. disposed of in its entirety according to the second embodiment]

Next, description is according to the second embodiment from being imaged onto for the flight of mobile object 100 and imaging device 200 The general introduction of the overall process of image synthesis processing.It forms the mobile object 100 of image processing system 1000, imaging device 200, pass Sensor arrangement 300, terminal installation 400 and cloud 500 are similar to first embodiment, therefore omit description.

Figure 16 is shown from the flight of mobile object 100 to imaging, the acquisition of the height above sea level of ground surface and composograph wound Build the flow chart of the entire process flow of processing.Processing identical with the flow chart in Fig. 6 indicates with identical number of steps, and And omit detailed description.

Firstly, in step s 11, the flight plan of pre-programmed to be uploaded to the control unit 101 of mobile object 100, together When, in addition, the imaging plan of pre-programmed to be uploaded to the control unit 201 of imaging device 200.Next, in step s 12, Start the flight of mobile object 100.Note that awing, being similar to described in the first embodiment, sensor device 300 periodically obtain the sensor information of such as GPS data and IMU data respectively with predetermined time interval.

In addition, when repeat step S13 to S16 and execute imaging at all imaging positions and sensor information obtains When taking, processing enters step S41 (step S16, yes).Note that step S41 to S43 is existed by terminal installation 400 and/or cloud 500 The flight of mobile object 100 executes the processing of composograph creation processing and execution after terminating.

Next, in step S41, the terminal installation 400 and/or cloud 500 for executing composograph creation processing are from sensing The GPS module 302 of device device 300 obtains the GPS data of the position of the mobile object 100 of instruction in-flight.GPS data can be by Think the flight path of instruction mobile object 100.GPS data is in-flight obtained by GPS module 302 in mobile object 100, And it can be after flight by reading the memory function or imaging device 200 that are stored in inside sensor device 300 Storage unit 207 in information obtain.Note that GPS data is also possible to the GPS mould by being arranged in mobile object 100 The GPS data that block 102 obtains.

Next, obtaining the elevation data in the areas imaging of ground surface in step S42.The elevation data also includes In sensor information.Firstly, description is obtained sea using its middle latitude and longitude map datum associated with height above sea level etc. The method for pulling out data.

In this case, the terminal installation 400 and/or cloud 500 for executing image synthesis processing are based on obtaining in step S41 Elevation data in areas imaging of the position indicated by the GPS data taken to obtain imaging device 200.It can be by using day Position map datum associated with height above sea level of latitude and longitude etc. of this geography spatial information management office publication executes Aforesaid operations, to confirm the latitude of the GPS data obtained in by step S41 instruction and the height above sea level of longitude.

As shown in Figure 17, with obtained GPS data position sequence to the elevation data of the ground surface of acquisition into Row number, and it is associated to create database with the position by latitude and longitude instruction.Data acquisition position, by latitude and The position of longitude instruction and this association of elevation data will be referred to as altitude database.Note that as in the first embodiment With reference to described in Fig. 9, saves with the sensor information of such as GPS data and IMU data and caught by imaging device 200 in association The image obtained.Therefore, elevation data and image can be associated with by reference to latitude associated with elevation data and longitude.

When the acquisition of elevation data is completed, next, executing composograph creation processing in step S43.

At this point, description is obtained ground surface by using distance measuring sensor in the step S42 of the flow chart of Figure 16 The method of elevation data in areas imaging.Assuming that by " light detection and ranging " or " laser imaging, detection and ranging " (LiDAR) The abbreviation of sensor is used as distance measuring sensor.LiDAR refers to a kind of remote sensing technology using light, and is that one kind can pass through survey Scattering light relative to the laser emission of impulse ejection is measured to measure the distance of distant object and analyze the characteristic of target Technology.

In a second embodiment, as shown in Figure 18 A, sensor device 300 is provided with LiDAR sensor 310.But by Distance is measured by laser emission in LiDAR, therefore as shown in figure 18b, the laser emission unit 311 of LiDAR sensor 310 It is arranged on the bottom of the fuselage 1 of mobile object 100 with laser pick-off unit 312, allow to radiate and receives direct projection is downward It is directed toward the laser on ground.Note that so far, due to that can radiate and receive the laser on direction ground, LiDAR sensor 310 Laser emission unit 311 and laser pick-off unit 312 can be set on the side of the fuselage 1 of mobile object 100.

Using LiDAR, it is preferred that before ground surface is imaged, make to be equipped with equipped with The mobile object 100 of the sensor device 300 of LiDAR sensor 310 flies and obtains in the range identical as areas imaging The elevation data of areas imaging.After the flight of mobile object 100 terminates, sea is calculated according to the information obtained by LiDAR Pull out data.But it is also possible to the flight of mobile object 100 and the imaging of ground surface concurrently according to by LiDAR sensor 310 information obtained calculate elevation data.Elevation data quilt is calculated according to the information acquired in LiDAR sensor 310 It is described as being executed by information process unit 250, but can also be executed by terminal installation 400 or cloud 500.

Note that the interval that the elevation data of LiDAR sensor 310 obtains between timing is preferably shorter than imaging device 200 Imaging timing between interval.This is because being obtained between timing if the interval between imaging timing is longer than elevation data Interval then becomes difficult accurate elevation data is associated with each captured image.In addition, this is because with height above sea level number It shortens according to the interval obtained between timing, the height above sea level of more accurate ground surface can be obtained.

Pass through LiDAR acquisition earth's surface elevation data and the position for indicating elevation data and longitude and latitude is associated to create It is similar with the case where obtaining elevation data from map datum to build database.

Note that using LiDAR sensor 310 obtain elevation data timing can with or can not be with sensor device 300 obtain the Timing Synchronization of the sensor information of such as GPS data.In the case where acquisition timing is synchronized, can be associated in The GPS data for including in the sensor information of timing acquisition identical with the elevation data obtained using LiDAR sensor 310 Deng.It, can be by will the timing (timing of such as imaging instruction related with image capture in the case where obtaining the nonsynchronous situation of timing Or the timing when receiving exposure notification signal from imaging device 200) be considered as benchmark and come associated images, sensor information and sea Pull out data.The association is described as being executed by information process unit 250, but can also be held by terminal installation 400 or cloud 500 Row.

For example, as shown in Figure 19, receiving exposure of the instruction for obtaining image data A in sensor device 300 In the case where the acquisition of time point execution elevation data A for exposing notification signal, when the acquisition of elevation data A is completed, sensing Elevation data A is sent imaging device 200 by device device 300.In this way, from sensor device 300 to imaging device 200 It is sent in the elevation data A of the timing acquisition closest to the timing for receiving exposure notification signal.Then, information process unit 250 It is stored in storage unit 207 in association using GPS data A, IMU data A and elevation data A as metadata and image A. Image, GPS data, IMU data and elevation data are associated in this way, when placement of images is to create composograph Elevation data can be used.Note that in the predetermined position using the acquisition elevation data of LiDAR sensor 310 or utilizing GPS number In the case where obtaining the position for obtaining elevation data using LiDAR sensor 310, position can also be obtained based on elevation data Information the GPS data for including in elevation data and sensor information etc. is associated.

Note that similarly, in the case where obtaining elevation data by reference to map datum, with mobile object 100 Flight, the imaging of imaging device 200 and by sensor device 300 to the acquisition of GPS data and IMU data concurrently from map In the case where data acquisition elevation data, image, sensor information and elevation data can be associated in this way.Note Meaning, in Figure 19, has been shown for illustration GPS data and elevation data, but associated data is wanted to be not limited to these, It and can also include the IMU data obtained by IMU module 303, the altitude information obtained by altimeter 304 etc..

Next, reference Figure 20 description to be calculated to the height above sea level number of the height above sea level of instruction ground surface using LiDAR According to method.The calculating of elevation data can be by any one of information process unit 250, terminal installation 400 and cloud 500 It executes, and can also be executed by LiDAR sensor 310 itself.There is angle relative to gravity direction in LiDAR ranging direction θ and the distance measured are dLIn the case where, it is assumed that ZGPSIt is the height of the mobile object 100 from the ranging that GPS data obtains It spends, then ground surface height above sea level hGroundIt is calculated according to following formula 1.

[formula 1]

hGround=ZGPS-dLcosθ

Furthermore, it is possible to by using instruction mobile object 100 position and posture GPS data and IMU data and The distance measurement result of LiDAR sensor 310, according to mobile object (PGPS) and measuring distance of target point PLThe distance between position and side Parallactic angle is calculated with height above sea level hGroundPLPosition.In common reference document [Japanese geospatial information management board technical report B5-No.19] in describe use distance PGPSDistance and bearing angle calculate PLLatitude and longitude ad hoc approach, and can To use the method.The height above sea level calculated in this way is used as the elevation data in image synthesis processing.

[2-3. composograph creation processing]

Next, the details that composograph creation processing will be described with reference to the flow chart in Figure 21.From map datum In the case where obtaining elevation data in the case where obtaining the elevation data of ground surface and by LiDAR, at composograph creation Reason is similar.

Firstly, in step s 51, definition will two coordinate systems used in composograph creation processing.To be defined Coordinate system is ground orthogonal coordinate system ΣG, indicate imaging associated with movement of the mobile object 100 in orthogonal coordinate system The areas imaging of device 200;And composograph coordinate system ΣI, serve as the arrangement destination of the image for image synthesis. Ground orthogonal coordinate system ΣGCorresponding to " the first coordinate system " in claim, and composograph coordinate system ΣICorresponding to " second Coordinate system ".

Ground orthogonal coordinate system ∑GWith composograph coordinate system ∑IIt is shown in FIG. 22.Ground orthogonal coordinate system ∑GOriginal Select the arbitrary point in the areas imaging being expressed as at 0m height above sea level as unit of rice.About ground orthogonal coordinate system ∑GXYZ axis Direction, +X direction is considered as east, +Y direction is considered as north, and +Z direction is considered as vertically upward.As image coordinate system Composograph coordinate system ΣIIt is defined as unit of pixel, and is configured to contain entire areas imaging and also makes image It does not protrude.But the pixel wide of image and pixels tall can be set to any resolution ratio.

Next, in step S52, by altitude database shown in Figure 17 latitude and longitude to be converted to ground orthogonal Coordinate system ∑G.Conversion method is shown in FIG. 23.Firstly, according to ground orthogonal coordinate system ΣGPoint of origin PGWith come from and Figure 17 Shown in any position P in elevation data and the corresponding multiple positions of latitude and longitudehi, calculate the distance between two o'clock D and azimuth angle theta.Basis is described in open bibliography [Japanese geospatial information management board technical report B5-No.19] The method that latitude and longitude calculate the distance between two o'clock d and azimuth angle theta, and the method can be used.

When calculating distance d and azimuth angle theta, ground orthogonal coordinate system ΣGIn any position PhiCoordinate (x, y) quilt It is determined as (dcos θ, dsin θ).In addition, if the height of original elevation data is considered as PhiHeight, then can determine Ground orthogonal coordinate system ΣGIn PhiValue.

In addition, being converted into ground orthogonal coordinate system ∑ in step S52GAltitude database for creating Figure 24 institute The network shown.Grid knot is created by connecting each consecutive points for including in altitude database to create triangle Structure.The every triangular mesh information configured in this way is all used to calculate the projection position at four angles of the image being described later on It sets.

Next, in step S53, by the movement of the GPS data obtained by the GPS module 102 of mobile object 100 instruction It is orthogonal that the position of object 100 with the posture for the mobile object 100 that the IMU data obtained by IMU module 103 indicate is converted to ground Coordinate system ΣG.Because being exported according to GPS data using the position of mobile object 100 as latitude and longitude information, so needing to turn It is changed to ground orthogonal coordinate system ΣG, but conversion method is orthogonal similar to altitude database is converted to ground in step S52 Coordinate system ∑G

Next, in step S54, in ground orthogonal coordinate system ∑GFall into a trap nomogram picture four angles on ground surface Projected position.Four angles of image refer to the vertex including four angles in the image with quadrangle form.Figure 25 is To the height above sea level (grid of the vector sum ground surface at four angles of the image extended from the imaging device 200 for being installed to mobile object 100 Structure) diagram.The point for the vector sum grid surface intersection for extending to four angles of image becomes four angles of the image to be calculated Projected position.

Here, description is calculated to the method for the projected position at four angles of image.Note that being described below a kind of calculating figure The method of the projected position at one of four angles of picture.By executing the method to all four angles, image all four can be calculated The projected position at a angle.

Firstly, as shown in figure 26, considering the face ABC for serving as the triangular mesh of the minimum unit of elevation data.Assuming that n is The unit normal vector of face ABC, and P0 is the centre coordinate of ABC, then calculates n according to cross product using formula 2 as shown in figure 26.

[formula 2]

Next, as shown in fig. 27 a, the position of the imaging device 200 of mobile object 100 is considered as Pd, and will from As the unit vector that the focus of device 200 extends to four angle points of image is considered as L.Note that in fact, there are four units to L is measured, but in Figure 27 A, illustrates only one for ease of description.Assuming that face ABC and from the position P of imaging device 200d In the case that the vector of extension has intersection point, scalar value d can be used by vector and be expressed as dL.

Herein, if being (P-P using the expression formula of the arbitrary point P in plane ABC0) n=0, then it can be used Following formula 3 come calculate vector dL and face ABC intersection point Li

[formula 3]

(dL+Pd-P0) n=0

In addition, research face ABC and vector (L0, dL) between intersection point LiWhether it is included in the ABC of face.As shown in figure 27b, It include intersection point L in the ABC of faceiIn the case where, by LiIt is considered as the projected position at an angle of image.

By executing this processing to institute's directed quantity to four angles of image, four angles of image can be calculated on ground surface Projected position.

Back to the description of the flow chart in Figure 21.Next, in step S55, will in step S54 calculated figure The projected position at four angles of picture is converted to composograph coordinate system ∑I.In order to be converted, using defining in step s 51 Ground orthogonal coordinate system ∑GWith composograph coordinate system ∑I.Figure 28 is ground orthogonal coordinate system ∑GWith composograph coordinate system ∑IConversion and image arrangement concept map.By by ground orthogonal coordinate system ΣGEach angle in four angles of middle image Projected position be converted to composograph coordinate system ΣI, image is projected to composograph coordinate system ΣIIt is upper and as shown in figure 28 It is arranged.It is such as described in reference diagram 15 using this arrangement, according to real image size placement of images, without such as Figure 14 institute Show by with the size placement of images different from the size of actual acquisition image come composograph.

Next, in step S56, based on four angles of image calculated in step S55 in composograph coordinate It is ΣIOn projected position placement of images.When placement of images, four angles of image calculated in step S55 are executed Projective transformation.For correct placement of images, it is necessary to four angles of correct projected image, but due to being indicated in altitude database The elevation data of each position does not indicate the height above sea level at all positions of ground surface in areas imaging, therefore in elevation data The height above sea level of instruction is not necessarily indicative to the height above sea level at four angles of image.Therefore, by calculating ground orthogonal coordinate system ∑GMiddle image Simultaneously these positions are converted in this way and project to composograph coordinate system Σ in position in each of four anglesICome up cloth Set image.It, can be with correct size placement of images to create composograph using this arrangement.

Then, in step S57, it is determined whether all images have been performed with the processing from step S53 to step S56 And image is arranged.In the case where untreated complete all images, processing enters step S53, and repeats step S53 To S57, until arranging all images (step S57, no).Then, in the case where having arranged all images, place Reason terminates, and completes the single composograph (step S57, yes) comprising multiple capture images.

As described above, executing composograph creation processing according to the second embodiment.Note that although the present embodiment uses Any kind of sensor can be used as distance measuring sensor in LiDAR sensor, if the sensor can measure away from From.In addition, present embodiment describes from map datum obtain ground surface height above sea level the case where and utilize distance measuring sensor The case where obtaining the height above sea level of ground surface, but the height above sea level of ground surface can also be obtained by other methods.For example, there is also one Kind of method, this method calculate the distance to target (ground surface) using triangulation, and calculate to moving body 100 height and The difference of the distance of ground surface come calculate from by using the imaging device 200 being mounted on mobile object 100 to Same Scene The parallax of two images (stereo-picture) of imaging and acquisition, the height above sea level as ground surface.As long as ground surface can be calculated Highly, so that it may use any kind of technology.

Note that can be by assuming that at the position of position indicated by closest GPS data associated with image Elevation data is that the height above sea level at all four angles of image carrys out placement of images.

As described above, in a second embodiment, the size of each image to be arranged becomes according to the height above sea level of ground surface It is different.Therefore, it by arrangement multiple images to create composograph, deposits and is likely to occur gap between images Possibility.However, by shortening the interval between imaging every time to increase the distance between imaging position shown in Fig. 7 direction The overlapping between image on (direction of travel of mobile object 100), can prevent gap.In addition, by shortening flight The distance between route is between the image on the distance between the flight path that increases mobile object 100 shown in Fig. 7 direction Overlapping, gap can be prevented.By flying in mobile object 100 with Reference Map data in advance before being imaged, Above content can also be built in flight plan.Furthermore it is possible to utilize the distance measuring sensor and mobile object of such as LiDAR 100 flight and imaging concurrently obtain the height above sea level of ground surface in real time, and can change mobile object 100 in real time based on height above sea level Mission program, to increase lap.

In embodiment, the flying height of mobile object 100 is described as fixation.However, it could be imagined that flying height It will change in the practical flight of mobile object 100.It therefore, can be according to being obtained by the GPS module 102 of mobile object 100 The change of the height of mobile object 100 executes correction.

<3. modification>

The embodiment of this technology is specifically described above, but this technology is not limited to previous embodiment, and is based on this skill The various modifications of the technical idea of art are possible.

The unmanned plane for serving as mobile object 100 is not limited to the unmanned plane for being provided with rotor as be shown in the examples, and It can also be so-called fixed wing aircraft.

It is not limited to unmanned plane according to the mobile object 100 of the embodiment of this technology, is also possible to automatically move without people Automobile, ship, the robot etc. that class drives.

This technology is not limited to the case where making unmanned plane during flying and capturing image from sky, and applies also for imaging device 200 and sensor device 300 be loaded on the ship for serving as mobile object, capture subsea image when mobile by ship, and Create the case where map of sea bed is as composograph.

In addition, this technology applies also for following situations: imaging device 200 and sensor device 300 are mounted to and serve as shifting On the automobile of animal body, imaging is carried out to landscape and along the panorama of road landscape photograph by referring to sideways imaging device 200 Piece is created as composograph.

Mobile object 100 is not installed to it but by playing the camera support of universal joint in imaging device 200 with fixation In the case that state is installed securely, the posture of mobile object 100 and the posture of imaging device 200 become equal.In this feelings Under condition because the gradient of mobile object 100 and the gradient of imaging device 200 become equal, mobile object 100, IMU module is arranged in any one of sensor device 300 and imaging device 200 to be sufficient.Note that even if using In the case where the camera support of universal joint effect, such as it can also be estimated by working together with the control of the angle of camera support The posture of imaging device 200.In this case, appointing in mobile object 100, sensor device 300 and imaging device 200 Setting IMU module is sufficient in what one.

Imaging device 200 is not limited to be provided with the imaging device of single image sensor, be also provided with it is multiple, i.e., two A or more imaging sensor.For example, the first imaging sensor can be considered as the imaging sensor for visible light, and Two imaging sensors can be considered as the imaging sensor for infra-red radiation.It, can be in single imaging using this arrangement Create a plurality of types of composographs.For example, using infrared image creation by from being obtained and farmland is imaged in the air Map image in the case where, by from map image check crops chlorophyll, can check the state of crops.Separately Outside, imaging device 200 can also be the multispectral camera that image is obtained in multiple and different wavelength regions (wavelength band), and It can be also used for checking the purpose of the vegetation index of such as normalized vegetation index (NDVI).

In embodiment, imaging device 200 and sensor device 300 are used as individual device to be mounted to mobile object 100, but imaging device 200 and sensor device 300 also can be configured as integrating device and be mounted to mobile object 100.In addition, mobile object 100 can be additionally configured to the function for being equipped with imaging device 200 and sensor device 300 in advance Energy.

Terminal installation 400 and/or cloud 500 can also be received by 200 captured image of imaging device and by sensor device 300 sensor informations obtained, and image is associated with sensor information.

Any kind of device can be used as imaging device 200, such as digital camera, smart phone, mobile phone, Portable game machine, PC on knee or tablet terminal, as long as the device is provided with imaging function, and can be mounted to movement On object 100.

Imaging device 200 is also provided with input unit, display unit etc..In addition, being not connected to mobile object 100 In the case where, imaging device 200 also acts as independent imaging device.

Imaging device 200 is also provided with battery, and electric power can be configured as the battery 6 from imaging device 200 Supplied to mobile object 100 and sensor device 300.Alternatively, all mobile objects 100, imaging device 200 and sensor Battery has can be set in device 300.

In addition, this technology can also configure as follows.

(1) a kind of information processing unit, is configured as:

The imaging device by being mounted on mobile object is obtained according to imaging instruction captured image;

The signal sent according to response in the imaging instruction from the imaging device, obtaining from sensor device includes institute State the sensor information of the location information of mobile object;And

The sensor information that will acquire is associated with the image of acquisition.

(2) information processing unit according to (1), wherein the sensor information further includes the height of the mobile object Spend information.

(3) information processing unit according to (1) or (2), wherein the sensor information further includes the motive objects The inclination information of body.

(4) information processing unit according to any one of (1) to (3), wherein the signal designation and the imaging Device captures image-related phototiming.

(5) information processing unit according to any one of (1) to (4), wherein

The sensor information is the newest sensor information on the time point of phototiming.

(6) information processing unit according to any one of (1) to (5), wherein

The information processing unit calculates new sensor information according to multiple sensor informations, and new sensor is believed It ceases associated with image.

(7) information processing unit according to (4), wherein the phototiming is the imaging device to capture figure As and execute exposure end timing.

(8) information processing unit according to (4), wherein the phototiming is the imaging device to capture figure As and execute exposure beginning timing.

(9) information processing unit according to (4), wherein the phototiming is to issue to catch for the imaging device Obtain the timing of the imaging instruction of image.

(10) information processing unit according to any one of (1) to (9), wherein the information processing unit passes through base Multiple images are arranged in the sensor information to create composograph.

(11) information processing unit according to (10), wherein the information processing unit is believed based on the sensor Breath arranges described multiple images while correcting position for described multiple images.

(12) information processing unit according to (10), wherein

The sensor information further includes the elevation information of the mobile object, and

Information processing unit placement of images while zooming in or out image based on the elevation information.

(13) information processing unit according to (10), wherein

The sensor information further includes the inclination information of the mobile object, and

Information processing unit cloth while executing keystone and/or pruning modes based on the inclination information Set image.

(14) information processing unit according to any one of (1) to (13), wherein

The information processing unit obtains the elevation data for indicating the height above sea level of the ground surface captured by the imaging device, and The information for including in the elevation data and the sensor information is associated.

(15) information processing unit according to (14), wherein the information processing unit is by the elevation data and schemes As associated.

(16) information processing unit according to (14) or (15), wherein the location information based on the mobile object, The elevation data is obtained from the map datum comprising position and the information of height above sea level.

(17) information processing unit according to (14) or (15), wherein based on by being installed on the mobile object The range information that distance measuring sensor obtains obtains the elevation data.

(18) information processing unit according to (17), wherein the information processing unit is according to response in the imaging Instruction obtains the elevation data, and the height above sea level number that will acquire from the signal that the imaging device is sent from the distance measuring sensor According to associated with the information for including in the sensor information.

(19) a kind of information processing method, comprising:

The imaging device by being mounted on mobile object is obtained according to imaging instruction captured image;

The signal sent according to response in the imaging instruction from the imaging device, obtaining from sensor device includes institute State the sensor information of the location information of mobile object;And

The sensor information that will acquire is associated with the image of acquisition.

(20) a kind of message handling program,

The message handling program makes computer execution information processing method, and the information processing method includes:

The imaging device by being mounted on mobile object is obtained according to imaging instruction captured image;

The signal sent according to response in the imaging instruction from the imaging device, obtaining from sensor device includes institute State the sensor information of the location information of mobile object;And

The sensor information that will acquire is associated with the image of acquisition.

(21) a kind of image processing apparatus, is configured as

Receive with according to response in imaging instruction from the position including mobile object that the signal that imaging device is sent obtains The offer of the associated multiple images of sensor information of confidence breath, and it is described more by being arranged based on the sensor information A creation of image composograph.

(22) image processing apparatus according to (21) is configured to by being filled based on instruction by the imaging The elevation data arrangement described multiple images of the height above sea level on the ground of capture are set to create composograph.

(23) image processing apparatus according to (22), wherein described image processing unit

It defines and the areas imaging associated with the movement of the mobile object corresponding first of the imaging device Coordinate system and the second coordinate system corresponding with the face of placement of images;

Based on the sensor information and the elevation data, throwing of the image on ground surface in the first coordinate system is calculated Shadow position, and

By the way that projected position is converted to the second coordinate system come placement of images.

(24) image processing apparatus according to (23), wherein in four angles of the image with quadrangle form Each angle, calculate the projected position of image.

(25) image processing apparatus according to (24), wherein described image processing unit is by by four angles of image Projected position be converted to the second coordinate system and carry out placement of images.

(26) image processing apparatus according to any one of (22) to (25), wherein from the letter comprising position and height above sea level The elevation data is obtained in the map datum of breath.

(27) image processing apparatus according to (26), wherein the location information based on the mobile object, passes through ginseng The map datum of information of the Kobo containing position and height above sea level obtains the elevation data.

(28) image processing apparatus according to any one of (22) to (25), wherein the elevation data is by being mounted on Distance measuring sensor on the mobile object obtains.

(29) a kind of image processing system, comprising:

Mobile object;

Imaging device, the imaging device are installed on the mobile object;

Sensor device, the sensor device be installed on the mobile object and be configured as according to response at The sensor information of the location information including the mobile object is detected as instructing the signal sent from imaging device;And

Information processing unit, the information processing unit are configured as catching the sensor information and the imaging device The image obtained is associated.

(30) image processing system according to (29), wherein the information processing unit is by being based on sensor information Multiple images are arranged to create composograph.

(31) image processing system according to (29) or (30), further includes: image processing apparatus, described image processing Device is configured as receiving to be captured and the offer of multiple images associated with the sensor information by the imaging device, and And composograph is created by arranging described multiple images based on the sensor information.

Reference signs list

100 mobile objects

200 imaging devices

300 sensor devices

250 information process units

1000 image processing systems

54页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:信号处理装置和固态摄像装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类