System and method for synchronizing frame timing between physical layer frame and video frame

文档序号:1776846 发布日期:2019-12-03 浏览:9次 中文

阅读说明:本技术 用于在物理层帧和视频帧之间将帧定时同步的系统和方法 (System and method for synchronizing frame timing between physical layer frame and video frame ) 是由 王晓东 于 2017-08-25 设计创作,主要内容包括:本公开涉及一种用于将视频传输与物理层同步的系统、方法和无人机(UAV)。该UAV可以包括至少一个视频记录设备、至少一个处理器和至少一个UAV收发器。至少一个视频记录设备可以被配置为记录包括多个视频帧在内的视频数据。至少一个处理器可以被配置为确定与多个视频帧中的视频帧的帧头相对应的第一时间点,并且至少部分地基于第一时间点来确定与物理层帧的帧头相对应的第二时间点。至少一个UAV收发器可以被配置为在第二时间点处开始发送视频帧。(This disclosure relates to a kind of for by the system, method and unmanned plane (UAV) of transmission of video and physical layer synchronization.The UAV may include at least one video recording apparatus, at least one processor and at least one UAV transceiver.At least one video recording apparatus can be configured as the video data recorded including multiple video frames.At least one processor, which can be configured as, determines first time point corresponding with the frame head of video frame in multiple video frames, and is based at least partially on first time point to determine the second time point corresponding with the frame head of physical layer frame.At least one UAV transceiver, which can be configured as, to be started to send video frame at the second time point.)

1. what one kind was realized on the computing device is used for the method for transmission of video and physical layer synchronization, the calculating equipment includes At least one processor and storage equipment, which comprises

Determine first time point corresponding with the frame head of video frame;

The first time point is based at least partially on to determine the second time point corresponding with the frame head of physical layer frame;And

Start to send the video frame at second time point.

2. according to the method described in claim 1, further include:

The physical layer frame is generated at second time point.

3. according to the method described in claim 1, wherein, the first time point is based at least partially on to determine and physical layer The frame head of frame corresponding second time point further include:

The first time point is based at least partially on to determine second time point;And

, the physical layer frame and institute synchronous with second time point by third time point corresponding with the frame head of physical layer frame It is corresponding to state video frame.

4. according to the method described in claim 1, further include:

The video frame is compressed before sending the video frame at second time point.

5. according to the method described in claim 1, further include:

The video frame is divided into multiple subframes;And

Compress data associated with each subframe in the multiple subframe.

6. according to the method described in claim 5, wherein it is determined that second time point include:

Determine the period for compressing the subframe in the multiple subframe;And

Period based on the first time point and for compressing the subframe determines second time point.

7. according to the method described in claim 1, wherein, determining that second time point also wraps based on the first time point It includes:

Determine at least part of period for compressing the video frame;And

Second time point is determined based on the first time point and the period.

8. according to the method described in claim 1, wherein, the frame head of the video frame is corresponding with frame synchronization pulse signal.

9. according to the method described in claim 1, wherein, the video frame is from the real-time video sent by video recording apparatus It is extracted in stream.

10. according to the method described in claim 1, wherein, the video frame is from being communicably connected to video recording apparatus Data-interface at extract in received real-time video.

11. according to the method described in claim 1, further include:

Obtain the frame rate of the video frame;And

The frame rate of the physical layer frame is configured based on the frame rate of the video frame.

12. according to the method for claim 11, wherein the frame rate of the physical layer frame is the frame rate of the video frame Integral multiple.

13. it is a kind of for by the system of transmission of video and physical layer synchronization, the system comprises:

Memory stores one or more computer executable instructions;And

One or more processors are configured as communicating with the memory, wherein when the one or more of computers of execution When executable instruction, one or more of processors are intended for:

Determine first time point corresponding with the frame head of video frame;

The first time point is based at least partially on to determine the second time point corresponding with the frame head of physical layer frame;And

Start to send the video frame at second time point.

14. system according to claim 13, wherein one or more of processors, which also aim to, to be used for:

The physical layer frame is generated at second time point.

15. system according to claim 13, wherein in order to be based at least partially on the first time point determine with At the frame head of the physical layer frame corresponding second time point, one or more of processors, which also aim to, to be used for:

The first time point is based at least partially on to determine second time point;And

, the physical layer frame and institute synchronous with second time point by third time point corresponding with the frame head of physical layer frame It is corresponding to state video frame.

16. system according to claim 13, wherein one or more of processors, which also aim to, to be used for:

The video frame is compressed before sending the video frame at second time point.

17. system according to claim 13, wherein one or more of processors, which also aim to, to be used for:

The video frame is divided into multiple subframes;And

Compress data associated with each subframe in the multiple subframe.

18. system according to claim 17, wherein in order to determine second time point, one or more of places Reason device is intended for:

Determine the period for compressing the subframe in the multiple subframe;And

Period based on the first time point and for compressing the subframe determines second time point.

19. system according to claim 13, wherein in order to determine second time based on the first time point Point, one or more of processors are intended for:

Determine at least part of period for compressing the video frame;And

Second time point is determined based on the first time point and the period.

20. system according to claim 13, wherein the frame head of the video frame is corresponding with frame synchronization pulse signal.

21. system according to claim 13, wherein the video frame is from the real-time view sent by video recording apparatus It is extracted in frequency stream.

22. system according to claim 13, wherein the video frame is from being communicably connected to video recording apparatus Data-interface at extract in received real-time video.

23. system according to claim 13, wherein one or more of processors, which also aim to, to be used for:

Obtain the frame rate of the video frame;And

The frame rate of the physical layer frame is configured based on the frame rate of the video frame.

24. system according to claim 23, wherein the frame rate of the physical layer frame is the frame rate of the video frame Integral multiple.

25. a kind of non-transitory computer-readable medium including executable instruction, the executable instruction is when by least one When processor executes, at least one described processor is made to realize methods including following operation:

Determine first time point corresponding with the frame head of video frame;

The first time point is based at least partially on to determine the second time point corresponding with the frame head of physical layer frame;And

Start to send the video frame at second time point.

26. non-transitory computer-readable medium according to claim 25, wherein the executable instruction is when by least When one processor executes, the method for realizing at least one described processor further includes following operation:

Obtain the frame rate of the video frame;And

The frame rate of the physical layer frame is configured based on the frame rate of the video frame.

27. a kind of unmanned plane UAV, comprising:

At least one video recording apparatus is configured as recording the video data including multiple video frames;

At least one processor, is configured as:

Determine first time point corresponding with the frame head of video frame in the multiple video frame;

The first time point is based at least partially on to determine the second time point corresponding with the frame head of physical layer frame;And

At least one UAV transceiver is configured as starting to send the video frame at second time point.

28. UAV according to claim 27, wherein at least one described processor is also configured to

The physical layer frame is generated at second time point.

29. unmanned plane according to claim 27, wherein determined to be based at least partially on the first time point Second time point corresponding with the frame head of the physical layer frame, at least one described processor are also configured to

The first time point is based at least partially on to determine second time point;And

, the physical layer frame and institute synchronous with second time point by third time point corresponding with the frame head of physical layer frame It is corresponding to state video frame.

30. UAV according to claim 27, wherein at least one described processor is also configured to

The video frame is compressed before sending the video frame at second time point.

Technical field

This application involves synchronous system and method are used for transmission, more particularly, in physical layer frame and video The system and method for synchronizing frame timing between frame.

Background technique

The unmanned moveable platform (UMP) of such as unmanned plane (UAV) etc be widely used in such as aeroplane photography, monitoring, The various fields such as scientific research, geological exploration and remote sensing.User can control the manipulation of UAV via ground based terminal.UAV can be with Video data during record flight, and video data is sent to ground based terminal.Ground based terminal can with recording synchronism be shown Show video data.

In order to show video data at ground based terminal in the case where not causing lag, for UAV, it is important that Reduce from UAV to the ground terminal send video data time delay.

Summary of the invention

According to the one side of the disclosure, it provides a kind of for by the method for transmission of video and physical layer synchronization.The side Method can include realizing at least one processor and the calculating equipment for storing equipment.The method may include: determine with The corresponding first time point of the frame head of video frame;The first time point is based at least partially on to determine and physical layer frame Frame head corresponding second time point;And start to send the video frame at second time point.

In some embodiments, the method, which can also be included at second time point, generates the physical layer frame.

In some embodiments, it is opposite with the frame head of physical layer frame to determine to be based at least partially on the first time point The second time point answered may include: to be based at least partially on the first time point to determine second time point;And Third time point corresponding with the frame head of physical layer frame is synchronous with second time point, the physical layer frame and the view Frequency frame is corresponding.

In some embodiments, the method can also include: sent at second time point video frame it The preceding compression video frame.

In some embodiments, the method can also include: that the video frame is divided into multiple subframes;And compression Data associated with each subframe in the multiple subframe.

In some embodiments, determine that second time point may include: to determine for compressing in the multiple subframe Subframe period;And the period based on the first time point and for compressing the subframe is described to determine Second time point.

In some embodiments, determine that second time point may include: determining use based on the first time point In at least part of period for compressing the video frame;And it is determined based on the first time point and the period Second time point.

In some embodiments, the frame head of the video frame can be corresponding with frame synchronization pulse signal.

In some embodiments, the video frame can be extracted from the live video stream sent by video recording apparatus.

In some embodiments, the video frame can be from the data-interface for being communicably connected to video recording apparatus It is extracted in received real-time video.

In some embodiments, the method can also include: to obtain the frame rate of the video frame;And based on described The frame rate of video frame configures the frame rate of the physical layer frame.

In some embodiments, the frame rate of the physical layer frame can be the frame rate of the video frame Integral multiple.

According to another aspect of the present disclosure, it provides a kind of for by the system of transmission of video and physical layer synchronization.It is described System may include: memory, store one or more computer executable instructions;And one or more processors, matched It is set to and is communicated with the memory.When executing one or more of computer executable instructions, one or more of places Reason device can be intended for: determine first time point corresponding with the frame head of video frame;It is based at least partially on described first Time point determines the second time point corresponding with the frame head of physical layer frame;And start to send at second time point The video frame.

According to another aspect of the present disclosure, a kind of non-transitory computer-readable medium is provided.The non-transitory meter Calculation machine readable medium may include executable instruction.When the executable instruction by least one processor execute when, it is described can Executing instruction can make at least one described processor realize a kind of method.The method may include: it is determining and video frame The corresponding first time point of frame head;It is opposite with the frame head of physical layer frame to determine to be based at least partially on the first time point The second time point answered;And start to send the video frame at second time point.

According to another aspect of the present disclosure, a kind of unmanned plane (UAV) is provided.The UAV may include at least one view Frequency recording equipment, at least one processor and at least one UAV transceiver.At least one described video recording apparatus can be matched It is set to the video data recorded including multiple video frames.At least one described processor can be configured as it is determining with it is described The corresponding first time point of the frame head of video frame in multiple video frames, and it is based at least partially on the first time point To determine the second time point corresponding with the frame head of physical layer frame.At least one described UAV transceiver can be configured as Start to send the video frame at second time point.

Additional feature will be set forth in part in the description which follows, and by portion after reading the following contents and attached drawing Divide ground to become obviously those skilled in the art, or can be learned by exemplary generation or operation.The feature of the disclosure It can be by the practice to method described in detailed example discussed below, means and combined various aspects or using next real Now and obtain.

Detailed description of the invention

The disclosure is further described accoding to exemplary embodiment.These exemplary embodiments are described in detail with reference to the attached drawings. These embodiments are non-limiting exemplary embodiment, wherein running through several views of attached drawing, similar appended drawing reference is indicated Similar structure, and wherein:

Fig. 1 shows the schematic diagram of exemplary unmanned plane (UAV) system according to some embodiments of the present disclosure;

Fig. 2 shows the block diagrams according to the exemplary unmanned planes (UAV) of some embodiments of the present disclosure;

Fig. 3 is shown according to some embodiments of the present disclosure for carrying out the exemplary of transmission of video in UAV system The flow chart of process;

Fig. 4 shows the block diagram of the exemplary ground based terminal in the UAV system according to some embodiments of the present disclosure;

Fig. 5 shows the block diagram of the example processor in the UAV system according to some embodiments of the present disclosure;

Fig. 6 is shown according to some embodiments of the present disclosure for sending the exemplary mistake of video frame in UAV system The flow chart of journey;

Fig. 7 is shown according to some embodiments of the present disclosure for configuring the frame rate of physical layer frame in UAV system Example process flow chart;

Fig. 8 A and Fig. 8 B show two signals of the transmission of video in the UAV system according to some embodiments of the present disclosure Figure;And

Fig. 9 shows the schematic diagram of exemplary open system interconnection (OSI) model.

Specific embodiment

In the following detailed description, a large amount of detail is elaborated, in an illustrative manner to provide to disclosed in correlation It understands thoroughly.However, should be it is readily apparent that can be in feelings without these details to those skilled in the art The disclosure is practiced under condition.In other cases, in order to avoid unnecessarily obscure the disclosure aspect, only in relatively high layer Well known method, process, component and/or circuit are described on face, without describing in detail.To the various of disclosed embodiment Modify will be evident to those skilled in the art, and without departing substantially from the scope of the present invention, defined herein General Principle be applicable to other embodiments and application.Therefore, the disclosure be not intended to be limited to shown in embodiment, and should assign It gives and the consistent widest range of claim.

It will be appreciated that term " system ", " unit ", " module " and/or " engine " used herein is that one kind is used for The different components of different level, element, part, part or the method for assembly are distinguished with ascending order.But if other are expressed Identical purpose may be implemented, then these terms can be replaced by other expression.

It will be appreciated that unless the context is clearly stated, otherwise when unit, module or engine are referred to as in another list Member, module or engine " on ", with its " connection " or " coupling " when, the unit, module or engine can directly another unit, On module or engine, directly connect couple or directly communicate with it, or there may be temporary location, module or Engine."and/or" includes related any and all groups for listing one or more of project as used herein, the term It closes.

Term used herein is limited just for the sake of description specific example and embodiment without being intended to. As used in this article, singular " one ", "one" and " described " can be intended to further include plural form, unless up and down Text clearly provides opposite instruction.It will also be understood that term " includes " and/or "comprising" ought be in the disclosure in use, specified There are integer, equipment, behavior, the feature stated, step, element, operation and/or component, but there is no exclude to exist Or add one or more integers, equipment, behavior, feature, step, element, operation, component and/or combination thereof.

Present disclose provides the system and method synchronous for the transmission of video in UAV system.The disclosure is according to video flowing The frame timing of transmission adjusts the frame timing of physical layer transmission.The video recording apparatus that video flowing can be carried directly from UAV It receives, or is received at the data-interface for the video recording apparatus that can be carried on being communicably connected to UAV.By by physics Frame timing between layer frame and video frame synchronizes, it is possible to reduce is led due to the waiting time associated with each video frame The transmission delay of the video flowing slave UAV to ground based terminal caused.In addition, the frame rate of physical layer frame is configured video by the disclosure The integral multiple of the frame rate of frame.Therefore, it is possible to reduce the tune of the frame timing during video flowing transmission between UAV and ground based terminal Whole number.

Fig. 1 shows the schematic diagram of exemplary unmanned plane (UAV) system 100 according to some embodiments of the present disclosure.UAV System 100 may include UAV 102, ground based terminal 104, network 106, server 108 and storage equipment 110.

UAV 102, which can be configured as, to be collected data during flight and the data being collected into is sent to ground based terminal 104 and/or server 108.Data may include the state of flight of UAV 102, the battery service condition of UAV 102 and surrounding The information etc. of environmental correclation connection.Data can also include text data, video data, audio data etc..Video data can wrap Include video, image, figure, animation, audio etc..UAV 102 can send the data to ground based terminal 104 during flight, with The content of data is synchronously shown on ground based terminal 104.

UAV 102 can be operated entirely autonomously (for example, being grasped by the computing system of such as on-board controller etc Make), semi-autonomous operate or manually operate (for example, the control application realized on the mobile apparatus by user's operation).In In some embodiments, user can operate UAV 102 via ground based terminal 104.UAV 102 can be from entity (for example, the mankind User or self-control system) order is received, and these orders are responded by executing one or more movements.For example, can be with Control UAV 102 takes off from ground, moves in the sky, being moved to target position or a series of target positions, hovering, In in the sky Land on ground etc..As another example, it can control UAV 102 with specified speed and/or acceleration or along specified Route move in the sky.In addition, the order can be used for one or more groups of UAV 102 described in control figure 2 Part (for example, video recording apparatus 206, sensor 210, flight controller 208 etc.).For example, certain orders can be used to control Position, direction and/or the operation of video recording apparatus 206 processed.

Ground based terminal 104 can be configured as transmission, reception, output, display and/or processing information.For example, ground based terminal 104 can receive information from UAV 102, network 106, server 108, storage equipment 110 etc..As another example, ground is whole End 104 can send the order for being used to control UAV 102 generated by user.The order may include for controlling UAV The information of 102 speed, acceleration, height and/or direction.As another example, ground based terminal 104 can be shown to user By the image shot of UAV 102 or play the video shot by UAV 102.As another example, ground based terminal 104 can be located Reason is from the information that server 108 receives to update the application being mounted on ground based terminal 104.

In some embodiments, ground based terminal 104 may include desktop computer, mobile device, laptop computer, put down Plate computer etc. or any combination thereof.In some embodiments, mobile device may include smart home device, wearable set Standby, Intelligent mobile equipment, virtual reality device, augmented reality equipment etc. or any combination thereof.In some embodiments, intelligent family Occupying equipment may include Intelligent illumination device, smart television, intelligent camera, intercom etc. or any combination thereof.In some realities It applies in example, wearable device may include smart bracelet, intelligent footgear, smart glasses, smart watches, intelligent helmet, Intellectual garment Dress, intelligent knapsack, smart accessories etc. or any combination thereof.In some embodiments, Intelligent mobile equipment may include intelligent electricity Words, game station, navigation equipment, point of sale (POS) equipment etc. or any combination thereof.

Network 106 can be configured as promotion information exchange.In some embodiments, one in UAV system 100 or more A component (for example, UAV 102, ground based terminal 104, server 108 and storage equipment 110) can be via network 106 by information The other assemblies being sent in UAV system 100.For example, ground based terminal 104 can receive video from UAV 102 via network 106 And/or image.In some embodiments, network 106 can be any kind of wired or wireless network, or combinations thereof.Only make For example, network 106 may include cable system, cable network, fiber optic network, telecommunication network, Intranet, internet, local Net (LAN), wide area network (WAN), WLAN (WLAN), Metropolitan Area Network (MAN) (MAN), wide area network (WAN), public telephone switching network (PSTN), bluetoothTMNetwork, ZigBeeTMNetwork, near-field communication (NFC) network etc. or any combination thereof.In some embodiments, Network 106 may include the wired or wireless network access point of such as base station and/or internet exchange point (not shown) etc, can Information is exchanged so that the one or more components of UAV system 100 are connected to network 106 by these access points.Some In embodiment, base station and/or internet exchange point can be Wi-Fi station.In some embodiments, UAV and/or ground based terminal 104 by Random Access competition-based or can be not based on the Random Access of competition and access network 106.

Server 108 can be configured as processing data.It can be from UAV 102, ground based terminal 104, network 106, storage Equipment 110 etc. receive data.For example, server 108 can be by the Air Diary archival of information from UAV 102 in storage equipment In 110.As another example, server 108 can be by the information back-up from ground based terminal 104 in storage equipment 110. Server 108 may include central processing unit (CPU), specific integrated circuit (ASIC), dedicated instruction set processor (ASIP), Graphics processing unit (GPU), physical processing unit (PPU), digital signal processor (DSP), field programmable gate array (FPGA), programmable logic device (PLD), controller, micro controller unit, Reduced Instruction Set Computer (RISC), micro process Device etc. or any combination thereof.In some embodiments, server 108 can be integrated in ground based terminal 104.

Storage equipment 110 can be configured as acquisition and/or storage information.The information can be from the component of UAV system 100 (for example, UAV 102, ground based terminal 104 or server 108 etc.) receives.For example, storage equipment 110 can be from ground based terminal 104 Obtain information.In some embodiments, obtain and/or be stored in storage equipment 110 in information may include program, software, Algorithm, function, file, parameter, data, text, number, image etc. or any combination thereof.For example, storage equipment 110 can deposit Store up the image collected by UAV 102.As another example, storage equipment 110 can store the parameter from ground based terminal 104 (for example, the latitude of UAV 102, longitude, height).In some embodiments, storage equipment 110 may include that massive store is set Standby, removable storage device, volatile read-write memory, read-only memory (ROM) etc. or any combination thereof.In some embodiments In, it can realize that storage equipment 110, the cloud platform include private clound, public cloud, mixed cloud, community cloud in cloud platform, divide Cloth cloud, Yun Jian, cloudy etc. or any combination thereof.

It should be noted that the above description of UAV system 100 is provided for illustration purposes only, and it is not intended to limit The scope of the present disclosure processed.For those of ordinary skill in the art, a variety of variations can be carried out under the introduction of the disclosure Or modification.For example, UAV 102 can be record and send any kind of remote equipment of video data, including but not limited to Supervision equipment, wireless sensor network equipment, smart home device, Airborne Video System equipment etc..However, these change and modification can be with The scope of the present disclosure is not departed from.

Fig. 2 shows the block diagrams according to the exemplary unmanned planes (UAV) 102 of some embodiments of the present disclosure.UAV 102 can To include UAV transceiver 202, processor 204, video recording apparatus 206, flight controller 208, sensor 210, inertia measurement Unit (IMU) 212 and storage medium 214.

UAV transceiver 202 can send and/or receive data.Data may include text, video, image, audio, move It draws, figure etc. or any combination thereof.In some embodiments, UAV 102 can be via UAV transceiver 202 and ground based terminal 104 It is communicated.For example, UAV transceiver 202 can send the video handled by processor 204 (for example, compressed video) To ground based terminal 104.As another example, UAV transceiver 202 can be received from ground based terminal 104 for manipulating UAV's 102 Mobile order.UAV transceiver 202 can be any kind of transceiver.For example, UAV transceiver 202, which can be, to be passed through Wireless network sends or receives radio frequency (RF) transceiver of data.More specifically, wireless network can operate in various frequency bands, Such as 433MHz, 900MHz, 2.4GHz, 5GHz, 5.8GHz etc..In some embodiments, UAV transceiver 202 may include hair Send device and receiver.Transmitters and receivers can respectively realize some or all of UAV transceiver 202 function.

Processor 204 can handle data.Data can from the other assemblies of UAV 102 (for example, UAV transceiver 202, Video recording apparatus 206 or storage medium 214 etc.) it receives.It is received for example, processor 204 can handle by UAV transceiver 202 Data.As another example, processor 204 can handle the number that ground based terminal 104 is sent to via UAV transceiver 202 According to.As another example, processor 204 can receive video data from video recording apparatus 206.Processor 204 can compress Video data adjusts video data and video data adjusted is sent to ground based terminal via UAV transceiver 202 104.The adjustment may include by the transmission and physical layer synchronization of video data.Processor 204 can be from sensor 210, flight Controller 208 and IMU 212 receive data, to assess the state of UAV 102 and to determine action process.For example, processor 204 It can constantly and/or periodically be communicated with IMU 212, IMU 212 can measure the speed and attitude data of UAV 102, And it is adaptively adjusted the position of UAV 102.In some embodiments, processor 204 may include one or more processors (for example, single core processor or multi-core processor).In some embodiments, processor 204 may include central processing unit (CPU), specific integrated circuit (ASIC), dedicated instruction set processor (ASIP), graphics processing unit (GPU), physical treatment list First (PPU), digital signal processor (DSP), field programmable gate array (FPGA), programmable logic device (PLD), control Device, micro controller unit, Reduced Instruction Set Computer (RISC), microprocessor etc. or any combination thereof.

Video recording apparatus 206 can capture video data.Video data may include image, video, audio, figure, Animation etc..Video recording apparatus 206 can be camera, video camera, video recorder, digital camera, infrared camera or ultraviolet phase Machine etc..It is for processing that the video data that video recording apparatus 206 can will be captured is sent to processor 204.For example, video is remembered The video data that recording apparatus 206 can will be captured is sent to processor 204.Processor 204 can be with compressed video data, and makes Compressed video data is transferred to ground based terminal 104.Ground based terminal 104 can receive and decompressed video data.One In a little embodiments, UAV 102 may include for installing and/or stablizing the retainer of video recording apparatus 206/holder (pan- Tilt) equipment (being not shown in Fig. 2), such as the universal joint at least one axis.Processor 204 can control retainer/cloud The operation of platform equipment is to adjust the position of video recording apparatus 206.

Flight controller 208 can control the propulsion of UAV 102 with control the pitch angle of UAV 102, angle of heel and/or partially Boat angle.Flight controller 208 can change the rate, direction and/or position of UAV 102.For example, being connect from ground based terminal 104 When receiving the data including user-defined route planning, processor 204 can explain the data and send corresponding instruction To flight controller 208.Flight controller 208 can change speed and/or the position of UAV 102 based on described instruction.

Sensor 210 can collect related data.Related data may include in UAV state, ambient enviroment or environment The related information of object.Sensor 210 may include position sensor (for example, HA Global Positioning Satellite (GPS) sensor, branch Hold the mobile device transmitter of position triangulation), visual sensor is (for example, be able to detect visible light, infrared light or ultraviolet light Imaging device, such as camera), the degree of approach or range sensor be (for example, ultrasonic sensor, LIDAR (light detection and survey Away from), flight time or depth camera), it is inertial sensor (for example, accelerometer, gyroscope, Inertial Measurement Unit (IMU)), high Sensor, attitude transducer (for example, compass, IMU) are spent, pressure sensor (for example, barometer), audio sensor are (for example, wheat Gram wind), field sensor (for example, magnetometer, electromagnetic sensor) etc. or any combination thereof.The number that sensor 210 can be collected According to being sent to processor 204.

The angular speed (for example, attitudes vibration) and linear acceleration that IMU 212 can measure UAV 102 are (for example, speed becomes Change).For example, IMU 212 may include for measuring the attitudes vibration of UAV 102 (for example, absolute or opposite pitch angle, inclination Angle and/or yaw angle) one or more gyroscopes, and may include for measure UAV 102 linear speed change (example Such as, along the acceleration in the direction x, y and/or z) one or more accelerometers.In some embodiments, IMU 212 can be with It is integrated in sensor 210.

Storage medium 214 can store data.Data can be from UAV transceiver 202, processor 204, video recording apparatus 206, flight controller 208, sensor 210, IMU 212 and/or any other equipment obtain.Data may include picture number According to, video data, metadata associated with image data and video data, director data etc..Storage medium 214 may include Hard disk drive, solid state drive, removable storage device driver (for example, flash disk drive, CD drive etc.), number Word video recorder etc. or any combination thereof.

It should be noted that the above description of UAV 102 provides for illustration purposes only, and it is not intended to limitation originally Scope of disclosure.For those of ordinary skill in the art, a variety of variations can be carried out under the introduction of the disclosure or are repaired Change.For example, some other components can be realized in UAV 102, for example, battery can be implemented as power supply in UAV 102. As another example, UAV 102 may include the electronics speed for controlling or adjusting the rotation speed for the motor being installed therein It spends controller (ESC).However, those change and modification can not depart from the scope of the present disclosure.

Fig. 3 is shown according to some embodiments of the present disclosure for carrying out the exemplary of transmission of video in UAV system The flow chart of process 300.In some embodiments, example process 300 can by the one or more processors of UAV 102 Lai It realizes.

In step 302, it can recorde video data.In some embodiments, step 302 can be as (shown in Fig. 2 ) video recording apparatus 206 realizes.Video data may include video, audio, image, figure, animation etc. or its any group It closes.In some embodiments, video data may include multiple video frames.It in some embodiments, can be in response to from processing Request that device 204 or ground based terminal 104 receive executes step 302.For example, the user of ground based terminal 104 can via with Family interface (for example, user interface 408 shown in Fig. 4) sends the request for recording video data.Receiving request Later, video recording apparatus 206 can be activated to record video data.As another example, for recording video data Order can be by user via 104 preprogramming of ground based terminal, and is stored in the storage medium 214 of UAV 102.Video counts According to record can be controlled by processor 204 via the order of preprogramming that be stored in storage medium 214 is executed.

In step 304, video frame can be extracted from video data.In some embodiments, step 304 can be by Device 204 is managed to realize.As used herein, video frame may refer to the frame of video data.Video frame can with depend on The length of the time of compression algorithm is corresponding.In some embodiments, video frame can be static image.Video frame may include Frame head.Frame head can indicate the starting point (start) of video frame transmission.Frame head may include the information of video frame, such as synchronous Information, address information, error-control information, encoded information etc..Synchronizing information may include video frame start time point and/or End time point.In some embodiments, frame head can be corresponding with frame synchronization pulse signal.Frame synchronization pulse signal can refer to Show the start time point of video frame.In some embodiments, video frame can be divided into multiple subframes.In multiple subframes Each associated data of subframe can be compressed and/or be packaged into data packet to transmit.Each subframe in multiple subframes It can be a part of corresponding with video frame.The length for being segmented subframe can be identical or different.

It within step 306, can be synchronous with the transmission of the video frame in physical layer by the transmission of physical layer frame.In some realities It applies in example, step 306 can be realized by processor 204.In open system interconnection (OSI) framework, physical layer defines logical The physical link of connection network node is crossed to send the mode of original bit.Bit stream can be converted into physical signal, so as to It is sent by hardware transport medium.Physical layer includes physical signaling sublayer, the matchmaker of the physical signaling sublayer and data link layer Body access control (MAC) sublayer interface connection (interface) simultaneously executes signaling control to sending and receiving.According to the application's Physical layer frame refers to by the frame of the physical layer generation of OSI framework, and the signaling sent and received is executed via physical layer and is controlled.Number It is controlled by the transmission of physical layer by physical layer frame according to packet (for example, video frame or its subframe after compression and/or after being packaged).Tool Body, the transmission of first bit of physical layer frame can signal allow to send data packet (for example, after compression and/or Video frame or its subframe after packing) first bit.That is, the timing for sending first bit of data packet (also referred to as regards The frame timing of frequency frame) it needs with the timing (the also referred to as frame timing of physical layer frame) for first bit for sending physical layer frame together Step.When the frame timing of the frame timing of video frame and physical layer frame is asynchronous, video frame may not be immediately sent.When determining video When the frame timing of frame is synchronous with the frame timing of one of subsequent physical-layer frame since current physical layer frame, it can permit and pass through Physical layer sends video frame.Since UAV 102 can access network 106 to be not based on the random access mechanism of competition, so The physical layer frame for being used for transmission control can be randomly generated.It in some embodiments, can be according to the timer of physical layer Scheduling is to generate physical layer frame.The frame timing of physical layer frame and the frame timing of video frame can be asynchronous.As a result, physical layer frame The starting point (the also referred to as frame head of physical layer frame) of transmission and by physical layer carry out video frame transmission starting point (also referred to as The time point of the subframe after compressing and/or after being packaged can be sent) it may be asynchronous.For example, the subframe when video frame is pressed Contract and/or be packaged into data packet and prepare when sending within the physical layer, it is contemplated that transmission initial time may fall in physical layer frame Transmission among.Compressed subframe (i.e. data packet) may have to wait for (also referred to as waiting for a period of time before transmission Time), so as to cause transmission delay.Physical layer frame due to being used for transmission control may be randomly generated or root It is generated according to the scheduling of the timer of physical layer, therefore is unpredictable by the waiting time before sending in permission data packet 's.In order to reduce the waiting time before sending data packet, the frame head of adjustable physical layer frame.For example, can will be with object The frame head corresponding time point of reason layer frame is adjusted to same with the time point of the subframe after expectation transmission compression and/or after being packaged Step.Synchronous detailed process can find (for example, in description of Fig. 6, Fig. 8 A, Fig. 8 B etc.) elsewhere in the disclosure.

In step 308, video frame can be sent.In some embodiments, step 308 can be by processor 204 real It is existing.Video frame can be sent to one or more other assemblies of UAV 102, such as UAV transceiver 202, storage medium 214 Deng.For example, can be sent to UAV transceiver 202 for processing for video frame.It is described processing may include amplification, analog-to-digital conversion, Digital-to-analogue conversion etc..UAV transceiver 202 can data be sent to ground based terminal 104 by treated.As another example, video Frame can be stored in storage medium 214.In some embodiments, video frame can be sent within the physical layer.Such as step 306 Described in, can by physical layer frame and video frame synchronization, allow to send video frame (or its compression after and/or be packaged after Subframe) time point it is synchronous with the frame head of physical layer frame.Video frame (or subframe after its compression and/or after being packaged) can be with object Physical layer frame in reason layer is sent together.

It should be noted that step shown in Fig. 3 is for the purpose of illustration, and it is not intended to the protection model of the limitation disclosure It encloses.In some embodiments, process 300 can be in the additional steps that there is one or more not describe and/or without upper It is completed in the case where the one or more steps that face discusses.In addition, the sequence in Fig. 3 the step of implementation procedure 300 is not purport It is limiting.For example, can it is shown in Fig. 3 during any two step between add other one or more optional steps Suddenly.The example of such step may include storage or buffered video data etc..

Fig. 4 shows the block diagram of the exemplary ground based terminal 104 in the UAV system according to some embodiments of the present disclosure. Ground based terminal 104 may include ground based terminal transceiver 402, processor 404, display 406, user interface 408 and memory 410。

Ground based terminal transceiver 402 can send and/or receive data.Data may include text, video, image, sound Frequently, animation, figure etc. or any combination thereof.In some embodiments, ground based terminal 104 can be via ground based terminal transceiver 402 are communicated with UAV 102.For example, ground based terminal transceiver 402 can by from processor 404 instruction (for example, with In the instruction of record video data) it is sent to UAV transceiver 202.When receiving instruction from ground based terminal 104, UAV transceiver 202 can be sent to described instruction video recording apparatus with start recording video data.Ground based terminal transceiver 402 can be Any kind of transceiver.For example, ground based terminal transceiver 402, which can be, to send or receive data by wireless network Radio frequency (RF) transceiver.More specifically, wireless network can operate in various frequency bands, such as 433MHz, 900MHz, 2.4GHz, 5GHz, 5.8GHz etc..In some embodiments, ground based terminal transceiver 402 may include transmitters and receivers. Transmitters and receivers can respectively realize some or all of surface terminal receiver 402 function.

Processor 404 can handle data.Data can be received from ground based terminal transceiver 402, memory 410 etc..Institute Stating data may include information (for example, speed, acceleration, height etc.) related with the state of UAV 102, image data, view Frequency evidence, user instruction (for example, instruction of the height for increasing UAV 102) etc..In some embodiments, the processing of data It may include storage, classification, selection, transformation, calculating, estimation, coding, decoding etc. or any combination thereof.For example, processor 404 It can be unziped it to from the received compressed video frame of UAV 102.As another example, ground based terminal 104 can be through Application, which is received, from server 108 by ground based terminal transceiver 402 updates packet.Processor 404 can handle using update Bao Yigeng The related mobile application being newly mounted on ground based terminal 104.As another example, processor 404 can handle from memory 410 data are to check history Air Diary.In some embodiments, processor 404 may include one or more micro processs Device, field programmable gate array (FPGA), central processing unit (CPU), digital signal processor (DSP) etc..

Display 406 can show information.The information may include text, audio, video, image etc. or its any group It closes.Display 406 may include liquid crystal display (LCD), based on the display of light emitting diode (LED), flat-panel monitor or Curve screens, television equipment, cathode-ray tube (CRT) etc. or combinations thereof.In some embodiments, display 406 may include Touch screen.In some embodiments, being shown in information on display 406 can be with the state of UAV102 (for example, height, speed Degree etc.) it is related.In some embodiments, display can be displayed on by 206 captured image of video recording apparatus and/or video On device 406.

User interface 408 can receive to be interacted with the user of ground based terminal 104, and is generated for operating ground based terminal 104 One or more components or the other assemblies in UAV system 100 one or more instructions.One or more of instructions can With include instruction for operating ground terminal transceiver 402 to be communicated with UAV transceiver 202, for operation processing device It is received 404 to handle from the instructions of the received data of ground based terminal transceiver 402, for operating display 406 with showing The instruction of image and/or video is regarded for operating memory 410 with the instruction of storing data, for operating UAV 102 with capturing Instruction of frequency evidence etc. or any combination thereof.In some embodiments, user interface 408 may include that one or more inputs are set It is standby, for example, touch screen, keyboard, mouse, trace ball, control stick, stylus, audio recognition devices or application etc..For example, keyboard It can be integrated in ground based terminal 104.Multiple keys on lower keyboard can be pressed in response to user with a certain sequence to refer to generate It enables.In some embodiments, described instruction can instruct UAV 102 to adjust flight attitude.In some embodiments, described instruction Video recording apparatus 206 can be instructed to shoot photo or record video.In some embodiments, described instruction can instruct to show Device 406 shows photo or video.In some embodiments, described instruction can instruct 410 storing data of memory.For example, In After obtaining video, memory 410 can receive the instruction for storing video.

Memory 410 can store data.The data can be from ground based terminal transceiver 402, processor 404, display Any other component in device 406, user interface 408 and/or UAV system 100 obtains.Data may include image data, view Frequency evidence, metadata associated with image data and video data, director data etc..In some embodiments, memory 410 It may include hard disk drive, solid state drive, removable storage device driver (for example, flash disk drive, disc drives Device etc.), digital video recorder etc. or any combination thereof.

It should be noted that the above description of ground based terminal 104 provides for illustration purposes only, and it is not intended to limit The scope of the present disclosure.For those of ordinary skill in the art, can be carried out under the introduction of the disclosure it is a variety of variation or Modification.For example, the one or more components of ground based terminal 104 can respectively include independent memory block (not shown).However, that A little change and modification can not depart from the scope of the present disclosure.

Fig. 5 shows the block diagram of the example processor 204 in the UAV system according to some embodiments of the present disclosure.Place Reason device 204 may include data acquisition module 502, data compression and packetization module 504, physical channel processing module 506 and storage Module 508.Module in processor 204 can connect in a wired or wireless fashion.

Data acquisition module 502 can obtain data.The data can be with UAV 102 (for example, the speed of UAV 102 Degree), the object in ambient enviroment (for example, temperature, atmospheric pressure etc.) or the environment it is related.The data may include image Data, audio data, video data etc. or any combination thereof.The data can be from UAV transceiver 202, video recording apparatus 206, the other assemblies of storage medium 214 or UAV system 100 obtain.In some embodiments, video data may include real-time Video flowing.Live video stream can be sent by video recording apparatus 206, or be communicably connected to video recording apparatus 206 Data-interface at receive.Video data may include multiple video frames.Each video frame in multiple video frames can have Frame head.The frame head of each video frame in multiple video frames can be corresponding with frame synchronization pulse signal.Frame synchronization pulse signal The start time point of each video frame in multiple video frames can be indicated to send within the physical layer.

Data compression and packetization module 504 can compress and/or packaged data.The data can be from data acquisition module 502 or memory module 508 receive.Data compression and packetization module 504 can be configured as the redundancy reduced in data.It is described superfluous Remaining may include time redundancy, spatial redundancy, statistical redundancy, perception redundancy etc..It can carry out compressed data with various compression algorithms. Compression algorithm may include lossless data compression and damage data compression.Lossless data compression may include run length coding, RLC (RLE), Lempel-Ziv compression, huffman coding, by part matching carry out prediction (PPM), bzip2 compression etc. or its What is combined.Damage that data compression may include fractal compression, vector quantization, wavelet compression, linear predictive coding etc. or its is any Combination.In some embodiments, data can be video data.Video data can be compressed under video encoding standard.Depending on Frequency coding standard can include but is not limited to H.120, H.261, H.262, H.263, H.264, efficient video coding, MPEG-4 etc. Or any combination thereof.

In some embodiments, data compression and packetization module 504, which also can be configured as, is packaged compressed data.Example Such as, it can be packaged compressed data in various formats.Exemplary packing format may include AVI format, DV-AVI format, WMV format, MP4 format, RMVB format, MOV format, FLV format etc. or any combination thereof.In some embodiments, it can incite somebody to action Compressed data are packaged into the data packet for meeting physical layer protocol.

As the disclosure elsewhere described in, video frame can be divided into the multiple of identical or different length Subframe.Each subframe in multiple subframes can be a part of corresponding with video frame.Data compression and packetization module 504 can be with Compress data associated with each subframe in multiple subframes.Alternatively or additionally, compression and packetization module 504 can be into One step is packaged compressed data.

Physical channel processing module 506 can be configured as processing data.The data can from data acquisition module 502 or Data compression and packetization module 504 obtain.In some embodiments, the processing of data may include from video frame extraction frame head, Determine time point and/or period, synchronize two or more time points etc. or any combination thereof.For example, data acquisition module 502 can obtain video frame from video recording apparatus 206.The subframe of video frame then can be by data compression and packetization module 504 compressions and/or packing.Subframe after compression and/or after being packaged can be sent within the physical layer.Physical layer frame can control Data transmission (transmission including the subframe after compression and/or after being packaged) in physical layer.Can determine physical layer frame with number The time point corresponding frame head started according to transmission.But the starting point of physical layer frame transmission (or the frame for physical layer frame Head) and video frame transmission starting point (or for can send compression after and/or packing after subframe time point) may be Asynchronous.Therefore, subframe may have to wait for a period of time (also referred to as waiting time) before transmission, and may cause Delay.In order to reduce waiting time and delay, the frame head of adjustable physical layer frame.For example, can be by the frame with physical layer frame Corresponding time point is adjusted to synchronous with that can send time point of subframe after compression and/or after being packaged.Synchronous is detailed Thin process can be found in such as Fig. 6 and its description.

Physical channel processing module 506 may include that time determination unit 510, time adjustment unit 520 and data are sent With receiving unit 530.

Time determination unit 510 can be configured as determining time point and/or period.The time point can be with video The frame head of frame or the frame head of physical layer frame are corresponding.The time point can also include the compression for starting to carry out or complete video frame And/or time point, the time point of the compression and/or the packing that start the subframe for carrying out or completing video frame, camera (example being packaged Such as, video recording apparatus 206) start recording video data time point etc..It in some embodiments, can be based on signal come really The fixed time point.For example, the frame head of video frame can be corresponding with frame synchronization pulse signal.It can be believed based on frame-synchronizing impulse Number determine time point corresponding with the frame head of video frame.As another example, the frame head of physical layer frame can be with physical layer Pulse signal is corresponding.Time point corresponding with the frame head of physical layer frame can be determined based on Physical layer impulse signal.Only As an example, frame synchronization pulse signal can indicate the time point for receiving video frame from video recording apparatus.

The period may include for compressing and/or being packaged the period of video frame, for compressing and/or being packaged view The period of the subframe of frequency frame, for send physical layer frame (and/or compression after/be packaged after subframe) period, or processing The period etc. that device 204 is operated.In some embodiments, can length based on video frame and used compression and/ Or packing algorithm come determine compression and/or be packaged video frame (or its subframe) period.In some embodiments, the time determines Unit 510 can be realized by timer.Timer can be to provide the counter of internal reading.The reading can be by each solid The fixed period (for example, microsecond, millisecond, second etc.) increases.In some embodiments, when reading reaches preset number, timing Device can be generated signal and read and can be zeroed.In some other embodiments, timer can periodically generate signal Without being read zero.For example, signal will be generated when reading is 100 multiple.In some embodiments, when detecting When frame synchronization pulse signal (for example, at time point corresponding with the frame head of video frame), timer be can recorde inside it Reading.In some embodiments, the period that timer can recorde the subframe after compression and/or after being packaged, (or it internal was read Several numbers increase).

Time adjustment unit 520 can be configured as the time point and/or time that adjustment is determined by time determination unit 510 Section.For example, the adjustable time point corresponding with the frame head of physical layer frame of time adjustment unit 520.More specifically, can adjust Whole time point, allow time point corresponding with the frame head of physical layer frame be ready for compression after and/or be packaged after Subframe time point it is synchronous.Subframe after compression and/or after being packaged can be sent together with the physical layer frame in physical layer.In In some embodiments, can extend or shorten the period of current physical layer frame so that the frame head of subsequent physical layer frame with The synchronizing sub-frame to be sent.In some other embodiments, the inside reading of timer can control the frame head of physical layer frame.Example Such as, when the reading of timer reaches some value, physical layer frame will be generated.By changing the reading of timer, physics can be made The frame header deviation of layer frame, so that the frame head of subsequent physical layer frame is synchronous with the time point for being ready for subframe.About adjustment Detailed description can be found in Fig. 6, Fig. 8 A and Fig. 8 B.

Data transmission and reception unit 530, which can be configured as, sends and receives data.The data may include from number According to compression and packetization module 504 (for example, video frame after compression and/or after being packaged, the subframe after compression and/or after being packaged), Other assemblies (for example, UAV transceiver 202, storage medium 214 etc.) received data of memory module 508 or UAV system 100 Deng.For example, after data compression and the received compression of packetization module 504 and/or be packaged after subframe can by data send and It is for processing that receiving unit 530 is sent to UAV transceiver 202.After compression and/or be packaged after subframe can within the physical layer by It sends.The processing may include amplification, analog-to-digital conversion, digital-to-analogue conversion etc..UAV transceiver 202 can will treated data It is sent to ground based terminal 104.As another example, data transmission and reception unit 530 can obtain number from memory module 508 According to.

Memory module 508 can be configured as storing data.It can be from data acquisition module 502, data compression and packing Module 504, physical channel processing module 506 and/or other equipment obtain data.The data may include program, software, calculation Method, function, file, parameter, text, number, image, video, audio etc. or any combination thereof.In some embodiments, it stores Module 508 may include hard disk drive, solid state drive, removable storage device driver (for example, flash disk drive, CD drive etc.), digital video recorder etc. or combinations thereof.

It should be noted that the purpose that the description above with respect to processor 204 is merely to illustrate, and it is not intended to the limitation disclosure Range.To those skilled in the art, it can be made various changes and modifications under the introduction of the disclosure.One In a little embodiments, memory module 508 can be omitted and/or be integrated into storage medium 214 from processor 204.However, that A little change and modification can not depart from the scope of the present disclosure.

Fig. 6 is shown according to some embodiments of the present disclosure for sending the exemplary mistake of video frame in UAV system The flow chart of journey 600.In some embodiments, process 600 can be realized by processor 204.

In step 602, it can recorde video data.In some embodiments, step 602 can be by video recording apparatus 206 realize.Step 602 can be similar to the step 302 of Fig. 3, and which is not described herein again.

In step 604, video frame can be extracted from video data.In some embodiments, step 604 can be by counting It is realized according to module 502 is obtained.As used herein, video frame may refer to the frame of video data.Video frame can be with Time span is corresponding.In some embodiments, video frame can be static image.

It in step 606, can be with compressed video frame.In some embodiments, step 606 can be by data compression and packing Module 504 is realized.Video frame can be divided into multiple subframes.It can compress associated with each subframe in multiple subframes Data.Each subframe in multiple subframes can be a part of corresponding with video frame.Video frame, which can be divided into, to be had Multiple subframes of identical or different length.Associated description about compression can be in the description of data compression and packetization module 502 In find.

In step 608, the period for compressed video frame can be determined.In some embodiments, step 608 can be with It is realized by physical channel processing module 506.It can be determined based on the length of video frame and used compression algorithm for pressing The period of contracting video frame.In some embodiments, compressed video frame is also referred to for the period of compressed video frame The summation of the period of all subframes.In some embodiments, the period of the compressed video frame in the application can be with finger pressure The period of the subframe of contracting video frame.

In some embodiments, it can also be packaged compressed video frame (or its subframe) in step 606.In this feelings Under condition, the determining period may include the time for compressing and being packaged both video frames (or its subframe) in step 608 Section.In some embodiments, for the period of compressed video frame may include video frame before transmitting other processing when Between, for example, for the time to the video frame addition redundancy check bits after packing.

In step 610, frame head can be extracted from video frame.In some embodiments, step 610 can be obtained by data Modulus block 502 is realized.Frame head may include the control information of video frame.Control information may include synchronizing information, address letter Breath, error-control information, encoded information etc..Synchronizing information may include the start time point and/or end time point of video frame.

In step 612, first time point corresponding with the frame head of video frame can be determined.In some embodiments, Step 612 can be realized by physical channel processing module 506.As it is used herein, first time point can refer to video frame Start time point.In some embodiments, the frame head of video frame can be corresponding with frame synchronization pulse signal.It connects on a physical layer The time point for receiving frame synchronization pulse signal is also determined as first time point.

It, can be based on the first time point obtained in step 612 and the use obtained in step 608 in step 614 Determined for the second time point in period of compression (and/or be packaged) video frame.In some embodiments, step 614 can be by Physical channel processing module 506 is realized.As it is used herein, the second time point also refer to video frame compression and/or It is packaged time point when completing.For example, the second time point also referred to the compression of subframe and/or was packaged time when completing Point.The detailed description of determination about the second time point can be found in Fig. 8 A and Fig. 8 B.

It in step 616, can be synchronous with the second time point by third time point corresponding with the frame head of physical layer frame. In some embodiments, step 616 can be realized by physical channel processing module 506.For example, the frame head with physical layer frame is opposite The second time point when the third time point answered can complete with the compression of subframe and/or packing is synchronous.Physical layer frame can be controlled Data transmission processed, and the starting point that the frame head of physical layer frame can be transmitted with designation date.For example, the frame head of physical layer frame can be with Indicate the starting point of the subframe after compressing by physical layer transmission and/or after being packaged.The transmission time of physical layer frame can be solid Definite value, because the length of physical layer frame can be configured as fixed value.In some embodiments, it can extend or shorten and is current The transmission time of physical layer frame, so that the frame head of subsequent physical layer frame and the second time point (that is, the compression of subframe and/or beating Time point when packet is completed) it is synchronous.It in some embodiments, can be with the side similar with the transmission time of current physical layer frame Formula adjusts the transmission times of one or more subsequent physical-layer frames since current physical layer frame.In some embodiments In, the transmission time of current physical layer frame is only adjusted, and by maintaining between video frame and the frame rate of physical layer frame Certain relationship, it may not be necessary to adjust the transmission time of subsequent physical layer frame.Alternatively, whenever receiving video frame to carry out When compression and transmission, so that it may adjust the transmission time of current physical layer frame.

In some embodiments, frame synchronization pulse signal can be sent to physical layer, the signal designation is from videograph The time point of equipment reception video frame.It can be existed in the timer photographs snapshot of physical layer with recording frame synchronization pulse signal Time of occurrence point in physical layer.Physical channel processing module 506 can based on frame synchronization pulse signal time of occurrence point (that is, The time point that video frame from video recording apparatus is transmitted) and based on time by compressing and/or being packaged video frame comes Calculate the start time point for sending video frame.When physical channel processing module 506 can also be by the calculated starting for sending video frame Between point with generate physical layer frame scheduling time point be compared, and determine it is calculated send video frame start time point and Whether the next scheduling time point for generating physical layer frame is synchronous.When the start time point for determining calculated transmission video frame When next scheduling time point with generation physical layer frame is asynchronous, the adjustable physical layer timing of physical channel processing module 506 The value of device, when so that the next scheduling time point for generating physical layer frame being adjusted to the calculated starting for sending video frame Between put it is identical.It can be disclosed in Fig. 8 A and Fig. 8 B about synchronous detailed description.

In step 618, compressed video frame can be sent at third time point.In some embodiments, step 618 can be realized by physical channel processing module 506.For example, can at third time point with physics corresponding in physical layer Layer frame sends together compression after and/or packing after subframe.In some embodiments, it can will be compressed via UAV transceiver 202 Afterwards and/or the video frame after packing is sent to remote equipment (for example, ground based terminal 104, server 108, storage equipment 110 Deng).

In step 620, process 600 can determine whether the transmission of video frame (or its subframe) is completed.If video frame Transmission do not complete, then process 600 may return to step 616.For example, can obtain respectively corresponding with multiple subframes more A third time point.Each subframe in multiple subframes can be successively sent at corresponding third time point.Weight can be passed through Step 616 is carried out again, and multiple subframes are sent to step 618.If video frame is transmitted, process 600 be may proceed to Step 622.At step 622, processor 204 can wait next video data.When receiving or be recorded next video When data, processor 204 can be handled it to 620 by repeating step 604.

It should be noted that step shown in fig. 6 is for the purpose of illustration, and it is not intended to the protection model of the limitation disclosure It encloses.In some embodiments, the process can be in the additional steps that there is one or more not describe and/or without upper It is completed in the case where the one or more steps that face discusses.In addition, executed in Fig. 6 process 600 the step of sequence unexpectedly It is being restrictive.For example, step 606 and step 610 can be executed simultaneously or sequentially.As another example, step 614 A step can be merged into step 616.As another example, step 616 can be divided into two steps: when third Between put the adjustment of determining and third time point.In some embodiments, video data may include multiple video frames.Weight can be passed through Step 604 is carried out again, and each video frame is handled to 620.

Fig. 7 is shown according to some embodiments of the present disclosure for configuring the frame rate of physical layer frame in UAV system Example process 700 flow chart.In some embodiments, example process 700 can be realized by processor 204.

In a step 702, multiple video frames be can receive.Can with disclosed in step 602 and/or step 302 Method similar method extracts video frame from the video data recorded.

In step 704, the frame rate of multiple video frames can be obtained.Each video frame can be static image.It is multiple The frame rate of video frame can refer to the quantity of the static image of acquisition per second.For example, the frame rate of 20 video frames can lead to Crossing will obtain the total time (in seconds) of this 20 frames divided by 20.

In step 706, the frame rate of multiple physical layer frames can be configured based on the frame rate of multiple video frames.It is multiple The frame rate of physical layer frame can refer to the quantity of the physical layer frame of (for example, second or millisecond) per unit time.Such as its of the disclosure Described in his place, the frame head of current physical layer frame can with current video frame synchronization, to reduce waiting time and delay.In In step 706, the frame rate of physical layer frame can be configured, allows subsequent physical layer frame same with subsequent video frame respectively Step.More specifically, the frame rate of adjustable physical layer frame, so that the compression of each frame head of physical layer frame and video frame is completed When time point it is synchronous or almost synchronize.In some embodiments, the frame rate of multiple physical layer frames can be multiple video frames Frame rate integral multiple.Only as an example, the integer can be 2,4,6,8,10,20,25,30 etc..

It should be noted that step shown in Fig. 7 is for the purpose of illustration, and it is not intended to the protection model of the limitation disclosure It encloses.In some embodiments, the process can be in the additional steps that there is one or more not describe and/or without upper It is completed in the case where the one or more steps that face discusses.Additionally, the sequence of the step of processing shown in Fig. 7 is not intended to be limit Property processed.

Fig. 8 A and Fig. 8 B show two signals of the transmission of video in the UAV system according to some embodiments of the present disclosure Figure.In some embodiments, Fig. 8 A and Fig. 8 B shows the transmission of video from UAV 102 to ground based terminal 104.

In some embodiments, as shown in Figure 8 A and 8 B, transmission of video may include: to capture video via UAV 102 Data;Video frame is received at UAV 102;The compressed video frame at UAV 102;Video frame is handled at UAV 102;According to object The timing for managing layer frame sends video frame via UAV 102;Video frame is wirelessly transmitted to ground based terminal 104 from UAV 102;In Reconciliation compressed video frame etc. is received in ground based terminal 104.Multiple video frames can be extracted from real-time video.Multiple video frames can To include video frame 802-1, video frame 802-2 etc..In some embodiments, camera can be communicably connected to (for example, view Frequency recording equipment 206) data-interface (for example, universal serial bus (USB) interface, 1394 interface of IEEE or RS-232 interface Deng) at receive real-time video.The frame timing of camera may include multiple signals, the starting of these signal designation cameras record videos Time point.Multiple signals may include signal 800-1, signal 800-2, signal 800-3 etc..Video frame 802-1 and video frame 802-2 can be divided into subframe and be compressed to multiple compressed packages, including compressed package 804-1, compressed package 804-2 ..., pressure Contracting packet 804-7 etc..Segmentation and compression method can be similar to the description of compressing data and packetization module 504.Can include Physical layer frame 806-1, physical layer frame 806-2 ..., under the control of multiple physical layer frames including physical layer frame 806-8 etc., lead to Physical layer is crossed to send multiple compressed packages.Multiple compressed packages corresponding with video frame can receive in ground based terminal 104, and And can be decompressed as the video frame after multiple decompressions, including the video frame 808-1 after decompression, the video after decompression Frame 808-2 ..., the video frame 808-5 etc. after decompression.

As shown in Figure 8 A, time point T1 can be corresponding with the frame head of video frame 802-1.Time point T4 can be with compressed package When 804-1 is received and is decompressed (video frame after decompression corresponding with compressed package 804-1 is indicated by 808-1) when Between put it is corresponding.Time period t 3 can be opposite with for compressing and/or being packaged the period of the first subframe of video frame 802-1 It answers.Time point T2 can be determined based on time point T1 and time period t 3.Time point T2 can indicate that compressed package 804-1 is compressed And it is ready to be sent.However, time point T2 not with correspond to physical layer frame 806-2 frame head (being illustrated as time point T9) Time point T3 it is synchronous.Therefore, compressed package 804-1 is at time point corresponding with the frame head of physical layer frame 806-3 (when being illustrated as Between point T3) before cannot be sent.Waiting period t2 can be defined as to compressed package 804-1 to need to wait before transmission Time span.Time delay can be defined as can be used for from the initial time of video frame to video frame in ground based terminal 104 Place carries out the period at the time point of streaming (streaming) and/or playback.As shown in Figure 8 A, time delay t1 can refer to From time point T1 to the period of time point T4, due to waiting period t2, which may be unnecessarily prolonged.This public affairs Method disclosed in opening can be implemented to eliminate or reduce the waiting time, to reduce time delay.

Fig. 8 B shows the schematic diagram of the transmission of video similar to Fig. 8 A, but has physical layer frame timing adjusted. As shown in Figure 8 B, time point T5 can be corresponding with the frame head of video frame 802-1.Time point T8 can be with compressed package 804-1 quilt Time point (video frame after decompression corresponding with compressed package 804-1 is indicated by 808-1) phase when receiving and being decompressed It is corresponding.Time period t 5 can be corresponding with for compressing and/or being packaged the period of the first subframe of video frame 802-1.It can be with Time point T7 is determined based on time point T5 and time period t 5.Time point T7 can indicate that compressed package 804-1 is compressed and quasi- It gets ready and is sent.Time point T6 can be corresponding with the frame head of physical layer frame 806-1.In order to synchronous with physical layer frame 806-2's Frame head corresponding time point, adjustable physical layer frame 806-1.For example, the length of physical layer frame 806-1 can be extended. As another example, the inside reading of timer corresponding with the frame head of physical layer frame 806-1 can be increased.Specifically, may be used To adjust the reading of timer, so that time point corresponding with the frame head of physical layer frame 806-2 is synchronous with time point T7.It can be with Extend or shorten physical layer frame 806-1 transmission time so that time point corresponding with the frame head of physical layer frame 806-2 and when Between point T7 it is synchronous.Time delay t4 can be defined as the period from time point T5 to time point T8.Pass through physical layer frame The timing adjustment of 806-1, compressed package 804-1 can be immediately sent when compressing and completing, that is, waiting not as shown in Figure 8 A Time period t 2.In some embodiments, the transmission time of physical layer frame 806-1 is only adjusted, and does not adjust physical layer frame 806-2 With the transmission time of subsequent physical layer frame, until receiving subsequent video frame.

It in some embodiments, as shown in figure 8B, can be in the case where being timed adjustment to multiple physical layer frames Send each video frame in multiple video frames.Its timing needs the quantity for the physical layer frame being adjusted can be with multiple physical layers The frame rate of the frame rate of frame and multiple video frames is related.In some embodiments, the frame rate of multiple physical layer frames can be with It is configured as the integral multiple (for example, 20) of the frame rate of multiple video frames, so that its timing needs the physical layer being adjusted The quantity of frame can greatly reduce.

Fig. 9 shows the schematic diagram of exemplary osi model.Osi model 900 characterizes and standardizes telecommunications or computing system Communication function.Osi model 900 defines networking framework to realize the agreement in seven layers.From bottom to up, this seven layers may include Physical layer 902, data link layer 904, network layer 906, transport layer 908, session layer 910, expression layer 912 and application layer 914.

Physical layer 902 defines the electrically and physically specification of data connection.Physical layer 902 defines equipment and physical transfer is situated between Relationship between matter (for example, copper or fiber optic cables, radio frequency etc.).The relationship may include the layout of pin, voltage, line impedence, Cable gauge, the signal timing and similar characteristics, the frequency (5GHz or 2.4GHz etc.) of wireless device etc. for connecting equipment.Physical layer 902 can be transmitted in electrically and mechanically level bit stream (for example, electric pulse, light or radio signal etc.) by network.Object Reason layer 902 can be provided on carrier the hardware device for sending and receiving data, including define cable, card and physics aspect.In In some embodiments, physical channel processing module 506 and/or user interface 408 can operate in physical layer 902.

Data link layer 904 can supply transmission protocol knowledge, management, the error handle in physical layer 902, flow control, Frame synchronization etc..In data link layer 904, data grouping can be encoded and decoded into bit.In some embodiments, data Link layer 904 can be divided into two sublayers: media access control (MAC) layer and logic link control (LLC) layer.MAC Layer can control how the computer on network obtains the permission of access data and the license for sending data.LLC layer can be with Control frame synchronization, flow control and error checking.In some embodiments, the processor 204, processor 404 of UAV system 100 can To be operated in data link layer 904.

Network layer 906 can provide exchange and route technology, and create for patrolling from node-to-node transmission data Collect path.In some embodiments, network 106, UAV transceiver 202 and/or ground based terminal transceiver 402 can be in network layers It is operated in 906.In some embodiments, access of the UAV 102 to network 106 can be operated in network layer 906.The access It can be random access competition-based or be not based on the random access of competition.

Transport layer 908 can provide transparent data transmission between terminal system or host.Transport layer 908 can be responsible for End-to-end Fault recovery and flow control.Transport layer 908 may insure complete data transmission.In some embodiments, network 106, UAV transceiver 202 and/or ground based terminal transceiver 402 can operate in transport layer 908.In some embodiments, In The agreement operated in transport layer 908 may include TCP, UDP, SPX etc..

Session layer 910 can control the connection between equipment.Session layer 910 can establish, manage between termination device Connection.In some embodiments, processor 204 and/or processor 404 can operate in session layer 910.

Expression layer 912 can indicate (for example, encryption) with data by providing from network format is converted to using format In difference independence, vice versa.Expression layer 912 can be used for transforming the data into 914 acceptable shape of application layer Formula.In some embodiments, data compression and packetization module 504 can operate in expression layer 912.

Application layer 914 can provide application service, such as file transmission, Email or other network software services etc.. Application layer 914 can execute including identification communication parter, determine Resource Availability etc. including function.In some embodiments, The agreement operated in application layer 912 may include FTP, HTTP, DNS etc..

It should be noted that osi model 900 is provided for illustration purposes only, and it is not intended to limit the scope of the present disclosure. For those of ordinary skill in the art, a variety of variations or modification can be carried out under the introduction of the disclosure.For example, UAV The equipment or module of system 100 can subsequently or simultaneously work in multiple layers.However, those change and modification can not depart from The scope of the present disclosure.

Have thus described basic conceptions, after having read this detailed disclosure, to those skilled in the art It may be apparent that above-mentioned detailed disclosure is meant only to be presented by way of example, rather than it is restrictive.Although herein It does not clearly state, but various changes, improvement and modification may occur, and be expected from those skilled in the art.These Change, improvement and modification are intended to be implied by the disclosure, and fall in the spirit and scope of exemplary embodiment of the disclosure.

In addition, describing implementation of the disclosure example using certain terms.For example, term " one embodiment ", " embodiment " And/or " some embodiments " means that a particular feature, structure, or characteristic for combining embodiment description is included in the disclosure extremely In few one embodiment.Therefore, it should be emphasized that and it is to be understood that in the various pieces of this specification to " embodiment " or The reference two or more times of " one embodiment " or " alternative embodiment " is not necessarily all referring to identical embodiment.In addition, at this In disclosed one or more embodiment, a particular feature, structure, or characteristic can be combined in due course.

In addition, those skilled in the art will recognize that, can with it is a variety of can granted patent classification or context (including Any new and useful process, machine, manufacture or material composition or its any new and useful improvement) in any one Kind is come in terms of being illustrated and described herein the disclosure.Therefore, all aspects of this disclosure can be completely with hardware, completely with software It (including firmware, resident software, microcode etc.) or is realized by the embodiment of integration software and hardware, herein all It is referred to as " block ", " module ", " engine ", " unit ", " component " or " system ".In addition, all aspects of this disclosure can take body The now form of computer program product in one or more computer-readable medium stores on the computer-readable medium There is computer readable program code.

29页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:信号处理装置和方法以及程序

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类