Method and device for synchronously playing audio and video

文档序号:1675916 发布日期:2019-12-31 浏览:22次 中文

阅读说明:本技术 音视频同步播放的方法及装置 (Method and device for synchronously playing audio and video ) 是由 王伟杰 于 2018-06-21 设计创作,主要内容包括:本申请提供一种音视频同步播放的方法及装置,应用在基于HTML5平台的多媒体设备上,方法包括:通过运行前台线程,由前台线程将音视频时间戳发送至与前台线程不同的后台线程;通过运行后台线程,由后台线程依据每帧音视频时间戳确定是否播放每帧音视频时间戳对应的音视频数据,若是,则向前台线程返回播放音视频数据的通知,以使前台线程播放音视频数据;其中,后台线程与前台线程相互独立运行。本申请通过采用前台线程与后台线程的结合实现HTML5平台中的音视频同步播放,由于前台线程采用只发送音视频时间戳的方式实现的同步播放,因此可以降低前台线程与后台线程间数据传递的性能损耗,保证前台线程播放音视频数据的实时性。(The application provides a method and a device for synchronously playing audio and video, which are applied to multimedia equipment based on an HTML5 platform, and the method comprises the following steps: by running a foreground thread, the foreground thread sends the audio and video time stamp to a background thread different from the foreground thread; by running the background thread, the background thread determines whether to play audio and video data corresponding to each frame of audio and video time stamp according to each frame of audio and video time stamp, if so, a notification of playing the audio and video data is returned to the foreground thread so that the foreground thread plays the audio and video data; wherein the background thread and the foreground thread run independently of each other. The audio and video synchronous playing in the HTML5 platform is realized by combining the foreground thread and the background thread, and the synchronous playing realized by only sending the audio and video time stamp is adopted by the foreground thread, so that the performance loss of data transmission between the foreground thread and the background thread can be reduced, and the real-time property of the foreground thread for playing the audio and video data is ensured.)

1. A method for synchronously playing audio and video is characterized in that the method is applied to a multimedia device based on an HTML5 platform, the multimedia device runs a foreground thread, and the foreground thread is used for playing audio and video data, and the method comprises the following steps:

by running the foreground thread, the foreground thread sends an audio and video timestamp to a background thread different from the foreground thread;

by running the background thread, the background thread determines whether to play audio and video data corresponding to each frame of audio and video timestamp according to each frame of audio and video timestamp, if so, a notification of playing the audio and video data is returned to the foreground thread so that the foreground thread plays the audio and video data;

wherein the background thread and the foreground thread run independently of each other.

2. The method of claim 1, wherein sending, by the foreground thread, an audio-video timestamp to a background thread comprises:

monitoring whether the number of unsent video timestamps in a video timestamp linked list reaches a preset value or not;

when the condition that a preset value is reached is monitored, copying an unsent video time stamp in the video time stamp linked list and an unsent audio time stamp in the audio time stamp linked list;

and sending the copied audio and video time stamp to the background thread.

3. The method of claim 1, wherein sending, by the foreground thread, an audio-video timestamp to a background thread comprises:

when a data request of a background thread is received, copying a video timestamp which is not sent in the video timestamp linked list and an audio timestamp which is not sent in the audio timestamp linked list;

and sending the copied audio and video time stamp to the background thread.

4. The method of claim 1, wherein determining, by the background thread, whether to play audio/video data corresponding to each frame of audio/video timestamp based on each frame of audio/video timestamp comprises:

reading a first frame audio and video time stamp and system time, determining the frame video time stamp as a reference time point, determining the read system time as a reference time point, and directly determining to play audio and video data corresponding to the frame audio and video time stamp;

and from the second frame of audio and video time stamp, determining whether to play audio and video data corresponding to the frame of audio and video time stamp by using the frame of audio and video time stamp, the read system time, the reference time point and the reference time point every time when one frame of audio and video time stamp and one system time are read.

5. The utility model provides a device of audio frequency and video synchronization broadcast which characterized in that, the device is used in the multimedia device based on HTML5 platform, multimedia device operation foreground thread, foreground thread is used for broadcasting audio frequency and video data, the device includes:

the first running module is used for sending the audio and video time stamp to a background thread different from the foreground thread by the foreground thread through running the foreground thread;

the second running module is used for determining whether audio and video data corresponding to each frame of audio and video timestamp is played or not by the background thread according to each frame of audio and video timestamp by running the background thread, and if so, returning a notification of playing the audio and video data to the foreground thread so as to enable the foreground thread to play the audio and video data;

wherein the background thread and the foreground thread run independently of each other.

6. The apparatus according to claim 5, wherein the first operation module is specifically configured to monitor whether the number of unsent video timestamps in the video timestamp chain table reaches a preset value; when the condition that a preset value is reached is monitored, copying an unsent video time stamp in the video time stamp linked list and an unsent audio time stamp in the audio time stamp linked list; and sending the copied audio and video time stamp to the background thread.

7. The apparatus according to claim 5, wherein the first running module is specifically configured to copy, when receiving a data request of a background thread, the video timestamp that is not sent in the video timestamp linked list and the audio timestamp that is not sent in the audio timestamp linked list; and sending the copied audio and video time stamp to the background thread.

8. The apparatus of claim 5, wherein the second operational module comprises:

the first determining submodule is used for reading a first frame of audio and video time stamp and system time, determining the frame of video time stamp as a reference time point, determining the read system time as a reference time point, and directly determining to play audio and video data corresponding to the frame of audio and video time stamp;

and the second determining submodule is used for determining whether to play the audio and video data corresponding to the frame of audio and video time stamp by using the frame of audio and video time stamp, the read system time, the reference time point and the reference time point every time when the second frame of audio and video time stamp is started and one frame of audio and video time stamp and one-time system time are read.

9. A multimedia device, characterized in that the device comprises a readable storage medium and a processor;

wherein the readable storage medium is configured to store machine executable instructions;

the processor configured to read the machine executable instructions on the readable storage medium and execute the instructions to implement the steps of the method of any one of claims 1-4.

Technical Field

The present application relates to the field of communications technologies, and in particular, to a method and an apparatus for audio and video synchronous playing.

Background

Audio-video synchronization refers to synchronization of video and audio, that is, the currently played sound and the currently displayed picture are consistent in time domain. At present, an audio and video synchronous playing method is generally based on a windows platform, a linux platform, an android platform and an apple platform, and audio and video data are directly transmitted among threads in a parallel processing mode of a plurality of foreground threads. However, since parallel processing of multiple foreground threads cannot be performed in an HTML5(Hypertext Markup Language) platform, the above audio/video synchronous playing method is not suitable for an HTML5 platform.

Disclosure of Invention

In view of this, the present application provides a method and an apparatus for audio and video synchronous playing to solve the problem that audio and video cannot be synchronously played in an HTML5 platform.

According to a first aspect of the embodiments of the present application, a method for audio and video synchronous playing is provided, where the method is applied to a multimedia device based on an HTML5 platform, the multimedia device runs a foreground thread, and the foreground thread is used to play audio and video data, and the method includes:

by running the foreground thread, the foreground thread sends an audio and video timestamp to a background thread different from the foreground thread;

by running the background thread, the background thread determines whether to play audio and video data corresponding to each frame of audio and video timestamp according to each frame of audio and video timestamp, if so, a notification of playing the audio and video data is returned to the foreground thread so that the foreground thread plays the audio and video data;

wherein the background thread and the foreground thread run independently of each other.

According to a second aspect of the embodiments of the present application, a device for audio and video synchronous playing is provided, the device is applied to a multimedia device based on an HTML5 platform, the multimedia device runs a foreground thread, the foreground thread is used for playing audio and video data, the device includes:

the first running module is used for sending the audio and video time stamp to a background thread different from the foreground thread by the foreground thread through running the foreground thread;

the second running module is used for determining whether audio and video data corresponding to each frame of audio and video timestamp is played or not by the background thread according to each frame of audio and video timestamp by running the background thread, and if so, returning a notification of playing the audio and video data to the foreground thread so as to enable the foreground thread to play the audio and video data;

wherein the background thread and the foreground thread run independently of each other.

According to a third aspect of embodiments herein, there is provided a multimedia device, the device comprising a readable storage medium and a processor;

wherein the readable storage medium is configured to store machine executable instructions;

the processor configured to read the machine executable instructions on the readable storage medium and execute the instructions to implement the steps of the method of any one of claims 1-4.

By applying the embodiment of the application, the application is based on an HTML5 platform, audio and video synchronous playing is realized by combining a foreground thread and a background thread, namely, an audio and video timestamp is sent to the background thread different from the foreground thread by the foreground thread, whether corresponding audio and video data are played or not is determined by the background thread according to each frame of audio and video timestamp, if so, a notification of playing the audio and video data is returned to the foreground thread, so that the foreground thread plays the audio and video data, and because the foreground thread adopts synchronous playing realized by only sending the audio and video timestamp, the performance loss of data transmission between the foreground thread and the background thread can be reduced, and the real-time property of playing the audio and video data by the foreground thread is ensured.

Drawings

Fig. 1 is a flowchart illustrating an embodiment of a method for audio and video synchronized playback according to an exemplary embodiment of the present application;

fig. 2 is a flowchart illustrating an embodiment of another method for audio-video synchronized playback according to an exemplary embodiment of the present application;

FIG. 3 is a diagram illustrating a hardware configuration of a multimedia device according to an exemplary embodiment of the present application;

fig. 4 is a block diagram of an embodiment of an apparatus for synchronously playing audio and video according to an exemplary embodiment of the present application;

fig. 5 is a block diagram of an embodiment of another apparatus for synchronously playing audio and video according to an exemplary embodiment of the present application.

Detailed Description

Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.

It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.

Fig. 1 is a flowchart of an embodiment of a method for audio and video synchronous playing shown in this application according to an exemplary embodiment, where this embodiment may be applied to a multimedia device based on an HTML5 platform, in this application embodiment, a foreground thread and a background thread run on the multimedia device, where the foreground thread may also be referred to as a page thread and is used to play audio and video data, and the background thread is used to determine whether to play audio and video data synchronously, and in an HTML5 platform, a Web Worker technology may be used to implement the background thread, and because the foreground thread and the background thread run independently of each other, the running of the background thread does not affect the performance of the foreground thread, and because in an HTML5 platform, data transfer between the foreground thread and the background thread may not be directly performed, data transfer between the foreground thread and the background thread may be implemented in a postMessage manner. As shown in fig. 1, the method for synchronously playing audio and video includes the following steps:

step 101: and by running the foreground thread, the foreground thread sends the audio and video timestamp to a background thread different from the foreground thread.

In an embodiment, the foreground thread can store the audio and video data and the audio and video timestamp data in different storage units respectively, and the performance of data interaction between the foreground thread and the background thread in a postMessage mode is related to the data volume, so that the foreground thread can only send the audio and video timestamp to the background thread, the performance loss of data transmission between the foreground thread and the background thread is reduced, and the real-time performance of the foreground thread for playing the audio and video data is ensured.

In an embodiment, the audio and video timestamps can be sent to the background thread in an active sending mode through the foreground thread, specifically, the foreground thread can monitor whether the number of the unsent video timestamps in the video timestamp linked list reaches a preset value, when the number of the unsent video timestamps in the video timestamp linked list reaches the preset value, the unsent video timestamps in the video timestamp linked list and the unsent audio timestamps in the audio timestamp linked list are copied, and the copied audio and video timestamps are sent to the background thread.

The foreground thread caches the video time stamp of the preset value frame in the video time stamp linked list at each time and then sends the video time stamp, and the preset value can be set according to actual requirements as long as the video playing real-time performance is not influenced. In addition, after the foreground thread sends the copied audio and video timestamps to the background thread, the sent video timestamps in the video timestamp linked list and the sent audio timestamps in the audio timestamp linked list can be deleted.

In an embodiment, the audio and video timestamp may be sent to the background thread in a manner of being sent passively by the foreground thread, and specifically, the foreground thread may copy the video timestamp not sent in the video timestamp linked list and the audio timestamp not sent in the audio timestamp linked list when receiving the data request of the background thread, and send the copied audio and video timestamp to the background thread.

When receiving a data request sent by a background thread, a foreground thread indicates that an audio and video timestamp sent to the background thread by the foreground thread is processed, and needs to send a new audio and video timestamp. In addition, after the foreground thread sends the copied audio and video timestamps to the background thread, the sent video timestamps in the video timestamp linked list and the sent audio timestamps in the audio timestamp linked list can also be deleted.

In an embodiment, the foreground thread can also send the audio and video timestamp to the background thread through the combination of active sending and passive sending, specifically, after foreground thread and background thread are started, when the quantity of the video timestamp in the video timestamp linked list reaches a preset value, the foreground thread copies the video timestamp in the video timestamp linked list and the audio timestamp in the audio timestamp linked list, and send the copied audio and video timestamp to the background thread, thereafter, the foreground thread directly sends the unsent audio timestamp in the unsent video timestamp linked list and the unsent audio timestamp linked list to the background thread when receiving the data request of the background thread. Therefore, the background thread can be ensured to continuously perform judgment processing on the basis of not influencing the real-time playing of the foreground thread.

Step 102: by running the background thread, the background thread determines whether to play audio and video data corresponding to each frame of audio and video time stamp according to each frame of audio and video time stamp, if so, step 103 is executed, and if not, step 104 is executed.

The processing procedure of step 102 may be exemplarily referred to the following description of the embodiment shown in fig. 2, and will not be described in detail here.

Step 103: and returning a notice of playing the audio and video data to the foreground thread by the background thread so as to enable the foreground thread to play the audio and video data.

In an embodiment, the notification sent by the background thread may carry a video timestamp, an audio timestamp, and a play flag, so that the foreground thread may synchronously play video data corresponding to the video timestamp and audio data corresponding to the audio timestamp according to the play flag. In addition, the foreground thread may delete the corresponding video data and audio data after playing.

It should be noted that, because the video is usually taken as the main part and the audio is taken as the auxiliary part in the audio and video synchronous playing service, the background thread can only carry the video timestamp and the playing flag in the notification when determining whether to play the corresponding audio and video data according to each frame of audio and video timestamp and under the condition that the video timestamp is smaller than the audio timestamp, that is, the audio timestamp is late relative to the video timestamp, so that the foreground thread only plays the video data corresponding to the video timestamp according to the playing flag.

Step 104: and returning a notice of discarding the audio and video data to the foreground thread by the background thread so as to enable the foreground thread to discard the audio and video data.

In an embodiment, the notification sent by the background thread may carry a video timestamp and an audio timestamp and a discard flag, so that the foreground thread may discard video data corresponding to the video timestamp and audio data corresponding to the audio timestamp according to the discard flag.

In the embodiment of the application, based on the HTML5 platform, audio and video synchronous playing is realized by combining a foreground thread and a background thread, namely, an audio and video timestamp is sent to the background thread different from the foreground thread by the foreground thread, whether corresponding audio and video data are played or not is determined by the background thread according to each frame of audio and video timestamp, if yes, a notification of playing the audio and video data is returned to the foreground thread, so that the foreground thread plays the audio and video data, and because the foreground thread adopts synchronous playing realized by only sending the audio and video timestamp, the performance loss of data transmission between the foreground thread and the background thread can be reduced, and the real-time property of playing the audio and video data by the foreground thread is ensured.

Fig. 2 is a flowchart of another embodiment of a method for audio and video synchronous playing shown in this application according to an exemplary embodiment, and this embodiment uses the method provided in this application embodiment to exemplarily explain how to determine whether to play audio and video data corresponding to each frame of audio and video timestamp by a background thread according to each frame of audio and video timestamp, as shown in fig. 2, the method for audio and video synchronous playing includes the following steps:

step 201: and the background thread reads the first frame audio and video time stamp and the system time, determines the frame video time stamp as a reference time point, determines the read system time as a reference time point, and directly determines to play audio and video data corresponding to the frame audio and video time stamp.

In an embodiment, before the background thread performs step 201, it may be determined whether the number of the video timestamps is greater than 1, if so, step 201 is performed, and if not, the notification of playing the audio/video data is directly sent to the foreground thread.

In an embodiment, the background thread may reset the synchronization reference time for the read first frame audio/video timestamp, that is, the first frame video timestamp is used as a reference time point, that is, a video timestamp reference point, for subsequently determining an actual offset duration of each frame video timestamp, and meanwhile, the read system time is used as a reference time point for subsequently determining a reference offset duration of each frame video timestamp.

Step 202: and the background thread starts from the second frame of audio and video time stamp, and determines whether to play audio and video data corresponding to the frame of audio and video time stamp by using the frame of audio and video time stamp, the read system time, the reference time point and the reference time point every time when reading one frame of audio and video time stamp and one system time.

In an embodiment, for a process of determining whether to play audio and video data corresponding to the frame of audio and video timestamp by using the frame of audio and video timestamp, the read system time, the reference time point, and the reference time point, the background thread may determine a difference between the frame of audio and video timestamp and the reference time point as an actual offset duration, determine a difference between the read system time and the reference time point as a reference offset duration, then determine a difference between the actual offset duration and the reference offset duration, and determine whether to play the audio and video data corresponding to the frame of audio and video timestamp by using the difference and the audio timestamp.

In an embodiment, for a process of determining whether to play audio and video data corresponding to the frame of audio and video timestamp by using the difference and the audio timestamp, if the difference is between a positive threshold and a negative threshold and the frame of audio timestamp is less than or equal to the frame of video timestamp, it is determined to play the audio and video data corresponding to the frame of audio and video timestamp; if the difference value is between the positive threshold value and the negative threshold value and the frame audio time stamp is greater than the frame video time stamp, only video data corresponding to the frame video time stamp is determined to be played; if the difference value is smaller than the negative threshold value, discarding the audio and video data corresponding to the frame audio and video timestamp; if the difference value is greater than the positive threshold value, the preset time length is set at intervals, the system time is read again, and the process of determining whether to play the audio and video data corresponding to the frame of audio and video time stamp or not by using the frame of audio and video time stamp, the read system time, the reference time point and the reference time point is returned to be executed.

The audio and video synchronous playing service generally takes video as a main service and audio as an auxiliary service, so that the difference value between the actual offset time length and the reference offset time length is judged firstly, and the relation between the audio time stamp and the video time stamp is continuously judged under the condition that the difference value meets the condition. If the difference between the actual offset duration and the reference offset duration is positive, it indicates that the playing time has not been reached, if the difference between the actual offset duration and the reference offset duration is negative, it indicates that the playing time has elapsed, if it is zero, it indicates that the playing time is just reached, and the range between the positive threshold and the negative threshold indicates the allowable range of the difference, which can be set according to practical experience. In addition, in the case that it is determined that the playing time has not been reached, the next comparison process may be waited, that is, the system time is read continuously after waiting for a preset time length, and the size of the preset time length (in milliseconds) may be smaller than the length of an interval between a positive threshold and a negative threshold, for example, a positive threshold of 5 and a negative threshold of-5, and the preset time length is smaller than 10 milliseconds.

In an exemplary scenario, a certain frame is read with a video timestamp of 14:47:00, an audio timestamp of 14:46:59, a system time of 14:47:01, a reference time point of 14:46:00, and a reference time point of 14:46:01, wherein the timestamp format is in minutes: seconds: milliseconds, and the positive and negative thresholds are 5 and-5, respectively, and thus the actual offset duration is: the frame video timestamp-reference time point is 1 second, and the reference offset duration is: the read system time-reference time point is 1 second; and the difference value between the actual offset time length and the reference offset time length is 0 and is between-5 and 5, and the audio time stamp is smaller than the video time stamp, so that the audio and video data corresponding to the frame of audio and video time stamp is determined to be played.

It will be understood by those skilled in the art that the time stamps in the above scenario are merely illustrative and do not represent time stamps in an actual application.

It should be noted that, before determining whether to play the audio/video data corresponding to the frame of audio/video timestamp by using the frame of audio/video timestamp, the read system time, the reference time point, and the reference time point, the background thread may determine whether an abnormal condition exists, so as to ensure the accuracy of playing the audio/video data. If no abnormal condition exists, the process of determining whether to play the audio and video data corresponding to the frame audio and video time stamp or not by using the frame audio and video time stamp, the read system time, the reference time point and the reference time point can be continuously executed; if an abnormal condition exists, the synchronous reference time is reset again, namely the frame audio and video time stamp can be determined as a reference time point, the currently read system time is determined as a reference time point, and a notice of discarding the audio and video data is returned to the foreground thread.

For the abnormal condition judgment process, the background thread may first judge whether the reference offset duration of the current frame is less than the reference offset duration of the previous frame, if so, determine that an abnormal condition exists, reset the reference time point, if not, continue to determine an adjacent difference between the video timestamp of the current frame and the video timestamp of the previous frame, if the adjacent difference exceeds a certain threshold range or the actual offset duration of the current frame is a negative value, determine that an abnormal condition exists, and reset the synchronous reference time again.

It should be further noted that, before reading a frame of audio/video timestamp each time, the background thread may determine whether processing of the currently received audio/video timestamp is finished, if so, send a data request to the foreground thread, and if not, read a frame of audio/video timestamp.

In this embodiment, for an audio/video timestamp sent by a foreground thread to a background thread, a first frame audio/video timestamp and a system time are read first, the first frame video timestamp is determined as a reference time point, the read system time is determined as a reference time point, and audio/video data corresponding to the first frame audio/video timestamp is directly determined to be played.

Fig. 3 is a hardware block diagram of a multimedia device according to an exemplary embodiment of the present application, where the multimedia device includes: a communication interface 301, a processor 302, a machine-readable storage medium 303, and a bus 304; wherein the communication interface 301, the processor 302, and the machine-readable storage medium 303 communicate with each other via a bus 304. The processor 302 may execute the above-described method for audio/video synchronous playing by reading and executing machine executable instructions corresponding to the control logic of the method for audio/video synchronous playing in the machine readable storage medium 302, and the specific content of the method is referred to the above-described embodiments, and will not be described herein again.

The machine-readable storage medium 303 referred to herein may be any electronic, magnetic, optical, or other physical storage device that can contain or store information such as executable instructions, data, and the like. For example, the machine-readable storage medium may be: a RAM (random Access Memory), a volatile Memory, a non-volatile Memory, a flash Memory, a storage drive (e.g., a hard drive), any type of storage disk (e.g., an optical disk, a dvd, etc.), or similar storage medium, or a combination thereof.

Further, the multimedia device may be a variety of terminal or backend devices, such as a camera, server, mobile phone, Personal Digital Assistant (PDA), mobile audio or video player, game console, Global Positioning System (GPS) receiver, or portable storage device such as a Universal Serial Bus (USB) flash drive, to name a few.

Fig. 4 is a structural diagram of an embodiment of an apparatus for audio and video synchronous playing shown in this application according to an exemplary embodiment, where this embodiment may be applied to a multimedia device based on an HTML5 platform, where the multimedia device runs a foreground thread, and the foreground thread is used to play audio and video data, and as shown in fig. 4, the apparatus for audio and video synchronous playing includes:

the first running module 40 is configured to send, by running the foreground thread, an audio/video timestamp to a background thread different from the foreground thread by the foreground thread;

the second running module 41 is configured to determine, by running the background thread, whether to play audio and video data corresponding to each frame of audio and video timestamp according to each frame of audio and video timestamp, and if so, return a notification of playing the audio and video data to the foreground thread, so that the foreground thread plays the audio and video data;

wherein the background thread and the foreground thread run independently of each other.

In an optional implementation manner, the first operation module 40 is specifically configured to monitor whether the number of unsent video timestamps in the video timestamp linked list reaches a preset value; when the condition that a preset value is reached is monitored, copying an unsent video time stamp in the video time stamp linked list and an unsent audio time stamp in the audio time stamp linked list; and sending the copied audio and video time stamp to the background thread.

In an optional implementation manner, the first running module 40 is specifically configured to copy, when a data request of a background thread is received, a video timestamp that is not sent in the video timestamp linked list and an audio timestamp that is not sent in the audio timestamp linked list; and sending the copied audio and video time stamp to the background thread.

Fig. 5 is a structural diagram of another embodiment of an apparatus for synchronously playing audio and video according to an exemplary embodiment of the present application, and based on the embodiment shown in fig. 4, as shown in fig. 5, the second operation module 41 includes:

the first determining submodule 411 is configured to read a first frame of audio/video timestamp and system time, determine the frame of video timestamp as a reference time point, determine the read system time as a reference time point, and directly determine to play audio/video data corresponding to the frame of audio/video timestamp;

and the second determining submodule 412 is configured to determine whether to play audio and video data corresponding to the frame of audio and video timestamp by using the frame of audio and video timestamp, the read system time, the reference time point, and the reference time point every time one frame of audio and video timestamp and one system time are read from the second frame of audio and video timestamp.

The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.

For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.

The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

12页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:显示设备及其图像处理方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类