System and method for synchronizing audio content on a mobile device to a separate visual display system

文档序号:1967151 发布日期:2021-12-14 浏览:13次 中文

阅读说明:本技术 用于将移动设备上的音频内容同步到分离的视觉显示系统的系统和方法 (System and method for synchronizing audio content on a mobile device to a separate visual display system ) 是由 陈敬权 李绍正 于 2020-05-08 设计创作,主要内容包括:提供一种将配置成在移动设备上播放的从音频内容与同步到配置成分离的视觉显示系统上播放的主音频内容或主视觉内容的系统和方法,以增强观众成员在公共视听演出中的听觉和视觉体验。(A system and method of synchronizing slave audio content configured to be played on a mobile device with master audio content or master visual content configured to be played on a separate visual display system is provided to enhance the auditory and visual experience of audience members in a public audiovisual presentation.)

1. A method of synchronizing audio content configured for playback on a mobile device to visual content configured for audiovisual display played on a separate visual display system, the method comprising the steps of:

presenting the visual content on a visual output device controlled by the visual display system at a first location and time;

obtaining a visual display system true time from the visual display system based on a visual display system local time and a visual display system time offset,

generating, by the visual display system, a synchronization message configured for transmission to a processing server, wherein the synchronization message comprises the visual display system real time, a visual content status, and a visual content time code;

sending, by the processing server, the synchronization message to the mobile device at periodic time intervals,

determining an audio content time code based on the visual content time code, a mobile device real time, and the visual display system real time, wherein when the audio content is triggered to play on the mobile device at a second location and time when the visual content state is in a play mode, the audio content is played at the audio content time code that is synchronized with the video content time code associated with a current location of the video content on the video display system.

2. The method of claim 1, comprising the steps of:

determining whether the audio content time code is not synchronized with the visual content time code on the visual display system beyond a predetermined time interval; and

adjusting playback of the audio content on the mobile device to an audio content time code that is synchronized with the visual content time code associated with the current location of the visual content on the visual display system in response to the audio content time code not being synchronized beyond the predetermined time interval.

3. The method of claim 1, wherein the audio content timecode is determined based on a summation of the visual content timecode and a time difference between the mobile device true time and the visual display system true time.

4. The method of claim 1, wherein the synchronization message is transmitted when there is an adjustment to the visual content time code.

5. The method of claim 1, wherein the synchronization message is transmitted when there is a change to the state of the visual content state.

6. The method of claim 3, wherein the mobile device real time is determined based on a mobile device local time and a mobile device time offset, wherein the mobile device time offset is a time difference between the mobile device local time and a timeserver current time derived from a timeserver in communication with the mobile device.

7. The method of claim 1, wherein the visual display system time offset is a time difference between the visual display system local time and a timeserver current time derived from a timeserver in communication with the visual display system.

8. The method of claim 2, wherein the predetermined time interval is approximately 200 ms.

9. A method of synchronizing slave audio content configured for playback on a mobile device to master audio content synchronized with a live visual display, wherein the master audio content is configured for playing on a separate visual display system, the method comprising the steps of:

playing the primary audio content on the visual display system at a first location and time;

acquiring real time of the visual display system based on the local time of the visual display system and the time offset of the visual display system from the visual display system;

generating, by the visual display system, a synchronization message configured for transmission to a processing server, wherein the synchronization message comprises the visual display system real time, a primary audio content status, and a primary audio content time code;

sending, by the processing server, the synchronization message to the mobile device at periodic time intervals;

determining a slave audio content time code, the slave audio content time code based on the master audio content time code, a mobile device real time, and the visual display system real time, wherein upon the slave audio content being played on the mobile device at a second location and time when the master audio content state is in a play mode, the slave audio content is played at the slave audio content time code that is synchronized with the master audio content time code associated with a current location of the master audio content on the video display system.

10. The method of claim 9, comprising the steps of:

determining whether the slave audio content time code is out of synchronization with the master audio content time code on the visual display system for more than a predetermined time interval; and

adjusting playback of the slave audio content on the mobile device to an audio content time code that is synchronized with the master audio content time code associated with the current location of the master audio content on the visual display system in response to the slave audio content time codes not being synchronized beyond the predetermined time interval.

11. The method of claim 9, wherein the slave audio content comprises a slave audio cue track for initiating a portion of the live visual display, whereby the slave audio cue track is transmitted by one or more computing devices to one or more control systems such that the audio cue track is configured for synchronization with the master audio content.

12. The method of claim 9, wherein the slave audio content comprises an audio track associated with a voice recording configured for synchronization with the master audio content.

13. The method of claim 9, wherein the slave audio content comprises an audio track associated with a musical accompaniment configured for synchronization with the master audio content.

14. The method of claim 9, wherein the primary audio content comprises an audio cue track configured for triggering a portion of a live visual display by a controller or a manual operator.

15. The method of claim 9, wherein the slave audio content comprises a slave audio cue track configured for triggering, by a manual operator, a portion of a live visual display configured for synchronization with the master audio content.

16. The method of claim 9, wherein the synchronization message is transmitted when there is an adjustment to the visual content time code.

17. The method of claim 9, wherein the synchronization message is transmitted when there is a change to the state of the visual content state.

18. The method of claim 9, wherein the slave audio content timecode is determined based on a summation of the master audio content timecode and a time difference between the mobile device real time and the visual display system real time, and wherein the mobile device real time is determined based on a mobile device local time and a mobile device time offset, wherein the mobile device time offset is a time difference between the mobile device local time and a time server current time derived from a time server in communication with the mobile device.

19. The method of claim 9, wherein the visual display system real time is determined based on a visual display system local time and a visual display system time offset, wherein the visual display system time offset is a time difference between the visual display system local time and a time server current time derived from a time server in communication with the visual display system.

20. The method of claim 9, wherein the predetermined time interval is approximately 200 ms.

21. A method configured for implementation on a mobile device having at least one processor, at least one computer-readable storage medium, and a synchronization application connected to a network, the method comprising:

transmitting audio content configured for storage on the mobile device;

obtaining a mobile device real time, the mobile device real time determined based on a mobile device local time and a mobile device time offset, wherein the mobile device time offset is a time difference between the mobile device local time and a time server current time obtained from a time server in communication with the mobile device;

receiving a synchronization message at periodic time intervals from a processing server in wireless communication with a visual display system, wherein the synchronization message comprises a visual display system true time, a visual content status, and a visual content time code;

determining an audio content time code, the audio content time code based on the visual content time code, the mobile device real time, and the visual display system real time; wherein the visual display system true time is determined based on a visual display system local time and a visual display system time offset, wherein the visual display system time offset is a time difference between the visual display system local time and a time server current time derived from a time server in communication with the visual display system;

wherein the audio content is played at the audio content time code that is synchronized with the video content time code associated with the current location of the video content on the video display system when the audio content is triggered to be played on the mobile device while the visual content state is in a play mode.

22. A method of synchronizing slave audio content configured for playback on a mobile device to master audio content configured for play on a separate visual display system, the method comprising the steps of:

obtaining a visual display system real time from the visual display system, wherein the visual display system real time is determined based on a visual display system local time and a visual display system time offset, wherein the visual display system time offset is a time difference between the visual display system local time and a time server current time derived from a time server in communication with the visual display system;

generating, by the visual display system, a synchronization message configured for transmission to a processing server, wherein the synchronization message comprises the visual display system real time, a primary audio content state associated with an operational mode, and a primary audio content time code associated with a current location of the visual display system;

sending, by the processing server, the synchronization message to the mobile device at periodic time intervals,

determining a slave audio content time code, the slave audio content time code based on the master audio content time code, a mobile device real time, and the visual display system real time; wherein the mobile device real time is based on the mobile device local time and a mobile device time offset, wherein the mobile device time offset is a time difference between the mobile device local time and a time server current time obtained from a time server in communication with the mobile device;

wherein when the master audio content state is triggered to play the slave audio content on the mobile device while in the play mode, the slave audio content is played at the slave audio content time code that is synchronized with the master audio content time code associated with the current location of the master audio content on the video display system.

23. A system for synchronizing slave audio content configured for playback on a mobile device to master audio content configured for play on a separate visual display system, the system comprising:

a memory;

one or more processors coupled with the memory, wherein the memory includes processor-executable code that, when executed by the processors, causes the processors to perform operations comprising:

obtaining a visual display system real time from the visual display system; wherein the visual display system true time is determined based on a visual display system local time and a visual display system time offset, wherein the visual display system time offset is a time difference between the visual display system local time and a time server current time derived from a time server in communication with the visual display system;

generating, by the visual display system, a synchronization message configured for transmission to a processing server, wherein the synchronization message comprises the visual display system real time, a primary audio content state associated with an operational mode, and a primary audio content time code associated with a current location of the visual display system;

sending, by the processing server, the synchronization message to the mobile device at periodic time intervals,

determining a slave audio content timecode, the slave audio content timecode based on the master audio content timecode, a mobile device real time, and the visual display system real time, wherein the mobile device real time is based on the mobile device local time and a mobile device time offset, wherein the mobile device time offset is a time difference between the mobile device local time and a time server current time derived from a time server in communication with the mobile device;

wherein when the master audio content state is triggered to play the slave audio content on the mobile device while in the play mode, the slave audio content is played at the slave audio content time code that is synchronized with the master audio content time code associated with the current location of the master audio content on the video display system.

Technical Field

The present disclosure relates generally to audiovisual systems, and in particular to systems and methods for enhancing the audio experience of audience members in public audiovisual shows. More particularly, the present disclosure relates to synchronizing playback of audio content on one or more slave computing devices to audio content or visual content on a separate master computing device.

Background

The following discussion of the background to the invention is intended to facilitate an understanding of the present invention. However, it should be appreciated that the discussion is not an acknowledgement or admission that any of the material referred to was published, known or part of the common general knowledge in any jurisdiction as at the priority date of the application.

In many cases, audio-visual performances in public facilities are not sound well because of poor and low power amplifiers or speakers, noise pollution, Radio Frequency (RF) interference (using RF devices such as radio headphones) or because the distance between the speaker and the viewer is too large (audio is not synchronized with the visual display due to the difference in light and sound velocities). Such poor sound effects and/or large distances between audience members and visual displays may have a large impact on the overall experience, resulting in diminished and unsatisfactory audience member enjoyment of large activities.

Some applications have attempted to synchronize audio content playing on a mobile device to visual content playing on a separate visual display system using streaming technologies based on the internet or a local communication network such as WIFI. However, streaming is resource intensive on mobile devices, taking up a lot of data and battery power. Furthermore, streaming of audio content tends to be asynchronous with video content due to network delays or interference experienced by WIFI or data connections. WIFI connections are also easily disconnected from the mobile device or each router has limited connectivity. Even slight synchronization problems and packet loss can be problematic when viewing audio-visual performances in public facilities. Typically, mobile devices do not maintain an accurate or consistent time and, unlike different hardware and software, tend to drift as audio content is played. Thus, there is a high risk that audio content on the mobile device will not be synchronized with visual content played on a visual display system separate from the mobile device, and audience member enjoyment of the audiovisual presentation will be reduced.

Accordingly, it is an object of the present disclosure to address most of the problems and limitations discussed above.

Disclosure of Invention

The following presents a simplified summary of one or more aspects in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects, and is intended to neither identify key or critical elements of all aspects nor delineate the scope of any or all aspects. Its sole purpose is to present some concepts of one or more aspects in a simplified form as a prelude to the more detailed description that is presented later.

According to a first aspect of the present disclosure, there is provided a method of synchronizing audio content configured for playback on a mobile device to visual content configured for audiovisual display played on a separate visual display system. The method comprises the following steps: the method includes presenting the visual content on a visual output device controlled by the visual display system at a first location and time, obtaining a visual display system real time from the visual display system based on a visual display system local time and a visual display system time offset, and generating, by the visual display system, a synchronization message configured for transmission to a processing server, wherein the synchronization message includes the visual display system real time, a visual content status, and a visual content time code. The method further comprises the following steps: sending, by the processing server, the synchronization message to the mobile device at periodic time intervals, and determining an audio content time code, the audio content time code based on the visual content time code, a mobile device real time, and the visual display system real time, wherein the audio content is played at the audio content time code that is synchronized with the video content time code associated with the current location of the video content on the video display system when the audio content is triggered to be played on the mobile device at a second location and time when the visual content state is in a play mode.

Preferably, the method determines whether the audio content time code is out of synchronization with the visual content time code on the visual display system for more than a predetermined time interval, and adjusts playback of the audio content on the mobile device to an audio content time code that is synchronized with the visual content time code associated with the current location of the visual content on the visual display system in response to the audio content time code being out of synchronization with the visual content time code on the visual display system for more than the predetermined time interval.

Preferably, the audio content time code is determined based on a summation of the visual content time code and a time difference between the mobile device real time and the visual display system real time.

Preferably, the synchronization message is transmitted when there is an adjustment to the visual content time code.

Preferably, the synchronization message is transmitted when there is a state change of the visual content state.

Preferably, the mobile device real time is determined based on a mobile device local time and a mobile device time offset, wherein the mobile device time offset is a time difference between the mobile device local time and a time server current time derived from a time server in communication with the mobile device.

Preferably, the visual display system time offset is a time difference between the visual display system local time and a time server current time obtained from a time server in communication with the visual display system.

Preferably, the predetermined time interval is about 200 ms.

According to a second aspect of the present disclosure, there is provided a method of synchronizing slave audio content configured for playback on a mobile device to master audio content synchronized with a live visual display, wherein the master audio content is configured for playing on a separate visual display system. The method comprises the following steps: the method further includes, at a first location and time, playing the primary audio content on the visual display system, obtaining a visual display system real time from the visual display system based on a visual display system local time and a visual display system time offset, and generating, by the visual display system, a synchronization message configured for transmission to a processing server, wherein the synchronization message includes the visual display system real time, a primary audio content state, and a primary audio content time code. The method further comprises the following steps: sending, by the processing server, the synchronization message to the mobile device at periodic time intervals, and determining a slave audio content time code, the slave audio content time code based on the master audio content time code, a mobile device real time, and the visual display system real time, wherein the slave audio content is played at the slave audio content time code that is synchronized with the master audio content time code associated with a current location of the master audio content on the video display system when the slave audio content is triggered to be played on the mobile device at a second location and time when the master audio content state is in a play mode.

Preferably, the method determines whether the slave audio content time code is out of synchronization with the master audio content time code on the visual display system for more than a predetermined time interval, and adjusts playback of the slave audio content on the mobile device to an audio content time code that is synchronized with the master audio content time code associated with the current position of the master audio content on the visual display system in response to the slave audio content time code being out of synchronization for more than the predetermined time interval.

Preferably, the slave audio content comprises a slave audio indication track for initiating a portion of the live visual display, whereby the slave audio indication track is transmitted to one or more control systems by one or more computing devices such that the audio indication track is configured for synchronization with the master audio content.

Preferably, the slave audio content comprises an audio track associated with a voice recording configured for synchronization with the master audio content.

Preferably, the slave audio content comprises an audio track associated with a musical accompaniment configured for synchronization with the master audio content.

Preferably, the primary audio content includes an audio indicator track configured for triggering a portion of a live visual display by a controller or manual operator.

Preferably, the slave audio content comprises a slave audio indication track configured for triggering by a manual operator a portion of a live visual display configured for synchronization with the master audio content.

Preferably, the synchronization message is transmitted when there is an adjustment to the visual content time code.

Preferably, the synchronization message is transmitted when there is a state change of the visual content state.

Preferably, the slave audio content timecode is determined based on a summation of the master audio content timecode and a time difference between the mobile device real time and the visual display system real time, and wherein the mobile device real time is determined based on a mobile device local time and a mobile device time offset, wherein the mobile device time offset is the time difference between the mobile device local time and a time server current time derived from a time server in communication with the mobile device.

Preferably, the visual display system real time is determined based on a visual display system local time and a visual display system time offset, wherein the visual display system time offset is a time difference between the visual display system local time and a time server current time derived from a time server in communication with the visual display system.

Preferably, the predetermined time interval is about 200 ms.

According to a third aspect of the present disclosure, there is provided a method configured for implementation on a mobile device having at least one processor, at least one computer-readable storage medium, and a synchronization application connected to a network. The method includes transmitting audio content configured for storage on the mobile device, obtaining a mobile device real time based on a mobile device local time and a mobile device time offset, wherein the mobile device time offset is a time difference between the mobile device local time and a time server current time derived from a time server in communication with the mobile device, and receiving a synchronization message at periodic intervals from a processing server in wireless communication with a vision system, wherein the synchronization message includes a visual display system real time, a visual content status, and a visual content time code. The method also includes determining an audio content time code based on the visual content time code, the mobile device real time, and the visual display system real time, wherein the visual display system real time is obtained from a time difference between the visual display system local time and a time server current time obtained from a time server in communication with the visual display system, wherein the audio content is played at the audio content time code that is synchronized with the video content time code associated with a current location of the video content on the video display system when the visual content state is triggered to play the audio content on the mobile device while in a play mode.

According to a fourth aspect of the disclosure, a method of synchronizing slave audio content configured for playback on a mobile device to master audio content configured for play on a separate visual display system is provided. The method comprises the following steps: obtaining, from the visual display system, a visual display system real time based on a visual display system local time and a visual display system time offset, wherein the visual display system time offset is a time difference between the visual display system local time and a time server current time derived from a time server in communication with the visual display system, generating, by the visual display system, a synchronization message configured for transmission to a processing server, wherein the synchronization message includes the visual display system real time, a primary audio content state associated with an operational mode, and a primary audio content time code associated with a current location of the visual display system, and sending, by the processing server, the synchronization message to the mobile device at periodic time intervals. The method also includes determining a slave audio content time code based on the master audio content time code, a mobile device real time, and the visual display system real time, wherein the mobile device real time is based on a time difference between the mobile device local time and a time server current time derived from a time server in communication with the mobile device, wherein the slave audio content is played at the slave audio content time code that is synchronized with the master audio content time code associated with a current location of the master audio content on the video display system when the master audio content state is triggered to play the slave audio content on the mobile device while in a play mode.

According to a fifth aspect of the disclosure, there is provided a system of synchronizing slave audio content configured for playback on a mobile device to master audio content configured for play on a separate visual display system, the system comprising a memory, one or more processors coupled with the memory, wherein the memory comprises processor-executable code that, when executed by the processors, causes the processors to perform operations comprising: obtaining a visual display system real time from the visual display system; wherein the visual display system true time is determined based on a visual display system local time and a visual display system time offset, wherein the visual display system time offset is a time difference between the visual display system local time and a time server current time derived from a time server in communication with the visual display system; generating, by the visual display system, a synchronization message configured for transmission to a processing server, wherein the synchronization message comprises the visual display system real time, a primary audio content state associated with an operational mode, and a primary audio content time code associated with a current location of the visual display system, and sending, by the processing server, the synchronization message to the mobile device at periodic time intervals. The operations further include determining a slave audio content timecode, the slave audio content timecode based on the master audio content timecode, a mobile device real time, and the visual display system real time, wherein the mobile device real time is based on the mobile device local time and a mobile device time offset, wherein the mobile device time offset is a time difference between the mobile device local time and a timeserver current time derived from a timeserver in communication with the mobile device. Wherein when the master audio content state is triggered to play the slave audio content on the mobile device while in the play mode, the slave audio content is played at the slave audio content time code that is synchronized with the master audio content time code associated with the current location of the master audio content on the video display system.

Drawings

The drawings illustrate by way of example and not limit some embodiments of the invention.

In the drawings, like reference numerals generally refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. The dimensions of various features or elements may be arbitrarily expanded or reduced for clarity. In the following description, various embodiments of the present invention are described with reference to the following drawings, in which:

FIG. 1 shows a block diagram illustrating a high-level system architecture for synchronizing slave audio content on one or more computing devices to master audio content or master visual content configured for playing on a separate visual display system, in accordance with various embodiments.

FIG. 2 sets forth a flow chart illustrating an exemplary method for synchronizing slave audio content on one or more computing devices to master audio content or master visual content configured for playback on a separate visual display system according to various embodiments.

FIG. 3 illustrates a high-level system overview for synchronizing audio content on a mobile device to visual content on a separate visual display system.

FIG. 4 is a flow diagram illustrating an exemplary method on a visual display system for enabling synchronization of separate audio content to visual content on a separate visual display system.

FIG. 5 sets forth a flow chart illustrating an exemplary method for synchronizing audio content on one or more mobile devices to visual content on a separate visual display system.

FIG. 6 illustrates an alternative high-level system overview of synchronizing audio content on a mobile device to visual content on a separate visual display system.

FIG. 7 is a flow diagram illustrating an exemplary method on a visual display system to enable synchronization of separate audio content on a mobile device to visual content on a separate visual display system.

FIG. 8 illustrates a flow chart of an exemplary method for synchronizing audio content on one or more mobile devices to visual content on a separate visual display system.

Detailed Description

The following detailed description refers to the accompanying drawings that show, by way of illustration, specific details and embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. Other embodiments may be utilized and structural and logical changes may be made without departing from the scope of the present invention. The various embodiments are not necessarily mutually exclusive, as some embodiments may be combined with one or more other embodiments to form new embodiments.

The embodiments described in the context of one of these systems or methods are equally valid for the other system or method. Similarly, embodiments described in the context of a system are equally valid for a method, and vice versa.

Features described in the context of one embodiment may correspondingly apply to other embodiments, even if not explicitly described in these other embodiments. Furthermore, features described for one feature in the context of one embodiment may additionally and/or alternatively be applied correspondingly to the same or similar features in other embodiments.

As used herein and in the context of the various embodiments, the expression "visual content" may refer to any type of visual media, such as video or static visual media that is displayed on any electronic device and that can be moved, animated, altered, or visually modified as viewed by a user. "visual content" as used in the scenes of the various embodiments may be used interchangeably with "visual data" which may refer to data used to manipulate visual elements like, but not limited to, fireworks, lasers, light projections, fountains, and the like.

As used herein and in the context of the various embodiments, the expression "audio content" may be used interchangeably with "audio data".

As used herein and in the context of various embodiments, the articles "a" and "an" as used in connection with a feature or element include reference to one or more features or elements.

As used herein and in the context of various embodiments, the term "and/or" includes any and all combinations of one or more of the associated listed items.

Thus, in one or more example embodiments, the functions described may be implemented in hardware, software, or any combination thereof. If implemented in software, the functions may be stored on or encoded as one or more instructions or code on a computer-readable medium.

As used herein and in the context of various embodiments, "second" and "first" as used in connection with location and time are used to distinguish, without limitation, location and time. For example, "second location and time" may be provided without the need for "first location and time" or, conversely, "first location and time" may be provided without the need for "second location and time".

In the specification, the term "comprising" is to be understood as having a similar broad meaning as the term "comprising" and is to be understood as implying that an integer or step or group of integers or steps is included which is stated, but not excluding any other integer or step or group of integers or steps. This definition also applies to variants of the term "comprising", such as "comprising" and "comprises".

In this specification, "timecode" refers to any symbolic designation of a constant and timed interval throughout the process of audio or visual media, which may be used to reference the location of an "playing" or "stopped" audio or video content file.

In order that the invention may be readily understood and put into practical effect, several specific embodiments will now be described, by way of example and not limitation, with reference to the accompanying drawings. It should be understood that any of the features described herein with respect to a particular system may also be true for any of the systems described herein. It is to be understood that any feature described herein with respect to a particular method may also be true for any of the methods described herein. Further, it should be understood that for any system or method described herein, not all components or steps are necessarily included in the system or method, but may include only some (but not all) components or steps.

The present disclosure is directed to providing a solution to the above-described problems by providing a system and method that enables audio content of a piece of audiovisual content to be pre-downloaded to a user's mobile device and then played (preferably through headphones) in synchronization with the visual content being played on a separate visual display system.

The present disclosure also allows for synchronizing playback of slave audio content on one or more mobile devices to master audio and/or master visual content on a separate visual display system. In other words, in some embodiments, the present disclosure allows for synchronizing primary audio content on one or more mobile devices to primary audio content on a separate visual display system. In other embodiments, the present disclosure allows for synchronizing slave audio content on one or more mobile devices to master visual content on a separate visual display system. In other embodiments, the present disclosure allows for synchronizing slave audio content on one or more mobile devices to master audio and master visual content on separate visual display systems.

FIG. 1 illustrates a system 60 for synchronizing slave audio content on one or more computing devices 50, 51, 52 to master audio content or master visual content on a separate visual display system 10. According to some embodiments, the separate visual display system 10 is located remotely from one or more mobile devices 50, 51, 52. In the system 60, each user of the mobile device accesses an audio synchronization application on the mobile device. The mobile device may be any type of stationary device, or portable device, including a mobile handset, unit, device, multimedia tablet, PhableTet, communicator, desktop computer, laptop computer, personal digital assistant, or any combination thereof, specifically configured to perform the functions of a mobile device having a speaker, a headset, or an earpiece wired or wirelessly connected to the mobile device. The mobile devices 50, 51, 52 may communicate with the processing server 30, the data storage server 40 and the time server 20 via a communication network (not shown). The audio synchronization application may be downloaded by the user from the data storage server 40 at a predetermined time or when needed.

The visual display system 10 may be in communication with the processing server 30, the data storage server 40, and the time server 20 via a communication network (not shown). In some embodiments, a visual player display application may be downloaded and installed on a visual display system to provide an interface for controlling and playing visual content. The visual output device 11 may be connected to the visual display system by a wired or wireless connection. According to various embodiments, the visual output device may be a visual display output, such as a television, projector, or monitor, configured for viewing by a viewer in a public facility. When the visual display system is activated by an operator to play the primary visual content, the primary visual content is displayed on the visual output device. Primary visual content may include, but is not limited to: movies, television, digital visual content, etc. In some embodiments, the present disclosure allows for synchronizing slave audio content on one or more mobile devices to master visual content on a separate visual display system. For example, in a scene of a show or exhibition displayed on a visual output device at a large event, slave audio content played on the mobile devices of audience members at the large event may be synchronized with the master visual content being played on the visual display system. The primary visual content may be downloaded or played from a data storage server or stored on a memory of the visual display system.

The time server 20 is a server that reads the actual time from the reference clock and distributes the information using a communication network. The time server may be a server utilizing Network Time Protocol (NTP) for distributing and synchronizing time over a communication network.

The processing server 30 may be a single server or a group of servers. The set of servers may be centralized or distributed (e.g., the processing server 30 may be a distributed system). In some embodiments, the processing server 30 may be local or remote. For example, the processing server 30 may access information and/or data stored in a visual display system or mobile device over a communications network. As another example, the processing server 30 may be directly connected to a visual display system to access stored information and/or data. In some embodiments, the processing server 30 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, inter-cloud, multi-cloud, and the like, or any combination thereof.

The data storage server 40, which may be in network communication with the visual display system 10 and the mobile devices 50, 51, 52, may include separate slave audio content and master visual or audio content. The primary visual content may be downloaded to the visual display system via a communication network and stored within a memory of the visual display system. The visual display system also includes a processor (not shown) that performs operations to execute or run code or programming to perform a plurality of functions; in particular a video player application (not shown). In some embodiments, the primary visual content may not need to be downloaded from the data storage server 40. In some embodiments, it may be uploaded onto a visual display system via a direct connection to a hard disk drive. In some embodiments, the audio content may be downloaded from the data storage server to the mobile device via the communication network by the mobile device and stored in a memory of the mobile device.

In some embodiments, the present disclosure allows for synchronizing slave audio content on one or more mobile devices to master audio content on a separate visual display system. For example, in a scene of a live visual display performance at a first location, slave audio content played on a mobile device of an audience member at a second location may be synchronized with master audio content being played on a visual display system. In some embodiments, the visual display system may be in wired or wireless communication with the control system 12. The control system 12 is a system or set of devices that command or adjust the behavior of other devices or systems to achieve a desired result. In some embodiments, the control system 12 includes an input device such as a button, key, or switch to trigger the live visual display performance of the plurality of portions. The main audio content configured to be played on the visual display system may include one or more audio indicator tracks that may be embedded, mixed, or combined into one or more audio tracks to form the main audio content such that the audio indicator tracks are capable of being described as precise time relative to the audio tracks. The slave audio content configured to be played on the mobile device may include one or more slave audio indicator tracks that may be embedded, mixed, or combined into one or more audio tracks to form the slave audio content such that the predetermined duration of all of the slave audio content is the same as the master audio content. The audio indicator tracks are configured to control or activate portions of a live visual display performance. The audio cue track may include audio cues or audio timecodes, such as SMPTE timecodes, FSK timecodes, or LTC (linear timecodes), each being a standard protocol for timecodes. The audio cue tracks may include cues, which may take the form of fireworks, timing, lights, or audible cues that control or direct portions of the live visual display performance. For example, in a firework display, the audio cue track may include firework cues, each of which activates a different set of fireworks at a different time. Alternatively, the primary audio content may comprise an audio track, which may be a musical content or a recording, which may be input to a separate audio output device (not shown). In some embodiments, an audio cue track is fed into the control system from the visual display system. In some embodiments, the audio cue track is fed into the one or more control systems by one or more external devices (not shown), which may be slave mobile devices that play the audio cue track in synchronization with the visual display system. In other embodiments, the control system includes a control signal corresponding to an audio time code for activating a cue of a live visual display performance. On the other hand, the slave audio content may include an audio track, which may be music content or a recording, that may be played by the slave mobile device of the audience member.

The primary audio content is an audio track of predetermined duration. As previously described, the slave audio content may include a slave audio cue track, which may include an audio time code thereon, a light cue, a sound cue, or a fireworks cue, or a combination thereof. Each of the one or more slave audio cue tracks has the same predetermined duration as the master audio content. The slave audio hint tracks can be played on one or more slave external devices through an audio synchronization application. In this manner, by allowing the slave audio hint tracks to be played from an external device (e.g., a mobile device) in synchronization with the master audio content being played on the visual display system, the operator of the slave mobile device can control or direct portions of the live visual display performance. For example, in one embodiment, the slave mobile device may be in wired or wireless communication with a second control system that is different from the control system in wired or wireless communication with the visual display system. The second control system may in turn be configured to trigger a light or fireworks cue upon receipt of a control signal corresponding to the slave audio cue track being played on the slave mobile device. In another embodiment, the slave mobile device may not communicate with the second control system. The operator of the mobile device will hear the slave audio cue track playing on the mobile device and when the operator receives the cue and does so, will trigger a light or fireworks cue on the second control system.

As an example, the visual display system may play primary audio content containing an audio cue track, including an audio time code such as SMPTE, FSK, or LTC. The audio cue track is fed into a first control system having one or more firework cues associated with a plurality of time code locations. In some embodiments, a secondary audio cue track, including an audio time code configured to control a visual or sound effect, may be fed by the mobile device to a second or third control system in a separate location remote from the visual display system and the first control system. A light display operator having a mobile device can listen to an audio cue track, which includes verbal cues to trigger preset light cues at a plurality of preset locations of a live visual display show. The lights show that the operator's mobile device is playing the slave audio cue tracks in synchronization with the master audio cue tracks being played on the visual display system. In some embodiments, the audience member may listen on a slave mobile device to an audio track, which may contain a music track, that is played in synchronization with the visual display system. This enables the various audiovisual components of a large fireworks campaign to work in synchronism with each other.

By connecting the mobile devices 50, 51, 52 and the visual display system 10 to the processing server 30, the data storage server 40 and the time server 20 via a communication network, slave audio content configured for playing on the mobile devices can be synchronized with master audio content and/or master visual content playing on a separate visual display system 10, thereby enabling audience members who are watching the visual output device 11 displaying the master visual content to have a better audio and visual experience for large activities.

As used herein, mobile devices 50, 51, 52, visual display system 10, visual output device 11, and control system 12 may exchange data via any communication network, such as a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a proprietary network, and/or an Internet Protocol (IP) network, such as the internet, an intranet, or an extranet. Each device, module or component within the system may be connected via a network or may be directly connected. Those skilled in the art will recognize that the terms "network," "computer network," and "online" are used interchangeably and do not imply a particular network embodiment. In general, any type of network may be used to implement the online or computer network embodiments of the present disclosure. The network may be maintained by a server or a combination of servers or the network may be serverless. Further, any type of protocol (e.g., HTTP, FTP, ICMP, UDP, WAP, SIP, H.323, NDMP, TCP/IP) may be used to communicate across the network. Devices and systems as described herein may communicate via one or more such communication networks.

FIG. 2 sets forth a flow chart illustrating an exemplary method for synchronizing slave audio content on one or more computing devices to master audio content or master visual content configured for playback on a separate visual display system according to various embodiments. The audience member may initiate the downloading of the audio synchronization application when the audience member arrives at a location where the primary visual content of a predetermined duration can be viewed on the visual output device, for example, at the date and time of a show that previously notified the audience member, or at a live visual display show where the primary audio content of a predetermined duration is played on the visual display system. The audio synchronization application allows the audience member to download from the data storage server slave audio content of a performance that the audience member is interested in viewing, which has the same predetermined duration as the primary visual content or the primary audio content. In some embodiments, downloading from the data storage server the secondary audio content may be performed at any location before the predetermined location and time is reached. In some embodiments, audience members are offered a choice of one or more shows for their purchase, whether at a predetermined location or anywhere, and the user is prompted to pay using, but not limited to, a payment gateway or electronic wallet, credit card, or prepaid. Upon successful payment of the selected show, the audience member may download the selected show's secondary audio content into the memory of the mobile device.

At a predetermined location and time, the operator will initiate a broadcast of primary visual content or primary audio content on the visual display system. In some embodiments, the operator will initiate a broadcast of primary visual content or primary audio content on a visual display player application that provides the operator with an interface for controlling and playing the primary visual content or primary audio content. The primary visual content is output to a visual output device, such as a screen, projector, television, or the like. In step 71, when the operator triggers the primary visual content or the primary audio content to be played, the visual display system obtains the real time of the visual display system. The visual display system true time is determined based on the visual display system local time and the visual display system time offset and is based on the following formula:

real time of visual display system is local time of visual display system plus time offset of visual display system

The visual display system time offset is the time difference between the visual system local time and the time server time obtained from the time server in communication with the visual display system.

In step 72, the visual display system proceeds to generate a synchronization message containing:

visually displaying the system real time;

a primary visual content time code or a primary audio content time code; and

a primary visual content state or a primary audio content state.

The primary visual content time code corresponds to a current temporal location of primary visual content being played on the visual display system. The primary visual content state or the primary audio content state is associated with the operational mode or the state of the primary visual content or the state of the primary audio content, respectively. The operating mode or state may be "playing" or "stopped," which indicates whether the primary visual content or the primary audio content continues to play or whether the content has stopped playing. Once the synchronization message is generated, it is sent to the processing server.

In step 73, the processing server sends a synchronization message to the mobile device at periodic intervals when the mobile device is triggered by the audience member to play the slave audio content. The transmission of synchronization messages to the mobile device at periodic time intervals is achieved through a WebSocket communication protocol. This enables the mobile device to listen or continuously communicate with the processing server so that synchronization messages are sent at periodic intervals. This may be transmitted, for example, once per second. The WebSocket communication protocol also enables the mobile device to receive a synchronization message once the time code is adjusted forward or backward by the visual display system, or when changing from a "playing" to a "stopped" state, or vice versa. In other words, whenever an operator makes a manual adjustment to the primary audio content time code or the primary visual content time code or a change to the state of the primary visual content or the primary visual content state, respectively, then one or more synchronization messages will be transmitted to the mobile device accordingly.

In step 74, the mobile device will determine the mobile device real time. The mobile device real time is determined based on the mobile device local time and the mobile device time offset and is based on the following equation:

mobile device true time + mobile device time offset

The mobile device real offset is the time difference between the mobile device local time and the time server time derived from the time server with which the mobile device is communicating.

In step 75, the mobile device will determine a slave audio content time code that is based on the master video content time code or master audio content time code, the mobile device real time, and the visual display system real time. The time code is determined from the audio content based on the following formula:

delay-mobile device real time-visual display system real time

Slave audio content time code as master visual content time code or master audio content time code + delay

The slave audio content time code is determined based on the master visual content time code or the master audio content time code summed with a delay determined as the time difference between the mobile device real time and the visual display system real time. The slave audio content time code corresponds to a current location of a master visual content time code or a master audio content time code being played on the visual display system.

In step 76, upon detecting that the primary visual content or the primary audio content is in a play mode, the audio synchronization application will determine whether the secondary audio content timecode is not synchronized with the primary visual content timecode or the primary audio content timecode. Since the mobile device maintains a persistent connection with the processing server via the WebSocket communication protocol, the mobile device receives synchronization messages at periodic intervals. This enables the audio synchronization application to check within a predetermined time interval whether the audio content time code is not synchronized with the primary visual content time code or the primary audio content time code. The formula for determining whether the slave audio content time code is not synchronized with the master visual content time code or the master audio content time code is as follows:

time difference from audio content time code to current slave audio content time code

In step 77, if the slave audio content timecode is not synchronized with the master visual content timecode or master audio content timecode for more than ± 200ms (i.e., the time difference is more than ± 200ms in either direction), for example, the mobile device will adjust the playback of the slave audio content timecode that is synchronized with the master audio content timecode or master video content timecode associated with the current location of the content on the visual display system. In some embodiments, a predetermined interval of about +200ms or about-200 ms is considered out of synchronization with the primary visual content timecode or the primary audio content timecode. If the time difference is less than 200ms, the mobile device will continue to play the slave audio content time code in synchronization with the master visual content time code or the master audio content time code in step 78.

Example 1

Fig. 3 shows an overview of the components of the system 100 that will enable the visual content 112 and audio content 114 separated in an audiovisual content file to be played separately on a visual display system and one or more mobile devices in synchronization with each other. When triggered to play, the visual display system notifies the time server of the visual content time code associated with the current location of the visual content being played on the visual display system and the visual content status of the visual content being played on the visual display system ("playing" or "stopped"). The time server records the visual content time code along with the current time (timestamp) and visual content status. The user's mobile device will play the audio content of the audiovisual content by extracting the latest current time, the visual content state, the visual content time code, and the time stamp from the time server, and then adjust the playing of the audio content to be synchronized with the visual content being played on the visual display system. In some embodiments, the audiovisual content file may include a show, but is not limited to: movies, television, video content, fireworks, lights, water, laser shows, etc.

A time code is a series of digital codes generated at regular intervals by a timing synchronization system. Time codes are used for video production, performance control, and applications that require time coordination or recording of recordings or actions. In the context of the present disclosure, a "timecode" refers to any symbolic designation of a constant and timed interval throughout the process of audio or visual media, and may be used to reference the location of an "playing" or "stopped" audio or video content file. In order to synchronize separate audio and visual content, their time codes should match exactly when they are played.

Both the visual display system and the mobile device will check the time server transit time when connecting with the time server, taking into account network delays. These transit times are used to adjust the audio content playback on the mobile device.

The data storage server 110 has separate visual content 112 and audio content 114 of a particular audiovisual content file in storage. The visual content 112 of the audiovisual content file is downloaded to the visual display system 120 via the internet (not shown) or other similar communication network and stored in the memory 126. The visual display system 120 also includes a processor 122 that performs operations to execute or run code or programming to perform a plurality of functions; for example, the processor may execute or run a visual display application. In some embodiments, the visual content may not need to be downloaded from the data storage server 110, but may be loaded onto the visual display system 120 from an external storage source (e.g., a hard drive) via a direct connection.

The audio content 114 of a particular audiovisual content file is downloaded to the mobile device 150 via the internet (not shown) or a communication network and stored in the memory 156 of the device. In some embodiments, the audio content may not need to be downloaded from the server 110, but may be loaded into the memory of the mobile device from an external device or storage source. In some embodiments, the audio content may be retrieved from an external device or storage source wirelessly or via a wired connection. In some embodiments, the external device or storage source may be a hard disk drive.

The time server 140 is a server that reads the actual time from the reference clock and distributes the information using a communication network. The time server 140 also has a database for recording information.

When the visual display system 120 is set to begin playing the visual content 128 (triggered using controls that may include, but are not limited to, a keyboard, a mouse, voice controls, time, etc.), there will be a visual display output via a connection to a visual output device 162, such as a projector, television, or monitor. At the same time, the visual display system 120 using the processor 122 and the visual display system clock 124 will check the visual display system time server transit time using ping or similar methods over the internet (not shown) or similar communication network. Visual display system timeserver transit time is the time it takes to transmit a ping from the visual display system to the timeserver and back. It will then send the visual content time code, the visual display system time server transit time, and the visual content status ("playing" or "stopped") to the time server 140. The time server immediately records the time at which it received the visual content time code, referred to herein as the timestamp. The time server 140 records the timestamp, visual content time code, visual display system server transit time, and current status ("playing" or "stopped") into its database.

Mobile device 150 may be any type of stationary or portable computing device including a mobile handset, unit, device, multimedia tablet, PhableTet, communicator, desktop computer, laptop computer, personal digital assistant, or any combination thereof, specifically configured to perform mobile device functions with a speaker, headset, or earpiece wired or wirelessly connected to mobile device 150. In the discussion, the terms "mobile device" and "computing device" are used interchangeably.

The mobile device 150 may access a communication network and communicate with other computers and systems, such as a processing server, a database and/or time server, a visual display system, and a display device operatively connected to the visual display system, all interconnected via the communication network.

The mobile device 150 also includes a processor 152 for executing or running code or programming applications. In some embodiments, the processor implements execution or running of an audio synchronization application. The audio synchronization application may be downloaded by the user or pre-installed on the mobile device. When the audio content 158 is played on the mobile device 150, the device will first check the mobile device time server transit time via the internet (not shown) or similar communication network using a ping or similar method using the processor 152 and the mobile device clock 154. The mobile device timeserver transit time is the time it takes to transmit a ping from the mobile device to the timeserver and back. Subsequently, the mobile device 150 will obtain the time stamp, visual content time code, visual display time server transit time, and visual content status of the visual display system 120 from the time server 140 ("playing" or "stopped"). Using the time stamp (of the visual display system and the mobile device), the visual content time code, the visual content status, and the time server transit time, the mobile device 150 adjusts the audio content 158 to be played back in synchronization with the visual content on the visual display system. To adjust the audio content to playback in synchronization with the visual content on the visual display system, the mobile device uses the NTP protocol to determine the mobile device real time, i.e., the mobile device checks the timeserver to obtain the current time of the timeserver and determines the mobile device real time based on the current server time and the mobile device timeserver transit time. The audio synchronization application then determines an audio timecode for playing the audio content at a location configured to be synchronized with a current location of the visual content timecode played on the visual display system. The audio content time code is determined by the following formula:

audio content time code ═ (mobile device true time- (timestamp-visual display system time server transit time/2)) + visual content time code

Audience member 170 will experience visual content 128 from visual output device 162 and audio content 158 seamlessly and audiovisual synchronously through mobile device 150.

As used herein, mobile devices 150 may exchange information via any communication network, such as a Local Area Network (LAN), Metropolitan Area Network (MAN), Wide Area Network (WAN), proprietary network, and/or Internet Protocol (IP) network, such as the internet, an intranet, or an extranet. Each device, module or component within the system may be connected via a network or may be directly connected. Those skilled in the art will recognize that the terms "network," "computer network," and "online" are used interchangeably and do not imply a particular network embodiment. In general, any type of network may be used to implement the online or computer network embodiments of the present disclosure. The network may be maintained by a server or a combination of servers or the network may be serverless. Further, any type of protocol (e.g., HTTP, FTP, ICMP, UDP, WAP, SIP, H.323, NDMP, TCP/IP) may be used to communicate across the network. Devices and systems as described herein may communicate via one or more such communication networks.

FIG. 4 is a flow diagram illustrating an exemplary method for enabling the synchronization of separate audio content to visual content on a visual display system. The visual display system is connected to the internet (or similar communication network). In step 210, the operator downloads and installs the visual display player application into the operating system of the visual display system. In step 230, the operator selects the audiovisual content file to be scheduled for play at the predetermined date and time and downloads the visual content of the audiovisual content file, stores it in memory, or opens and loads the visual content from local memory of the visual display system.

In some embodiments, and in step 250, the operator plays the visual content on a visual display player application on a visual display system at the location, date, and time of the show. In step 280, the visual content is output to a visual output device, such as a screen, projector, television, or the like. Meanwhile, the visual display player application will check the timeserver transit time using ping or similar methods in step 262 and send the visual content timecode, the timeserver transit time and the visual content status ("playing" or "stopped") to the timeserver in step 264. The time server immediately records the time at which it received the visual content time code, referred to herein as the timestamp. In step 270, the time server records this timestamp, the visual content time code, the server transit time, and the current status ("playing" or "stopped") into its database.

According to various embodiments, the time server is a server that reads the actual time from a reference clock and distributes this information using a communication network. In this case, the time server also has a database for recording information. The time server may be a local network time server or an internet server. In some embodiments, Network Time Protocol (NTP) is used to distribute and synchronize time over a communication network.

FIG. 5 illustrates a flow chart of an exemplary method for synchronizing audio content on a mobile device to visual content on a separate visual display system. The mobile device is connected to the internet (or similar communication network). In step 310, the audience member downloads and installs the audio synchronization application into the operating system of the mobile device. In step 320, the audience member selects a show to purchase, and then as an optional step, the audience member is prompted to pay for the show using, but not limited to, a payment gateway, an internal wallet, a subscription fee, a credit card, and the like. Alternatively, if the show does not require any payment, the audience members will not need to pay. Upon successful payment or no payment is required, the audio content of the show is downloaded and stored in the memory of the mobile device.

In step 360, the audience member arrives at the location at the specified date and time of the show, and in step 370, the audience member triggers the application to play the audio content. In step 372, the mobile device will check the time server using the internet (or similar communication network) to see if the show has started. In step 374, if the show has started, the mobile device confirms that there are the following records from the visual display system: a timestamp, a time code, and a visual content state in a "now playing" state. If the mobile device does not detect a recording or if a visual content state in a "stopped" state is detected on the time server database, the audio content will not play. If the mobile device detects the presence of the timestamp, time code, and visual content state in the "now playing" state, the mobile device will check the server transit time using ping or similar methods in step 376. In step 378, the mobile device will extract the current time, timestamp, time code, visual content status of the visual display system ("playing" or "stopped" status) from the time server database and use this information along with the mobile device's server transit time to calculate an adjustment to the audio content to be played back in synchronization 380 with the visual content. To adjust the audio content to play synchronously with the visual content on the visual display system, the mobile device uses a timestamp (of the visual display system and the mobile device), a visual content time code, a visual content status, and a time server transit time. The mobile device determines the mobile device real time using the NTP protocol, i.e., the mobile device checks the timeserver to obtain the current time of the timeserver and determines the mobile device real time based on the current server time and the mobile device timeserver transit time. The audio synchronization application then determines an audio timecode for playing the audio content at a location configured to be synchronized with a current location of the visual content timecode played on the visual display system. The audio content time code is determined based on the following formula:

audio content time code ═ (mobile device true time- (timestamp-visual display system time server transit time/2)) + visual content time code

In some embodiments, the timeserver and database 140 may be a single server or a group of servers. The set of servers can be centralized or distributed (e.g., can be a distributed system). In some embodiments, the time server and database 140 may be local or remote. In some embodiments, the time server and database may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, inter-cloud, multi-cloud, and the like, or any combination thereof.

Example 2

Fig. 6 shows an overview of the components of a system 400 that will enable the playback of visual content 412 and audio content 414 separated in audiovisual content on separate devices in synchronization with each other. In some embodiments, the audiovisual content is a show, but is not limited to: movies, television, video content, fireworks, lights, water, laser shows, etc.

Fig. 6 shows an alternative embodiment in which the time server is a separate entity, i.e. different from the time server disclosed in embodiment 1 above, and the communication between the mobile device and the visual display system is performed by the processing server 445. A time server as disclosed in the present embodiment refers to a publicly available time server device on the internet, such as Google public NTP time server, which reads actual time from a reference clock and distributes the information using NTP protocol.

The data storage server 410 has separate visual content 412 and audio content 414 for a particular performance in storage. The visual content 412 of the show is downloaded to the visual display system 420 via the internet (not shown) or other similar communication network and stored within the visual display system's memory 426. In some embodiments, the visual content and the audio content are downloaded to the visual display system via a communication network. The visual display system 420 also includes a processor 422 that performs operations to execute or run code or programming to perform a number of functions. For example, the processor may be programmed to run a visual display application. In some embodiments, the visual content may not be downloaded from the data storage server 410, but may be loaded onto the visual display system 420 from an external data source via a wireless or wired connection. For example, the external data source may be a hard disk drive.

Audio content 414 for a particular show is downloaded to the mobile device 450 via the internet (not shown) or similar communication network and stored in the device's memory 456. The audio content may not need to be downloaded from the data storage server 410. In some embodiments, the audio content may be loaded into the memory of the mobile device from an external data source via a wireless or wired connection. For example, the external data source may be a hard disk drive.

The time server 440 is a server device that reads the actual time from the reference clock and distributes this information using the NTP protocol, for example, a time server that is publicly available on the internet. The NTP protocol involves synchronizing a client to a network server using several packet exchanges, where each exchange is a pair of a request and a reply. When a request is made, the client stores its own time (start timestamp) in the packet being sent. When a server receives such a packet, it stores its own time (reception timestamp) in the packet, and returns the packet after putting the transmission timestamp in the packet. When receiving the reply, the receiving party will again record its own reception time to estimate the travel time of the data packet. The travel time (delay) is estimated as the "total delay minus half of the remote processing time", assuming a symmetric delay.

There is also a processing server 445 for transmitting synchronization messages from the visual display system 420 to all mobile devices 450. The protocol for transmitting synchronization messages from the processing service to the mobile device may be established through lightweight and fast connections via the processing server 445 using WebSocket or similar protocols. Such a constant connection between the processing server and the mobile device will be able to send frequent and regular synchronization messages from the visual display system to the mobile device. The mobile device and visual display system also check the mobile device time offset and visual display system time offset from time server 440 on a regular basis.

When the visual display system 420 is set to begin playing the visual content 428 (triggered using controls that may include, but are not limited to, a keyboard, a mouse, voice controls, time, etc.), there will be visual display output via connection to the visual output device 462. In some embodiments, the visual output device is a projector, a television, a monitor. At the same time, the visual display system 420 using the processor 422 and clock 424 will check the current time from the time server 440 via the internet (not shown) or similar communication network and calculate its visual display system time offset. Using its local time and the time offset from the time server time, it can derive the visual display system real time. It will then send the visual content time code, the visual display system true time, and the visual content status to the processing server 445 via the internet (not shown) or similar communication network. The visual content state may include a "playing" or "stopped" state. This process will be done periodically throughout the audiovisual presentation.

The mobile device 450 also includes a processor 452 for executing or running code or programming. In some embodiments, the processor is programmed to run an audio synchronization application. The audio content 458 is played on the mobile device 450, which will check the time of the time server 440 via the internet (not shown) or similar communication network and calculate its own time offset using the processor 452 and the mobile device clock 454. When the mobile device is triggered to play audio content, the mobile device will connect with the processing server to receive a synchronization message from the visual display system. The mobile device 450 will receive the synchronization message from the processing server 445 via the internet (not shown) or similar communication network.

Each synchronization message includes:

visual content status ("playing" or "stopped");

a visual content time code;

visually displaying the system real time;

upon receiving the synchronization message, the mobile device will perform one of two operations. The "playing" visual content state indicates that the visual content is playing on a visual display output device or visual display system. If the mobile device receives a synchronization message with the visual content state being the "stopped" state, the audio content will not play.

The mobile device will check its mobile device time offset with the time server 440. The mobile device then uses the mobile device time offset along with data from the synchronization message to calculate a current visual content time code or current position of the visual content playing on the visual display system 420, and adjusts playback of the audio content on the mobile device to be synchronized with the visual content displayed on the visual output device. The mobile device true offset is the variance or time difference between the mobile device local time and the current time obtained from the time server. Correspondingly, the visual display system time offset is the variation or time difference between the visual display system local time and the current time obtained from the time server.

Audience member 470 will experience visual content 428 from visual output device 462 and audio content 458 through mobile device 450 seamlessly and audiovisual synchronously.

For each synchronization message, a calculation is performed according to the following formula. Here is an example of a synchronization message:

synchronization message ═ synchronization message

Visually displaying the system real time;

visual content time code (current location);

visual content status ("playing" or "stopped")

Mobile device true time + mobile device time offset

Real time of visual display system is local time of visual display system plus time offset of visual display system

Delay-mobile device real time-visual display system real time

The delay is interpreted as the transit time of a "synchronization message" between the visual display system and the mobile device.

Audio content time code ═ visual content time code + delay

Time difference-audio content time code-current audio content time code

The audio application on the mobile device checks whether the player is at the audio content timecode. If there is a time difference between the audio content time code and the current audio content time code of more than 200ms, the player will adjust to the audio content time code.

There is always a risk of out-of-synchronization due to variable network condition conditions and performance degradation of the mobile device 450 and the visual display system 420, and therefore the mobile device and the visual display system establish a lightweight and fast connection using WebSocket or similar protocol via the processing server 445. Such a constant connection would enable frequent and regular "sync messages" to be sent from the visual display system to the mobile device. The mobile device and the visual display system also regularly check the mobile device time offset and the visual display system time offset, respectively. This constant and frequent process of checking the mobile device real time and the visual display system real time, together with sending "synchronization messages" ensures that the visual content is synchronized with the audio content with an offset of less than 200 ms.

The WebSocket communication protocol defines a mechanism for fast, secure, near real-time, and bi-directional communication between clients (i.e., mobile devices and visual display systems) and a processing server via a communication network. Data is transferred over a full duplex single socket connection, enabling the sending and receiving of data packets from both endpoints in real time. To establish a WebSocket connection, a specific HTTP-based handshake is exchanged between the client and the server. If successful, the application layer protocol is "upgraded" from HTTP using the previously established TCP transport layer connection "

To WebSocket. After the handshake, HTTP is no longer used, and data can be sent or received by both endpoints using the WebSocket protocol until the WebSocket connection closes. Thus, data is transferred between the audio synchronization application (on the mobile device) and the visual display application (on the visual display system) and vice versa via the WebSocket communication protocol on the server.

FIG. 7 is a flow diagram illustrating an exemplary method for enabling synchronization of audio content on a separate mobile device to visual content on a visual display system 500. At a predetermined location and time, the operator will initiate a broadcast of visual content on a visual display system in wired or wireless communication with the network. The primary visual content is output to a visual output device, such as a screen, projector, television, or the like. In step 510, the visual display player application is downloaded and installed or written into the operating system of the visual display system. In step 520, the operator will select a show on the visual display player application. Once selected, the visual content of the show will be downloaded and stored in the memory of the visual display system. In some embodiments, the visual content of the show has been preloaded onto the memory of the visual display system. In some embodiments, the visual content of the show may be downloaded from an external storage device.

At a predetermined location, date and time of the show, the operator may trigger the visual display system to play the visual content in the application in step 550. In step 580, the visual content is displayed on a visual output device. In some embodiments, the visual output device is a screen, projector, television, or the like. Meanwhile, in step 562, when the operator triggers the playing of the visual content, the visual display system or the visual display application obtains the real time of the visual display system. The visual display system true time is determined based on the visual display system local time and the visual display system time offset and is based on the following formula:

real time of visual display system is local time of visual display system plus time offset of visual display system

The visual display system time offset is the time difference between the visual display system local time and the time server time obtained from the time server in communication with the visual display system.

In step 564, the visual display system proceeds to generate a synchronization message containing:

visually displaying the system real time;

a visual content time code; and

visual content status.

The visual content time code corresponds to a current time position of the visual content being played on the visual display system. The visual content state is associated with an operational mode or state of the visual content. The operating mode or state may be "playing" or "stopped," which indicates whether the visual content continues to play or whether the visual content has stopped playing. Once the synchronization message is generated, it is sent to the processing server.

In step 570, the processing server sends a synchronization message to the mobile device at periodic intervals when the mobile device is triggered by the audience member to play the audio content. The transmission of synchronization messages to the mobile device at periodic time intervals is achieved through a WebSocket communication protocol. This enables the mobile device to listen or continuously communicate with the processing server so that synchronization messages are sent at periodic intervals. This may be transmitted, for example, once per second. The WebSocket communication protocol also enables the mobile device to receive a synchronization message once the time code is adjusted forward or backward by the visual display system, or when changing from a "playing" to a "stopped" state, or vice versa. In other words, whenever an operator makes a manual adjustment to the primary audio content time code or the primary visual content time code or a change to the state of the primary visual content or the primary visual content state, respectively, then one or more synchronization messages will be transmitted to the mobile device accordingly.

FIG. 8 illustrates a flow chart of an exemplary method for synchronizing audio content on a mobile device to visual content on a separate visual display system. The mobile device is connected to the internet (not shown) or similar communication network. In step 610, the audio synchronization application is downloaded and installed or written into the operating system of the mobile device. In step 620, the audience member will select a purchased show. In some embodiments, audience members are prompted to pay for the performance using, but not limited to, a payment gateway, an internal wallet, a subscription fee, a credit card, and the like. In some embodiments, the performance may not require payment and be given away to audience members for free. The audio content of the show is downloaded and stored in the memory of the mobile device upon successful payment or without payment because the show is a free gift.

At step 660, upon reaching the predetermined location, date and time of the show, the audience member triggers the audio synchronization application to play the audio content in the audio synchronization application. In step 672, the mobile device will connect with a processing server using the internet (not shown) or similar communication network to listen for synchronization messages from the visual display system. Upon receiving the synchronization message, 678, the mobile device will perform one of two operations. The "playing" visual content state indicates that the visual content is playing on a visual output device or visual display system. If the mobile device receives a synchronization message with the visual content state being the "stopped" state, the audio content will not play.

In step 676, the mobile device will determine the mobile device real time. The mobile device real time is determined based on the mobile device local time and the mobile device time offset and is based on the following equation:

mobile device true time + mobile device time offset

The mobile device real offset is the time difference between the mobile device local time and the time server time derived from the time server with which the mobile device is communicating.

In step 680, the mobile device will determine an audio content time code based on the visual content time code, the mobile device real time, and the visual display system real time that enables the audio synchronization application to adjust the audio content to be played at the audio content time code in synchronization with the visual content being played on the visual display system. The audio content time code is determined based on the following formula:

delay-mobile device real time-visual display system real time

Audio content time code ═ visual content time code + delay

The audio content time code is determined based on a visual content time code summed with a delay determined as a time difference between a mobile device real time and a visual display system real time. The audio content time code corresponds to a current location of the visual content time code being played on the visual display system.

When the visual content is detected to be in the play mode, the audio synchronization application will determine whether the audio content timecode is not synchronized with the visual content timecode. Since the mobile device maintains a persistent connection with the processing server via the WebSocket communication protocol, the mobile device receives synchronization messages at periodic intervals. The processing server also transmits a synchronization message whenever there is an adjustment to the visual content time code or a change to the state of the visual content state. This enables the audio synchronization application to check whether the audio content time code is not synchronized with the visual content time code within a predetermined time interval. The formula for determining whether the audio content time code is not synchronized with the visual content time code is as follows:

time difference-audio content time code-current audio content time code

For example, if the audio content timecode is more than ± 200ms out of sync with the visual content timecode, the mobile device will adjust the playback of the audio content timecode in sync with the video content timecode associated with the current location of the content on the visual display system. In some embodiments, the predetermined interval may be about +200ms or about-200 ms and is considered out of sync with the visual content timecode. If the time difference is less than 200ms, the mobile device will continue to play the audio content timecode in synchronization with the visual content timecode.

In use, the process of synchronizing audio content on the audience member's mobile device to visual content displayed on a separate visual output device may be used in an outdoor public movie facility. For example, a movie operator will use a visual display system and/or visual display application to download the visual portion (or visual content) of the movie into the system's memory. Audience members will download the audio content of the movie via the audio synchronization application on their mobile devices. When the visual display system begins playing, the audience member may trigger their mobile device to play the audio content. The synchronization process uses a clock, a time server (NTP) timecode, synchronization messages sent via the server, and calculations to adjust the audio content to play in synchronization with the visual content so that audience members can watch a movie through audiovisual synchronization as if the audio and visual portions of the movie were never separated. The audio content may be the initial audio from the movie or an alternative audio, such as director's commentary or a different audio language soundtrack of the movie.

An alternative application case may be to synchronize audio content on the audience member's mobile device to a live video display, such as a firework, light, water, laser show, and the like. For example, the operator would use the visual display system 420 and applications to download the main audio content into the system's memory 426. The main audio content may include an audio cue track, which may include an audio time code, such as SMPTE, FSK, or a linear time code. The audio time code track may be fed into a digital control system that may read the audio time code and have been pre-configured to initiate a fireworks cue at certain locations of the time code track. The audio cue track may also include pre-recorded verbal cues for use by a manual operator to trigger a series of fireworks. The audio prompting track can prompt control signals of any component of the live visual display, such as a light console, a firework control module, a digital emission system, and the like. The manual firework operator may listen to the audio cue track through a slave mobile device that plays the audio cue track in synchronization with the video display system master audio content. Audience members may download the slave audio content of a live performance through an application on their mobile device 450. The slave audio content may include an audio soundtrack corresponding to or synchronized with the master audio content played on the visual display system, providing an immersive audio experience to the audience members. In some embodiments, the audio tracks may include voice commentary, singing performances, and/or music tracks. Since the digital control system is synchronized to the visual display system by the audio time code, and any slave audio device that plays the audio cue track is also synchronized to the video display system; the slave audio content on the audience member's mobile device 450 will be synchronized with the real-time visual display of fireworks, lasers, lights, or water displays, etc.

Yet another application case may include various different slave audio tracks synchronized with the master audio track so that multiple different workers in a live performance can listen to respective cue audio tracks synchronized with the master audio track.

For example, in a live theatrical performance, lights, sounds, smoke makers and sceneries, etc. may all listen on their respective personal mobile devices 450 to their particular alert tracks, all synchronized with the main audio track on the stage manager controlled video display system 420.

In yet another example, in a fireworks show, multiple manual pegboard operators may listen to the same slave audio alert track on their respective mobile devices 450 to trigger fireworks and fireworks emissions that are synchronized with the master audio track played on the visual display system 420. At the same time, audience members may view the fireworks show while listening on their respective mobile devices 450 for the audio track (e.g., commentary/score) of the fireworks show without the fireworks launch trigger prompts.

While the present invention has been particularly shown and described with reference to embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. The scope of the invention is, therefore, indicated by the appended claims, and all changes that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

31页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种演唱界面的显示方法、显示设备及服务器

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类