Calibrating media playback channels for synchronous presentation

文档序号:1617320 发布日期:2020-01-10 浏览:27次 中文

阅读说明:本技术 校准媒体回放信道用于同步呈现 (Calibrating media playback channels for synchronous presentation ) 是由 L·M·瓦萨达 V·桑德拉姆 W·M·布姆加纳 D·H·洛伊德 C·J·桑德斯 S·A·拉 于 2019-06-25 设计创作,主要内容包括:本公开涉及校准媒体回放信道用于同步呈现。在一些具体实施中,计算设备可以通过确定通过媒体系统的媒体传播延迟来校准用于通过媒体系统呈现媒体内容的媒体回放信道。例如,计算设备可以发送校准内容(例如,音频数据、视频数据等)至媒体系统的各种回放设备(例如,回放信道)并记录指示校准内容何时被发送的时间戳。在回放设备呈现校准内容时,传感器设备(例如,遥控设备、智能电话等)可以检测校准内容的呈现。传感器设备可以发送校准数据(例如,可以包括校准内容的媒体样本和/或指示媒体样本何时被传感器设备检测到的时间戳)至计算设备。计算设备可以基于从传感器设备接收的校准数据来确定传播延迟(例如,呈现延时)。(The present disclosure relates to calibrating a media playback channel for synchronous presentation. In some implementations, a computing device may calibrate a media playback channel for presenting media content through a media system by determining a media propagation delay through the media system. For example, the computing device may send calibration content (e.g., audio data, video data, etc.) to various playback devices (e.g., playback channels) of the media system and record timestamps indicating when the calibration content was sent. While the playback device is rendering the calibration content, a sensor device (e.g., a remote control device, a smartphone, etc.) may detect the rendering of the calibration content. The sensor device may send calibration data (e.g., a media sample that may include calibration content and/or a timestamp indicating when the media sample was detected by the sensor device) to the computing device. The computing device may determine a propagation delay (e.g., a presentation delay) based on calibration data received from the sensor device.)

1. A method, comprising:

the media device sending calibration content to a first playback device associated with a first playback channel;

the media device storing a transfer time indicating when the calibration content was sent to the first playback device, the transfer time determined based on a first clock on the media device;

a sensor device detecting a portion of calibration content presented by the first playback device;

the sensor device generating calibration data comprising the portion of the detected calibration content and a detection time indicating when the portion of the calibration content was detected, the detection time determined based on a second clock on the sensor device;

the sensor device sending the calibration data to the media device; and

the media device calculates a propagation delay value based on the transmission time, the portion of the detected calibration content, and the detection time indicated in the calibration data.

2. The method of claim 1, wherein the first clock is a system clock and the second clock is a bluetooth clock.

3. The method of claim 1, wherein the first clock and the second clock are system clocks of the media device and the sensor device, respectively.

4. The method of claim 1, wherein the calibration content comprises a first media segment followed by a calibration media segment followed by a second media segment.

5. The method of claim 1, wherein the calibration content is audio content.

6. The method of claim 1, wherein the calibration content is video content.

7. The method of claim 4, wherein the calibration media segment is associated with a time offset, and the method further comprises:

the media device calculates a propagation delay value based on the transmission time, the detection time, and the time offset of the calibration media segment.

8. A system, comprising:

one or more processors; and

a computer-readable medium storing instructions that, when executed by the one or more processors, cause the processors to perform operations comprising:

the media device sending calibration content to a first playback device associated with a first playback channel;

the media device storing a transfer time indicating when the calibration content was sent to the first playback device, the transfer time determined based on a first clock on the media device;

a sensor device detecting a portion of calibration content presented by the first playback device;

the sensor device generating calibration data comprising the portion of the detected calibration content and a detection time indicating when the portion of the calibration content was detected, the detection time determined based on a second clock on the sensor device;

the sensor device sending the calibration data to the media device; and

the media device calculates a propagation delay value based on the transmission time, the portion of the detected calibration content, and the detection time indicated in the calibration data.

9. The system of claim 8, wherein the first clock is a system clock and the second clock is a bluetooth clock.

10. The system of claim 8, wherein the first clock and the second clock are system clocks of the media device and the sensor device, respectively.

11. The system of claim 8, wherein the calibration content comprises a first media segment followed by a calibration media segment followed by a second media segment.

12. The system of claim 8, wherein the calibration content is audio content.

13. The system of claim 8, wherein the calibration content is video content.

14. The system of claim 11, wherein the calibration media segment is associated with a time offset, and further comprising:

the media device calculates a propagation delay value based on the transmission time, the detection time, and the time offset of the calibration media segment.

Technical Field

The present disclosure generally relates to synchronizing playback of audio/video data over multiple channels.

Background

Various types of wired and/or wireless media systems are on the market today. Many of these systems present audio and/or video data over multiple channels (e.g., devices, speakers, displays, headphones, etc.). For example, to play music throughout a house, a user may place different speakers in each room of the house. To simulate cinema surround sound while watching a movie, a user may place different speakers at different locations in a room with a television and/or other media devices (e.g., streaming device, set-top box, etc.). To avoid a dissonant playback experience, the playback of audio and/or video at the various playback devices (e.g., speakers, televisions, etc.) must be synchronized so that each playback device presents the same media content at the same time.

Disclosure of Invention

In some implementations, a computing device may calibrate a media playback channel for presenting media content through a media system by determining a media propagation delay through the media system. For example, the computing device may send calibration content (e.g., audio data, video data, etc.) to various playback devices (e.g., playback channels) of the media system and record timestamps indicating when the calibration content was sent. While the playback device is rendering the calibration content, a sensor device (e.g., a remote control device, a smartphone, etc.) may detect the rendering of the calibration content. The sensor device may send calibration data (e.g., a media sample that may include calibration content and/or a timestamp indicating when the media sample was detected by the sensor device) to the computing device. The computing device may determine a propagation delay (e.g., a presentation delay) based on calibration data received from the sensor device.

Particular implementations provide at least the following advantages. The media system may be calibrated for synchronized playback at multiple playback devices with different types of sensor devices (e.g., a dedicated remote control device, a smartphone, a tablet, etc.). The media system may be calibrated for synchronized playback through a third party playback device (e.g., bluetooth speaker, bluetooth headset, etc.). The media system may be calibrated with or without explicit user input to initiate the calibration process. For example, the calibration process may be performed in the background while the user is performing other tasks on or with the sensor device. Thus, the calibration process may be performed automatically, dynamically, and/or frequently without bothering the user to provide explicit input for performing the calibration process.

The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features, aspects, and potential advantages will become apparent from the description and drawings, and from the claims.

Drawings

Fig. 1 is a block diagram of an exemplary media system for calibrating media playback channels for synchronized playback based on a bluetooth clock at a sensor device.

Fig. 2 is a block diagram of an exemplary media system for calibrating media playback channels for synchronized playback based on a system clock at a sensor device.

Fig. 3 is a block diagram of an example media system for calibrating media playback channels for synchronized playback based on visual calibration content detected at a sensor device.

Fig. 4 is a block diagram of an exemplary media system for calibrating a media playback channel for synchronized playback based on the time at which calibration data was received at a transmitting media device.

Fig. 5 illustrates exemplary calibration content for determining propagation delay on a communication channel of a media system.

Fig. 6 is a flow diagram of an exemplary process for calibrating a media playback channel for synchronized playback based on a detection time determined by a sensor device.

Fig. 7 is a flow diagram of an exemplary process for calibrating a media playback channel for synchronous presentation based on a time of receipt determined by a media device and a time of flight for data transmitted between a sensor device and the media device.

Fig. 8 is a block diagram of an exemplary computing device that may implement the features and processes of fig. 1-7.

Like reference symbols in the various drawings indicate like elements.

Detailed Description

The techniques described herein provide several mechanisms for calibrating media playback channels for synchronized playback. For example, media devices (e.g., computing devices, laptops, set-top boxes, streaming media players, etc.) within a media system can determine and/or calculate propagation delays of media content through various playback channels of the media system, such that the media devices can adjust transmission timing of the media content through the playback channels in order to provide synchronized presentation of the media content at the playback devices (e.g., speakers, displays, televisions, headphones, etc.).

The playback channel may correspond to a communication path (e.g., including the playback device) through which media content travels from a sending (e.g., originating) media device that renders the media content (e.g., audibly and/or visually) to a user of the media system. The playback channel may be a wired playback channel (e.g., HDMI, RCA cable, coaxial cable, ethernet, speaker line, etc.) or a wireless playback channel (e.g., bluetooth, Wi-Fi, etc.) to a respective playback device (e.g., television, speaker, monitor, display, etc.).

The propagation delay (e.g., presentation delay) may correspond to an amount of time consumed by media content sent by the media device to be perceptually presented (e.g., audibly or visually) at the playback device for enjoyment by the user (e.g., presentation time-transmission time ═ propagation delay). The media device may determine a propagation delay on each of a plurality of playback channels (e.g., wired, wireless, Wi-Fi, bluetooth, etc.) corresponding to a plurality of playback devices (e.g., televisions, speakers, headsets, set-top boxes, computing devices, etc.). The media device may then adjust the timing at which the media content is sent to each channel based on the determined propagation delay for each channel such that the media content is presented synchronously (e.g., the same portion of the media content is played simultaneously) by the playback devices associated with each playback channel.

Fig. 1 is a block diagram of an exemplary media system 100 for calibrating media playback channels for synchronized playback based on a bluetooth clock at a sensor device. For example, a bluetooth clock at the sensor device may be used by the media system 100 to determine when the playback device is presenting the calibration content so that the propagation delay through the media system 100 may be determined.

In some implementations, the media system 100 may include a media device 110. For example, the media device 110 may be a computing device (e.g., a laptop, a set-top box, a streaming media player, a smart phone, etc.) capable of delivering streaming media to other playback devices. Media device 110 may communicate streaming media to other playback devices using multiple wired (e.g., HDMI, RCA cable, coaxial cable, ethernet, speaker line, etc.) or wireless communication (e.g., bluetooth, Wi-Fi, etc.) channels.

In some implementations, media device 110 may include a media module 112. For example, media module 112 may be a software module configured to perform various media management functions on media device 110. Media module 112 may transmit media content (e.g., audio content, video content, etc.) for enjoyment by a user to various playback devices over respective playback channels, e.g., according to an output configuration specified by the user of media device 110. The media module 112 may manage the transfer of media content to the playback devices such that the playback devices render the media content in a synchronized manner. For example, the media module 112 may adjust the timing at which media content is transmitted to playback devices according to the propagation delay determined for each playback device and/or the respective playback channel.

In some implementations, media module 112 may perform a calibration procedure to determine the propagation delay for each playback channel and/or corresponding playback device. For example, media module 110 may have a playback mode and a calibration mode. The media module 112 may operate normally in playback mode when sending media to various playback devices in response to a user request to play music, videos, movies, or some other media content for the user's entertainment. In some implementations, media module 112 may enter a calibration mode to determine propagation delays for the respective playback channels and calibrate the respective playback channels of media system 100 based on the determined propagation delays to ensure synchronized playback of media content on the playback channels and/or playback devices.

Media module 112 may enter the calibration mode in a variety of ways. For example, media module 112 may enter the calibration mode in response to a user providing input to media device 110. For example, media device 110 may present a graphical user interface on television 130 and the user may provide an input selecting a calibration menu item to cause media module 112 to enter a calibration mode.

As another example, media module 112 may automatically enter the calibration mode when a user of sensor device 140 and/or media device 110 enables (e.g., turns on) microphone 144. For example, a user of the sensor device 140 may press a button on the sensor device 140 (e.g., a dedicated remote control device) to enable voice input for providing input to the media device 110. While microphone 144 is enabled, media module 112 may enter a calibration mode to determine a propagation delay of a playback channel in media system 100 and calibrate the playback channel based on sounds detected while microphone 144 is enabled to enable synchronized playback of media content. Thus, the media system 100 may be automatically and dynamically (e.g., frequently) calibrated without bothering the user to provide explicit input for calibrating the media system 100.

In some implementations, when the media module 112 enters the calibration mode, the media module 112 may cause the sensor device 140 to enter the calibration mode. For example, the media module 112 may send a message over the network 150 (e.g., peer-to-peer bluetooth, peer-to-peer Wi-Fi, local area network, etc.) to cause the sensor device 140 and/or the remote module 142 to enter the calibration mode. While in the calibration mode, the remote module 142 may sample sound detected by the microphone 144 (e.g., or video captured by a camera of the sensor device 140) so that the media module 112 may determine when the calibration content 118 is being presented by a playback device (e.g., the speaker 132, the speaker 160, the television 130, etc.) and determine propagation delays, as described further below.

To determine the propagation delay and perform the calibration process, media module 112 may send calibration content 118 to the playback device over the corresponding playback channel. For example, media module 112 may send calibration content 118 to television 130 and speakers 132 over playback channel 126 (e.g., an HDMI channel). The speaker 132 may be, for example, a speaker attached or connected to the television 130.

In some implementations, calibration content 118 may be media content that includes audio content and/or video content created specifically for calibrating media system 100. In general, calibration content may be configured to include an initial media segment, then a calibration media segment (e.g., an audio or video pattern useful for calibration), and then an ending media segment. The initial media segment and the ending media segment may be configured to be vocalized or visually appealing to the user such that they mask or make more tolerable the calibration media segment, which may be less appealing to the user. For example, the initial media segment and the ending media segment may have a longer duration than the calibration media segment and thus be less noticeable to the user. By configuring the calibration segment between the initial media segment and the ending media segment, the media module 112 may determine an offset at which the calibration media segment is presented within the calibration content. This offset may allow for greater accuracy in determining the propagation delay on the playback channel, as further described below with reference to fig. 5. In the example of fig. 1, calibration content 118 may include audio data to be rendered by television 130 and/or speakers 132. However, in other implementations (e.g., fig. 4), the calibration content 118 may include video content.

As described above, the purpose of the calibration process is to determine the amount of time (e.g., propagation delay) it takes for calibration content 118 to be transmitted to speaker 132 and rendered by speaker 132 after media module 112 sends calibration content to television 130 and/or speaker 132 so that the output of media content from media module 112 can be calibrated (e.g., timed) for synchronized playback. For example, an important source of delay in a playback channel that includes a display device (e.g., television 130) is video processing performed by the display device. Thus, sending media content over a playback channel that does not include a display device or other device performing video processing may need to be delayed to accommodate the latency associated with video processing performed at the display device in order to provide a synchronized playback experience for the user on all playback channels.

In some implementations, when transmitting the calibration content 118, the media module 112 may determine a time (e.g., a transmission time) at which the calibration content 118 was transmitted based on the system clock 120. For example, system clock 120 may be an internal clock used by media device 110 to perform various computing operations. In some implementations, the system clock 120 may be synchronized with the network clock using well-known protocols. Media module 112 may record and/or store the system time at which calibration content 110 was sent to the playback device (e.g., television 130 and/or speakers 132) so that when the propagation delay on the playback channel (e.g., playback channel 126) is calculated, the transmission time may be compared to the presentation time (e.g., detected by sensor device 140) at which the calibration content was presented by the playback device.

When television 130 and/or speaker 132 receive calibration content 118, speaker 132 may render calibration content 118. For example, the speaker 132 may present sounds (e.g., pleasant sounds, then audible test patterns, then pleasant sounds) corresponding to the audio data in the calibration content 118.

In some implementations, the media system 100 can include a sensor device 140. For example, the sensor device 140 may be a computing device, such as a remote control device, a smartphone, a tablet, or other device configured with sound and/or image sensors and capable of communicating with the media device 110 over the network 150 (e.g., a bluetooth network, a Wi-Fi network, a peer-to-peer network, etc.). In a specific example of media system 100, sensor device 140 may correspond to a dedicated remote control device for controlling media device 110 and/or providing input to media device 110 using a bluetooth connection.

In some implementations, the sensor device 140 can include a remote module 142. For example, the remote module 142 may be a software module that provides remote control capabilities of the sensor device 140 relative to the media device 110. The remote module 142 may acquire media samples (e.g., audio samples, video samples, etc.) generated by the sensor device 140 and provide calibration data (including the media samples) to the media module 112 for use in determining propagation delays through various playback channels of the media system 100.

In some implementations, the sensor device 140 can include a microphone 144 (e.g., a sound sensor) for detecting sounds, such as voice commands for remotely controlling the media device 110. The microphone 144 may also be used by the remote module 142 to detect calibration content 118 presented by the speaker 132 or any other audio playback device (e.g., speaker 160, headphones, etc.) while in the calibration mode.

In some implementations, while the remote module 142 is in the calibration mode, the remote module 142 can monitor sounds detected by the microphone 144 and periodically send calibration data to the media device 110. For example, the calibration data may include media samples (e.g., sound samples, video samples, etc.) detected and/or generated by the sensor device 140 using sound and/or image sensors of the sensor device 140. The calibration data may include a timestamp indicating a time at which the media sample in the calibration data was detected and/or generated by the sensor device 140. While in the calibration mode, the remote module 142 may periodically generate and transmit calibration data. For example, the remote module 142 may periodically sample sensor data generated by sensors (e.g., sound sensors, image sensors, etc.) on the sensor device 140 and generate calibration data for each sampling period. The remote module 142 may then send the calibration data for the sampling period (including the newly collected media sample for the current sampling period) to the media device 110. For example, when in calibration mode, the sampling period may be 50 milliseconds, 1 second, and so on. Each instance of calibration data may or may not include calibration content and, more importantly, may or may not include a calibration media segment. Accordingly, the media module 112 on the media device 110 may analyze each calibration data as it is received to determine whether the calibration data includes a calibration media segment, as described further below.

In some implementations, the remote module 142 can use the bluetooth clock 146 to determine a timestamp of the calibration data. For example, the sensor device 140 may not have a system clock. Accordingly, the remote module 142 may utilize the bluetooth clock 146 to obtain the current time (e.g., timestamp) when the calibration data is generated. For example, the current time may be acquired from the bluetooth clock 146 through an API (application programming interface) of the bluetooth controller 148. The remote module 142 may then store the timestamp in calibration data that includes the media sample for the current sampling period. After generating the calibration data for the current sampling period, the remote module 142 may send a message 149 to the media device 110 that includes the calibration data generated by the remote module 142 for the current sampling period.

When the message 149 is received by the media device 110, the media module 112 may determine a system time that corresponds to the bluetooth time at which the sound sample included in the message 149 was detected by the sensor device 140. For example, as part of the bluetooth communication protocol, the bluetooth clock 146 on the sensor device 140 and the bluetooth clock 116 on the media device 110 may be synchronized. However, the bluetooth clock 116 and the system clock 120 on the media device 110 may not be synchronized. Thus, when message 149 is received, the media module can obtain the current time of bluetooth clock 116 (e.g., from bluetooth controller 114) and system clock 120 to determine a mapping between system time and bluetooth time on media device 110. For example, media module 112 may determine an amount of time (e.g., 20 milliseconds, 5 seconds, 30 seconds, etc.) that the system time of system clock 120 is before (or after) the bluetooth time of bluetooth clock 116. Media module 112 may then add (or subtract) this amount of time to (or from) the bluetooth timestamp included in the calibration data to determine the system time at which the calibration data was generated and/or the calibration media sample was detected by sensor device 140.

After determining the system time at which the sensor device 140 generated the calibration data in message 149, the media module 112 may determine a presentation time at which the playback device began presentation of the calibration content 118. For example, the media module 112 may determine the presentation time based on the time at which the playback device presented the calibrated media segment, as described below with reference to fig. 5.

Media module 112 may then compare the system time (e.g., transmission time) at which media module 112 transmitted calibration content 118 to the playback device (e.g., television 130 and/or speaker 132) to the time (e.g., presentation time) at which the playback device presented calibration content 118 to determine the propagation delay (e.g., presentation latency) on playback channel 126. For example, the media module 112 may subtract the transmission time from the presentation time to determine the propagation delay on the playback channel 126.

In some implementations, media module 112 may determine the propagation delay through other (e.g., additional) playback channels in a similar manner as described above with reference to playback channel 126. For example, when media system 100 is in the calibration mode, media module 112 may send calibration content 118 to speaker 160 (e.g., smart speaker, bluetooth speaker, headset, wireless ear bud, etc.) over playback channel 162. The calibration content 118 sent over the playback channel 162 may include the same calibration media segments (e.g., audio patterns, video patterns, etc.) as the calibration content sent over the playback channel 126. The calibration content 118 sent over the playback channel 162 may include a different calibration media segment (e.g., an audio calibration pattern, a video calibration pattern, etc.) than the calibration content sent over the playback channel 126. The sensor device 140 may then detect the calibration content 118 presented by the speaker 160, generate calibration data, and transmit the calibration data to the media device 110 so that the media module 112 may determine the propagation delay through the playback channel 162, as described above.

In some implementations, media module 112 may send calibration content 118 to playback channel 126 and playback channel 162 simultaneously. For example, media module 112 may determine all playback channels (e.g., playback channel 126, playback channel 162, etc.) or playback devices (e.g., television 130, speakers 132, speakers 160, etc.) through which media device 110 is configured to transmit media content. The media module 112 may then send the calibration content 118 over each channel and/or to each playback device such that when received at the playback device, the playback device renders the calibration content 118. As described above, the calibration content 118 may include the same calibration media segments for each playback channel, or the calibration content 118 may include different calibration media segments for each playback channel. For example, by sending different calibration media segments to each playback channel, media module 112 may determine which calibration data (e.g., detected calibration media segments) corresponds to which playback channel by matching the calibration media segments in the calibration data to the calibration media segments sent by media module 112 to each playback channel.

In any case, when calibration content 118 is received by the playback device, the playback device may render calibration content 118. Due to differences in propagation delay on each channel (e.g., channel 126, channel 162, etc.), each playback device may present calibration content 118 at different times or at the same time. However, sensor device 140 may detect calibration content 118 presented by each playback device (e.g., television 130, speaker 132, and/or speaker 160), generate calibration data, and transmit the calibration data to media device 110, such that media module 112 may calculate a propagation delay for each playback channel 126 and/or 160 based on the calibration data for each channel, as described herein.

In some implementations, the media module 112 may synchronize the media content playback on the playback channels using the propagation delay calculated for each playback channel. For example, video processing (e.g., performed by television set 130) is typically the source of the greatest amount of propagation delay within media system 100. Thus, if media module 112 determines that playback channel 126 (e.g., television 130, speaker 132) has a two (2) second propagation delay and playback channel 162 (e.g., speaker 160) has a one (1) second propagation delay, media module 112 may send media content to television 130 and/or speaker 132 one second before media module 112 sends media content to speaker 160 for rendering when media content is sent to a playback device. In other words, media module 112 may delay sending media content to speaker 160 for one second after sending media content to television 130 and/or speaker 132, such that television 130, speaker 132, and/or speaker 160 simultaneously present the media content. Accordingly, media module 112 may calibrate the transmission of media content on the respective playback channels based on the determined propagation delay determined for each playback channel such that the playback devices associated with each playback channel render the media content simultaneously and in synchronization with the other playback devices.

Fig. 2 is a block diagram of an exemplary media system 200 for calibrating media playback channels for synchronized playback based on a system clock at a sensor device. For example, the media system 200 may correspond to the system 100 described above. In some implementations, a system clock at the sensor device may be used by the media system 100 to determine when the playback device is rendering the calibration content so that the propagation delay through the media system 100 may be determined. For example, the sensor device 140 may be a computing device (e.g., a smartphone, a tablet, etc.) that includes a system clock 202. The calibration of the media system 200 may be performed similarly to the calibration of the media system 100 described above, but the timestamp of the calibration data may be determined based on the system clock 202 of the sensor device 140 instead of the bluetooth clock 146. The system clock 202 may be synchronized with the system clock 120 of the media device 110 using well-known network protocols. Thus, the media module 112 may directly use the calibration data timestamp (e.g., without converting the bluetooth clock time to system time) in determining the propagation delay for each playback channel.

To calibrate the playback channel of the system 200, the media module 112 may send a notification to the sensor device 140 to cause the sensor device 140 (e.g., a smartphone, tablet, smart watch, other computing device, etc.) to enter the calibration mode when the media module 112 of the media device 110 enters the calibration mode. As described above, when a user selects the calibration menu item presented by media module 112 on television 130, media module 112 may enter a calibration mode. For example, the sensor device 140 may present a calibration notification on a display of the sensor device 140. A user of the sensor device 140 may provide an input to cause the remote module 142 to enter the calibration mode in response to the notification.

Alternatively, a user of the sensor device 140 may invoke a remote module 142 (e.g., a software application) on the sensor device 140 and provide input to the remote module 142 to cause the media module 112 on the media device 110 and/or the remote module 142 to enter the calibration mode. While in the calibration mode, the remote module 142 may sample (e.g., record for a period of time) the sensor data generated by the microphone 144 and periodically generate calibration data, as described above.

When the remote module 142 generates calibration data, the remote module 142 may determine the time at which the media sample was collected by requesting the current time from the system clock 202. The remote module 142 may include the sample time in the calibration data and send the calibration data to the media module 112 on the media device 110 in message 204.

When message 204 is received by media module 112, media module 112 may determine a system time at which sensor device 140 generated the calibration data in message 204 based on the timestamp in the calibration data. After determining the system time at which the sensor device 140 generated the calibration data in message 204, the media module 112 may determine a presentation time at which the playback device began presentation of the calibration content 118. For example, the media module 112 may determine the presentation time based on the time at which the playback device presented the calibrated media segment, as described below with reference to fig. 5.

To determine the propagation delay on the playback channels, the media module 112 may calculate the difference between the system time and the presentation time (e.g., system time) at which the media module 112 sends the calibration content 118 to the playback device to determine the propagation delay on each playback channel. After determining the propagation delay for each playback channel, the media module 112 may calibrate the transmission of the media content on each playback channel such that each playback device associated with the playback channel synchronously presents the media content, as described above.

Fig. 3 is a block diagram of an example media system 300 for calibrating media playback channels for synchronized playback based on visual calibration content detected at a sensor device. For example, the media system 300 may correspond to the system 200 described above. However, rather than determining the propagation delay of the playback channel 304 based on the audio calibration content, the media system 300 may utilize the video calibration content to determine the propagation delay of the playback channel 304. Similar to media system 200, media system 300 may use a system clock at the sensor device to determine when samples of video calibration content are detected by camera 302 so that propagation delay through media system 300 (e.g., playback channel 304) may be determined. Since the playback channel 162 to the speaker 160 does not include a video playback device, the propagation delay through the playback channel 162 may be determined (e.g., simultaneously or separately) using audio calibration content, as described above.

In some implementations, the media module 112 may determine the type and/or capabilities of the playback device to which the media module 112 is configured to send the media content. For example, upon establishing the playback channel 304 to the television 130, the media module 112 may determine that the television 130 is a type of playback device capable of rendering audio and video content. Upon establishing the playback channel 162 to the speaker 160, the media module 112 may determine that the speaker 160 is a type of playback device that is only capable of rendering audio content. Thus, in transmitting calibration content 118 to television 140 and/or speakers 160, media module 112 may select video and/or audio calibration content according to the capabilities of the playback devices associated with each playback channel. Alternatively, the calibration content 118 can include both audio and video calibration content, and the playback device can render the audio and/or video content according to the capabilities of the playback device.

In the example of fig. 3, media module 112 may select to send video calibration content to television 130 and audio calibration content to speaker 160, and record the system time at which the calibration content was sent to each playback device. When the television 130 receives the video calibration content 118, the television 130 may present the video calibration content on a display of the television 130.

In some implementations, the sensor device 140 may be configured with a camera 302 and/or a microphone 144. The sensor device 140 may use the microphone 144 (e.g., a sound sensor) to detect the presentation of the audio calibration content 118 by the speaker 160, as described above. The sensor device 140 may use a camera 302 (e.g., an image sensor) to detect the presentation of the video calibration content 118 by the television 130. For example, when the media module 112 enters the calibration mode, the media module 112 may send a notification to the sensor device 140 (e.g., a smartphone, a tablet, etc.). The notification may include information indicating that media device 110 has entered calibration mode. The notification may include information indicating the type of calibration content (e.g., video content, audio content, etc.) to be used for calibration of the media system 300. The sensor device 140 may present a calibration notification on a display of the sensor device 140.

When a user of the sensor device 140 selects or interacts with a calibration notification presented on the sensor device 140 to cause the sensor device 140 to enter a calibration mode, the remote module 142 may present instructions for performing video calibration of the media system 300. For example, when the sensor device 140 enters the calibration mode, the remote module 142 may enable (e.g., turn on) the microphone 144 and/or the camera 302 and instruct the user to orient the sensor device 140 such that the lens of the camera 302 is directed at the television 130. Accordingly, when the television 130 presents video calibration content (e.g., calibration content 118), the camera 302 may detect the presentation of the video calibration content. For example, while in the calibration mode, the remote module 142 may sample (e.g., record for a period of time) the sensor data generated by the camera 302 and periodically generate calibration data, as described above.

When the remote module 142 generates calibration data, the remote module 142 may determine the time at which the media sample was collected by requesting the current time from the system clock 202. The remote module 142 may include the sample time in the calibration data and send the calibration data to the media module 112 on the media device 110 in message 306.

When the message 306 is received by the media module 112, the media module 112 may determine a system time at which the sensor device 140 generated the calibration data in the message 306 based on the timestamp in the calibration data. After determining the system time at which the sensor device 140 generated the calibration data (e.g., sample data) in the message 306, the media module 112 may determine a presentation time at which the playback device began presentation of the calibration content 118. For example, the media module 112 may determine the presentation time based on the time at which the playback device presented the calibrated media segment, as described below with reference to fig. 5.

To determine the propagation delay on the playback channels, the media module 112 may calculate the difference between the system time and the presentation time (e.g., system time) at which the media module 112 sends the calibration content 118 to the playback device to determine the propagation delay on each playback channel. After determining the propagation delay for each playback channel, the media module 112 may calibrate (e.g., adjust the timing of) the transmission of the media content on each playback channel so that each playback device associated with the playback channel synchronously presents the media content, as described above.

Fig. 4 is a block diagram of an exemplary media system 400 for calibrating a media playback channel for synchronized playback based on the time at which calibration data is received at a transmitting media device. For example, the system 400 may correspond to the system 100 described above. However, in system 400, remote module 142 may not have access to bluetooth clock 146 (or system clock) to determine when calibration content is being presented by the playback device and/or detected by microphone 144. Accordingly, the media system 400 may be configured to determine the propagation delay based on the system clock of the media device 110 and the time of flight for transmitting the calibration content from the sensor device 140 to the media device 110. For example, the media module 112 may calculate the time (e.g., the sampling time) at which the sensor device 140 generated the media sample in the calibration data by subtracting the time of flight (e.g., the amount of time it takes to transmit data from the sensor device 140 to the media device 110 over the communication channel 404) from the time (e.g., the reception time) at which the media module 112 including the detected calibration content 118 received the message 402 received by the media device 110.

After determining the system time (e.g., sample time) at which the sensor device 140 generated the calibration data in message 402, the media module 112 may determine a presentation time at which the playback device began presentation of the calibration content 118. For example, the media module 112 may determine the presentation time based on the time at which the playback device presented the calibrated media segment, as described below with reference to fig. 5.

In some implementations, the media module 112 can send the calibration content 118 to the playback device over various playback channels. For example, media module 112 may send calibration content 118 to television 130 and/or speaker 132 via playback channel 126, as described above. The media module 112 may send the calibration content 118 to the speaker 160 over the playback channel 162, as described above.

In some implementations, while the remote module 142 is in the calibration mode, the remote module 142 can monitor sounds detected by the microphone 144 and periodically send calibration data to the media device 110. For example, the calibration data may include media samples (e.g., sound samples, video samples, etc.) detected and/or generated by the sensor device 140 using sound and/or image sensors of the sensor device 140. However, in the example of fig. 4, the remote module 142 may not have access to any clock (e.g., bluetooth clock 146) on the sensor device 140 (e.g., the sensor device 140 may simply be a remote control device without any system clock). Thus, the remote module 142 may transmit calibration data, including media samples, without a corresponding timestamp indicating when the calibration data and/or media samples were generated.

While in the calibration mode, the remote module 142 may periodically generate and transmit calibration data. For example, the remote module 142 may periodically sample sensor data generated by sensors (e.g., sound sensors, image sensors, etc.) on the sensor device 140 and generate calibration data for each sampling period. The remote module 142 may then send the calibration data (including the newly collected media sample) for the sampling period to the media device 110. For example, when in calibration mode, the sampling period may be 50 milliseconds, 1 second, and so on. Each instance of calibration data may or may not include calibration content and, more importantly, may or may not include a calibration media segment. Accordingly, the media module 112 may analyze each calibration data as it is received to determine whether the calibration data includes a calibration media segment, as described further below. After generating the calibration data for the current sampling period, the remote module 142 may send a message 402 (including the calibration data generated by the remote module 142) to the media device 110.

When media device 110 receives message 402, media module 112 may calculate a difference (e.g., a round trip time) between the system time at which media module 112 sent the calibration content to the one or more playback devices and the time at which media module 112 received message 402. Media module 112 may then subtract the time-of-flight value from the round trip time to determine (e.g., estimate) when sensor device 140 generated the calibration data and/or media sample included in message 402.

In some implementations, the time-of-flight value may be determined based on the amount of time it takes for a message transmitted by the sensor device 140 to be received by the media device 110. For example, the time-of-flight value may be determined based on bluetooth clocks at the sensor device 140 and the media device 110. For example, while the remote module 142 may not have access to the bluetooth clock 146 to determine the time at which the calibration content 118 was detected by the sensor device 140, the bluetooth controller 148 may include a bluetooth clock time in the message 402 that indicates the time at which the message 402 was transmitted by the sensor device 140 as part of the bluetooth communication protocol. When message 402 is received at media device 110, bluetooth controller 114 may determine a bluetooth time at which message 402 was received based on bluetooth clock 116. Bluetooth controller 114 may determine the time of flight by calculating the difference between the bluetooth time (e.g., transmission time) in message 402 and the bluetooth time at which message 402 was received at media device 110. This calculated time of flight may be provided to the media module 112. Media module 112 may then subtract the time of flight from the round trip time to determine when the calibration data and/or media sample in message 402 was generated.

In some implementations, the time-of-flight value may be a statistical value (e.g., a minimum value) determined from time-of-flight values calculated for a plurality of messages sent from the sensor device 140 to the media device 110. For example, over time, the sensor device 140 may send multiple (e.g., hundreds, thousands, etc.) bluetooth messages to the media device 110. The media module 112 may store a time-of-flight value generated for each message received from the sensor device 140 over a period of time (e.g., all times, last week, previous hour, etc.). In some implementations, media module 112 may determine a minimum time-of-flight value from all of these messages and use the minimum time-of-flight value in calculating the propagation delay based on the time-of-flight between sensor device 140 and media device 110, as described above. In some implementations, the media module 112 may calculate other statistical time-of-flight values, such as a median, an average, etc., and use these other statistical time-of-flight values in calculating the propagation delay based on the time-of-flight between the sensor device 140 and the media device 110, as described above.

After determining the system time at which the sensor device 140 generated the calibration data in message 149, the media module 112 may determine a presentation time at which the playback device began presentation of the calibration content 118. For example, the media module 112 may determine the presentation time based on the time at which the remote device 142 generated the calibration media and/or media sample, as described below with reference to fig. 5.

The media module 112 may then compare the system time (e.g., transmission time) at which the media module 112 transmitted the calibration content 118 to the playback device (e.g., television 130) to the time at which the playback device presented the calibration content 118 (e.g., presentation time) to determine the propagation delay (e.g., latency) on the playback channel 304. For example, the media module 112 may subtract the transmission time from the presentation time to determine the propagation delay on the playback channel 304.

After calculating the propagation delay on each playback channel using the time-of-flight calculation as described above, media module 112 may calibrate each playback channel based on the propagation delay calculated for each playback channel, as described above.

Fig. 5 illustrates exemplary calibration content 500 for determining propagation delay on a communication channel of a media system. For example, the calibration content 500 may correspond to the calibration content 118 of the media systems 100, 200, 300, and/or 400 described above. As described above, the calibration content 500 may include video content and/or audio content.

In some implementations, the calibration content 500 may include a beginning media segment 502, a calibration segment 504, and an ending media segment 506. For example, media segments 502 and 504 may include some audibly or visually pleasing media. Calibration segment 504 may include audio or video patterns that may be matched by media device 110 when performing the calibration process described herein. For example, in determining when the calibration segment 504 is presented by the playback device, the media module 112 may match the calibration segment 504 to audio and/or video sample data to determine whether the sample data includes the calibration segment. When presented by a playback device (e.g., television 130, speaker 132, speaker 160, etc.), the playback device may present the media segment 502 for a first duration (e.g., time 512-time 510), present the calibration segment 504 for a second duration (e.g., time 514-time 512), and present the media segment 506 for a third duration (e.g., time 516-time 514). For example, the duration of the calibration segment 504 may be shorter than the duration of the media segments 502 and/or 506. Accordingly, the calibration content 500 may be presented for a total duration (e.g., time 516-time 510).

In some implementations, the sensor device 140 can capture a sample of sensor data that includes a portion of the calibration content 500. For example, the sensor device 140 may capture the sample 520. The sample 520 may be a sample of audio data or video data captured and/or generated by a sound sensor (e.g., a microphone) or an image sensor (e.g., a camera) of the sensor device 140. As described above, the remote module 142 may periodically sample sensor data while in the calibration mode. Sample 520 is an example of sample data generated by the remote module 142. The samples 520 may be sent by the remote module 142 to the media device 110 in calibration data. The time indicated in the calibration data may correspond to the time 522 at which the sample 520 was captured or generated by the remote module 142.

As shown in fig. 5, the sample 520 may be generated at time 522 and end at 542. Thus, the sample 520 may not include all of the calibration content 500 (e.g., extending from time 510 to time 516). Further, the start of the sample 520 (e.g., time 522) may not coincide with the start of the calibration content 500 (e.g., time 510). For example, the difference between the time the playback device begins to render the calibration content 500 (e.g., time 510) and the time the media module 112 sends the calibration content 500 to the playback device is the propagation delay of the communication channel to the playback device, so the media module 112 needs to determine time 510 to calculate the propagation delay.

In some implementations, media module 112 can use calibration segment 504 to determine when the playback device begins to render calibration content 500 even though the beginning of calibration content 500 at time 510 is not part of sample 520. For example, the media module 112 may determine the calibration time offset 530 of the calibration segment 504. For example, calibration time offset 530 may correspond to a difference between time 512 (e.g., the beginning of calibration segment 504) and time 510 (e.g., the beginning of calibration content 500). The media module 112 can use the calibration time offset 530 of the calibration segment 504 to determine when the playback device begins to render the calibration content 500. For example, if the media module 112 can determine the time at which the calibration segment 504 was presented, the media module 112 can subtract the calibration offset 530 from this time to determine when the playback device begins to present the calibration content 500. This "start time" or presentation time may be used by media module 112 to calculate the propagation delay from media device 110 through the playback device that is presenting calibration content 500.

In some implementations, when the media module 112 receives calibration data including the sample 520 from the sensor device 140, the media module 112 may analyze the sample 520 to determine a sampling-time offset 540 (e.g., the difference between time 522 and time 512) corresponding to when the calibration segment 504 begins within the sample 520. For example, the calibration segment 504 may begin at a one (1) second sample time offset 540 relative to the beginning of the sample 520.

When the calibration data is sent to the media module 112, the media module 112 may determine the time at which the sample 520 was generated (e.g., the time at which the beginning of the sample 520 was captured). For example, the calibration data time may be obtained from the calibration data itself (e.g., the system or bluetooth clock time determined at the sensor device 140) or may be derived from a time-of-flight calculation, as described above.

To determine the time at which the calibration content 500 was first presented (e.g., time 510), the media module 112 may add the sample time offset 540 to the calibration data time (e.g., time 522) and subtract the calibration time offset 530. The results of these calculations may correspond to the presentation time of the calibration content. For example, the presentation time corresponds to the time at which the playback device begins to present the calibration content 500.

Exemplary procedure

In order to enable the reader to clearly understand the technical concepts described herein, the following process describes specific steps performed in a particular order. However, one or more steps of a particular process may be rearranged and/or omitted while remaining within the intended scope of the techniques disclosed herein. Further, different processes and/or steps thereof may be combined, recombined, rearranged, omitted, and/or performed in parallel to create different process flows that are also within the intended scope of the techniques disclosed herein. Moreover, although the following processes may omit or briefly summarize some details of the techniques disclosed herein for the sake of clarity, details described in the above paragraphs may be combined with process steps described below to obtain a more complete and thorough understanding of these processes and the techniques disclosed herein.

Fig. 6 is a flow diagram of an exemplary process 600 for calibrating a media playback channel for synchronized playback based on a detection time determined by a sensor device. For example, process 600 may be performed by media systems 100, 200, and/or 300 described above. Process 600 may be performed to determine a propagation delay for each playback channel (e.g., including a playback device) in media systems 100, 200, and/or 300. The determined propagation delay for each playback channel may then be used by media device 110 to adjust the transmission time of the media content through each playback channel so that the media content is presented in a synchronized manner on all playback devices.

At step 602, media device 110 may cause media device 110 and sensor device 140 to enter a calibration mode. For example, media device 110 may receive explicit user input indicating that a user wishes to calibrate media system 100, 200, and/or 300. The user input may be received by media device 110 through a remote control (e.g., sensor device 140) associated with media device 110.

In some implementations, media device 110 may detect when a user activates a sensor (e.g., microphone, camera, etc.) on sensor device 140 and utilize this opportunity (e.g., without explicit user input) to calibrate media systems 100, 200, and/or 300. For example, a user may enable a microphone on the sensor device 140 to provide voice input to the media device 110. The media device 110 may receive a message from the sensor device 140 indicating that the microphone is active or on and cause the media device 110 and the sensor device 140 to enter a calibration mode. Thus, the media device 110 can opportunistically cause the media device 110 and the sensor device 140 to enter a calibration mode when a user enables a calibration sensor (e.g., microphone, camera, etc.) on the sensor device 140.

In some implementations, media device 110 may calibrate media systems 100, 200, and/or 300 periodically. For example, media device 110 may repeatedly calibrate media systems 100, 200, and/or 300 periodically (e.g., daily, weekly, etc.). If media device 110 has not recently calibrated media systems 100, 200, and/or 300, media device 110 may automatically enter calibration mode at the end of the configured period and send a notification to sensor device 140 to cause sensor device 140 to enter calibration mode. For example, a user of the sensor device 140 may interact with the notification to allow the sensor device 140 to enter a calibration mode and activate a calibration sensor (e.g., microphone, camera, etc.) on the sensor device 140, as described above.

In step 604, media device 110 may send calibration content to the playback device over the playback channel and record the transmission time. For example, the media device 110 may determine the sensor capabilities of the sensor device 140 (e.g., sound sensor-microphone, image sensor-camera, etc.). Media device 110 may determine the media presentation capabilities (e.g., audio, video only, etc.) of the playback devices in media systems 100, 200, and/or 300. The media device 110 may select calibration content to send to each playback device on each playback channel based on the determined capabilities of the sensor device 140 and the playback device. For example, when the sensor device 140 can only detect sound (e.g., configured with only a microphone), then the media device 110 can transmit audio calibration data to the respective playback devices in the media systems 100, 200, and/or 300. When the sensor device 140 is capable of detecting sound and images, the media device 110 may select audio or video calibration data according to the output capabilities of the playback device. For example, the video calibration data may be sent to a playback device having a display. The audio calibration data may be sent to a playback device having a speaker. In transmitting the calibration data to the playback device over the playback channel, media device 110 may record the local system time at which the calibration data was transmitted over the playback channel (e.g., using a system clock of media device 110). When the playback device receives the calibration data, the playback device may render the calibration data (e.g., with a display or speaker of the playback device).

At step 606, the sensor device 140 may detect the calibration content presented by the playback device and record the time of detection. For example, while in the calibration mode, one or more calibration sensors of the sensor device 140 may remain enabled (e.g., active, on) such that sound and/or images corresponding to calibration content presented by the playback device may be detected by the sensor device 140. The remote module 142 may sample (e.g., periodically record) the sounds and/or images detected by the sensor device 140 while in the calibration mode. The remote module 142 may record the time each sample was recorded and/or calibration data was generated. For example, the sensor device 140 may record the current bluetooth clock time or the current system time on the sensor device 140 at the time the sensor data is sampled and/or calibration data is generated. When the system clock is not available on the sensor device 140, the sensor device 140 may record the current bluetooth time.

At step 608, the sensor device 140 may send calibration data to the media device 110. For example, the remote module 142 may send calibration data for the current sampling period to the media device 140.

In some implementations, steps 606 and 608 can be performed iteratively while in the calibration mode. For example, the remote module 142 may periodically (e.g., every 50 milliseconds, every second, etc.) sample data generated by a calibration sensor (e.g., microphone, camera, etc.) on the sensor device 140, store the sensor data (e.g., the detected calibration data), record the current time of the current sample, and send the current sample and the current time to the media device 110 for analysis. While in the calibration mode, the remote module 142 may iterate through a number of sampling periods. Accordingly, the remote module 142 may send multiple instances of calibration data to the media module 112 on the media device 110.

At step 610, media device 110 may calculate a propagation delay based on the calibration data and the transmission time of the calibration content. For example, media module 112 may analyze each instance of calibration data to determine which instance or instances of calibration data (when multiple playback channels are calibrated) include calibration segments (e.g., audio and/or video calibration patterns). When media module 112 identifies an instance of calibration data that includes a calibration segment, media module 112 may determine a time at which the calibration segment of calibration data (e.g., sampled sensor data) was presented by the playback device and/or received by sensor device 140. For example, the time at which the calibration segment is presented may be determined by adding a sample offset to the time indicated in the calibration data (e.g., the sample time). Media module 112 may then determine a time at which the calibration content was presented by the playback device (e.g., a presentation time) based on the calibration offset of the calibration segment, as described above. Media module 112 may then calculate a propagation delay based on the difference between the transmission time of the calibration content and the presentation time of the calibration segment. For example, media module 112 may subtract the transfer time recorded when calibration content 118 was sent to the playback device over the playback channel from the presentation time determined based on the calibration data.

In step 612, media device 110 may adjust a transmission delay of the playback channel in transmitting the media content for playback based on the playback delay determined for the playback channel. For example, media module 112 may compare the presentation delay of a playback channel to the presentation delays calculated for other playback channels and adjust the playback delay (e.g., the amount of time that media content is delayed to be sent) of each playback channel to accommodate the playback channel with the longest presentation delay. For example, if the playback channel 126 has a presentation delay of 5 seconds and the playback channel 162 has a playback delay of 2 seconds, the media module 112 may transmit the same media content on the playback channel 162 with a delay of 3 seconds after transmitting the media content on the playback channel 126, such that the media content will be presented simultaneously by the playback devices associated with each playback channel.

Fig. 7 is a flow diagram of an example process 700 for calibrating a media playback channel for synchronous presentation based on a time of receipt determined by a media device and a time of flight for data transmitted between a sensor device and the media device. For example, process 700 may be performed by media system 400 described above. Process 700 may be performed to determine a propagation delay for each playback channel (e.g., including a playback device) in media system 400. The determined propagation delay for each playback channel may then be used by media device 110 to adjust the transmission time of the media content through each playback channel so that the media content is presented in a synchronized manner on all playback devices.

At step 702, media device 110 may cause media device 110 and sensor device 140 to enter a calibration mode. For example, media module 112 may receive explicit user input indicating that the user wishes to calibrate media system 400. The user input may be received by media module 112 via a remote control (e.g., sensor device 140) associated with media device 110.

In some implementations, the media device 112 can detect when a user activates a sensor (e.g., microphone, camera, etc.) on the sensor device 140 and take advantage of this opportunity (e.g., without explicit user input) to calibrate the media system 400. For example, a user may enable a microphone on the sensor device 140 to provide voice input to the media module 112. The media module 112 may receive a message from the sensor device 140 indicating that the microphone is active or on and cause the media module 112 and the remote module 142 to enter a calibration mode. Thus, the media module 112 may opportunistically cause the media module 112 and the remote module 142 to enter a calibration mode when a user enables a calibration sensor (e.g., microphone, camera, etc.) on the sensor device 140.

In some implementations, the media module 112 may calibrate the media system 400 periodically. For example, the media module may repeatedly calibrate the media system 400 periodically (e.g., daily, weekly, etc.). If the media module 112 has not recently calibrated the media system 400, the media module 112 may automatically enter the calibration mode at the end of the configured period and send a notification to the sensor device 140 to cause the sensor device 140 to enter the calibration mode. For example, a user of the sensor device 140 may interact with the notification to allow the remote module 142 to enter a calibration mode and activate a calibration sensor (e.g., microphone, camera, etc.) on the sensor device 140, as described above.

In step 704, media device 110 may send calibration content to the playback device over the playback channel and record the transmission time. For example, media module 112 may determine sensor capabilities of sensor device 140 (e.g., sound sensor-microphone, image sensor-camera, etc.). The media module 112 may determine the media presentation capabilities (e.g., audio, video only, etc.) of the playback devices in the media system 400. The media module 112 may select calibration content to send to each playback device on each playback channel based on the determined capabilities of the sensor device 140 and the playback device. For example, when the sensor device 140 can only detect sound (e.g., configured with only a microphone), then the media module 112 can send audio calibration data to the various playback devices in the media system 400.

When the sensor device 140 is capable of detecting sound and images, the media module 112 may select audio or video calibration data according to the output capabilities of the playback device. For example, the video calibration data may be sent to a playback device having a display. The audio calibration data may be sent to a playback device having a speaker. In transmitting the calibration data to the playback device over the playback channel, media module 112 may record the local system time (e.g., using the system clock of media device 110) at which the calibration data was transmitted over the playback channel. When the playback device receives the calibration data 118, the playback device may render the calibration data 118 using a display or speaker of the playback device.

At step 706, the sensor device 140 may detect calibration content presented by the playback device and record the time of detection. For example, while in the calibration mode, one or more calibration sensors of the sensor device 140 may remain enabled (e.g., active, on) such that sound and/or images corresponding to calibration content presented by the playback device may be detected by the sensor device 140. The remote module 142 may sample (e.g., periodically record) the sounds and/or images detected by the sensor device 140 while in the calibration mode. In the example of the media system 400, the remote module 142 will not record the time at which each sample was recorded and/or calibration data was generated because the remote module 142 does not have access to the clock on the sensor device 140.

At step 708, the sensor device 140 may send calibration data to the media device 110. For example, the remote module 142 may send calibration data for the current sampling period to the media device 140.

In some implementations, steps 706 and 708 can be performed iteratively while in the calibration mode. For example, the remote module 142 may periodically (e.g., every 50 milliseconds, every second, etc.) sample data generated by a calibration sensor (e.g., microphone, camera, etc.) on the sensor device 140, store the sensor data (e.g., the detected calibration data), record the current time of the current sample, and send the current sample and the current time to the media device 110 for analysis. While in the calibration mode, the remote module 142 may iterate through a number of sampling periods. Accordingly, the remote module 142 may send multiple instances of calibration data to the media module 112 on the media device 110.

At step 710, media device 110 may determine a time at which the media device received calibration data from sensor device 140. For example, when media device 110 receives the calibration data, media module 112 may obtain the current system time from a system clock on media device 110 and store the current system time as the time of receipt of the calibration data.

At step 712, the media device 110 may calculate a propagation delay based on the transmission time, the calibration data, the time of receipt of the calibration data, and the time of flight of the transmission between the sensor device 140 and the media device 110. For example, media module 112 may analyze each instance of calibration data to determine which instance or instances of calibration data (when multiple playback channels are calibrated) include calibration segments (e.g., audio and/or video calibration patterns). When media module 112 identifies an instance of calibration data that includes a calibration segment, media module 112 may determine a time at which the calibration segment of calibration data (e.g., sampled sensor data) was presented by the playback device and/or received by sensor device 140. For example, the time at which the calibration segment is presented can be determined by adding a sampling offset to the time at which the samples in the calibration data were captured (e.g., the sampling time). This sampling time may be estimated by subtracting a time-of-flight value (e.g., corresponding to an estimated amount of time for a message to travel from the sensor device 140 to the media device 110) from the calibration data reception time determined at step 710. Media module 112 may then determine a time (e.g., a presentation time) at which the calibration content was presented by the playback device based on the calibration offset of the calibration segment (e.g., subtracting the calibration offset from the time at which the calibration segment was presented), as described above. Media module 112 may then calculate a propagation delay based on the difference between the transmission time of the calibration content and the presentation time of the calibration segment. For example, media module 112 may subtract the transfer time recorded when calibration content 118 was sent to the playback device over the playback channel from the presentation time determined based on the calibration data.

In step 714, media device 110 may adjust a transmission delay of the playback channel when transmitting the media content for playback based on the playback delay determined for the playback channel. For example, media module 112 may compare the presentation delay of a playback channel to the presentation delays calculated for other playback channels and adjust the playback delay (e.g., the amount of time that media content is delayed to be sent) of each playback channel to accommodate the playback channel with the longest presentation delay. For example, if the playback channel 126 has a presentation delay of 5 seconds and the playback channel 162 has a playback delay of 2 seconds, the media module 112 may transmit the same media content on the playback channel 162 with a delay of 3 seconds after transmitting the media content on the playback channel 126, such that the media content will be presented simultaneously by the playback devices associated with each playback channel.

Graphic user interface

The present disclosure describes various Graphical User Interfaces (GUIs) for implementing various features, processes, or workflows above. These GUIs may be presented on a variety of electronic devices including, but not limited to, laptop computers, desktop computers, computer terminals, television systems, tablets, e-book readers, and smart phones. One or more of these electronic devices may include a touch-sensitive surface. The touch-sensitive surface may process multiple simultaneous input points, including processing data related to the pressure, degree, or location of each input point. Such processing may facilitate gestures performed with multiple fingers, including pinching and swiping.

When the present disclosure refers to "selecting" a user interface element in a GUI, these terms are understood to include clicking or "hovering" over the user interface element using a mouse or other input device, or touching, tapping, or gesturing on the user interface element using one or more fingers or a stylus. The user interface elements may be virtual buttons, menus, selectors, switches, sliders, brushes, knobs, thumbnails, links, icons, radio boxes, check boxes, and any other mechanism for receiving input from or providing feedback to a user.

Privacy

The present disclosure recognizes that the use of personal information data in the techniques of the present disclosure may be useful to benefit a user. For example, personal information data (e.g., samples of detected audio and/or video calibration data) may be used to calibrate playback devices such that audio and/or video data may be presented in a synchronized manner on different playback devices. Thus, the use of such personal information data enables planned control of the presented audio and/or video content. While in some cases, the user's voice and/or other sounds in the vicinity of the sensor device may be recorded while the audio and/or video calibration data is being sampled, the systems described herein maintain and protect the user's privacy by recording and/or detecting audio/video data only in response to user input indicating that such audio and/or video detection sensors (e.g., cameras, microphones, etc.) should be activated, enabled, or turned on. Thus, outside of the particular calibration processes described herein, the techniques described herein are not configured to record audio and/or video data without the knowledge and/or consent of the user.

The present disclosure also contemplates that entities responsible for the collection, analysis, disclosure, transmission, storage, or other use of such personal information data will comply with established privacy policies and/or privacy practices. In particular, such entities should enforce and adhere to the use of privacy policies and practices that are recognized as meeting or exceeding industry or government requirements for maintaining privacy and security of personal information data. For example, personal information from a user should be collected for legitimate and legitimate uses by an entity and not shared or sold outside of these legitimate uses. In addition, such collection should only be done after the user has informed consent. In addition, such entities should take any required steps to secure and protect access to such personal information data, and to ensure that others who are able to access the personal information data comply with their privacy policies and procedures. In addition, such entities may subject themselves to third party evaluations to prove compliance with widely accepted privacy policies and practices.

Regardless of the foregoing, the present disclosure also contemplates embodiments in which a user selectively prevents use or access to personal information data. That is, the present disclosure contemplates that hardware elements and/or software elements may be provided to prevent or block access to such personal information data. For example, in the case of an ad delivery service, the techniques of this disclosure may be configured to allow a user to opt-in to "join" or "opt-out of" participating in the collection of personal information data during registration with the service. As another example, the user may choose not to provide location information for the targeted content delivery service. As another example, the user may choose not to provide accurate location information, but to permit transmission of location area information.

Exemplary System architecture

Fig. 8 is a block diagram of an exemplary computing device 800 that may implement the features and processes of fig. 1-7. Computing device 800 may include a memory interface 802, one or more data processors, image processors and/or central processing units 804, and a peripheral interface 806. The memory interface 802, the one or more processors 804, and/or the peripherals interface 806 can be separate components or can be integrated in one or more integrated circuits. The various components in computing device 800 may be coupled by one or more communication buses or signal lines.

Sensors, devices, and subsystems can be coupled to peripherals interface 806 to facilitate multiple functions. For example, motion sensor 810, light sensor 812, and proximity sensor 814 may be coupled to peripheral interface 806 to facilitate orientation, lighting, and proximity functions. Other sensors 816 may also be connected to the peripheral interface 806, such as a Global Navigation Satellite System (GNSS) (e.g., GPS receiver), temperature sensor, biometric sensor, magnetometer, or other sensing device to facilitate related functions.

Camera subsystem 820 and optical sensor 822, e.g., a Charge Coupled Device (CCD) or Complementary Metal Oxide Semiconductor (CMOS) optical sensor, may be utilized to facilitate camera functions, such as taking photographs and video clips. The camera subsystem 820 and optical sensor 822 may be used to collect images of a user to be used during authentication of the user, for example, by performing facial recognition analysis.

Communication functions can be facilitated by one or more wireless communication subsystems 824, which can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The specific design and implementation of communication subsystem 824 may be dependent upon one or more communication networks through which computing device 800 is intended to operate. For example, computing device 800 may include a network designed to communicate over a GSM network, GPRS network, EDGE network, Wi-Fi or WiMax network, and BluetoothTMA network-operated communications subsystem 824. In particular, the wireless communication subsystem 824 may include hosting protocols such that the device 100 may be configured as a base station for other wireless devices.

Audio subsystem 826 may be coupled to speaker 828 and microphone 830 to facilitate voice-enabled functions such as speaker recognition, voice replication, digital recording, and telephony functions. Audio subsystem 826 may be configured to facilitate, for example, processing of voice commands, voiceprint authentication, and voice authentication.

I/O subsystem 840 may include a touch-surface controller 842 and/or one or more other input controllers 844. Touch-surface controller 842 may be coupled to touch-surface 846. Touch surface 846 and touch-surface controller 842 may, for example, detect contact and movement or breaking thereof using any of a variety of touch-sensitive technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch surface 846.

One or more other input controllers 844 may be coupled to other input/control devices 848, such as one or more buttons, rocker switches, thumb wheels, infrared ports, USB ports, and/or pointer devices, such as styluses. The one or more buttons (not shown) may include an up/down button for volume control of the speaker 828 and/or the microphone 830.

In one implementation, pressing the button for a first duration releases the lock on touch surface 846; and pressing the button for a second duration that is longer than the first duration can turn power on or off to the computing device 800. Pressing the button for a third duration can activate a voice control or voice command module that enables the user to speak a command into microphone 830 to cause the device to execute the spoken command. The user can customize the functionality of one or more buttons. For example, touch surface 846 may also be used to implement virtual or soft buttons and/or a keyboard.

In some implementations, the computing device 800 may present recorded audio and/or video files, such as MP3, AAC, and MPEG files. In some implementations, the computing device 800 may include the functionality of an MP3 player, such as an iPodTM

The memory interface 802 may be coupled to a memory 850. The memory 850 may include high-speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR). The memory 850 may store an operating system 852, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks.

Operating system 852 may include instructions for handling basic system services and for performing hardware related tasks. In some implementations, the operating system 852 may be a kernel (e.g., UNIX kernel). In some implementations, the operating system 852 can include instructions for performing voice authentication. For example, the operating system 852 may implement the media system calibration features described with reference to fig. 1-7.

The memory 850 can also store communication instructions 854 to facilitate communication with one or more additional devices, one or more computers, and/or one or more servers. Memory 850 may include graphical user interface instructions 856 to facilitate graphical user interface processing; sensor processing instructions 858 to facilitate sensor-related processes and functions; telephone instructions 860 to facilitate telephone-related processes and functions; electronic message instructions 862 to facilitate processes and functions related to electronic message processing; web browsing instructions 864 that facilitate web browsing-related processes and functions; media processing instructions 866 to facilitate media processing-related processes and functions; GNSS/navigation instructions 868 that facilitate GNSS and navigation related processes and instructions; and/or camera instructions 870 to facilitate camera-related processes and functions.

Memory 850 may store other software instructions 872 that facilitate other processes and functions, such as the media system calibration processes and functions described with reference to fig. 1-7.

Memory 850 may also store other software instructions 874, such as Web video instructions to facilitate Web video-related processes and functions; and/or online shopping instructions that facilitate processes and functions related to online shopping. In some implementations, the media processing instructions 866 are divided into audio processing instructions and video processing instructions for facilitating audio processing-related processes and functions and video processing-related processes and functions, respectively.

Each of the instructions and applications identified above may correspond to a set of instructions for performing one or more functions described above. The instructions need not be implemented as separate software programs, procedures or modules. Memory 850 may include additional instructions or fewer instructions. Further, various functions of the computing device 800 may be implemented in hardware and/or software, including in one or more signal processing and/or application specific integrated circuits.

Exemplary embodiments

Some embodiments may include a method comprising: the sensor device detecting a portion of calibration content presented by the first playback device, the portion of calibration content being transmitted from the media device to the first playback device at a transmission time determined based on a first clock at the media device; the sensor device generating calibration data comprising the portion of the detected calibration content and a detection time indicating when the portion of the calibration content was detected by the sensor device, the detection time being determined based on a second clock on the sensor device; the sensor device transmits calibration data to the media device, wherein the media device calculates a propagation delay value based on the transmission time, the detected portion of the calibration content, and the detection time indicated in the calibration data.

The method may include embodiments wherein the first clock is a system clock and the second clock is a bluetooth clock. The method can include embodiments wherein the first clock and the second clock are system clocks of the media device and the sensor device, respectively. The method may include embodiments in which the calibration content includes a first media segment followed by a calibration media segment followed by a second media segment. The method may include embodiments wherein the calibration content is audio content. The method may include an embodiment wherein the calibration content is video content. The method can include embodiments wherein the sensor device is a remote control device for remotely controlling the media device.

Some embodiments may include a system comprising: a plurality of computing devices including a media device, a sensor device; and a plurality of non-transitory computer-readable media comprising one or more sequences of instructions which, when executed by a computing device, cause the computing device to perform operations comprising: the media device sending calibration content to a first playback device associated with a first playback channel; the media device storing a transfer time indicating when the calibration content was sent to the first playback device, the transfer time determined based on a first clock on the media device; the sensor device detecting a portion of the calibration content presented by the first playback device; the sensor device generating calibration data, the calibration data including a portion of the detected calibration content and a detection time indicating when the portion of the calibration content was detected, the detection time determined based on a second clock on the sensor device; the sensor device sends calibration data to the media device; the media device calculates a propagation delay value based on the transmission time, the detected portion of the calibration content, and the detection time indicated in the calibration data.

The system may include embodiments in which the first clock is a system clock and the second clock is a bluetooth clock. The system may include embodiments in which the first clock and the second clock are system clocks of the media device and the sensor device, respectively. The system may include an embodiment in which the calibration content includes a first media segment followed by a calibration media segment followed by a second media segment. The system may include an embodiment wherein the calibration content is audio content. The system may include an embodiment wherein the calibration content is video content. The system may include embodiments in which the calibration media segment is associated with a time offset, and in which the instructions cause the computing device to perform operations comprising: the media device calculates a propagation delay value based on the transmission time, the detection time, and the time offset of the calibration media segment.

Some embodiments may include a media device comprising: one or more processors; and a non-transitory computer-readable medium comprising one or more sequences of instructions which, when executed by one or more processors, cause the processors to perform operations comprising: the media device sending calibration content to a first playback device associated with a first playback channel; the media device storing a transfer time indicating when the calibration content was sent to the first playback device, the transfer time determined based on a first clock on the media device; receiving, by the media device, calibration data from the sensor device, the calibration data including a portion of calibration content rendered by the first playback device and detected by the sensor device; the media device determining a receive time indicating when the calibration content was received by the media device, the receive time determined based on a first clock on the media device; and the media device calculating a propagation delay value based on the transmission time, the reception time, the detected portion of the calibration content, and a time-of-flight value representing an amount of time it takes for the message to be received at the media device after being transmitted by the sensor device.

The media device may include an implementation in which the calibration content includes a first media segment followed by a calibration media segment followed by a second media segment. The media device may include an embodiment wherein the calibration content is audio content. The media device may include an embodiment wherein the calibration content is video content. The media device may include embodiments in which the instructions result in operations comprising: a detection time for a portion of the calibration content is determined based on the time of receipt and the time of flight value. The media device may include an implementation in which the calibration media segment is associated with a time offset, and in which the instructions result in operations comprising: the media device calculates a propagation delay value based on the transmission time, the detection time, and the time offset of the calibration media segment.

30页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:视频流推荐方法、电子设备和存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类