Integrated video and data system

文档序号:1436282 发布日期:2020-03-20 浏览:32次 中文

阅读说明:本技术 集成视频及数据系统 (Integrated video and data system ) 是由 W·张 李勇 陈学敏 C·L·娄 于 2019-09-12 设计创作,主要内容包括:本申请案是针对集成视频及数据系统。一种用于集成视频内容及数据提供的系统包含:上游信号路径,其用以处理从第一传输网络接收的上游数据业务以传输到第二传输网络;及下游信号路径,其用以处理下游数据业务及下游视频业务以从所述第二传输网络传输到所述第一传输网络。所述上游信号路径包含模/数转换器ADC及上游解调器,且所述下游信号路径包含以太网处理器、数字信号处理器DSP及数/模转换器DAC。(The present application is directed to an integrated video and data system. A system for integrated video content and data provision comprising: an upstream signal path to process upstream data traffic received from a first transport network for transmission to a second transport network; and a downstream signal path to process downstream data traffic and downstream video traffic for transmission from the second transport network to the first transport network. The upstream signal path includes an analog-to-digital converter (ADC) and an upstream demodulator, and the downstream signal path includes an Ethernet processor, a Digital Signal Processor (DSP), and a digital-to-analog converter (DAC).)

1. A system for integrating video content and data provision, the system comprising:

an upstream signal path configured to process upstream data traffic received from a first transport network for transmission to a second transport network; and

a downstream signal path configured to process downstream data traffic and downstream video traffic for transmission from the second transport network to the first transport network,

wherein:

the upstream signal path includes an analog-to-digital converter (ADC) and an upstream demodulator, an

The downstream signal path includes an ethernet processor, a digital signal processor DSP, and a digital-to-analog converter DAC.

2. The system of claim 1, wherein said upstream data traffic comprises Data Over Cable System Interface Specification (DOCSIS) data flows.

3. The system of claim 1, wherein the first transport network comprises at least one of a coaxial cable network or a fiber optic network, and wherein the second transport network comprises an ethernet network.

4. The system of claim 1, wherein the upstream demodulator comprises a DOCSIS demodulator configured to demodulate the upstream data traffic for ethernet processing.

5. The system of claim 4, wherein the Ethernet processor comprises a DOCSIS upstream receiver configured to receive demodulated upstream data traffic and provide DOCSIS upstream content to an Internet Protocol (IP) framer for conversion to IP frames.

6. The system of claim 4, wherein the Ethernet processor further comprises an IP deframer, a payload analyzer, a DOCSIS downstream interface, and a video stream interface, and wherein the payload analyzer is configured to analyze a payload of deframed IP frames to provide DOCSIS downstream data and video data streams to the DOCSIS downstream interface and video stream interface, respectively.

7. The system of claim 6, wherein the downstream signal path further comprises a DOCSIS downstream processor configured to process the DOCSIS downstream data by performing DOCSIS functions comprising bandwidth control and de-jitter.

8. The system of claim 6, wherein the downstream signal path further comprises a broadcast video processor configured to process the video data stream by performing Motion Picture Experts Group (MPEG) transport stream processing comprising removing network jitter and performing Program Clock Reference (PCR) correction and conditional access.

9. The system of claim 6, wherein the DSP comprises a downstream channel formation and scheduler, a digital modulator, and a digital channel combiner configured to combine the video data stream and the DOCSIS downstream data.

10. The system of claim 1, further comprising a system management unit configured to uniformly manage and control operation of the upstream signal path and the downstream signal path.

11. A method of integrating video content and data, the method comprising:

configuring an upstream signal path including an ADC and an upstream demodulator to process upstream data traffic received from a first transport network;

configuring a downstream signal path comprising an ethernet processor, a DSP and a DAC to process downstream data traffic and downstream video traffic;

combining the processed downstream data traffic with the processed downstream video traffic to provide simulated downstream data and video traffic; and

and transmitting the analog downstream data and video service through the first transmission network.

12. The method of claim 11, wherein the upstream demodulator comprises a DOCSIS demodulator, and wherein configuring the upstream signal path comprises configuring the DOCSIS demodulator to demodulate the upstream data traffic for ethernet processing.

13. The method of claim 12, wherein the ethernet processor comprises a DOCSIS upstream receiver, and wherein configuring the downstream signal path comprises configuring the DOCSIS upstream receiver to receive demodulated upstream data traffic and provide DOCSIS upstream content to an IP framer for conversion to IP frames.

14. The method of claim 12, wherein the ethernet processor further comprises an IP deframer, a payload analyzer, a DOCSIS downstream interface, and a video stream interface, and wherein configuring the downstream signal path comprises configuring the payload analyzer to analyze a payload of deframed IP frames to provide DOCSIS downstream data and video data streams to the DOCSIS downstream interface and video stream interface, respectively.

15. The method of claim 14, wherein the downstream signal path further comprises a DOCSIS downstream processor, and wherein configuring the downstream signal path further comprises configuring the DOCSIS downstream processor to process the DOCSIS downstream data by performing DOCSIS functions comprising bandwidth control and dejitter.

16. The method of claim 14, wherein the downstream signal path further comprises a broadcast video processor, and wherein configuring the downstream signal path further comprises configuring the broadcast video processor to process the video data stream by performing MPEG transport stream processing including removing network jitter and performing PCR correction and conditional access.

17. The method of claim 14, wherein the DSP comprises a downstream channel formation and scheduler, a digital modulator, and a digital channel combiner, and wherein configuring the downstream signal path further comprises configuring the digital channel combiner to combine the video data stream and the DOCSIS downstream data.

18. A content distribution system, comprising:

at least one processor circuit configured to:

processing upstream data traffic received from the first transport network over an upstream signal path for transmission to the second transport network; and

processing downstream data traffic and downstream video traffic through a common downstream signal path for transmission to the first transport network,

wherein at least one of the downstream data traffic or the downstream video traffic is receivable from the second transport network.

19. The content distribution system of claim 18, wherein the second transport network comprises an ethernet network.

20. The content distribution system of claim 18, wherein the first transport network comprises at least one of a coaxial cable network or a fiber optic network.

Technical Field

This specification relates generally to content distribution systems and, more particularly, to, for example, but not limited to, integrated video and data systems.

Background

As urban population densities increase, broadband service providers are moving fiber optic network endpoints closer to buildings (or areas) associated with high population densities, such as basements entering multi-dwelling units. For example, a broadband service provider may place a fiber optic network endpoint, such as an Optical Network Terminal (ONT), in a basement of a large high-rise apartment building containing several apartments. The head end of the broadband service provider may include an Optical Line Terminal (OLT) that is communicatively coupled to the ONTs, e.g., via fiber optic cables. The ONT may be separately coupled to a gateway device in a user device located in a personal dwelling unit via a non-optical network medium, such as a coaxial transmission line, and provide broadband services (e.g., television, telephone, and/or the internet) to the user device. The user device may include, for example, a set-top box, a mobile phone, a tablet device, or other communication device. Thus, the ONTs may each include and/or may be coupled to a media converter that transforms optical signals received from the OLT of the head-end over a fiber optic network into electrical signals that may be transmitted over a non-optical network medium (e.g., coaxial cable) to a gateway in an individual residential unit, and vice versa.

Disclosure of Invention

In one aspect, the present application is directed to a system for integrating video content and data provision, the system comprising: an upstream signal path configured to process upstream data traffic received from a first transport network for transmission to a second transport network; and a downstream signal path configured to process downstream data traffic and downstream video traffic for transmission from the second transport network to the first transport network, wherein: the upstream signal path includes an analog-to-digital converter (ADC) and an upstream demodulator, and the downstream signal path includes an Ethernet processor, a Digital Signal Processor (DSP), and a digital-to-analog converter (DAC).

In another aspect, the present application is directed to a method of integrating video content and data, the method comprising: configuring an upstream signal path including an ADC and an upstream demodulator to process upstream data traffic received from a first transport network; configuring a downstream signal path comprising an ethernet processor, a DSP and a DAC to process downstream data traffic and downstream video traffic; combining the processed downstream data traffic with the processed downstream video traffic to provide simulated downstream data and video traffic; and transmitting the analog downstream data and video service through the first transmission network.

In another aspect, the present application is directed to a content distribution system comprising: at least one processor circuit configured to: processing upstream data traffic received from the first transport network over an upstream signal path for transmission to the second transport network; and processing downstream data traffic and downstream video traffic through a common downstream signal path for transmission to the first transport network, wherein at least one of the downstream data traffic or the downstream video traffic is capable of being received from the second transport network.

Drawings

Certain features of the technology are set forth in the appended claims. However, for purposes of explanation, several embodiments of the present technology are set forth in the following figures.

FIG. 1 illustrates an example environment in which a content distribution system may be implemented.

Fig. 2A is a high-level block diagram of an example of an integrated video and data system in accordance with one or more implementations of the present technique.

Fig. 2B is a block diagram of an example implementation of the modem and set-top box of fig. 2A, in accordance with one or more implementations of the present technology.

Fig. 3 is a schematic diagram illustrating an example of an integrated content distribution system, in accordance with one or more implementations of the present technique.

Fig. 4 is a schematic diagram illustrating an example of an ethernet internet protocol processing system in accordance with one or more implementations of the present technique.

Fig. 5 is a schematic diagram illustrating an example of integrating video and data channel content, in accordance with one or more implementations of the present technique.

Fig. 6 is a schematic diagram illustrating an example of a video auxiliary channel, in accordance with one or more implementations of the present technique.

Fig. 7 is a schematic diagram illustrating an example of data, video content, and cable plant (cableplan) spectrum in accordance with one or more implementations of the present technique.

Fig. 8 is a schematic diagram illustrating an example of data and video content spectra.

Fig. 9 is a schematic diagram illustrating an example of broadcast video content and data service content spectrum in accordance with one or more implementations of the present technique.

Fig. 10 is a schematic diagram illustrating an example implementation of a user device, in accordance with one or more implementations of the present technology.

Fig. 11 is a flow diagram of an example of a method of integrating video content and data in accordance with one or more implementations of the present technique.

Figure 12 conceptually illustrates an electronic system with which any implementation of the present technology may be implemented.

Detailed Description

The detailed description set forth below is intended as a description of various configurations of the present technology and is not intended to represent the only configurations in which the present technology may be practiced. The accompanying drawings are incorporated herein and constitute a part of the detailed description, which includes specific details for providing a thorough understanding of the present technology. However, the present techniques are not limited to the specific details set forth herein and may be practiced without one or more of the specific details. In some instances, structures and components are shown in block diagram form in order to avoid obscuring the concepts of the present technology.

The present technology is directed to an integrated video and data system. The disclosed integrated video and data system addresses both bi-directional data services and broadcast video services. The present technology introduces a device concept for delivering high quality broadcast programs, such as 4k and 8k content, over broadcast video channels. In the disclosed method, the integrated device is used as a front-end server for both data services and broadcasts that are controlled and managed under a unified management system.

FIG. 1 illustrates an example environment 100 in which a content distribution system may be implemented. Environment 100 includes a network 110, such as the internet, a Data Over Cable Service Interface Specification (DOCSIS) Cable Modem Transmission System (CMTS)120, a video server 130, a conditional access module 140, a channel formation module 150, a modulator 160, and a Radio Frequency (RF) combiner network 170. DOCSIS is a two-way data communication protocol that offers the possibility of various services. Due to the DOCSIS bi-directional nature and capability of real-time interaction, many data services provided over DOCSIS data networks have fewer timing problems than video broadcast services. The DOCSIS CMTS 120 (hereinafter "CMTS 120") operates as a bi-directional DOCSIS headend that provides bi-directional data services for data applications such as internet browsers and netconferencing for end user devices. The CMTS 120 provides a spectrum with multiple channels, as explained later.

Video server 130 is a video broadcast system, which is a distribution network for broadcasting video and/or audio programs. In some embodiments, video server 130 may utilize a narrowband or broadband communication system as a return channel to obtain end-user information to further optimize network utilization. Each video and/or audio program is comprised of one of a plurality of Moving Picture Experts Group (MPEG) transport streams. MPEG content can be protected by various conditional access systems with timing sensitive encryption mechanisms. Timing synchronization is critical between the MPEG transport stream for a program and the timing sensitive encryption information corresponding to the program. Many MPEG transport streams are combined to form a channel. Each channel may occupy a 6 to 8MHz frequency band on a cable plant to deliver a raw data rate of 28 to 50 Mbps. Each broadcast program (video or audio) is limited to one of the channels, and each channel may contain more than one broadcast program. The bit rate required for each broadcast video program tends to vary dramatically due to compression and video quality; therefore, some null packets are added to maintain a constant bit rate per channel, and those null packets are considered wasted bandwidth. The raw data for each channel is then converted to an RF band by various digital and analog circuitry. Broadcast video quality is sometimes limited by the total bandwidth available within each individual channel, particularly for high quality video, such as live sports in 8K format.

Conditional access module 140 may include logic, processing circuitry, firmware, and/or software for protecting content, such as by using timing sensitive encryption mechanisms. Timing synchronization is crucial between the program's MPEG stream and the program's timing sensitive encryption information. Channel formation module 150 may include logic, processing circuitry, firmware, and/or software for converting the raw data for each channel to one RF band. Channel formation module 150 may dynamically assign content to a particular channel based on information from end user requests fed through the return channel. The modulator 160 may comprise logic, processing circuitry, firmware, and/or software for modulating a baseband signal onto a carrier signal using amplitude, frequency, or phase modulation to generate a broadcast RF signal 162. The RF combiner network 170 may comprise logic, processing circuitry, firmware, and/or software for combining the RF data signals 122 and broadcast RF signals 162 provided by the CMTS 120 to provide downstream data and video content as part of the stream 175. Stream 175 further includes upstream data uploaded by an end user device (e.g., a Set Top Box (STB), computer, tablet, or other handheld communication device) and delivered to CMTS 120 as upstream data 172. The back channel may carry end user broadcast video service information back to the server to perform network optimization or service configuration.

Fig. 2A is a high-level block diagram of an example of an integrated video and data system 200 in accordance with one or more implementations of the present technique. The integrated video and data system 200 includes a DOCSIS and broadcast video block 210 coupled via a first transmission medium 203 to a first transmission network 202, including but not limited to an ethernet network. DOCSIS and broadcast video blocks 210 are also coupled via a second transmission medium 205 to a second transmission network 204 that includes a modem (e.g., a cable modem) 220 and a STB (e.g., a STB with Conditional Access (CA) 230). The first transmission medium 203 and the second transmission medium 205 include coaxial cables (also referred to as "cabling") and/or optical fibers. DOCSIS and broadcast video block 210 includes: an upstream signal path to process upstream data traffic received by the first transport network 202 for transmission to the second transport network 204; and a downstream signal path that processes downstream data traffic and downstream video traffic for transmission to the second transport network 204, as will be discussed in greater detail herein.

Fig. 2B is a block diagram of an example implementation of modem 220 and STB230 of fig. 2A, in accordance with one or more implementations of the present technology. The modem 220 may be a cable modem or an Optical Network Terminal (ONT) and is coupled to a network 250, such as a home IP network or other IP network (e.g., Wi-Fi). Modem 220 includes an MPEG Transport Stream (TS) and/or IP conversion module 222, an upstream modulator 224, and a downstream demodulator 226, which are known modules, and may include logic, processing circuitry, firmware, and/or software for tuning upstream data received from an end-user device, such as a Customer Premises Equipment (CPE) connected to a network 250 device, to a DOCSIS bandwidth, and for transmitting downstream data to the end-user device over network 250.

A plurality of cable modems (e.g., modem 220) and CPEs (e.g., STB230) are connected to DOCSIS and broadcast video block 210 via cable plant 205. For broadcast video, the STB230 is typically used to receive content and display it on a display device 235, such as a Television (TV). STB230 includes, but is not limited to, a tuner/demodulator 232, a conditional access engine 234, and a video decoder 236. Tuner/demodulator 232 may select an RF channel from cable plant 205 and convert the download stream (RF signal) to digital content (bits). Conditional access engine 234 may descramble/decrypt the digital content for an authorized end user (e.g., CPE). The video decoder 236 may decode the compressed video content and send to a display device 235 (e.g., a TV).

Fig. 3 is a schematic diagram illustrating an example of an integrated content distribution system 300, in accordance with one or more implementations of the present technique. The integrated content distribution system 300 (hereinafter referred to as "distribution system 300") is an integrated device (e.g., a chip such as a semiconductor chip) and can be used as a front-end server for both data services and video broadcasting under the control of the unified management unit 306. Distribution system 300 is connected to a backbone server (not shown for simplicity) through one of a plurality of ethernet connections via ethernet IP processor 320. Distribution system 300 includes an upstream signal path 302 and a downstream signal path 304 coupled to an ethernet IP processor 320. Upstream signal path 302 includes, but is not limited to, an analog-to-digital converter (ADC)312 (e.g., a high-speed ADC), an upstream demodulator 310, and other RF circuitry (not shown for simplicity). ADC 312 receives upstream analog data from cable plant 205 of figure 2A and converts the analog upstream data to upstream digital data for further processing by upstream demodulator 310, which may demodulate the upstream digital data into DOCSIS upstream data.

Downstream signal path 304 includes ethernet IP processor 320, DOCSIS downstream processor 330, broadcast video processor 340, Digital Signal Processor (DSP)350, and digital-to-analog converter (DAC) 360. DOCSIS downstream processor 330 may include logic, processing circuitry, firmware, and/or software for processing to comply with the DOCSIS protocol. For example, DOCSIS downstream processor 330 performs important functions of layering various DOCSIS traffic based on timing requirement sensitivity. DOCSIS downstream processor 330 also performs functionality such as bandwidth control, dejitter, and other conventional DOCSIS functionality.

Broadcast video processor 340 is used to perform conventional MPEG TS processing requirements on each of the program streams provided by one or more video servers (e.g., 130 of fig. 1). For example, the broadcast video processor 340 may divide the high bandwidth stream into a plurality of sub-TS streams, where the peak rate of each sub-TS stream is less than the maximum bandwidth of the original RF channel limit. Broadcast video processor 340 may further remove network jitter, perform Program Clock Reference (PCR) correction and conditional access, and other functionality.

DSP 350 includes a downstream channel formation and/or scheduler 352, a digital modulator 354, and a digital channel combiner 356. Downstream channel formation and/or scheduler 352 may include logic, processing circuitry, firmware, and/or software for scheduling and locating packets from each individual video or DOCSIS flow throughout the available band. The scheme for scheduling content in the available bandwidth is based on timing sensitivity, delay and jitter requirements, and other overall quality of service (QoS) attributes of video and DOCSIS content. The QoS information for the content may be provided by DOCSIS downstream processor 330 and broadcast video processor 340. For example, broadcast video may be preferentially localized into the bandwidth of each channel and mixed with some DOCSIS traffic, which has less time sensitivity and may be used to fill the remaining available bandwidth. In this way, the conventional channel concept still exists, and high bandwidth broadcast content and DOCSIS content can be spread over several or all conventional channels. The channel concept is only applicable to legacy devices. An important advantage of this approach is that the full remaining bandwidth (initially filled with null packets) will be used for some of the DOCSIS services. On the receiver side, a client device (e.g., a cable modem or set-top box) is used to demodulate and separate the mixed video and DOCSIS packets in the downstream channel into their corresponding packet streams. In particular, a full bandwidth capable client device (e.g., with a full band capture tuner and demodulator) may be used to receive any video and DOCSIS content in any downstream channel.

Digital modulator 354 is a known block and may comprise logic, processing circuitry, firmware, and/or software for modulating signals received from downstream channel formation and/or scheduler 352. The modulated signal comprises a DOCSIS CMTS output and at least one broadcast video RF output, which are mixed by a digital channel combiner 356 to generate a digital RF downstream. The digital RF downstream is converted to analog RF downstream via DAC 360.

Unified management unit 306 may include logic, processing circuitry, firmware, and/or software for controlling and managing the operation of the various modules and components of upstream signal path 302 and downstream signal path 304.

Fig. 4 is a schematic diagram illustrating an example of ethernet IP processor 320 of fig. 3, in accordance with one or more implementations of the present technique. Ethernet IP processor 320 is connected to the backbone server by one of a plurality of ethernet connections and includes an upstream path consisting of DOCSIS upstream receiver 410 and IP framer 420, and a downstream path including IP deframer 430, payload analyzer 440, DOCSIS downstream interface 450, and broadcast video (e.g., MPEG TS) module 460. IP deframer 430, payload analyzer 440, DOCSIS downstream interface 450, and broadcast video module 460 may each comprise logic, processing circuitry, firmware, and/or software. DOCSIS upstream receiver 410 interfaces with upstream demodulator 310 of fig. 3 to receive and provide DOCSIS upstream data to IP framer 420. IP framer 420 encapsulates DOCSIS upstream data into IP frames and sends to the backbone server via an Ethernet connection. IP deframer 430 removes the IP header from the IP frames received from the main dry server (strip off). Payload analyzer 440 may separate DOCSIS downstream data traffic and broadcast video traffic for further processing by DOCSIS downstream processor 330 and broadcast video processor 340, respectively, of fig. 3. The separated DOCSIS downstream data service and broadcast video service are delivered to DOCSIS downstream processor 330 and broadcast video processor 340 via DOCSIS downstream interface 450 and broadcast video module 460, respectively.

Fig. 5 is a schematic diagram illustrating an example of integrated video and data channel content 500, in accordance with one or more implementations of the present technique. Integrated video and data channel content 500 includes timing sensitive content 510 comprising broadcast video and less timing sensitive content 520 comprising various DOCSIS data, such as downstream DOCSIS data.

Fig. 6 is a schematic diagram illustrating an example of a video auxiliary channel 600, in accordance with one or more implementations of the present technique. It can be appreciated that the entire video content can be delivered over multiple physical layer (PHY) channels. The video supplemental channel 600 includes multiple channels, such as video channels 602, 604, 606, and 608, separated from the super-channel by a PHY. The time-varying content 610, 612, 614, and 616 of the video channels 602, 604, 606, and 608 and their corresponding less-time-varying content 620, 622, 624, and 626 are shown in fig. 6.

Fig. 7 is a schematic diagram illustrating an example of data, video content, and cable plant (cableway) spectrum 700, in accordance with one or more implementations of the present technique. Example data, video content, and cable plant spectrum 700 corresponds to distribution system 300 of figure 3 and includes DOCSIS CMTS downstream spectrum 710, video Quadrature Amplitude Modulation (QAM) spectrum 720, and cable plant spectrum 730. The DOCSIS CMTS downstream spectrum 710 includes a plurality of DOCSIS downstream bands, such as bands 712, 714, and 716 separated by two frequency intervals 713 and 715. Video QAM spectrum 720 includes QAM video bands 722 and 724 that are conveniently located in frequency intervals 713 and 715 of DOCSIS CMTS downstream spectrum 710. Cable plant spectrum 730 is the combined video content stream and DOCSIS downstream spectrum, which is the combination of DOCSIS CMTS downstream spectrum 710 and video QAM spectrum 720 and is completely filled by DOCSIS downstream bands (712, 714, and 716) and QAM video bands (722 and 724).

Fig. 8 is a schematic diagram illustrating an example of a data and video content spectrum 800. An example of the data and video content spectrum 800 corresponds to a conventional data and video content system, such as the content distribution system provided by the environment 100 of fig. 1. As shown in fig. 8, band 810 corresponding to video content and bands 820 and 830 corresponding to DOCSIS data are separated by frequency intervals 813 and 823, which are wasteful. This indicates that the integrated device of the present technology (e.g., 300 in fig. 3) is substantially more spectrally efficient than the conventional system (e.g., 100 of fig. 1).

Fig. 9 is a schematic diagram illustrating an example of a broadcast video content and data service content spectrum 900 in accordance with one or more implementations of the present technique. Spectrum 900 includes broadcast video content 910(910-1, 910-2, 910-3, and 910-4) and DOCSIS data 912(912-1, 912-2, 912-3, and 912-4) bandwidths for channels 902(902-1, 902-2, 902-3, and 902-4) for narrowband modulation schemes, and broadcast video content 920 and DOCSIS data 922 for channels 904 for wideband modulation schemes. Regardless of the modulation scheme, broadcast video content (e.g., 910 or 920) and DOCSIS data (e.g., 912 or 922) coexist in either of downstream channels 902 or 904. In one or more implementations, the narrowband modulation scheme may be narrowband QAM and the wideband modulation scheme may be an Orthogonal Frequency Division Multiplexing (OFDM) modulation scheme. For end-user devices, whether legacy or the user devices of the present technology, frequency-based channelization is no longer a limitation of content bandwidth-sensitive services. However, in the conventional method, different services are separated by frequency allocation and frequency division. For example, DOCSIS services remain within the assigned DOCSIS channel or band, and legacy broadband video services are limited to 6 or 8MHz bands.

Fig. 10 is a schematic diagram illustrating an example implementation of a user device 1000, in accordance with one or more implementations of the present technology. The user device 1000 includes one or more high-speed ADCs 1010, digital tuners and digital modulators 1020, a transport stream filter 1030, a content and/or stream recovery module 1040, and a content and/or service stream management unit 1050 that provides broadband services 1060. User device 1000 can recover information from a frequency channelized RF system (without limitation). The high-speed ADC 1010 digitizes a wide frequency band or entire frequency band containing multiple channelized RF systems based on frequency division, which may have different channel spacing and/or different modulation types. Digital tuner and digital demodulator 1020 may convert the digitized RF signals into a mixed transport stream. For each particular service, one or more of transport stream filters 1030 may be employed to select the relevant sub-streams. The relevant substreams may be selected from different frequency channels with different modulations. The content/service flow recovery block 1040 may combine all relevant flows for a particular service of interest. The recovered stream is no longer limited by the RF channel frequency bandwidth or modulation type. The restored content stream may then be managed by a content and/or service stream management unit 1050 (e.g., a conventional one-way conditional access control system or a two-way communication system for access control) to deliver the content stream for the broadband service 1060.

Fig. 11 is a flow diagram of an example of a method 1100 of integrating video content and data in accordance with one or more implementations of the present technique. The method 1100 includes configuring an upstream signal path (e.g., 302 of fig. 3) comprised of an ADC (e.g., 312 of fig. 3) and an upstream demodulator (e.g., 310 of fig. 3) to process upstream data traffic (1110) received from a first transport network (e.g., 202 of fig. 2A). Downstream data traffic and downstream video traffic are processed using a downstream signal path (e.g., 304 of fig. 3) including an ethernet processor (e.g., 320 of fig. 3), a DSP (e.g., 350 of fig. 3), and a DAC (e.g., 360 of fig. 3) (1120). The processed downstream data traffic and the processed downstream video traffic (e.g., via 356 of fig. 3) are combined to provide simulated downstream data and video traffic (e.g., 362 of fig. 3) (1130). The analog downstream data and video traffic is transmitted 1140 to the user over a second transport network (e.g., 204 of fig. 2).

Figure 12 conceptually illustrates an electronic system with which any implementation of the present technique is implemented. Electronic system 1200 may be, for example, a network device, an STB device, a media converter, a desktop computer, a laptop computer, a tablet computer, a server, a switch, a router, a base station, a receiver, a telephone, or any electronic device that generally transmits signals over a network. Such an electronic system 1200 includes various types of computer-readable media and interfaces for various other types of computer-readable media. In one or more implementations, the electronic system 1200 may perform some of the functionality of the integrated device (e.g., 300 of fig. 3) of the present technology, e.g., execute one or more software modules of the distribution system 300. Electronic system 1200 includes bus 1208, one or more processing units 1212, system memory 1204, Read Only Memory (ROM)1210, persistent storage 1202, input device interface 1214, output device interface 1206, and one or more network interfaces 1216, or subsets and variations thereof.

Bus 1208 collectively represents all system, peripheral, and chipset buses that communicatively connect the numerous internal devices of electronic system 1200. In one or more implementations, a bus 1208 communicatively connects the one or more processing units 1212 with the ROM1210, the system memory 1204, and the permanent storage 1202. From these various memory units, one or more processing units 1212 retrieves instructions to execute and data to process in order to perform the processes of the present invention. In different implementations, the one or more processing units 1212 may be a single processor or a multi-core processor. In one or more implementations, the one or more processing units 1212 may perform some of the processing functionality of the distribution system 300 of the present technology by executing respective software modules.

The ROM1210 stores static data and instructions for the processing unit(s) 1212 and other modules of the electronic system. Persistent storage 1202, on the other hand, is a read-write memory device. Persistent storage 1202 is a non-volatile memory unit that stores instructions and data even when electronic system 1200 is turned off. One or more embodiments of the present invention use a mass storage device (e.g., a magnetic or optical disk and its corresponding disk drive) as persistent storage 1202.

Other implementations use removable storage (such as floppy disks or flash drives and their corresponding disk drives) as persistent storage 1202. Like persistent storage 1202, system memory 1204 is a read-write memory device. However, unlike persistent storage 1202, system memory 1204 is a volatile read-and-write memory such as a Random Access Memory (RAM). The system memory 1204 stores any of the instructions and data required by the one or more processing units 1212 at runtime. In one or more implementations, the processes of the present invention are stored in system memory 1204, persistent storage 1202 and/or ROM 1210. From these various memory units, one or more processing units 1212 retrieves instructions to execute and data to process in order to perform the processes of one or more implementations.

The bus 1208 is also connected to an input device interface 1214 and an output device interface 1206. The input device interface 1214 enables a user to communicate information and select commands to the electronic system 1200. Input devices used with input device interface 1214 include, for example, alphanumeric keyboards and pointing devices (also referred to as "cursor control devices"). The output device interface 1206 can, for example, display images generated by the electronic system 1200. Output devices used with the output device interface 1206 include, for example, printers and display devices, such as Liquid Crystal Displays (LCDs), Light Emitting Diode (LED) displays, Organic Light Emitting Diode (OLED) displays, flexible displays, flat panel displays, solid state displays, projectors, or any other device for outputting information. One or more implementations include devices that function as both input devices and output devices, such as touch screens. In these embodiments, the feedback provided to the user can be any form of sensory feedback, such as visual, auditory, or tactile feedback, and input from the user can be received in any form, including acoustic, speech, or tactile input.

Finally, as shown in fig. 12, bus 1208 also couples electronic system 1200 to one or more networks (not shown) through one or more network interfaces 1216. In this manner, the computer may be part of one or more networks of computers, such as a LAN, a Wide Area Network (WAN), an intranet, or a network of multiple networks, such as the internet. Any or all of the components of electronic system 1200 may be used in conjunction with the present invention.

Implementations within the scope of the present invention may be implemented, in part or in whole, using a tangible computer-readable storage medium (or multiple tangible computer-readable storage media of one or more types) encoding one or more instructions. Tangible computer-readable storage media may also be non-transitory in nature.

A computer-readable storage medium may be any storage medium that can be read, written, or otherwise accessed by a general purpose or special purpose computing device, including any processing electronics and/or processing circuitry capable of executing instructions. For example, and without limitation, a computer-readable medium may comprise any volatile semiconductor memory, such as RAM, DRAM, SRAM, T-RAM, Z-RAM, and TTRAM. The computer readable medium may also include any non-volatile semiconductor memory, such as ROM, PROM, EPROM, EEPROM, NVRAM, flash memory, nvSRAM, FeRAM, FeTRAM, MRAM, PRAM, CBRAM, SONOS, RRAM, NRAM, racetrack memory, FJG, and millettia memory.

Further, the computer-readable storage medium may include any non-semiconductor memory, such as optical disk memory, magnetic tape, other magnetic storage devices, or any other medium capable of storing one or more instructions. In some implementations, the tangible computer-readable storage medium may be directly coupled to the computing device, while in other implementations, the tangible computer-readable storage medium may be indirectly coupled to the computing device, e.g., via one or more wired connections, one or more wireless connections, or any combination thereof.

The instructions may be executed directly, or may be used to develop executable instructions. For example, the instructions may be implemented as executable or non-executable machine code, or may be implemented as instructions of a high-level language that may be compiled to produce executable or non-executable machine code. Further, instructions may also be implemented as or may contain data. Computer-executable instructions may also be organized in any format including routines, subroutines, programs, data structures, objects, modules, applications, applets, functions, and the like. As recognized by one of ordinary skill in the art, details including, but not limited to, number, structure, order, and organization of instructions may vary significantly without changing the underlying logic, function, processing, and output.

Although the discussion above has primarily referred to a microprocessor or multi-core processor executing Software (SW), one or more implementations are performed by one or more integrated circuits, such as an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA). In one or more implementations, such integrated circuits execute instructions stored on the circuit itself.

The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but is to be accorded the full scope consistent with the language claims, wherein reference to an element in the singular is not intended to mean "one and only one" (unless specifically so stated), but rather "one or more. The term "some" means one or more unless specifically stated otherwise. A pronoun for a male (e.g., his) includes a female and a neutral gender (e.g., her and it), and vice versa. The headings and sub-headings, if any, are for convenience only and do not limit the invention.

The terms "configured to," "operable to," and "programmed to" do not imply any particular tangible or intangible modification to an object, but are intended to be used interchangeably. For example, a processor configured to monitor and control an operation or component may also mean a processor programmed to monitor and control the operation or a processor operable to monitor and control the operation. Likewise, a processor configured to execute code may be constructed as a processor programmed to execute code or operable to execute code.

A phrase such as an "aspect" does not imply that such aspect is essential to the technology or that such aspect applies to all configurations of the technology. The disclosure in connection with an aspect may apply to all configurations or one or more configurations. A phrase such as an "aspect" may refer to one or more aspects and vice versa. A phrase such as a "configuration" does not imply that such configuration is essential to the present technology or that such configuration applies to all configurations of the present technology. The disclosure relating to configurations may apply to all configurations or one or more configurations. A phrase such as "configured" may refer to one or more configurations and vice versa.

The word "example" is used herein to mean "serving as an example or illustration. Any aspect or design described herein as an "example" is not necessarily to be construed as preferred or advantageous over other aspects or designs.

All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the claims. Unless an element is specifically recited using the phrase "means for.," or in the context of method claims, an element is recited using the phrase "step for.,", any claim element should not be construed in accordance with the provisions of 35u.s.c. § 112, paragraph 6. Furthermore, to the extent that the terms "includes," "has," "having," and the like are used in either the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term "comprising" as "comprising" is interpreted when employed as a transitional word in a claim.

Those of skill in the art will appreciate that the various illustrative blocks, modules, elements, components, methods, and algorithms described herein may be implemented as electronic hardware, computer software, or combinations of both. To illustrate this interchangeability of hardware and software, various illustrative blocks, modules, elements, components, methods, and algorithms have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application. The various components and blocks may be arranged differently (e.g., arranged in a different order or partitioned in a different manner), all without departing from the scope of the present technology.

21页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:视音频同步的监测方法及系统,以及视音频播出系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类