Multi-screen interaction method and system

文档序号:1342051 发布日期:2020-07-17 浏览:9次 中文

阅读说明:本技术 一种多屏交互方法及其系统 (Multi-screen interaction method and system ) 是由 李小波 王振超 李昆仑 于 2020-06-15 设计创作,主要内容包括:本申请公开了一种多屏交互方法及其系统,其中,多屏交互方法包括如下步骤:接收操作终端发送的接入请求,并完成设备接入;与当前完成设备接入的操作终端进行绑定,并建立实时通信通道;通过实时通信通道接收并显示交互行为,其中,交互行为的交互类型包括:同步数据或操作指令。本申请能够简化显示终端操作过程,以及提升操作终端查看数据的效果。(The application discloses a multi-screen interaction method and a system thereof, wherein the multi-screen interaction method comprises the following steps: receiving an access request sent by an operation terminal and completing equipment access; binding with an operation terminal which finishes equipment access at present, and establishing a real-time communication channel; receiving and displaying the interactive behaviors through a real-time communication channel, wherein the interactive types of the interactive behaviors comprise: synchronizing data or operating instructions. The method and the device can simplify the operation process of the display terminal and improve the effect of checking data by the operation terminal.)

1. A multi-screen interaction method is characterized by comprising the following steps:

receiving an access request sent by an operation terminal and completing equipment access;

binding with an operation terminal which finishes equipment access at present, and establishing a real-time communication channel;

receiving and displaying the interactive behaviors through a real-time communication channel, wherein the interactive types of the interactive behaviors comprise: synchronizing data or operating instructions.

2. A multi-screen interaction method as claimed in claim 1, wherein the sub-step of binding with an operation terminal currently completing device access and establishing a real-time communication channel is as follows:

sending access completion information to the operating terminal, wherein the access completion information at least comprises: an identity code of the operation terminal;

receiving a display screen preemption request fed back by the operation terminal after receiving the access completion information, and completing display screen connection;

and after the connection of the display screen is completed, the establishment of a real-time communication channel is completed.

3. A multi-screen interaction method as recited in claim 1, wherein the sub-steps of receiving and displaying interaction behavior via the real-time communication channel are as follows:

determining an interaction type of the interaction behavior, wherein the interaction type comprises: at least one of synchronization data or operational instructions;

and processing and displaying the interaction behavior according to the interaction type.

4. A multi-screen interaction method as recited in claim 3, wherein, when the interaction type is synchronous data, the sub-steps of receiving and displaying the synchronous data via the real-time communication channel are as follows:

acquiring synchronous data and judging the data type of the synchronous data;

performing data detection on the synchronous data according to the data type to generate a detection result;

processing the synchronous data according to the detection result to generate processed synchronous data;

and synchronizing the processed synchronous data to a display screen for displaying.

5. A multi-screen interaction method as claimed in claim 4, wherein if the data type is video data, the sub-step of detecting the video data is as follows:

obtaining a maximum allowable time and an average time of a frame of image in decoded video data;

and obtaining the playing capacity parameter by using the maximum allowable time and the average time.

6. A multi-screen interaction method as recited in claim 5, wherein the specific formula for obtaining the playability parameter using the maximum allowable time and the average time is as follows:

wherein the content of the first and second substances,is a playing capability parameter;is the average time;is the maximum allowable time.

7. A multi-screen interaction system, comprising: a display terminal and an operation terminal;

wherein, the display terminal: for performing the multi-screen interaction method of any one of claims 1-6;

operating the terminal: the system is used for sending an access request to the display terminal, establishing a real-time communication channel with the display terminal, and sending synchronous data or an operation instruction to the display terminal through the real-time communication channel.

8. A multi-screen interaction system as recited in claim 7, wherein the display terminal includes: the system comprises a display screen, a data processing device and a cloud storage;

wherein, the high in the clouds is stored: used for storing the historical mark code; the operation log is used for storing and displaying the operation log reported by the terminal; an interaction behavior for receiving synchronization;

a data processing device: the system comprises a cloud storage, a synchronization data processing unit and a data processing unit, wherein the cloud storage is used for acquiring interaction behaviors from the cloud storage, processing the synchronization data in the interaction behaviors and generating processed synchronization data;

a display screen: for receiving and displaying synchronization data or operation instructions.

9. A multi-screen interaction system as recited in claim 8, wherein the data processing apparatus includes: the device comprises a data acquisition unit, a detection unit and a data processing unit;

wherein the data acquisition unit: the cloud storage and synchronization system is used for acquiring synchronization data from the cloud storage and judging the data type of the synchronization data;

a detection unit: performing data detection on corresponding synchronous data according to the data type to generate a detection result;

a data processing unit: and receiving the detection result, processing the synchronous data according to the detection result, generating processed synchronous data, and synchronizing the processed synchronous data to a display screen for displaying.

10. A multi-screen interaction system as recited in claim 8, wherein the operator terminal has a display unit, and the size of the display unit is smaller than the size of the display screen.

Technical Field

The present application relates to the field of communications technologies, and in particular, to a multi-screen interaction method and system.

Background

With the development of multimedia compression technology and network communication technology, media service providers have introduced more and more video contents with high compression ratio, high resolution and high frame rate, which will greatly improve the visual experience of users and enrich the entertainment life of users. However, to play the video contents with high compression ratio, high resolution and high frame rate, the computing power and data processing power of the terminal player are also highly required. The complicated operation process of the playing device (such as a set-top box, a television and the like) with a large display screen is complicated, and the user experience is poor. And when the mobile equipment (mobile phone, tablet computer, etc.) with a small display screen directly checks data, the checking of details is inconvenient.

In addition, the playing device receives the data synchronized with the mobile terminal for playing, and frame skipping is easy to occur.

Disclosure of Invention

The application aims to provide a multi-screen interaction method and a multi-screen interaction system, which can simplify the operation process of a display terminal and improve the data checking effect of the operation terminal.

In order to achieve the above object, the present application provides a multi-screen interaction method, including the following steps: receiving an access request sent by an operation terminal and completing equipment access; binding with an operation terminal which finishes equipment access at present, and establishing a real-time communication channel; receiving and displaying the interactive behaviors through a real-time communication channel, wherein the interactive types of the interactive behaviors comprise: synchronizing data or operating instructions.

As above, the sub-steps of binding with the operation terminal currently completing the device access and establishing the real-time communication channel are as follows: sending access completion information to the operating terminal, wherein the access completion information at least comprises: an identity code of the operation terminal; receiving a display screen preemption request fed back by the operation terminal after receiving the access completion information, and completing display screen connection; and after the connection of the display screen is completed, the establishment of a real-time communication channel is completed.

As above, wherein the sub-steps of receiving and displaying the interactive behavior through the real-time communication channel are as follows: determining an interaction type of the interaction behavior, wherein the interaction type comprises: at least one of synchronization data or operational instructions; and processing and displaying the interaction behavior according to the interaction type.

As above, wherein, when the interaction type is synchronous data, the sub-steps of receiving and displaying the synchronous data through the real-time communication channel are as follows: acquiring synchronous data and judging the data type of the synchronous data; performing data detection on the synchronous data according to the data type to generate a detection result; processing the synchronous data according to the detection result to generate processed synchronous data; and synchronizing the processed synchronous data to a display screen for displaying.

As above, if the data type is video data, the sub-step of detecting the video data is as follows: obtaining a maximum allowable time and an average time of a frame of image in decoded video data; and obtaining the playing capacity parameter by using the maximum allowable time and the average time.

As above, the specific formula for obtaining the playability parameter by using the maximum allowable time and the average time is as follows:(ii) a Wherein the content of the first and second substances,is a playing capability parameter;is the average time;is the maximum allowable time.

The present application further provides a multi-screen interaction system, including: a display terminal and an operation terminal; wherein, the display terminal: the multi-screen interaction method is used for executing the multi-screen interaction method; operating the terminal: the system is used for sending an access request to the display terminal, establishing a real-time communication channel with the display terminal, and sending synchronous data or an operation instruction to the display terminal through the real-time communication channel.

As above, wherein the display terminal includes: the system comprises a display screen, a data processing device and a cloud storage; wherein, the high in the clouds is stored: used for storing the historical mark code; the operation log is used for storing and displaying the operation log reported by the terminal; an interaction behavior for receiving synchronization; a data processing device: the system comprises a cloud storage, a synchronization data processing unit and a data processing unit, wherein the cloud storage is used for acquiring interaction behaviors from the cloud storage, processing the synchronization data in the interaction behaviors and generating processed synchronization data; a display screen: for receiving and displaying synchronization data or operation instructions.

As above, wherein the data processing apparatus comprises: the device comprises a data acquisition unit, a detection unit and a data processing unit; wherein the data acquisition unit: the cloud storage and synchronization system is used for acquiring synchronization data from the cloud storage and judging the data type of the synchronization data; a detection unit: performing data detection on corresponding synchronous data according to the data type to generate a detection result; a data processing unit: and receiving the detection result, processing the synchronous data according to the detection result, generating processed synchronous data, and synchronizing the processed synchronous data to a display screen for displaying.

As above, wherein the operation terminal has a display unit, the size of the display unit is smaller than the size of the display screen.

The beneficial effect that this application realized is as follows:

(1) the display terminal is used for displaying interactive behaviors, the operation terminal is used for operating, a display screen (large screen) of the display terminal can provide and display details which are not possessed by a display unit (small screen) of the operation terminal, a user can conveniently view detailed contents of data, complex operation of the display terminal is operated through the operation terminal, interactive operation (such as angle changing, detail zooming, sliding and other operations) can be carried out more conveniently to view displayed content information interactively, and the effect of improving usability, convenience and reliability of user operation is achieved.

(2) The display terminal and the operation terminal are mainly asynchronous interaction, namely, no response is needed to wait, after the interactive behavior is received and decrypted, content display or effect expression is carried out on the content which needs to respond, and the display terminal and the operation terminal have the advantages of higher transmission efficiency and lower communication delay.

Drawings

In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art according to the drawings.

FIG. 1 is a schematic diagram of a multi-screen interaction system according to an embodiment;

FIG. 2 is a flowchart of an embodiment of a multi-screen interaction method.

Detailed Description

The technical solutions in the embodiments of the present invention are clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.

The application provides a multi-screen interaction method and a multi-screen interaction system, which can simplify the operation process of a display terminal and improve the effect of checking data by the operation terminal.

As shown in fig. 1, the present application provides a multi-screen interaction system, including: a display terminal 110 and an operation terminal 120.

Wherein, the display terminal: for performing the multi-screen interaction method described below.

Operating the terminal: the system is used for sending an access request to the display terminal, establishing a real-time communication channel with the display terminal, and sending synchronous data or an operation instruction to the display terminal through the real-time communication channel. Specifically, the operation terminal is a handheld device, for example: cell phone, Pad, etc.

Further, the display terminal includes: display screen, data processing device and high in the clouds storage.

Wherein, the high in the clouds is stored: used for storing the historical mark code; the operation log is used for storing and displaying the operation log reported by the terminal; for receiving synchronized interaction behaviors.

A data processing device: the cloud storage processing system is used for acquiring the interaction behaviors from the cloud storage, processing the synchronous data in the interaction behaviors and generating the processed synchronous data.

A display screen: for receiving and displaying synchronization data or operation instructions.

Further, the data processing apparatus includes: the device comprises a data acquisition unit, a detection unit and a data processing unit.

Wherein the data acquisition unit: the cloud storage and synchronization system is used for acquiring the synchronization data from the cloud storage and judging the data type of the synchronization data.

A detection unit: and carrying out data detection on the corresponding synchronous data according to the data type to generate a detection result.

A data processing unit: and receiving the detection result, processing the synchronous data according to the detection result, generating processed synchronous data, and synchronizing the processed synchronous data to a display screen for displaying.

Further, the operation terminal has a display unit, and the size of the display unit is smaller than that of the display screen.

As shown in fig. 2, the present application provides a multi-screen interaction method, including the following steps:

s210: and receiving an access request sent by the operation terminal and completing equipment access.

Further, the substeps of receiving an access request sent by the operation terminal and completing the device access are as follows:

q1: and receiving an access request sent by the operating terminal.

Specifically, after the display terminal is started, the operation terminal sends an access request to the display terminal through a temporary communication channel of the display terminal, and after the display terminal receives the access request, Q2 is executed. Wherein, the access request includes: a security certificate of the operation terminal and an operation terminal identification code.

Q2: and processing the access request to generate an authentication result.

Further, the sub-step of processing the access request and generating the authentication result is as follows:

q210: and identifying the identification code in the access request to generate an identification result.

Specifically, the recognition result includes: no authentication is required and authentication is required. After receiving the access request, the display terminal identifies the identification code in the access request, and if the identification code is identified as a history identification code, the generated identification result is that authentication is not needed; and if the identification code is identified as a new identification code, the generated identification result is the identification required. After the display terminal generates the recognition result, Q220 is executed.

The history mark code is a mark code of an operation terminal which displays that the terminal completes equipment access once. The new identifier is the identifier of the operation terminal that sent the access request to the display terminal for the first time.

Q220: analyzing the identification result, and if the identification result is that authentication is not needed, directly generating an authentication result; and if the identification result is that the authentication is required, authenticating the security certificate in the access request and generating an authentication result.

Specifically, the authentication result includes: authentication success and authentication failure. And if the display terminal analyzes the identification result to be that authentication is not needed, directly generating an authentication result, wherein the authentication result is authentication success. If the display terminal analyzes the identification result to be required to be authenticated, authenticating the security certificate in the access request, and if the security certificate is authenticated to be legal, generating authentication success; and if the authentication security certificate is illegal, generating authentication failure. After the display terminal generates the authentication result, Q3 is executed.

Q3: if the authentication result is that the authentication is successful, the equipment access is finished; and if the authentication result is authentication failure, generating an operation log, and automatically reporting the operation log to the cloud.

Specifically, after the display terminal generates the authentication result, the authentication result is judged, if the authentication result is successful, the device access is completed, and S220 is executed; and if the authentication result is authentication failure, generating an operation log from the operation related data, automatically uploading the operation log to a cloud for storage, and using the stored operation log for data analysis or risk monitoring.

S220: and binding with the operation terminal which finishes the equipment access at present, and establishing a real-time communication channel.

Further, the sub-steps of binding with the operation terminal which currently completes the equipment access and establishing the real-time communication channel are as follows:

w1: sending access completion information to the operating terminal, wherein the access completion information at least comprises: and operating the identity code of the terminal.

Specifically, after the device access is completed, the display terminal sends access completion information to the operation terminal, and after the operation terminal receives the access completion information, the operation terminal feeds back a display screen preemption request to the display terminal and executes W2.

W2: and receiving a display screen preemption request fed back by the operation terminal after receiving the access completion information, and completing display screen connection.

Further, receiving a display screen preemption request fed back by the operation terminal, and completing the display screen connection according to the following substeps:

e1: the current connection state of the display screen is checked and a state result is generated.

Specifically, the display terminal checks the connection state of the display screen, and if the current display screen is checked to be in a state of being connected with the historical operation terminal, the generated state result is occupied; and if the current display screen is in a state of not being connected with any operation terminal, the generated state result is unoccupied. After the status result is generated, E2 is executed.

E2: and performing display screen connection on the operation terminal according to the state result.

Further, the sub-step of connecting the display screen of the operation terminal according to the state result is as follows:

e210: analyzing the state result, and if the state result is occupied, executing E220; if the status result is not occupied, E230 is executed.

E220: and sending a display screen preemption instruction to the operation terminal, disconnecting the display screen preemption instruction from the historical operation terminal, and executing E230.

Specifically, when the state result is occupied, it indicates that the current display screen is in a connection state with the historical operation terminal, and the display terminal sends a display screen preemption instruction to the newly accessed operation terminal. And after the newly accessed operation terminal receives the display screen preemption instruction, preempting the display screen, and after the preemption succeeds, disconnecting the data synchronization of the display terminal and the previous operation terminal and executing E230.

E230: and completing the display screen connection.

Specifically, after the display terminal is connected to the display screen of the operation terminal, W3 is executed.

W3: and after the connection of the display screen is completed, the establishment of a real-time communication channel is completed.

Specifically, after the connection of the display screen is completed, the sub-step of completing the establishment of the real-time communication channel is as follows:

w310: and sending display screen connection success information to the operation terminal.

Specifically, after the display terminal completes connection with the display screen of the operation terminal, the display screen connection success information is sent to the operation terminal, and W320 is executed.

W320: judging whether an interaction request is received within a preset time range, and if the interaction request is received within the preset time range, finishing the establishment of a real-time communication channel; if the interaction request is not received within a preset time range, sending a rebinding instruction to the operation terminal; the interaction request is information sent by the operation terminal after receiving the information that the display screen is successfully connected.

Specifically, as an embodiment, the preset time range is: the display terminal sends 3 continuous display screen connection success information to the operation terminal. If the display terminal does not receive the interaction request of the same operation terminal within the preset time range, the display terminal actively disconnects the current operation terminal and sends a rebinding instruction to the operation terminal. If the operation terminal needs to be connected again, re-executing S220 according to the re-binding instruction. If the display terminal receives the interactive request within the preset time range, the establishment of the real-time communication channel is completed, S230 is executed, and the operation terminal is immediately connected with the display terminalAnd carrying out data synchronization and playback.

Specifically, the communication protocol of the real-time traffic channel keeps the current information and the previous 3 times of information as one information unit for broadcasting, so that data inconsistency caused by packet loss, packet breakage, packet sticking and the like can be effectively reduced.

S230: and receiving and displaying the interactive behaviors through a real-time communication channel.

Specifically, after the real-time communication channel is established, the operation terminal asynchronously sends the interaction behavior to the cloud storage, the data processing device of the display terminal obtains the interaction behavior from the cloud storage and processes the interaction behavior, and the processed synchronous data are synchronously displayed on the display screen.

Further, the sub-steps of receiving and displaying the interactive behavior through the real-time communication channel are as follows:

p1: determining an interaction type of the interaction behavior, wherein the interaction type comprises: synchronizing at least one of data or operational instructions.

Specifically, after the display terminal obtains the interactive behavior, the interactive type of the interactive behavior is determined.

P2: and processing and displaying the interaction behavior according to the interaction type.

Specifically, after the interaction type is determined, the interaction behavior is processed according to the interaction type, if the interaction type is an operation instruction, the operation instruction is directly displayed on a display screen, and operation is performed according to the operation instruction; and if the interaction type is synchronous data, detecting and processing the synchronous data, and displaying the processed synchronous data on a display screen.

Further, when the interaction type is synchronous data, the sub-step of receiving and displaying the synchronous data through the real-time communication channel is as follows:

r1: and acquiring synchronous data and judging the data type of the synchronous data.

Specifically, after the data acquisition unit acquires the synchronization data from the cloud storage, the data acquisition unit judges the data type of the synchronization data, determines the data type, and executes R2.

Wherein, the data type of the synchronous data at least comprises: video data, image data, audio data, or text data.

R2: and carrying out data detection on the synchronous data according to the data type to generate a detection result.

Specifically, as a first embodiment, if the data type is image data, the definition degree and the damage degree of the image data are detected, and if the definition degree of the image data meets a preset threshold value and is not damaged, the image data are synchronized to a display screen for displaying.

Specifically, as a second embodiment, if the data type is audio data, the definition degree and the damage degree of the audio data are detected, and if the definition degree of the audio data meets a preset threshold value and is not damaged, the audio data are synchronized to the display screen to be played.

Specifically, as a third embodiment, if the data type is video data, the sub-step of detecting the video data is as follows:

r210: the maximum allowable time and the average time of one frame of image in the decoded video data are obtained.

Specifically, the detection unit acquires the maximum allowable time for decoding one frame of image according to the frame rate of the acquired video data and the time required for decoding the frame of video imageAnd average time for decoding a frame of image

R220: and obtaining the playing capacity parameter by using the maximum allowable time and the average time.

Further, a specific formula for obtaining the playing capability parameter by using the maximum allowable time and the average time is as follows:

wherein the content of the first and second substances,is a playing capability parameter;is the average time;is the maximum allowable time.

R230: judging the playing capability parameter to generate a detection result, wherein the detection result comprises: with and without frame skipping.

Specifically, if the detection unit determines that the video data is acquired by the data acquisition unitIf the video is likely to have a frame skipping condition, the video data is sent to the data processing unit to execute R3.

If the detection unit judgesAnd if the video data does not have the frame skipping condition during display, the generated detection result is no frame skipping and is directly synchronized to the display screen for displaying.

R3: and processing the synchronous data according to the detection result to generate processed synchronous data.

Specifically, as an embodiment, when the synchronization data is video data, the synchronization data is processed according to the detection result, and the sub-step of generating the processed synchronization data is as follows:

r310: and determining the frame skipping position of the frame skipping data.

Specifically, the data processing unit receives video data which may have a frame skipping condition, and performs a secondary check on the video data, if the secondary check is performedIf the video data does not have the frame skipping position, it indicates that the frame skipping condition does not exist in the video data, and the video data is directly processedThe video data is processed as post-processing synchronization data, and R4 is executed. If in the second inspectionHas a frame skipping position, the frame skipping position is determined, and R320 is performed.

R320: and carrying out image matching on the front frame image and the rear frame image of the determined frame skipping position, and carrying out interpolation to obtain an interpolation image.

Specifically, the image matching method may use a compression-first-filtering (CPF) matching algorithm, but is not limited to the CPF matching algorithm, and may also use a gray-scale-based matching algorithm, a feature-based matching algorithm, a relationship-based matching algorithm, and the like.

R320: and comparing the interpolation image with the image to be processed to generate a comparison result.

The image to be processed is a frame skipping coding frame with local compensation of an original input image. The interpolation image is an interpolation reference image of the image to be processed.

Further, as another embodiment, a matching degree between regions at the same positions corresponding to the intermediate image to be processed and the interpolated image is obtained, and a comparison result is generated according to the matching degree, and the sub-steps are as follows:

y1: and acquiring a first signal-to-noise ratio threshold value and an objective signal-to-noise ratio of a block in the interpolation image.

In particular, according to the matching probabilityHistogram the S/N ratio of all blocks in the image to obtain the corresponding first S/N ratio threshold value

Specifically, the formula for obtaining the mismatch flag of the same position region corresponding to the interpolated image and the image to be processed is as follows:

wherein the content of the first and second substances,

wherein the content of the first and second substances,into blocksIdentification bits of mismatch, i.e. image matching model in blockThe effectiveness of (1);for the interpolated image with the image to be processedA block at the same position as the block;into blocksThe signal-to-noise ratio of (c);the value of the signal-to-noise ratio threshold is related to the matching probability as a first signal-to-noise ratio threshold;is the matching probability;is the total number of all blocks within a frame of image;is a natural number, and is provided with a plurality of groups,

in particular, whenWhen, a match is indicated; when in useWhen, a mismatch is indicated. Is obtained toAfter that, Y2 is executed.

Y2: and acquiring a jump mark bit of the same position region corresponding to the interpolation image of the image to be processed by utilizing the first signal-to-noise ratio threshold value and the objective signal-to-noise ratio of the block in the interpolation image, and generating a comparison result according to the jump mark.

Further, the sub-step of obtaining the skip flag bit of the same position region corresponding to the FR frame of the interpolated image and the F frame of the image to be processed is as follows:

wherein the content of the first and second substances,

wherein the content of the first and second substances,into blocksA skip flag bit;is a graph to be processedLike the firstA block at the same position as the block;for interpolating blocks in an imageThe signal-to-noise ratio of (c);a signal-to-noise ratio decision threshold;is a first signal-to-noise ratio threshold value;is an introduced second signal-to-noise ratio threshold;is the average signal-to-noise ratio of the whole image representing the interpolation image;is the minimum distance of the signal-to-noise ratio of the mismatched block from the average signal-to-noise ratio.

In particular, ifThen pair the blocksCoding is required, and the comparison result is required to be processed; otherwise, the code is not needed, and only the zone bit is transmittedThe comparison result is no need of processing. Wherein, the blockJumping flag bitNamely the matching degree.

Specifically, as another embodiment, the data processing unit compares the interpolated image with the image to be processed, and if the similarity of the region at the position corresponding to the interpolated image and the image to be processed is high (the similarity is greater than or equal to the similarity threshold value of 80%), the comparison result is that no processing is required. And if the similarity of the areas at the same positions corresponding to the interpolation image and the image to be processed is low (the similarity is less than the similarity threshold value of 80%), the comparison result is that the processing is required.

R330: and processing the image to be processed according to the comparison result to obtain a complementary code image.

Specifically, when the comparison result is that no processing is needed, the same region in the interpolation image is directly used for replacing the region in the image to be processed, and the replaced image to be processed is a complement image. And when the comparison result is that the image needs to be processed, performing complement processing on the region in the image to be processed to obtain a complement image. After the complement image is obtained, R340 is performed.

R340: and inserting the complementary code image into the corresponding frame skipping position of the video data to serve as processed synchronous data.

Specifically, the data processing unit executes R4 after acquiring the processed synchronization data.

Further, as another embodiment, the interaction type includes two types of synchronous data and an operation instruction, where the operation instruction is a slide, and when the synchronous data is image data or text data, the frame skipping data is processed, and an intermediate frame can be automatically compensated after calculation in combination with a slide distance and a slide time of a finger on a screen of the operation terminal, which can avoid frame skipping visually by the user.

R4: and synchronizing the processed synchronous data to a display screen for displaying.

Specifically, the display screen supporting the display of the webP format is preferentially displayed, so that the display screen has the advantages of smaller bandwidth and shorter waiting time of a user.

The beneficial effect that this application realized is as follows:

(1) the display terminal is used for displaying interactive behaviors, the operation terminal is used for operating, a display screen (large screen) of the display terminal can provide and display details which are not possessed by a display unit (small screen) of the operation terminal, a user can conveniently view detailed contents of data, complex operation of the display terminal is operated through the operation terminal, interactive operation (such as angle changing, detail zooming, sliding and other operations) can be carried out more conveniently to view displayed content information interactively, and the effect of improving usability, convenience and reliability of user operation is achieved.

(2) The display terminal and the operation terminal are mainly asynchronous interaction, namely, no response is needed to wait, after the interactive behavior is received and decrypted, content display or effect expression is carried out on the content which needs to respond, and the display terminal and the operation terminal have the advantages of higher transmission efficiency and lower communication delay.

While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, the scope of protection of the present application is intended to be interpreted to include the preferred embodiments and all variations and modifications that fall within the scope of the present application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

12页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:信号处理方法及系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类