Live broadcast page switching method, video page switching method, electronic device and storage medium

文档序号:134863 发布日期:2021-10-22 浏览:14次 中文

阅读说明:本技术 直播页面切换方法、视频页面切换方法、电子设备及存储介质 (Live broadcast page switching method, video page switching method, electronic device and storage medium ) 是由 涂方蕾 吴伟嘉 于 2021-08-12 设计创作,主要内容包括:本申请公开一种直播页面切换方法,包括:在第一直播间的直播页面中播放第一流数据;响应于用户的滑动触摸事件,定义滑动周期内的多个滑动阶段;在用户手指在所述触屏上滑动并保持接触的第一滑动阶段,保持所述第一直播间的直播页面的播放,拉取用于相邻的第二直播间的第二流数据并在第二直播间的直播页面中播放所述第二流数据;在用户手指离开所述触屏后且滑动动画结束前的第二滑动阶段,判断所述滑动动画结束后将停留的目标直播间。本申请还公开一种视频页面切换方法、电子设备和存储介质。本申请对直播页面之间的滑动触摸事件进行抽象,抽象出一个完整的滑动周期内的多个滑动阶段,并且利用这些阶段,在滑动中进行直播画面的加载优化。(The application discloses a live broadcast page switching method, which comprises the following steps: playing first stream data in a live broadcast page of a first live broadcast room; defining a plurality of sliding phases within a sliding cycle in response to a sliding touch event of a user; in a first sliding stage that a user finger slides on the touch screen and keeps in contact with the touch screen, the playing of a live broadcast page of the first live broadcast room is kept, second streaming data used in an adjacent second live broadcast room is pulled, and the second streaming data is played in the live broadcast page of the second live broadcast room; and in a second sliding stage after the finger of the user leaves the touch screen and before the sliding animation is finished, judging a target live broadcast room to be stopped after the sliding animation is finished. The application also discloses a video page switching method, electronic equipment and a storage medium. The method and the device abstract sliding touch events among live broadcast pages, abstract a plurality of sliding stages in a complete sliding period, and utilize the stages to carry out loading optimization on live broadcast pictures in sliding.)

1. A live broadcast page switching method is implemented by a touch screen terminal and is characterized by comprising the following steps:

playing first stream data in a live broadcast page of a first live broadcast room;

defining a plurality of sliding phases within a sliding cycle in response to a sliding touch event of a user;

in a first sliding stage that a user finger slides on the touch screen and keeps in contact with the touch screen, the playing of a live broadcast page of the first live broadcast room is kept, second streaming data used in an adjacent second live broadcast room is pulled, and the second streaming data is played in the live broadcast page of the second live broadcast room;

in a second sliding stage after the user finger leaves the touch screen and before the sliding animation is finished, judging a target live broadcast room to be stopped after the sliding animation is finished:

if the target live broadcast room is judged to be a second live broadcast room, maintaining the pulling and playing of second stream data of the second live broadcast room;

and if the target live broadcast room is judged to be the first live broadcast room, the playing of the first stream data in the live broadcast page of the first live broadcast room is kept, and the second stream data is stopped being pulled.

2. The live page switching method of claim 1, further comprising:

and in the first sliding stage, performing mute processing on the pulled and played second stream data.

3. The live page switching method according to claim 1, wherein in the second sliding stage after the user's finger leaves the touch screen and before the sliding animation is finished, determining a target live room to be stopped after the sliding animation is finished comprises:

acquiring the sliding direction, the sliding distance, the sliding speed or the acceleration of the finger of the user in a preset time period before the finger leaves the touch screen;

and judging a target live broadcast room to be stopped after the sliding animation is finished based on the sliding direction, the sliding distance, the sliding speed or the acceleration.

4. The live broadcast page switching method according to claim 3, wherein the determining, based on the sliding direction, the sliding distance, the sliding speed, or the acceleration, a target live broadcast room where the sliding animation stays after the sliding animation is ended includes:

if the sliding direction is determined not to be opposite to the relative position of the second live broadcast room relative to the first live broadcast room, judging that the target live broadcast room is the first live broadcast room;

and if the sliding direction is opposite to the relative position of the second live broadcast room relative to the first live broadcast room and the sliding distance exceeds a first threshold value, judging that the target live broadcast room is the second live broadcast room.

5. The live broadcast page switching method according to claim 4, wherein the determining of the target live broadcast room to be stopped after the end of the slide animation based on the slide direction, the slide distance, the slide speed, or the acceleration further includes:

if the sliding direction is opposite to the relative position of the second live broadcast room relative to the first live broadcast room and the sliding distance does not exceed the first threshold value, further determining whether the sliding speed or the acceleration exceeds a second threshold value;

and if the sliding speed or the acceleration is determined to exceed the second threshold value, determining that the target live broadcast room is a second live broadcast room.

6. The live broadcast page switching method according to claim 5, wherein the determining of the target live broadcast room to be stopped after the end of the slide animation based on the slide direction, the slide distance, the slide speed, or the acceleration further includes:

if it is determined that the slip velocity or acceleration does not exceed the second threshold, further determining whether a product of a slip distance and the slip velocity or acceleration exceeds a third threshold;

if the product of the sliding distance and the sliding speed or the acceleration is determined not to exceed a third threshold value, determining that the target live broadcast room is a first live broadcast room;

and if the product of the sliding distance and the sliding speed or the acceleration is determined to exceed the third threshold, determining that the target live broadcast room is a second live broadcast room.

7. The live page switching method according to any one of claims 1 to 6, further comprising:

and in the second sliding stage, if the target live broadcast room is judged to be a second live broadcast room, starting the sound of the second stream data.

8. The live page switching method according to any one of claims 1 to 6, further comprising:

in the second sliding stage, if the target live broadcast room is judged to be the second live broadcast room, the first stream data is subjected to mute processing, the pulling of the first stream data is stopped, and a live broadcast page of the first live broadcast room is stopped at the last frame of the pulled first stream data.

9. The live page switching method according to any one of claims 1 to 6, further comprising:

and in a third sliding stage after the sliding animation is finished, if the target live broadcast room is judged to be the second live broadcast room, completely displaying a live broadcast page of the second live broadcast room and loading other business processes of the second live broadcast room.

10. The live page switching method according to any one of claims 1 to 6, further comprising:

and in the third sliding stage, if the target live broadcast room is judged to be the second live broadcast room, the first live broadcast room is retreated, and the live broadcast page of the first live broadcast room is emptied and reset.

11. The live page switching method according to any one of claims 1 to 6, further comprising:

and in the second sliding stage, if the target live broadcast room is judged to be the first live broadcast room, carrying out mute processing on the second stream data, and enabling a live broadcast page of the second live broadcast room to stay at the last frame of the pulled second stream data.

12. The live page switching method according to any one of claims 1 to 6, further comprising:

and in a third sliding stage after the sliding animation is finished, if the target live broadcast room is judged to be the first live broadcast room, the second live broadcast room is retreated, and the live broadcast page of the second live broadcast room is emptied and reset.

13. A video page switching method is characterized by comprising the following steps:

playing a first video in a first page;

judging whether a second video to be played in a second page is streaming media or not, wherein the second page is adjacent to the first page;

if the second video is not streaming media, preloading the second video;

if the second video is streaming media, the second video is not preloaded, and

defining a plurality of swipe phases within a swipe cycle in response to a swipe touch event by a user,

at a first sliding stage of sliding and keeping contact of a finger of a user on the touch screen, keeping playing the first video in the first page, pulling the second video which is streaming media, and playing the second video in the second page,

in a second sliding stage after the user finger leaves the touch screen and before the sliding animation is finished, judging a target page to be stopped after the sliding animation is finished:

if the stopped target page is judged to be a second page, the pulling and playing of the second video which is kept as the streaming media are carried out,

and if the stopped target page is judged to be the first page, stopping pulling the second video.

14. The method of claim 13, wherein if the second video is streaming media, the method further comprises:

and in a third sliding stage after the sliding animation is finished, if the stopped target page is judged to be a second page, completely displaying the second page and loading other business processes of the streaming media.

15. The method of claim 13, wherein if the second video is streaming media, the method further comprises:

and in a third sliding stage after the sliding animation is finished, if the stopped target page is judged to be the first page, the stream media is retreated, and the second page is emptied and reset.

16. An electronic device, comprising: a processor and a memory storing a computer program, the processor being configured to perform the method of any of claims 1 to 15 when the computer program is run.

17. A storage medium, characterized in that the storage medium stores a computer program configured to perform the method of any one of claims 1 to 15 when executed.

Technical Field

The present application relates to the field of computer technologies, and in particular, to a live page switching method and a video page switching method. The application also relates to a related electronic device and a storage medium.

Background

With the rapid development of intelligent mobile terminals and mobile communication technologies, streaming media technologies represented by live broadcasting are widely applied to mobile terminals.

When streaming media is played, such as watching live broadcasts, users often require high resolution, high real-time, and high frame rate of the played pictures, and also require effective interaction with the anchor during playing, which puts high demands on the processing capability and communication bandwidth of the mobile terminal.

When a current audience user watches the streaming media playing such as live broadcasting in a mobile terminal, the current audience user can switch between different playing pages such as live broadcasting rooms by sliding a screen. However, as the streaming media playing data is large as described above, the requirements on processing capacity and bandwidth occupation are high, and when the audience user switches between live broadcast rooms, there is a period of buffering time from entering a new live broadcast room to seeing the first frame of picture, which seriously affects the user experience.

This background description is for the purpose of facilitating understanding of relevant art in the field and is not to be construed as an admission of the prior art.

Disclosure of Invention

Therefore, embodiments of the present invention are intended to provide a live page switching method and apparatus, and a related electronic device and a storage medium, which can reduce or avoid a feeling of delay generated when a user switches a streaming media playing page, such as a live page, to provide a good user experience, and do not excessively consume processing resources and communication bandwidth resources of a terminal.

In a first aspect, a live page switching method implemented by a touch screen terminal is provided, and includes:

playing first stream data in a live broadcast page of a first live broadcast room;

defining a plurality of sliding phases within a sliding cycle in response to a sliding touch event of a user;

in a first sliding stage that a user finger slides on the touch screen and keeps in contact with the touch screen, the playing of a live broadcast page of the first live broadcast room is kept, second streaming data used in an adjacent second live broadcast room is pulled, and the second streaming data is played in the live broadcast page of the second live broadcast room;

in a second sliding stage after the user finger leaves the touch screen and before the sliding animation is finished, judging a target live broadcast room to be stopped after the sliding animation is finished:

if the target live broadcast room is judged to be a second live broadcast room, maintaining the pulling and playing of second stream data of the second live broadcast room;

and if the target live broadcast room is judged to be the first live broadcast room, the playing of the first stream data in the live broadcast page of the first live broadcast room is kept, and the second stream data is stopped being pulled.

In some embodiments, the live page switching method further comprises:

and in the first sliding stage, performing mute processing on the pulled and played second stream data.

In some embodiments, the determining, in the second sliding stage after the user finger leaves the touch screen and before the sliding animation is ended, a target live broadcast room to be stopped after the sliding animation is ended includes:

acquiring the sliding direction, the sliding distance, the sliding speed or the acceleration of the finger of the user in a preset time period before the finger leaves the touch screen;

and judging a target live broadcast room to be stopped after the sliding animation is finished based on the sliding direction, the sliding distance, the sliding speed or the acceleration.

In a further embodiment, the determining, based on the sliding direction, the sliding distance, the sliding speed, or the acceleration, a target live broadcast room where the live broadcast will stay after the sliding animation is ended includes:

if the sliding direction is determined not to be opposite to the relative position of the second live broadcast room relative to the first live broadcast room, judging that the target live broadcast room is the first live broadcast room;

and if the sliding direction is opposite to the relative position of the second live broadcast room relative to the first live broadcast room and the sliding distance exceeds a first threshold value, judging that the target live broadcast room is the second live broadcast room.

In a further embodiment, the determining, based on the sliding direction, the sliding distance, the sliding speed, or the acceleration, a target live broadcast room where the live broadcast will stay after the sliding animation is ended further includes:

if the sliding direction is opposite to the relative position of the second live broadcast room relative to the first live broadcast room and the sliding distance does not exceed the first threshold value, further determining whether the sliding speed or the acceleration exceeds a second threshold value;

and if the sliding speed or the acceleration is determined to exceed the second threshold value, determining that the target live broadcast room is a second live broadcast room.

In a further embodiment, the determining, based on the sliding direction, the sliding distance, the sliding speed, or the acceleration, a target live broadcast room where the live broadcast will stay after the sliding animation is ended further includes:

if it is determined that the slip velocity or acceleration does not exceed the second threshold, further determining whether a product of a slip distance and the slip velocity or acceleration exceeds a third threshold;

if the product of the sliding distance and the sliding speed or the acceleration is determined not to exceed a third threshold value, determining that the target live broadcast room is a first live broadcast room;

and if the product of the sliding distance and the sliding speed or the acceleration is determined to exceed the third threshold, determining that the target live broadcast room is a second live broadcast room.

In some embodiments, the live page switching method further comprises:

and in the second sliding stage, if the target live broadcast room is judged to be a second live broadcast room, starting the sound of the second stream data.

In some embodiments, the live page switching method further comprises:

in the second sliding stage, if the target live broadcast room is judged to be the second live broadcast room, the first stream data is subjected to mute processing, the pulling of the first stream data is stopped, and a live broadcast page of the first live broadcast room is stopped at the last frame of the pulled first stream data.

In some embodiments, the live page switching method further comprises:

and in a third sliding stage after the sliding animation is finished, if the target live broadcast room is judged to be the second live broadcast room, completely displaying a live broadcast page of the second live broadcast room and loading other business processes of the second live broadcast room.

In some embodiments, the live page switching method further comprises:

and in the third sliding stage, if the target live broadcast room is judged to be the second live broadcast room, the first live broadcast room is retreated, and the live broadcast page of the first live broadcast room is emptied and reset.

In some embodiments, the live page switching method further comprises:

and in the second sliding stage, if the target live broadcast room is judged to be the first live broadcast room, carrying out mute processing on the second stream data, and enabling a live broadcast page of the second live broadcast room to stay at the last frame of the pulled second stream data.

In some embodiments, the live page switching method further comprises:

and in a third sliding stage after the sliding animation is finished, if the target live broadcast room is judged to be the first live broadcast room, the second live broadcast room is retreated, and the live broadcast page of the second live broadcast room is emptied and reset.

In a second aspect, a video page switching method is provided, including:

playing a first video in a first page;

judging whether a second video to be played in a second page is streaming media or not, wherein the second page is adjacent to the first page;

if the second video is not streaming media, preloading the second video;

if the second video is streaming media, the second video is not preloaded, and

defining a plurality of swipe phases within a swipe cycle in response to a swipe touch event by a user,

at a first sliding stage of sliding and keeping contact of a finger of a user on the touch screen, keeping playing the first video in the first page, pulling the second video which is streaming media, and playing the second video in the second page,

in a second sliding stage after the user finger leaves the touch screen and before the sliding animation is finished, judging a target page to be stopped after the sliding animation is finished:

if the stopped target page is judged to be a second page, the pulling and playing of the second video which is kept as the streaming media are carried out,

and if the stopped target page is judged to be the first page, stopping pulling the second video.

This second aspect is particularly suitable for situations where non-streaming video such as short video and live video is intermixed in the slidably switchable video.

In some embodiments, the streaming media is live.

In some embodiments, if the second video is streaming media, the method further comprises:

and in a third sliding stage after the sliding animation is finished, if the stopped target page is judged to be a second page, completely displaying the second page and loading other live-broadcast business processes.

In some embodiments, if the second video is streaming media, the method further comprises:

and in a third sliding stage after the sliding animation is finished, if the stopped target page is judged to be the first page, the live broadcast is retreated, and the second page is emptied and reset.

In some embodiments, the determining, at the second sliding stage after the user's finger leaves the touch screen and before the sliding animation is finished, a target page to be stopped after the sliding animation is finished includes:

acquiring the sliding direction, the sliding distance, the sliding speed or the acceleration of the finger of the user in a preset time period before the finger leaves the touch screen;

and judging a target page to be stopped after the sliding animation is finished based on the sliding direction, the sliding distance, the sliding speed or the acceleration.

In a further embodiment, the determining, based on the sliding direction, the sliding distance, the sliding speed, or the acceleration, a target page to be stopped after the sliding animation is ended includes:

if the sliding direction is determined not to be opposite to the relative position of the second page relative to the first page, determining that the target page is the first page;

and if the sliding direction is opposite to the relative position of the second page relative to the first page and the sliding distance exceeds a first threshold value, determining that the target page is a second live broadcast.

In a further embodiment, the determining, based on the sliding direction, the sliding distance, the sliding speed, or the acceleration, a target page to be stopped after the sliding animation is ended further includes:

if the sliding direction is opposite to the relative position of the second page relative to the first page and the sliding distance does not exceed the first threshold value, further determining whether the sliding speed or the acceleration exceeds a second threshold value;

and if the sliding speed or the acceleration is determined to exceed the second threshold, determining that the target page is a second page.

In a further embodiment, the determining, based on the sliding direction, the sliding distance, the sliding speed, or the acceleration, a target page to be stopped after the sliding animation is ended further includes:

if it is determined that the slip velocity or acceleration does not exceed the second threshold, further determining whether a product of a slip distance and the slip velocity or acceleration exceeds a third threshold;

if the product of the sliding distance and the sliding speed or the acceleration is determined not to exceed a third threshold value, determining that the target page is a first page;

and if the product of the sliding distance and the sliding speed or the acceleration is determined to exceed the third threshold, determining that the target page is a second page.

In a third aspect, an electronic device is provided, comprising: a processor and a memory storing a computer program, the processor being configured to perform the method of any of the embodiments of the invention when the computer program is run.

In a fourth aspect, there is provided a storage medium storing a computer program configured when executed to perform the method of any of the embodiments of the present invention.

The inventor knows that there are two schemes for current live sliding handover.

The first solution is to start the pull stream into the house after the slide animation is completely finished. However, the first scheme can only play the video stream of one live broadcast room at a time, and the live broadcast frame stream data of the next live broadcast room is pulled only after the sliding animation is finished.

The second solution is to pull and buffer the stream data of the current live broadcast room and the adjacent live broadcast room at the same time when entering a live broadcast room, for example, pull the stream data of the upper and lower live broadcast rooms. Thus, the buffer segments of adjacent live rooms can be shown when sliding. However, in the second scheme, when entering a live broadcast room, the frame stream data of a plurality of live broadcast rooms needs to be pulled at the same time, which causes great waste to bandwidth. As mentioned above, streaming media playing has high requirements on the processing power and communication bandwidth of the mobile terminal. Under the condition that the processing capacity or the network of the mobile terminal of the user is poor, the picture display of the current live broadcast room can be seriously influenced. Secondly, if the user does not perform the sliding operation after entering the live broadcast room, the video stream segments of other pulled live broadcast rooms are completely wasted.

The embodiment of the invention draws streaming data around streaming media playing such as live broadcasting, abstracts sliding events of a touch screen terminal: abstracting a complete sliding according to different stages of screen sliding; and according to the characteristics of each stage, different operations are carried out on the live broadcast room. Therefore, the scheme of the embodiment of the invention performs staged live broadcast picture preloading optimization according to the sliding by using the time of the screen sliding page.

Compared with the current first scheme, the scheme of the embodiment of the invention can arrange the live broadcast stream picture pulling time in advance, reduce the loading time of the first frame picture, reduce the loading waiting time after sliding reception and optimize the live broadcast watching experience of a user. Therefore, the scheme of the invention can greatly optimize the loading time of the first frame of live broadcast picture when the picture slides up and down. Compared with the known second scheme, the scheme of the embodiment of the invention can pull the picture only when the user clearly has the sliding intention, so that no extra bandwidth is wasted; meanwhile, the time for pulling the streaming data is coincident with the sliding time of the user, so that compared with the second scheme, the experience of the user in switching to watch different live broadcasting rooms is not influenced.

In short, according to some embodiments of the present invention, it is possible to make full use of the time consumed by the user to operate the page sliding operation, and start to mute and pull the stream picture of the next live broadcast room and display the stream picture in advance when the page slides, so as to achieve the effect that the user can quickly view the first frame picture of the next live broadcast room before the user slides the page. At the same time, the large amount of memory space required to pre-pull the streaming media or fragments thereof is avoided, which may adversely affect the user experience.

Additional optional features and technical effects of embodiments of the invention are set forth, in part, in the description which follows and, in part, will be apparent from the description.

Drawings

Embodiments of the invention will hereinafter be described in detail with reference to the accompanying drawings, wherein the elements shown are not to scale as shown in the figures, and wherein like or similar reference numerals denote like or similar elements, and wherein:

FIG. 1 illustrates an exemplary live system capable of implementing methods of embodiments of the present invention;

FIG. 2 shows a first schematic flow chart of a method according to an embodiment of the invention;

FIGS. 3A and 3B respectively illustrate exemplary flow diagrams of methods according to embodiments of the invention;

FIG. 4A illustrates an example of a method of implementing an embodiment of the invention;

FIG. 4B illustrates an example of a method of implementing an embodiment of the invention;

FIG. 5 shows a second schematic flow chart of a method according to an embodiment of the invention;

6A, 6B, 6C illustrate a user sliding on a touch screen of a touch screen terminal;

FIG. 7A shows a schematic structural diagram of an apparatus according to an embodiment of the invention;

FIG. 7B shows a schematic structural diagram of an apparatus according to an embodiment of the invention;

fig. 8 shows a hardware configuration diagram of an electronic device according to an embodiment of the invention;

FIG. 9 shows a first operating system diagram of an electronic device, according to an embodiment of the invention;

FIG. 10 shows a second operating system diagram of an electronic device, according to an embodiment of the invention.

Detailed Description

In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the following detailed description and accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but not to limit the present invention.

In the embodiment of the present invention, Streaming Media (Streaming Media) has a conventional meaning in the art, and refers to a technology for instantly transmitting video and audio for viewing on the internet in a Streaming manner, and this technology enables data packets to be transmitted like Streaming.

In the embodiment of the present invention, live broadcast will refer to live webcast.

In the embodiment of the present invention, a live Application (APP) can be broadly interpreted, and covers an APP with a live function as a main function and other types of APPs with live functions or modules. For example, in some embodiments of the present invention, the Applications (APPs) may be different types of APPs with live functionality or modules, including but not limited to: music APP, social APP, shopping APP, video APP, and the like.

In embodiments of the present invention, a "live room" may be broadly interpreted as an independent space provided in a live application in which live audio and video streams are provided by a anchor user to audience users. In the embodiment of the invention, the 'live broadcast room' covers a common 'live broadcast room' and a similar live broadcast space with the property of the live broadcast room, such as 'song room' and the like. In some embodiments of the invention, the live room may also be referred to simply as a room.

Referring to fig. 1, an exemplary live system 100 is shown that can be used to implement the methods of embodiments of the present invention. The live system 100 may include a server 101. The live system 100 may further include a first terminal and a second terminal. Specifically, the first terminal may generate and send live streaming media (live frame) to the server 101; the second terminal may receive and play live streaming media from the server 101.

In some embodiments, the first terminal may be referred to as a push stream (terminating) terminal 111, 112, which may also be referred to as a cast terminal; the second terminal can be called a play (terminal) terminal 121, 122, 123 or a pull terminal, and can also be called a viewer terminal or simply a user terminal. In the embodiment shown in fig. 1, a plurality of push stream ports 111, 112 and a plurality of play ports 121, 122, 123 are shown.

In some embodiments, the server 101, the stream pushing end 111, 112 and the playing end 121, 122, 123 of the live system may include various functional modules.

The streaming end 111, 112 of the live broadcast system 100, such as a mobile terminal, such as a smart phone, may be configured with functional modules, including but not limited to an acquisition module, a preprocessing module, an encoder, an encapsulation module, and a streaming module. In the scheme realized by the smart phone, the broadcasting streaming end, namely the anchor end, can acquire video data through a camera of the mobile phone and audio data through a microphone, and the video data and the audio data are subjected to a series of preprocessing, encoding and packaging and then are streamed to the server end for distribution. These functional modules of the push streaming end can be integrated in the (anchor) live Application (APP) or implemented by the (anchor) live application invoking the underlying interface of the mobile terminal.

In some embodiments, the SDK of the live APP can directly collect video data and audio data through a mobile phone camera and a microphone, the video sample data can adopt RGB or YUV format, and the audio sample data can adopt PCM format.

In some embodiments, the pre-processing may include, but is not limited to, processing effects of filters, beauty, watermarking, blurring, and the like. In order to facilitate push streaming, pull streaming and storage of mobile phone video, video coding compression technology is usually adopted to reduce the volume of the video.

An exemplary video coding is the HEVC/h.265 coding format; an exemplary audio encoding is the AAC encoding format, as an alternative, MP3, WMA are also alternatives. The streaming media video (including audio) is encoded and compressed, so that the storage and transmission efficiency of the video is greatly improved. Accordingly, the encoded streaming video is to be decoded when being played.

One key function of the push stream terminals 111, 112 is push stream (streaming data push). Before stream pushing, the audio and video data can be encapsulated by using a transmission protocol to become stream data. Exemplary streaming protocols include, but are not limited to, RTSP, HLS, RTMP, FLV (HTTP-FLV).

RTSP (Real Time Streaming Protocol) is an application layer Protocol proposed by Real Network and Netscape together how to efficiently transmit Streaming media data over IP networks. In some embodiments, the RTSP may be employed in an Android-based touch screen terminal.

The HLS (HTTP Live Streaming hypertext transfer protocol real-time Streaming) protocol is a HTTP-based Streaming media network transfer protocol proposed by apple inc. Is part of the apple QuickTime X and iPhone software systems. In some embodiments, the HLS may be employed in an iOS-based touch screen terminal or a touch screen terminal provided by Apple inc.

RTMP (Real-Time Messaging Protocol) is a Protocol developed by Adobe Systems for audio, video and data transmission between Flash players and servers. In some embodiments, the RTMP may be employed in a touch screen terminal that supports Flash.

Flv (flash video) is another video format introduced by Adobe corporation, which is a streaming media data storage container format transmitted over a network.

In some embodiments, the streaming end may push the audio/video streaming data to the service end through the network based on the Qos algorithm, and the service end distributes the audio/video streaming data through the CDN, for example.

The server 101 of the live system 100 may be configured with functional modules including, but not limited to, a streaming media processing system 102 and a business system 103.

The streaming media processing system 102 may include, for example, a plurality of processing modules, such as a transcoding processing module, a recording module, a watermarking module, a screenshot module, a yellow identification module, and the like.

In some embodiments, the transcoding processing module may be configured to process streaming data to adapt to various terminals and platforms, such as transcoding for streaming data transmitted in RTMP, HLS, FLV, etc. formats, or to support adaptation of one-way-multiple-way to different networks and resolutions of a playback terminal. In some embodiments, a user can convert a high bitrate (e.g. 720P) stream pushed by a user into a lower definition (e.g. 360P) stream in real time by a real-time transcoding technology to meet the requirements of a playing end.

The server may be implemented, for example, by one or more servers and one or more databases of various architectures. The service system 103 may be a system for processing other services than live streaming, including for example an interactive system.

The interactive function in live broadcasting is a strong function of the anchor and the attention of the viewers. The interactive system may include, but is not limited to, a chat (room) module, an attention subsystem, a gift subsystem, a census subsystem, a leaderboard subsystem, a wheat subsystem, a PK subsystem, etc. In some embodiments, the stream pushing end and the playing end may configure corresponding service modules corresponding to the service system of the service end and may execute corresponding service processes. The service system can be executed independently of the stream media playing. Accordingly, for example, other business processes independent of streaming media playing (live playing) can be provided in a live broadcast room of a pull end or a play end.

The playbacks 121, 122, 123 of the live system 100 may be configured with functional modules including, but not limited to, a pull stream (stream data pull) module, a decoder, and a rendering module.

Accordingly, pulling stream (stream data pull) is a key function of the playback end. In some embodiments, it may use a protocol similar to that described above for the push stream, e.g., corresponding to that used at the push stream end, which is not described herein.

Accordingly, a decoder may be provided at the playback end 121, 122, 123, which for example adopts the protocol described above for encoding, for example, a protocol corresponding to that used at the stream push end, such as h.265, which is not described herein. In some embodiments, the encoder and decoder are both hardware codecs; in some embodiments, the encoder and decoder may be implemented on the basis of software or a combination of both software and hardware.

The embodiment of the invention provides a live broadcast page switching method. The embodiment of the invention also relates to a corresponding device, electronic equipment for implementing the method and a storage medium for storing a program capable of executing the method correspondingly. In some embodiments, an apparatus, component, unit or model may be implemented by software, hardware or a combination of software and hardware.

Referring to fig. 2, a live page switching method according to an embodiment of the present invention is shown. The method may be implemented by a touch screen terminal, such as a mobile terminal, such as a smartphone. The touch screen terminal is, for example, the playing terminal 121, 122, 123 or a functional module having the playing terminal.

The method may include the following steps S201 to S206.

S201: and playing the first stream data in a live broadcast page of the first live broadcast room.

In some embodiments, playing the first streaming data in the live page of the first live broadcast room may be implemented based on the live broadcast system in the above embodiments or related functions thereof. For example, the touch screen terminal may pull the first streaming data, decode it, and play it as described above.

In some embodiments, the live page may be implemented in a module or frame of a terminal operating System related to a display window, for example, a UIKit frame of an iOS or a view System (Views System) frame of an Android, as described with reference to fig. 9 or fig. 10.

S202: in response to a sliding touch event by a user, a plurality of sliding phases within a sliding cycle are defined.

In some embodiments, the sliding touch event is one of touch events; the touch event is, for example, an event defined in an operating system of the touch screen terminal. In some embodiments, upon invocation, the touch event may return touch event data to the live APP, which includes, for example, the touch event type (i.e., touch gesture) and touch coordinates.

In embodiments of the present invention, touch event data obtained directly from, for example, an operating system, can include a type of touch event obtained directly, such as a touch or gesture. Accordingly, touch event data obtained directly from the operating system may also include coordinates of the respective touch or gesture.

In some embodiments, the touch event data further includes touch time characteristics, such as a touch start time, a touch duration, and/or a touch end time. In an embodiment of the present invention, the touch event data may be data directly obtained by a live application from an operating system layer interface of the touch screen terminal.

For example, in some embodiments, for example, in a touch screen terminal under an iOS operating system platform, the live application may obtain the touch event data through a UIKit framework of a touch layer of the iOS, that is, call the returned touch event data. FIG. 9 illustrates a schematic diagram of an exemplary iOS operating system platform, which is described further below. The acquisition of touch event data may be implemented, for example, using various instructions under the iOS operating system, particularly under the UIKit framework. For example, a command may be utilized in some embodiments to retrieve touch event data in the form of a gesture. For example, a class uigesturrecognizer under UIKit and a subclass uiswipegesturrecognizer (slide) thereof, or the like may be used. For example, in other embodiments, the touch event data in the form of a touch may be obtained using a command. For example, attributes UITouch and sub-attributes UITouch hasebegan (touch start), UITouch hasemoved (touch point movement), UITouch haseStationary (touch point no movement), UITouch haseEnded (touch end), UITouch haseCanelled (touch cancel), tapCount (number of screen clicks), and timeStamp (touch start time) under UIKit may be used. In some embodiments, the touch event and its related data may be obtained, for example, in the iOS system by calling a uiscrillview component of the UIKit framework.

In some embodiments, for example, in a touch screen terminal under an Android operating System platform, the touch event data may be acquired by a live application through a view System (Views System) frame in a framework layer of the Android. FIG. 10 illustrates a schematic diagram of an exemplary android operating system platform, which is described further below. In some embodiments, the touch event and its related data may be obtained, for example, in the Android System by calling a view manager 2 component of a view System (Views System) framework.

Those skilled in the art will appreciate that other command instructions may be employed to extract touch event data under the iOS or android or other operating systems.

In some embodiments, in embodiments of the present invention, the touch event type may be a Swipe (Swipe) and may have corresponding coordinates, for example, under an iOS operating system or other operating system platform. For example, in sliding (Swipe), the touch event data may include [ Swipe, (X)0,Y0)···(X1,Y1)]Wherein (X)0,Y0) As a starting point coordinate, (X)1,Y1) Is the coordinate of the end point and optionally may include a plurality of intermediate coordinates.

In some embodiments of the present invention, the touch event data may also include other characteristics, such as time characteristics, pressure characteristics, and the like. For example, dragging and sliding may be distinguished by temporal features; the long press and click may be distinguished based on a time characteristic and/or a pressure characteristic.

Other forms of touch event types or combinations of the above and other coordinate forms or other features derived from the operating system interface are also contemplated in some embodiments of the invention for the corresponding action of the sliding of the invention, and are within the scope of the invention.

Also illustratively, in an iOS operating system, when an application responds to an event, the event is responded to by a responder chain, all event response classes are subclasses of the uiresponse (as described above for uigesturrecognizer), and the responder chain is a hierarchy of different objects, each of which in turn gets an opportunity to respond to the event message. When an event occurs, the event is first sent to a first responder, which is often the view of the event occurrence, i.e., where the user touched the screen. Events will pass down the chain of respondents until accepted and processed.

A typical corresponding roadmap is:

First Responser-->The Window-->The Application-->App Delegate

those skilled in the art will appreciate that in functional use of the application, for example when a user accesses a page of the application by touch, processing may proceed based on a similar chain of responses, and touch event data may be retrieved from the operating system using touch response instructions similar to those described above, and functional modules of the application, such as pages and their elements, respond accordingly. It will be apparent to those skilled in the art in light of the teachings of the present invention that methods according to embodiments of the present invention can detect touch events and extract touch event data independently of touch response instructions of functional modules, or share touch response instructions of functional modules to detect touch events and extract touch event data for use in methods according to embodiments of the present invention, but that the detection and extraction does not affect the response of functional modules to touch events and normal functional use.

In some embodiments of the present invention, a plurality of swipe phases within a swipe cycle are defined for a swipe touch event, which may include a first swipe phase in which a user's finger is swiped and held in contact on the touch screen, a second swipe phase after the user's finger has left the touch screen and before the end of the swipe animation, and an optional third swipe phase after the end of the swipe animation.

S203: and in a first stage of sliding and keeping contact of a finger of a user on the touch screen, keeping playing of a live broadcast page of the first live broadcast room, pulling second streaming data used in an adjacent second live broadcast room, and playing the second streaming data in the live broadcast page of the second live broadcast room.

In an embodiment of the present invention, the confirmation that the user's finger is slid and kept in contact on the touch screen may be determined based on the above-mentioned sliding touch event and its data, for example, by calling uisrolview (ios) or viewport 2 (Android).

In some embodiments of the present invention, the second live broadcast room is a live broadcast room adjacent to the first live broadcast room. In some embodiments of the invention, the second live room is determined based on a sliding direction, e.g. based on a sliding up and down switch, being a lower or upper live room; based on the left-right sliding switching, the live broadcasting room on the right or the left is adopted. It will be understood by those skilled in the art that in the first stage, the second live room may be changed by sliding the user's finger up and down (while still touching), for example, in the early stage, the finger slides up (keeps touching), the streaming data in the lower live room is pulled, and as the finger slides down, the streaming data in the upper live room is pulled.

Optionally, in some embodiments, the method may further include:

a1: and in the first stage, performing mute processing on the pulled and played second stream data.

S204: and at a second stage after the user finger leaves the touch screen and before the sliding animation is finished, judging a target live broadcast room to be stopped after the sliding animation is finished.

In an embodiment of the present invention, after the user's finger leaves the touch screen and the slide animation is finished, the determination may also be performed based on the above-mentioned slide touch event and data thereof, for example, by calling uisrolview (ios) or viewport 2 (Android).

In the embodiment of the present invention, the determination of the target live broadcast room to be stopped after the sliding animation is finished may also be determined based on the sliding touch event and the data thereof, for example, by calling uisrolview (ios) or viewport 2 (Android). As an illustrative example, the touch event data and the derived values of the data (such as coordinates, direction, distance, speed, or acceleration) may be determined by directly calling the return value of the uisrolview (ios) or the viewmaker 2 (Android).

In some embodiments, as shown in fig. 3A, in the second sliding stage after the user's finger leaves the touch screen and before the sliding animation is finished, determining a target live broadcast room to be stopped after the sliding animation is finished includes:

s301: acquiring the sliding direction, the sliding distance, the sliding speed or the acceleration of the finger of the user in a preset time period before the finger leaves the touch screen;

in some embodiments, the predetermined time period before the user's finger leaves the touch screen is a time period before the time when the user's finger leaves the touch screen, such as T ═ T1,toff]Wherein t isoffFor example, the moment when the finger leaves the touch screen in the touch event, t1Is a time period start node. In some embodiments, the length of the time period T may be set as desired.

In some embodiments, the sliding direction, sliding distance, sliding velocity, or acceleration may be derived (e.g., calculated) directly from the touch event data.

In some embodiments, the sliding direction may be determined according to an extension direction of the live or video stream, for example, one of two extension directions. For example, following a live or video stream is an up-down or a left-right distribution, and the sliding direction may be selected from one of a slide-up and a slide-down, or one of a slide-left and a slide-right. Thus, those skilled in the art will appreciate that the sliding direction may be the general sliding direction, and need not be strictly oriented. FIG. 6A shows a sliding direction d of a user on a touch screen of a touch screen terminal 6001And d2Wherein d is1To slide upwards, d1Is a downward slide.

In some embodiments, the sliding direction may be determined, for example, from coordinate data in the directly acquired touch event data.

S302: and judging a target live broadcast room to be stopped after the sliding animation is finished based on the sliding direction, the sliding distance, the sliding speed or the acceleration.

In some embodiments, the target live room may be determined based on multiple thresholds.

In some embodiments, the determining of the target live broadcast room to be stopped after the sliding animation is ended based on the sliding direction, the sliding distance, the sliding speed or the acceleration may include multiple-dimensional determination, such as sliding direction determination, sliding distance determination, sliding speed or acceleration determination, and distance and (acceleration) comprehensive determination.

For example, it may be determined whether the sliding direction is opposite to the relative position of the second live room with respect to the first live room during the time period. As mentioned previously, the sliding direction may be the general sliding direction, possibly ignoring skewing or subtle reciprocating fluctuations.

For example, the step S302 may include:

and if the sliding direction is determined not to be opposite to the relative position of the second live broadcast room relative to the first live broadcast room, judging that the target live broadcast room is the first live broadcast room.

And if the sliding direction is opposite to the relative position of the second live broadcast room relative to the first live broadcast room and the sliding distance exceeds a first threshold value, judging that the target live broadcast room is the second live broadcast room.

As shown in fig. 3B, in S3021, it is determined whether the sliding direction is opposite, if not, the target live broadcast room is determined as the first live broadcast room in S3022, and if so, it is further determined whether the sliding distance exceeds the first threshold in S3023; if the sliding distance exceeds the first threshold, the target live broadcast room is determined to be the second live broadcast room in S3024.

Here, with combined reference to fig. 2, 3A-3B, and 6A, in an embodiment of the present invention, the position of the pulled second live broadcast room will be opposite to the direction of the slide at the start of the slide. For example, as shown in fig. 6A, sliding up (sliding up) d1 at the beginning of the slide will pull the adjacent second live room below the first live room; accordingly, in an embodiment not shown, sliding down (downslide) at the beginning of the slide will pull an adjacent second live room located above the first live room.

Here, by way of explanation and not limitation, when the user is more interested in the first live room/not interested in the second live room, the user may try to slide back to the first live room within a predetermined time period before the user's finger leaves the touch screen, that is, the (overall) sliding direction in the time period is not opposite to that of the second live room, as shown in fig. 6A, and slide d2 downwards in the time period (not opposite to that of the second live room below), and then the target live room may be determined to be the first live room, so that corresponding live room processing may be performed in the second sliding stage and/or the third sliding stage.

In a further embodiment of the invention, the user may continue to slide in the direction of pulling the second live broadcast room for a predetermined time period before the user's finger leaves the touch screen, i.e. the (overall) sliding direction in this time period is opposite to the relative position of the second live broadcast room, at which point it may be further determined.

Here, for example, it may be determined whether the sliding distance within the time period exceeds a predetermined first, distance threshold.

By way of explanation and not limitation, when the user is more interested in the second live broadcast room/is not interested in the first live broadcast room, the user may continue to slide a significant distance in the direction of pulling the second live broadcast room within a predetermined time period before the user's finger leaves the touch screen, and here, the user's intention may be known by judging the sliding distance of the user within the time period so as to perform corresponding live broadcast room processing, as shown in fig. 6B, when the sliding distance l is greater than a predetermined distance threshold lt within the time period, the target live broadcast room may be determined to be the second live broadcast room.

Optionally, as further shown in fig. 3B, the step S302 may further include:

if the sliding direction is opposite to the relative position of the second live broadcast room relative to the first live broadcast room and the sliding distance does not exceed the first threshold value, further determining whether the sliding speed or the acceleration exceeds a second threshold value;

and if the sliding speed or the acceleration is determined to exceed the second threshold value, determining that the target live broadcast room is a second live broadcast room.

With continued reference to fig. 3B, if the sliding distance does not exceed the first threshold, it is determined at S3025 whether the sliding (adding) speed exceeds the second threshold, and if the sliding (adding) speed exceeds the second threshold, the target live broadcast room is determined as the second live broadcast room at S3026.

By way of explanation and not limitation, the user may make a further determination by making the distance that the user continues to slide in the direction of pulling the second live room less noticeable for a predetermined period of time before the user's finger leaves the touch screen.

For example, as shown in fig. 3B, it may be determined whether the slip velocity or acceleration during the time period exceeds a predetermined second (acceleration) velocity threshold.

By way of explanation and not limitation, when the user is more interested in the second live broadcast room/is not interested in the first live broadcast room, the user may have a fast end-stage sliding (adding) speed in a direction of pulling the second live broadcast room within a predetermined time period before the user's finger leaves the touch screen, and here, the user's intention may be known by judging the sliding (adding) speed of the user within the time period so as to perform corresponding live broadcast room processing, as shown in fig. 6C, and the target live broadcast room may be determined to be the second live broadcast room when the sliding speed v is greater than a predetermined speed threshold vt and/or the sliding acceleration a is greater than a predetermined acceleration threshold at within the time period.

By way of explanation and not limitation, the end-swipe (plus) speed of the user in the direction of pulling the second live broadcast room within a predetermined time period before the user's finger leaves the touch screen may not be significant enough, e.g., the (plus) speed does not exceed a predetermined (plus) speed threshold, at which time it may be determined whether the combined value (product) of the distance and the (plus) speed exceeds a predetermined third threshold.

Optionally, as further shown in fig. 3B, the step S302 may further include:

if it is determined that the slip velocity or acceleration does not exceed the second threshold, further determining whether a product of a slip distance and the slip velocity or acceleration exceeds a third threshold;

if the product of the sliding distance and the sliding speed or the acceleration is determined not to exceed a third threshold value, determining that the target live broadcast room is a first live broadcast room;

and if the product of the sliding distance and the sliding speed or the acceleration is determined to exceed the third threshold, determining that the target live broadcast room is a second live broadcast room.

With continued reference to fig. 3B, if the sliding (addition) speed does not exceed the second threshold, it is determined at S3027 whether the product of the sliding distance and the sliding (addition) speed exceeds a third threshold, if so, the target live broadcast room is determined at S3028 to be the second live broadcast room, and if not, the target live broadcast room is determined at S3029 to be the first live broadcast room.

Here, by way of explanation and not limitation, erroneous judgment of the user's intention can be effectively eliminated by judging the integrated value (product) of the distance and the (addition) speed, whereby the method according to the present invention can perform corresponding live room processing according to the judgment of the target live room accordingly.

S205: and if the target live broadcast room is judged to be a second live broadcast room, in the second stage, the pulling and playing of second stream data of the second live broadcast room are kept.

Optionally, the method may further include:

b1: and if the target live broadcast room is judged to be a second live broadcast room, starting the sound of the second stream data in the second stage.

Optionally, the method may further include:

b2: and in the second stage, if the target live broadcast room is judged to be the second live broadcast room, carrying out mute processing on the first stream data, stopping pulling the first stream data, and enabling the live broadcast page of the first live broadcast room to stay at the last frame of the pulled first stream data.

S206: and if the target live broadcast room is judged to be a first live broadcast room, in the second stage, the playing of the first stream data in the live broadcast page of the first live broadcast room is kept, and the second stream data is stopped being pulled.

Optionally, in some embodiments, the method may further include:

c1: and if the target live broadcast room is judged to be the first live broadcast room, in the second stage, carrying out mute processing on the second stream data, so that the live broadcast page of the second live broadcast room stays at the last frame of the pulled second stream data.

Optionally, in some embodiments, the sliding may further include a third stage as previously described. In an embodiment of the present invention, the third stage does not specify an end point, and may be regarded as the end of the stage after the related action is completed. In some embodiments, the third stage may have a different end point for different live bays. Here, the method may further include S207 to S208.

S207: and if the target live broadcast room is judged to be the second live broadcast room, completely displaying the live broadcast page of the second live broadcast room and loading other business processes of the second live broadcast room in a third stage after the sliding animation is finished.

Optionally, the method may further comprise:

d1: and in the third stage, if the target live broadcast room is judged to be the second live broadcast room, the backroom processing is carried out on the first live broadcast room, and the live broadcast page of the first live broadcast room is emptied and reset.

S208: and if the target live broadcast room is judged to be the first live broadcast room, in a third stage after the sliding animation is finished, the second live broadcast room is retreated, and a live broadcast page of the second live broadcast room is emptied and reset.

In the embodiment of the present invention, it will be understood that in the first and second stages, the live pages of the first and second live rooms are both partially displayed (partial windows), for example, the display thereof may be implemented based on a partial view implementation of an operating system (iOS or android) of a mobile terminal.

In some embodiments of the invention, different processes may be provided for different live rooms, and optionally also respective views (views) may be provided for this process, and the sliding and its phases are applicable for both live room processes. Therefore, under the condition that the sliding is combined with the independent process and the view, the live broadcast (playing) framework which is simple in structure and occupies further reduced resources can be provided.

Reference is now made to fig. 4A and 4B, which illustrate an example of a method of practicing an embodiment of the present invention. In the example, corresponding execution actions can be configured for each stage of the sliding, and further beneficial effects are obtained.

In the example shown in fig. 4A, in phase 0 (which phase 0 is merely for the sake of embodying, by way of explanation and not limitation, that a slide event has not been triggered, which does not constitute a slide), live room a (first live room) plays the live streaming video in the front end, while the playing page/view of live room a is displayed in its entirety, which is performed, for example, in the first process (main process). In this phase 0, the following live room B (second live room) has not yet been created. For example, a second process (sub-process) for the second live room has not been created, or another process has not loaded the relevant data for the second live room; similarly, a second play page (view) for the second live room has not been created here, or the second play page (view) has been cleared/reset without loading the pictures of the second live room.

With continued reference to fig. 4A, the user slides the screen up, but in a second phase determines that the last live room to stay is a second adjacent live room (the live room below), thereby describing one example of live page switching and including a number of slide-related performance actions:

show (onAppear): when the user starts to slide, the action is triggered when the view of the next live broadcast room B starts to be displayed, and in the first stage, the stream picture of the live broadcast room B starts to be pulled, and the live broadcast room B is subjected to mute operation, so that two sound streams are prevented from being played simultaneously.

Page triggered (onPageActivated): and when the user looses the hands and the sliding animation is not finished, the second stage is performed, and the action is executed on the corresponding live broadcast room according to the judgment result. In the example shown in fig. 4A, the live broadcast room B is determined to stay after the sliding is finished, so that the action is executed on the live broadcast room B, and at this time, the streaming data of the live broadcast room B is continuously pulled and played. At this stage (when the action is executed), the sound playing of the live broadcast room B is started, thereby realizing the full flow entering.

Page not triggered (onpageUnactionated): the method corresponds to onPageActivated, when the user looses his hands and the sliding animation is not finished, the method is in the second stage, and the action is executed for the corresponding live broadcast room according to the judgment result. In the embodiment shown in fig. 4A, the system determines that the live room a is not displayed after the sliding is finished, and thus the action is performed on the live room a. At this stage (when the action is executed), the unselected live broadcast room a will be muted, and the pulling of the stream data of the live broadcast room a is stopped, and the last frame of picture is reserved.

Complete display (onCompleteShow): and in a third stage after the sliding animation of the sliding assembly stops, executing the action on the live broadcast room B with the view completely displayed according to the judgment result. At this stage (when the action is executed), processing of other business processes of the live broadcast room B is started, including the bullet screen, the anchor information, and the like.

Disappearance (ondiappear): and in a third stage after the sliding animation of the sliding assembly stops, executing the action on the live broadcasting room A with the view of the live broadcasting room completely moved out of the screen according to the judgment result, and in the stage (when the action is executed), carrying out room returning processing and carrying out clearing/resetting on the view corresponding to the live broadcasting room A by the live broadcasting room A. As previously described, this cleared/reset view (and possibly process) may be used for additional live slots at the next page switch.

In the example shown in fig. 4B, similar to fig. 4A, in phase 0 (which phase 0 is merely for the sake of embodying that a slide event has not been triggered, which does not constitute a slide, by way of explanation and not limitation), live room a (first live room) plays the live streaming video in the front end, while the playing page/view of live room a is fully displayed, which is performed, for example, in the first process (sub-process). In this phase 0, the following live room B (second live room) has not yet been created. For example, a second process (sub-process) for the second live room has not been created, or another process has not loaded the relevant data for the second live room; similarly, a second play page (view) for the second live room has not been created here, or the second play page (view) has been cleared/reset without loading the pictures of the second live room.

With continued reference to fig. 4B, the user slides the screen up, but determines in the second phase that the last live room left is the second adjacent live room (the live room below), thereby describing one example of live page switching and including a number of slide-related performance actions:

show (onAppear): when the user starts to slide, similar to fig. 4A, the next view of the live broadcast room B starts to be displayed, and in the first stage, the stream picture of the live broadcast room B starts to be pulled, and the live broadcast room B is subjected to a mute operation, so that two sound streams are prevented from being played simultaneously.

Page triggered (onPageActivated): and when the user looses the hands and the sliding animation is not finished, the second stage is performed, and the action is executed on the corresponding live broadcast room according to the judgment result. In the example shown in fig. 4B, the live broadcast room a is determined to stay after the sliding is finished, and the action is executed for the live broadcast room a, and at this time, the streaming data of the live broadcast room a located at the front end is continuously pulled and played. At this stage (when the action is performed), the live room a is kept always at the front end, without any further action.

Page not triggered (onpageUnactionated): the method corresponds to onPageActivated, when the user looses his hands and the sliding animation is not finished, the method is in the second stage, and the action is executed for the corresponding live broadcast room according to the judgment result. In the embodiment shown in fig. 4B, the system determines that the live room B is not displayed after the sliding is finished, and thus the action is performed on the live room B. At this stage (when the action is performed), the non-selected live broadcast room B will be muted, and the pulling of the stream data of the live broadcast room B is stopped, and the last frame of picture is retained.

Complete display (onCompleteShow): and in a third stage after the sliding animation of the sliding assembly stops, executing the action on the live broadcast room A with the view completely displayed according to the judgment result. At this stage (when the action is performed), no further action is required, since the live room a, which was originally located at the head end, has already loaded all the traffic.

Disappearance (ondiappear): and in a third stage after the sliding animation of the sliding assembly stops, executing the action on the live broadcasting room B with the view of the live broadcasting room completely moved out of the screen according to the judgment result, and in the stage (when the action is executed), the live broadcasting room B performs room returning processing and emptying/resetting corresponding to the view of the live broadcasting room B. As previously described, this cleared/reset view (and possibly process) may be used for additional live slots at the next page switch.

In connection with the examples shown in fig. 4A and 4B, it will be appreciated that sub-processes and/or views may be allocated separately for live closets a and B and selectively executed for the same set of staged action logic (i.e., onapper, onPageActivated, onpageUnactioned, onCompleteShow, onDisapear), such an architecture is clear and maximizes the avoidance of system resource consumption.

Those skilled in the art will appreciate that in particular examples, sliding down or sliding left or right may be appropriate as the case may be.

In other embodiments of the present invention, a video page switching method may also be provided, for example, a method for switching a video page mixed with a live video, such as a short video page. In some video-like applications, such as short video APP, there may be live pages in the short video waterfall stream pages. The scheme according to the embodiments of the present invention can be applied to such a scenario.

As shown in fig. 5, the video page switching method may include

S501: playing a first video in a first page;

s502: judging whether a second video to be played in a second page is streaming media or not, wherein the second page is adjacent to the first page:

s503: if the second video is not streaming media, preloading the second video;

s504: if the second video is streaming media, the second video is not preloaded, and

s505: defining a swipe comprising a plurality of stages in response to a swipe touch event of a user,

s506: at a first stage of sliding and keeping contact of a finger of a user on the touch screen, keeping playing the first video in the first page, pulling the second video which is streaming media, and playing the second video in the second page,

s507: in a second stage after the user finger leaves the touch screen and before the sliding animation is finished, judging a target page to be stopped after the sliding animation is finished:

s508: if the stopped target page is judged to be a second page, the second video which is the streaming media is kept to be pulled and played,

s509: and if the stopped target page is judged to be the first page, stopping pulling the second video.

In some embodiments, the streaming media is live, such as the aforementioned live.

In some embodiments, if the second video is a streaming media such as live broadcast, the method further comprises:

e1: and in a third stage after the sliding animation is finished, if the stopped target page is judged to be a second page, the second page is completely displayed and other live broadcast business processes are loaded.

In some embodiments, if the second video is a streaming media such as live broadcast, the method further comprises:

f1: and in a third stage after the sliding animation is finished, if the stopped target page is judged to be the first page, the live broadcast is retreated, and the second page is emptied and reset.

Similar to the embodiments shown in fig. 2 and fig. 3, in some embodiments, the determining, in the second sliding stage after the user's finger leaves the touch screen and before the sliding animation is ended, a target page to be stopped after the sliding animation is ended may include:

g1: acquiring the sliding direction, the sliding distance, the sliding speed or the acceleration of the finger of the user in a preset time period before the finger leaves the touch screen;

g2: and judging a target page to be stopped after the sliding animation is finished based on the sliding direction, the sliding distance, the sliding speed or the acceleration.

In a further embodiment, the step G2 may include:

g21: if the sliding direction is determined not to be opposite to the relative position of the second page relative to the first page, determining that the target page is the first page;

g22: and if the sliding direction is opposite to the relative position of the second page relative to the first page and the sliding distance exceeds a first threshold value, determining that the target page is a second live broadcast.

In a further embodiment, the step G2 may further include:

g23: if the sliding direction is opposite to the relative position of the second page relative to the first page and the sliding distance does not exceed the first threshold value, further determining whether the sliding speed or the acceleration exceeds a second threshold value;

g24: and if the sliding speed or the acceleration is determined to exceed the second threshold, determining that the target page is a second page.

In a further embodiment, the step G2 may further include:

g25: if it is determined that the slip velocity or acceleration does not exceed the second threshold, further determining whether a product of a slip distance and the slip velocity or acceleration exceeds a third threshold;

g26: if the product of the sliding distance and the sliding speed or the acceleration is determined not to exceed a third threshold value, determining that the target page is a first page;

g27: and if the product of the sliding distance and the sliding speed or the acceleration is determined to exceed the third threshold, determining that the target page is a second page.

In the scheme, the method and the device can pre-load the adjacent video with small occupied resources, such as short video, during front-end playing, and perform the sliding-based loading on the adjacent streaming media with large occupied resources, such as live broadcast, so that the method and the device can have good user experience and simultaneously keep the minimum occupied resources.

In some embodiments of the present invention, as shown in fig. 7A, a live page switching apparatus 700 is further provided, which may be used for a touch screen terminal, for example. The live page switching apparatus 700 may include a playing unit 710, a sliding manager 720, a first executor 730, a determiner 740, and a second executor 750.

In the embodiment shown in fig. 7, the playback unit 710 is configured to play the first streaming data in a live page of the first live room. The swipe manager 720 is configured to define a swipe comprising a plurality of phases in response to a swipe touch event by a user. For example, the first actuator 730 for performing a first stage action is configured to maintain playback of the live page of the first live room, pull the second streaming data for the adjacent second live room and play the second streaming data in the live page of the second live room in a first stage where the user's finger is slid and held in contact on the touch screen. The determiner 740 is configured to determine a target live broadcast room to be stopped after the end of the slide animation at a second stage after the user's finger leaves the touch screen and before the end of the slide animation. For example, the second executor 750 for executing the second stage action may be configured to maintain the pull and play of the second stream data in the second live broadcast room if the target live broadcast room is determined to be the second live broadcast room; and if the target live broadcast room is judged to be the first live broadcast room, the playing of the first stream data in the live broadcast page of the first live broadcast room is kept, and the second stream data is stopped being pulled.

In some embodiments of the present invention, as shown in fig. 7B, there is further provided a video page switching apparatus 700', which may be used for a touch screen terminal, for example. The video page switching apparatus 700' may be suitable for use in a live-compatible video, such as a short video play switching solution, for example. The video page switching apparatus 700 ' may include, for example, a playing unit 710 ', a first determiner 720 ', a loading unit 730 ', a slide manager 740 ', a first executor 750, a second determiner 760 ', and a second executor 770 '.

In some embodiments, the play unit 710' is configured to play the first video in the first page. The first determiner 720' is configured to determine whether a second video to be played in a second page is streaming media, wherein the second page is adjacent to the first page. The loading unit 730' is configured to preload the second video if the second video is not streaming media, and not preload the second video if the second video is streaming media. The swipe manager 740' is configured to define a swipe comprising a plurality of phases in response to a swipe touch event by a user. For example, the first actuator 750 'for performing a first stage action is configured to keep playing the first video in the first page, pull the second video as streaming media and play the second video in the second page in a first stage of the user's finger sliding and remaining in contact on the touch screen. The second determiner 760 'is configured to determine a target page to be stopped after the end of the slide animation at a second stage after the user's finger leaves the touch screen and before the end of the slide animation. For example, the second executor 770' for executing the second stage action is configured to keep pulling and playing the second video as streaming media if the stopped target page is determined as the second page, and stop pulling the second video if the stopped target page is determined as the first page.

It will be clear to a person skilled in the art that without causing a contradiction, the device of the present embodiment may incorporate method features described in other embodiments, and vice versa.

In an embodiment of the present invention, an electronic device, such as a touch screen terminal, is provided. In a preferred embodiment of the present invention, the touch screen terminal is a mobile terminal, and preferably may be a mobile phone. By way of exemplary implementation only, fig. 8 shows a hardware structure diagram of a specific embodiment of a touch screen terminal, such as a mobile terminal 800; and fig. 9 and 10 show schematic system configurations of an embodiment of a touch screen device, such as a mobile terminal.

In the illustrated embodiment, the mobile terminal 800 may include a processor 801, an external memory interface 812, an internal memory 810, a Universal Serial Bus (USB) interface 813, a charge management module 814, a power management module 815, a battery 816, a mobile communication module 840, a wireless communication module 842, antennas 839 and 841, an audio module 834, a speaker 835, a microphone 836, a microphone 837, an earphone interface 838, keys 809, a motor 808, an indicator 807, a Subscriber Identity Module (SIM) card interface 811, a display 805, a camera 806, a sensor module 820, and the like.

It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the mobile terminal 800. In other embodiments of the present application, mobile terminal 800 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.

In some embodiments, processor 801 may include one or more processing units. In some embodiments, the processor 801 may include one or a combination of at least two of the following: an Application Processor (AP), a modem processor, a baseband processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, a neural Network Processor (NPU), and so forth. The different processing units may be separate devices or may be integrated in one or more processors.

The controller may be a neural center and a command center of the mobile terminal 800. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.

A memory may also be provided in the processor for storing instructions and data. In some embodiments, the memory in the processor is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor. If the processor needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 801, thereby increasing the efficiency of the system.

The NPU is a Neural Network (NN) computational processor that processes input information quickly by referencing a biological neural network structure, such as by referencing transfer patterns between human brain neurons, and may also be continuously self-learning.

The GPU is a microprocessor for image processing and is connected with a display screen and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor may include one or more GPUs that execute program instructions to generate or alter display information.

The digital signal processor (ISP) is used to process digital signals and may process other digital signals in addition to digital image signals.

In some embodiments, the processor 801 may include one or more interfaces. The interfaces may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a Universal Asynchronous Receiver Transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a General Purpose Input Output (GPIO) interface, a Subscriber Identity Module (SIM) interface, a Universal Serial Bus (USB) interface, and so forth.

It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an exemplary illustration, and does not constitute a limitation to the structure of the mobile terminal. In other embodiments of the present application, the mobile terminal may also adopt different interface connection manners or a combination of multiple interface connection manners in the foregoing embodiments.

The wireless communication function of the mobile terminal 800 may be implemented by the antennas 839 and 841, the mobile communication module 840, the wireless communication module 842, a modem processor or a baseband processor, etc.

The mobile terminal 800 may implement audio functions through an audio module, a speaker, a receiver, a microphone, an earphone interface, an application processor, and the like. Such as music playing, recording, etc.

The audio module is used for converting digital audio information into analog audio signals to be output and converting the analog audio input into digital audio signals.

The microphone is used for converting a sound signal into an electric signal. When making a call or sending voice information, a user can input a voice signal into the microphone by making a sound by approaching the microphone through the mouth of the user.

The sensor module 820 may include one or more of the following sensors:

the pressure sensor 823 is configured to sense a pressure signal and convert the pressure signal into an electrical signal.

The air pressure sensor 824 is used to measure air pressure.

The magnetic sensor 825 includes a hall sensor.

The gyro sensor 827 may be used to determine a motion gesture of the mobile terminal 800.

The acceleration sensor 828 may detect the magnitude of acceleration of the mobile terminal 800 in various directions.

The distance sensor 829 may be configured to measure distance.

The proximity light sensor 821 may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode.

The ambient light sensor 822 is for sensing ambient light level.

The fingerprint sensor 831 may be configured to capture a fingerprint.

Touch sensor 832 can be disposed on a display screen, and the touch sensor and the display screen form a touch screen, also called a "touch screen". The touch sensor is used to detect a touch operation applied thereto or nearby. The touch sensor may communicate the detected touch operation to the application processor to determine the type of touch event, such as a single click, a double click, a long press, a rotation, a swipe, a zoom, and so on, in accordance with embodiments of the present invention.

The bone conduction sensor 833 can acquire a vibration signal.

A software operating system of an electronic device (computer), such as a mobile terminal, may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture.

The embodiments illustrated herein exemplify the software structure of a mobile terminal, taking the iOS and android operating system platforms, respectively, as a layered architecture. It is contemplated that embodiments herein may be implemented in different software operating systems.

In the embodiment shown in fig. 9, the solution of the embodiment of the present invention may employ an iOS operating system. The iOS operating system adopts a four-layer architecture, which comprises a touchable layer (coco Touch layer)910, a Media layer (Media layer)920, a Core Services layer (Core Services layer)930 and a Core operating system layer (Core OS layer)940 from top to bottom. The touch layer 910 provides various common frameworks for application development and most of the frameworks are related to interfaces, which are responsible for touch interaction operations of users on iOS devices. The media layer provides the technology of audio-visual aspects in the application, such as graphic images, sound technology, video and audio-video transmission related frameworks and the like. The core service layer provides the underlying system services required by the application. The core operating system layer contains most of the low level hardware-like functionality.

In an embodiment of the present invention, UIKit is the user interface framework of the touchable layer 910.

Fig. 10 is a schematic structural diagram of an android operating system, which may be adopted in the solution of the embodiment of the present invention. The layered architecture divides the software into several layers, which communicate via software interfaces. In some embodiments, the android system is divided into four layers, from top to bottom, an application layer 1010, an application framework layer 1020, an android Runtime (Runtime) and system library 1030, and a kernel layer 1040.

The application layer 1010 may include a series of application packages.

The application framework layer 1020 provides an Application Programming Interface (API) and a programming framework for applications of the application layer. The application framework layer includes a number of predefined functions.

The window manager is used for managing window programs.

The content provider is used to store and retrieve data and make it accessible to applications.

The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.

The phone manager is used to provide a communication function of the mobile terminal.

The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.

The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction.

The android Runtime comprises a core library and a virtual machine, and is responsible for scheduling and managing an android system. The core library comprises two parts: one part is a function to be called by java language, and the other part is a core library of android. The application layer and the framework layer run in a virtual machine.

The system library may include a plurality of functional modules. The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.

The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others.

The kernel layer 1040 is a layer between hardware and software. The kernel layer may include a display driver, a camera driver, an audio interface, a sensor driver, power management, and a GPS interface. In some embodiments of the present invention, the display of the frame animation may invoke a display driver.

In some embodiments of the present invention, there may also be provided an electronic device, comprising: a processor and a memory storing a computer program, the processor being configured to perform the method of any of the embodiments of the invention when the computer program is run.

The systems, devices, modules or units described in the above or below embodiments of the present invention may be implemented by a computer or its associated components. The computer may be, for example, a mobile terminal, a smart phone, a Personal Computer (PC), a laptop, a vehicle-mounted human interaction device, a personal digital assistant, a media player, a navigation device, a game console, a tablet, a wearable device, a smart television, an internet of things system, a smart home, an industrial computer, a server, or a combination thereof, as the case may be.

In some embodiments of the present invention, a storage medium may also be provided. In some embodiments, the storage medium stores a computer program configured to perform the method of any of the embodiments of the present invention when executed.

Storage media in embodiments of the invention include permanent and non-permanent, removable and non-removable articles of manufacture in which information storage may be accomplished by any method or technology. Examples of storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.

The methods, programs, systems, apparatuses, etc., in embodiments of the present invention may be performed or implemented in a single or multiple networked computers, or may be practiced in distributed computing environments. In the described embodiments, tasks may be performed by remote processing devices that are linked through a communications network in such distributed computing environments.

As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Thus, it will be apparent to one skilled in the art that the implementation of the functional modules/units or controllers and the associated method steps set forth in the above embodiments may be implemented in software, hardware, and a combination of software and hardware.

Unless specifically stated otherwise, the actions or steps of a method, program or process described in accordance with an embodiment of the present invention need not be performed in a particular order and still achieve desirable results. In some embodiments, multitasking and parallel/combined processing of the steps may also be possible or may be advantageous.

In this document, "first" and "second" are used to distinguish different elements in the same embodiment, and do not denote any order or relative importance.

While various embodiments of the invention have been described herein, the description of the various embodiments is not intended to be exhaustive or to limit the invention to the precise forms disclosed, and features and components that are the same or similar to one another may be omitted for clarity and conciseness. As used herein, "one embodiment," "some embodiments," "examples," "specific examples," or "some examples" are intended to apply to at least one embodiment or example, but not to all embodiments, in accordance with the present invention. The above terms are not necessarily meant to refer to the same embodiment or example. Various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.

Exemplary systems and methods of the present invention have been particularly shown and described with reference to the foregoing embodiments, which are merely illustrative of the best modes for carrying out the systems and methods. It will be appreciated by those skilled in the art that various changes in the embodiments of the systems and methods described herein may be made in practicing the systems and/or methods without departing from the spirit and scope of the invention as defined in the appended claims.

37页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:虚拟标签标记方法、装置、电子设备及存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类