Display processing method of panoramic video live broadcast microphone, electronic equipment and storage medium

文档序号:1966130 发布日期:2021-12-14 浏览:17次 中文

阅读说明:本技术 全景视频直播连麦的显示处理方法、电子设备及存储介质 (Display processing method of panoramic video live broadcast microphone, electronic equipment and storage medium ) 是由 陈科 于 2021-08-11 设计创作,主要内容包括:本申请公开了全景视频直播连麦的显示处理方法、电子设备及存储介质。其中,全景视频直播连麦的显示处理方法包括显示对至少两个全景视频流进行混流处理后对应的直播界面;在直播界面显示至少两个全景视频流各自对应的直播画面;响应于漫游操作指令,对直播界面显示的直播画面中的至少一者进行与漫游操作指令相对应的漫游显示。通过上述方式,本申请能够有效地实现全景视频连麦的显示交互,提升交互效果。(The application discloses a display processing method of panoramic video live broadcast microphone, electronic equipment and a storage medium. The display processing method of the panoramic video live broadcast wheat connection comprises the steps of displaying a corresponding live broadcast interface after mixed flow processing is carried out on at least two panoramic video streams; displaying live broadcast pictures corresponding to at least two panoramic video streams on a live broadcast interface; and responding to the roaming operation instruction, and performing roaming display corresponding to the roaming operation instruction on at least one live broadcast picture displayed on the live broadcast interface. Through the mode, the display interaction of the panoramic video and the microphone can be effectively realized, and the interaction effect is improved.)

1. A display processing method of panoramic video live broadcast wheat connection is characterized by comprising the following steps:

displaying a corresponding live broadcast interface after mixed flow processing is carried out on at least two panoramic video streams;

displaying the live broadcast pictures corresponding to the at least two panoramic video streams on the live broadcast interface;

responding to a roaming operation instruction, and performing roaming display corresponding to the roaming operation instruction on at least one of the live broadcasting pictures displayed on the live broadcasting interface.

2. The display processing method according to claim 1, wherein the displaying, on a live interface, a live screen corresponding to each of the at least two panoramic video streams includes:

displaying a live broadcast picture corresponding to at least one of the panoramic video streams in a first picture area of the live broadcast interface, and displaying live broadcast pictures corresponding to the rest of the panoramic video streams in a second picture area of the live broadcast interface;

the roaming display corresponding to the roaming operation instruction is performed on at least one of the live broadcast pictures displayed on the live broadcast interface, and the roaming display includes:

and performing roaming display corresponding to the roaming operation instruction on the live broadcast picture displayed in the first picture area.

3. The display processing method according to claim 2, wherein the displaying a live view corresponding to at least one of the panoramic video streams in a first view area of the live view interface, and after displaying live views corresponding to the remaining panoramic video streams in a second view area of the live view interface, comprises:

and responding to a picture switching instruction, and switching and displaying the live broadcast picture displayed by the second picture area and corresponding to the picture switching instruction and the live broadcast picture currently displayed by the first picture area.

4. The display processing method according to claim 2, wherein the displaying a live view corresponding to the remaining panoramic video stream in a second view area of the live view interface includes:

and displaying the live broadcast pictures corresponding to the rest panoramic video stream in a second picture area which is arranged at an interval with the first picture area or is suspended on the first picture area and has a size smaller than that of the first picture area.

5. The display processing method according to claim 2, wherein the displaying a live view corresponding to the remaining panoramic video stream in a second view area of the live view interface includes:

and displaying the live broadcast pictures corresponding to the rest panoramic video streams in a one-to-one correspondence manner in each display window of the second picture area.

6. The display processing method according to claim 2, wherein the displaying a live view corresponding to the remaining panoramic video stream in a second view area of the live view interface includes:

and respectively displaying the live broadcast pictures corresponding to one visual angle of the rest panoramic video streams in the second picture area.

7. The display processing method according to claim 6, wherein the displaying, in the second screen area, live-broadcast screens corresponding to respective one view angles of the remaining panoramic video streams includes:

and automatically switching the view angle of at least one of the rest panoramic video streams presented in the second picture area according to a preset time interval, so as to correspondingly display the live broadcast picture corresponding to the switched view angle in the second picture area.

8. The display processing method according to claim 1, wherein the displaying, on a live interface, a live screen corresponding to each of the at least two panoramic video streams includes:

detecting a current display mode;

if the current display mode is the direct display mode, directly rendering and displaying the respective live broadcast pictures of the at least two panoramic video streams;

and if the current display mode is the VR display mode, rendering the live pictures of the at least two panoramic video streams into left and right pictures corresponding to left and right eyes for display.

9. The display processing method according to claim 8, wherein before the roaming display corresponding to the roaming operation instruction is performed on at least one of the live screens displayed on the live interface in response to the roaming operation instruction, the method includes:

receiving the roaming operation instruction input by at least one of touch operation on a screen, rotation operation of a gyroscope, click operation of a preset key, gesture operation and voice recognition operation in the direct display mode;

in the VR display mode, the roaming operation instruction input through at least one of a key click operation, a gyroscope rotation operation, a gesture operation and a voice recognition operation on a VR device is received.

10. The display processing method according to claim 1, wherein before the displaying a live interface corresponding to the mixed-flow processing of the at least two panoramic video streams, the method comprises:

acquiring the at least two panoramic video streams;

performing mixed flow processing on the at least two panoramic video streams;

alternatively, the first and second electrodes may be,

and acquiring a data stream formed after the mixed flow processing is carried out on the at least panoramic video stream.

11. An electronic terminal, comprising a processor, a communication circuit, a display screen, and a memory, wherein the communication circuit, the display screen, and the memory are respectively coupled to the processor, the communication circuit is configured to be communicatively connected to other terminals, the display screen is configured to display a live interface, the memory stores a computer program, and the processor is configured to execute the computer program to implement the display processing method according to any one of claims 1 to 10.

12. A computer-readable storage medium, in which a computer program is stored, the computer program being executable by a processor to implement the display processing method according to any one of claims 1 to 10.

Technical Field

The present application relates to the field of live broadcast technologies, and in particular, to a display processing method, an electronic device, and a storage medium for live broadcast and live broadcast of panoramic video.

Background

With the popularization of intelligent devices and the development of communication technologies, society has entered the era of intelligent interconnection. The network communication speed is faster and faster, and people can conveniently use the intelligent equipment to roam the network. The live broadcast technology enriches the use scenes of the intelligent equipment, and people can watch live broadcast or live broadcast anytime and anywhere, thereby enriching the life of people.

The current live broadcasting technology generally only supports 2D video for live broadcasting, and a broadcasters generally uses a wide-angle camera to show an environment scene on a screen in order to show richer live broadcasting scenes to viewers. But the video in the current live broadcast technology has poor interaction effect with wheat.

Disclosure of Invention

The technical problem mainly solved by the application is to provide a display processing method, electronic equipment and storage medium for panoramic video live broadcast wheat connection, so that panoramic video wheat connection can be realized, and the interaction effect is improved.

In order to solve the technical problem, the application adopts a technical scheme that: the display processing method of the panoramic video live broadcast microphone comprises the following steps: displaying a corresponding live broadcast interface after mixed flow processing is carried out on at least two panoramic video streams; displaying live broadcast pictures corresponding to at least two panoramic video streams on a live broadcast interface; and responding to the roaming operation instruction, and performing roaming display corresponding to the roaming operation instruction on at least one live broadcast picture displayed on the live broadcast interface.

In order to solve the above technical problem, another technical solution adopted by the present application is: the electronic equipment comprises a processor, a communication circuit, a display screen and a memory, wherein the communication circuit, the display screen and the memory are respectively coupled with the processor, the communication circuit is used for being in communication connection with other equipment, the display screen is used for displaying a live broadcast interface, the memory stores a computer program, and the processor is used for executing the computer program so as to realize the display processing method of the panoramic video live broadcast microphone.

In order to solve the above technical problem, the present application adopts another technical solution: there is provided a computer-readable storage medium storing a computer program executable by a processor to implement the above-described display processing method of a panoramic video live broadcast link.

The beneficial effect of this application is: different from the prior art, this application is through linking the living broadcast interface that corresponds after the wheat in-process shows carrying out the mixed flow to two at least panoramic video flows in the living broadcast, show the respective live broadcast picture that corresponds of these two at least panoramic video flows at the living broadcast interface, so can realize that the live broadcast of panoramic video links the wheat, and can carry out the roaming display corresponding with roaming operation instruction to at least one in the live broadcast picture of live broadcast interface display through roaming operation instruction, so can realize the interactive function that the panoramic video links the wheat effectively, and then carry out good roaming display to the live broadcast picture of panoramic video flow, realize better display effect. The user can roam the interaction to the live broadcast picture for the live broadcast picture changes along with the operation roaming instruction, so this application both be convenient for user's use and watch, promote live broadcast watch the convenience, increased the interactive function that live broadcast links the wheat again.

Drawings

Fig. 1 is a schematic block diagram of a live broadcast system in an embodiment of a display processing method for panoramic video live broadcast and live broadcast;

fig. 2 is a schematic flowchart of an embodiment of a display processing method for live panoramic video and live microphone in the present application;

fig. 3 is a schematic view of a first interface of an electronic device according to an embodiment of a display processing method for live panoramic video and live microphone in the present application;

fig. 4 is a schematic view of a second interface of an electronic device according to an embodiment of a display processing method for live panoramic video and live microphone in the present application;

fig. 5 is a schematic view of a third interface of an electronic device according to an embodiment of a display processing method for live panoramic video and live microphone in the present application;

FIG. 6 is a block diagram schematically illustrating the structure of a first embodiment of the electronic device of the present application;

fig. 7 is a schematic structural diagram of a storage medium readable by a computer according to the present application.

Detailed Description

The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.

The inventor of this application discovers through long-term research, and in the live broadcast, the 2D video live broadcast is company the wheat and can't let the user experience immersive watching experience, and the display effect and the interactive function of the live broadcast picture of 2D video are comparatively single moreover, and company the wheat process can be relatively monotonous, and then makes the appeal of video picture watch the shared proportion of live broadcast in-process not high at the user, and interactive effect is poor. In order to improve or solve the above technical problem, the present application proposes at least the following embodiments.

Referring to fig. 1, the following embodiment may be applied to a live system 1. The following is an exemplary introduction of a live system 1, the live system 1 for example comprising at least a plurality of viewers 20, a host 30 and a server 10. The audience member 20 and the anchor member 30 may be electronic devices, and specifically may be a mobile terminal, a computer, a server, or other terminals, the mobile terminal may be a mobile phone, a notebook computer, a tablet computer, an intelligent wearable device, or the like, and the computer may be a desktop computer, or the like. The server 10 may pull the live data stream, which may include video stream data, text and picture data, etc., from the anchor 30 and push the obtained live data stream to the viewer 20 and at least partially to the anchor 30. After acquiring the live data stream, the viewer end 20 can view the live process of the anchor or guest. A connection, for example a video connection or a voice connection, may be made between the main broadcaster side 30 and the main broadcaster side 30, and between the main broadcaster side 30 and the viewer side 20.

Alternatively, the server 10 may perform mixed stream processing of the live data stream. After acquiring the live data stream from the anchor terminal 30, the server 10 performs mixed flow processing on the live data stream, and pushes the mixed flow processed data stream to the anchor terminal 30 and the audience terminal 20.

Alternatively, the anchor terminal 30 may perform mixed-stream processing of the live data stream. The server 10 sends the live broadcast data stream sent by the main broadcast end 30 connected with other wheat to the main broadcast end 30 executing mixed flow, and after the main broadcast end 30 performs mixed flow processing on the live broadcast data stream, the mixed flow processed data stream is sent to the server 10. The server 10 pushes the mixed stream processed data stream to the other main broadcasting end 30 connected with the wheat and the related audience end 20.

Optionally, the main broadcasting terminal 30 and the audience terminal 20 simultaneously perform mixed flow processing of the live data streams. The server 10 obtains the live data stream of the anchor 30 connected to the wheat, sends the live data streams of the other anchor 30 to the current anchor 30, and the anchor 30 performs mixed flow processing on the live data streams. The server 10 simultaneously pushes the live data stream of the main broadcasting end 30 to the audience end 20, and the audience end 20 performs mixed flow processing on the live data stream.

Referring to fig. 3, for video connected with wheat, a corresponding live interface 100 after mixed stream processing can be displayed in a corresponding live broadcast room, the live interface 100 can display a plurality of screen areas 110, and the plurality of screen areas 110 can display live video frames of each connected with wheat party. The user enters the live room of the anchor at his viewer end 20 and can view the live video of the anchor along with the live broadcast and the live broadcast process. In the live system 1, the live video link may be a panoramic live video link, and the video stream data may include panoramic video stream data. Embodiments of the present application may also be applied to other types of live systems 1, and are not limited to the above exemplary description.

The embodiment of the display processing method for the panoramic video live broadcast and live broadcast comprises the following steps:

s100: and displaying a corresponding live broadcast interface after mixed flow processing is carried out on at least two panoramic video streams.

Taking the video inter-anchor as an example, the anchor end 30 connected with the anchor can capture respective panoramic video streams through respective panoramic cameras or panoramic cameras, and push the streams to the server 10. The server 10 may perform mixed flow processing on the panoramic video stream and then send the panoramic video stream to the anchor 30 and the audience 20, may perform mixed flow processing on the audience 20 and the anchor 30, or perform mixed flow processing on the anchor 30 and then send the panoramic video stream to the server 10, and the server pushes the mixed flow processed data stream to the audience 20. For the audience 20, the panoramic video streams of the respective connected microphones may be acquired and mixed, or a data stream obtained by mixing at least two panoramic video streams may be received. The panoramic video stream may present panoramic video pictures, i.e., 3D video pictures, at the viewer side 20 and the anchor side 30. Of course, the same is true for video-on-air between the anchor and the viewer.

Based on the above, the content of the mixed flow processing may be the following step before step S100:

s110: at least two panoramic video streams are acquired.

For the viewer side 30, it may acquire at least two panoramic video streams corresponding to at least two connected parties.

S111: and performing mixed flow processing on at least two panoramic video streams.

After the at least two panoramic video streams are received, mixed stream processing can be carried out to form a data stream, and a live interface corresponding to the data stream is displayed on a display interface.

In addition to the cases described in S110 and S111, the mixed flow processed data stream may also be directly acquired. The method comprises the following specific steps:

s120: and acquiring a data stream formed after the mixed flow processing is carried out on the at least panoramic video stream.

For the audience 30, it may receive a data stream formed by mixing at least two panoramic video streams by the server 10, or may receive a data stream formed by mixing at least two panoramic video streams by the anchor 30 and forwarded by the server 10. A live interface corresponding to the mixed-flow processed data stream can be displayed on the display interface of the viewer end 30.

S200: and displaying the live broadcast pictures corresponding to the at least two panoramic video streams on a live broadcast interface.

The live interface 100 refers to, for example, a screen of a live room showing a live process. After at least two panoramic video streams are acquired, the live broadcast pictures corresponding to the panoramic video streams can be displayed in the corresponding picture area 110. Taking three anchor feeds for video-on-wheat as an example, where at least two anchor feeds are live in the form of panoramic video, the viewer side 20 can acquire at least two panoramic video streams. For live video broadcasting and live broadcasting, each party of live broadcasting can be live broadcasting in a panoramic video mode, or part of the parties of live broadcasting can be live broadcasting in a panoramic video mode, and the other part of the parties of live broadcasting can be live broadcasting in a common video mode. Certainly, the live video microphone and the live voice microphone are compatible, and part of microphone connecting parties are allowed to participate in the live video microphone and the microphone connecting parties of other microphone connecting parties in a voice mode.

The panoramic video stream is also panoramic video stream data, and a live broadcast picture displayed on a live broadcast interface is a 3D video picture and can be switched in view angle. A live view of a panoramic video may have a panoramic view and an azimuthal view. The panoramic view refers to a picture that represents all angles or orientations from a global perspective, for example, in a spherical form. An orientation view refers to a frame that can be rendered from a particular orientation or angle. The panoramic view angle can be switched to a certain azimuth view angle, and the certain azimuth view angle can also be switched to another azimuth view angle.

Optionally, the screen region 110 on the live interface may include a first screen region 111 and a second screen region 112. The live view displayed on the live view interface may be separately displayed in the first view area 111 and the second view area 112, and may be specifically implemented by the following steps included in step S200.

S210: and displaying a live broadcast picture corresponding to at least one panoramic video stream in a first picture area of a live broadcast interface, and displaying live broadcast pictures corresponding to the other panoramic video streams in a second picture area of the live broadcast interface.

At least one of the live views corresponding to the plurality of panoramic video streams is displayed in the first view region 111, and the rest is displayed in the second view region 112. Of course, if the live video and the live video are connected, the live video and the live video also include the ordinary video stream in addition to the panoramic video stream. A live view corresponding to the normal video stream may be displayed in the second view region 112. Through carrying out the subregion with the live picture that a plurality of panoramic video flow correspond and showing, reduce the mutual interference between each live picture, can promote the display effect, and then promote live effect.

Optionally, the size of the first screen region 111 is larger than the size of the second screen region 112. The first and second screen regions 111 and 112 may be spaced apart, or the second screen region 112 may be suspended above the second screen region 112. Optionally, the second screen region 112 may include at least one display window. The number of display windows is related to the number of connected parties. For example, five anchor players perform panoramic video live broadcasting and live broadcasting, wherein one live broadcasting picture is displayed in the first picture area 111, the remaining four live broadcasting pictures are displayed in the second picture area 112, and the number of display windows of the second picture area 112 may be four. The display windows are arranged at intervals. Since the size of the first screen region 111 is larger than that of the second screen region 112, the first screen region 111 corresponds to a main screen region, and the second screen region 112 corresponds to a sub-screen region, thereby realizing main-sub distinction. The shapes and specific sizes of the first screen region 111, the second screen region 112, and the display window are not particularly limited.

Alternatively, the second screen region 112 is spaced apart from the first screen region 111, and the display windows of the second screen region 112 may be spaced apart from the periphery of the first screen region 111 and adjacent to the edge of the first screen region 111.

Alternatively, the second screen region 112 is disposed in a floating manner with respect to the first screen region 111, and then each display window of the second screen region 112 may be suspended in the first screen region 111 and adjacent to the edge of the first screen region.

Based on the above description, step S210 may include the steps of:

s211: and displaying the live broadcast picture corresponding to the rest panoramic video stream in a second picture area which is arranged at an interval with the first picture area or is suspended on the first picture area and has a size smaller than that of the first picture area.

By setting the size of the first screen region 111 to be larger than that of the second screen region 112, and setting or suspending the second screen region 112 and the first screen region 111 on the first screen region 111 at intervals, the main-sub relationship between the first screen region 111 and the second screen region 112 can be highlighted, so that the live broadcast picture of the first screen region 111 can be presented more and better, a user can watch the live broadcast picture of the first screen region 111 conveniently, the emphasis is highlighted, and the display effect is improved.

S212: and displaying at least one live broadcast picture corresponding to the rest panoramic video streams in each display window of the second picture area in a one-to-one correspondence manner.

The number of display windows may be greater than or equal to the number of live pictures corresponding to the remaining panoramic video streams. If the number of each display window is equal to the number of the live broadcast pictures corresponding to the remaining panoramic video streams, the live broadcast pictures corresponding to the remaining panoramic video streams are displayed in the display windows of the first screen area 111 in a one-to-one correspondence manner. If the number of each display window is greater than the number of the live broadcast pictures corresponding to the remaining panoramic video streams, the live broadcast pictures corresponding to the remaining panoramic video streams are respectively and correspondingly displayed outside one display window, and the remaining display windows can be used for displaying the live broadcast pictures of the common video stream or the patterns of the voice microphone, such as the head portrait or other identification symbols. In this way, at least one live view corresponding to the remaining panoramic video stream is displayed in each display window of the second screen area 112.

Through at least corresponding display of the live broadcast pictures corresponding to the rest of panoramic videos in each display window of the second screen area 112, mutual interference between the live broadcast pictures corresponding to the rest of panoramic videos can be reduced, and it is convenient to distinguish each microphone connecting party, and further to perform gift interaction, communication and the like, and it is also convenient to perform switching operation on the live broadcast pictures between the first screen area 111 and the second screen area 112, and further to realize switching display.

The live broadcast picture of the panoramic video displayed on the live broadcast interface can be displayed by VR equipment, and can also be directly displayed by a screen of the electronic equipment. The two different display modes can be realized by the following steps included in step S200.

S221: the current display mode is detected.

For example, the current display mode is determined by detecting whether a VR device is connected, and if the VR device is not connected, the current display mode is the direct display mode. And if the VR equipment is connected, the current display mode is the VR display mode.

Of course, the display mode may also be determined by a related operation, for example, a related button is displayed on the live interface, the current display mode is switched by clicking the button, and then the related data may be read to detect the current display mode.

S222: and if the current display mode is the direct display mode, directly rendering and displaying the respective live broadcast pictures of the at least two panoramic video streams.

And if the current display mode is detected to be the direct display mode, directly rendering and displaying the respective live broadcast pictures of the at least two panoramic video streams through the screen of the electronic equipment. Thus, the number of the first screen regions 111 and the number of the second screen regions 112 may be one, and the number of the display windows of the second screen regions 112 may be determined according to the number of the connected display parties, and is directly displayed on the live broadcast interface.

S223: and if the current display mode is the VR display mode, rendering the live pictures of the at least two panoramic video streams into left and right pictures corresponding to left and right eyes for display.

And if the current display mode is detected to be the VR display mode, rendering the live pictures of the at least two panoramic video streams into left and right pictures corresponding to left and right eyes for display. Thus, the left and right pictures can cause visual difference at the left and right eyes, and further form space sense or 3D vision.

Specifically, the first screen region 111 and the second screen region 112 are each divided into left and right screens to be displayed. For example, the left picture is rendered to form three display windows of a first picture area 111 and a second picture area 112, the right picture is also rendered to form three display windows of the first picture area 111 and the second picture area 112, the left picture and the right picture respectively correspond to the left eye and the right eye, and the left picture and the right picture are synthesized to realize the effect of 3D virtual display due to the self characteristics of VR devices.

The VR device can be provided with a display screen, and the screen of the electronic device can also be used as the display screen. If the VR device can be provided with a display screen, left and right pictures can be rendered on the screen of the VR device. If the VR device is displaying through the screen of the electronic device, left and right pictures can be rendered on the screen of the electronic device.

Alternatively, the live screens of the first screen section 111 and the second screen section 112 may switch the display. Specifically, the following steps may be performed after step S200.

S230: and responding to the picture switching instruction, and switching and displaying the live broadcast picture displayed by the second picture area and corresponding to the picture switching instruction and the live broadcast picture currently displayed by the first picture area.

The live view displayed in the second screen area 112 and the live view currently displayed in the first screen area 111 are switched to be displayed, that is, the live view currently displayed in the first screen area 111 is displayed in the second screen area 112, and the live view switched from the second screen area 112 is displayed in the first screen area 111. In short, the display positions of the two live screens are reversed in the first screen section 111 and the second screen section 112. Specifically, the live view displayed on the display window of the second screen region 112 and the live view currently displayed on the first screen region 111 may be displayed in a switched manner.

The screen switching instruction may be automatically generated by the electronic device, for example, the screen switching instruction is generated at predetermined time intervals, and switching display of the live screen of the first screen region 111 and the live screen of the second screen region 112 is realized.

The screen switching instruction may also be formed by a user through a corresponding operation.

In the direct display mode, the electronic device may receive a screen switching instruction input through at least one of a touch operation on a screen, a rotation operation of a gyroscope, a click operation of a preset key, a gesture operation, and a voice recognition operation.

For example, by directly clicking a live view of the second screen region 112 displayed on the screen, the live view can be switched to the first screen region 111 for display.

For example, the input of the screen switching instruction is realized by a click operation of a physical key or a virtual button of the electronic device, such as a volume key, a power key, or an on-screen virtual button.

For example, voice input is realized by a microphone of the electronic apparatus, and input of a screen switching instruction is realized by voice recognition.

For example, the input of a screen switching instruction is realized by a rotation operation of the electronic device and orientation and gravity sensing generated by a gyroscope.

In the VR display mode, the electronic device may receive a screen switching instruction input through at least one of a key click operation, a gyro rotation operation, a gesture operation, and a voice recognition operation on the VR device.

For example, the VR device provides physical keys, such as a handle, through which a screen switching instruction can be input.

For example, the VR device may construct a virtual ray and a virtual key, after the user wears the VR device, the ray may be emitted from a virtual world position corresponding to a handle or a helmet, and by rotating a head or moving the handle, the direction of the ray is adjusted, the virtual ray is directed to the virtual key, and if the virtual ray is kept still for a certain time (e.g., 1 to 3 seconds) or a physical key on the helmet or the handle or other devices is pressed, a screen switching instruction is triggered.

For example, the VR device or the electronic device may be controlled by a sensor, such as a gyroscope, to set a specific action corresponding to a specific command, such as fast nodding twice up and down, fast nodding twice left and right, jumping one hop, and the like, to trigger a frame switching command.

For example, the VR device collects a picture through a camera provided therein, performs gesture recognition to input a picture switching instruction, or performs gesture recognition to interact with a virtual menu to input a picture switching instruction.

For example, the VR device collects audio through a microphone provided therein, and inputs a screen switching instruction through voice recognition.

S300: and responding to the roaming operation instruction, and performing roaming display corresponding to the roaming operation instruction on at least one live broadcast picture displayed on the live broadcast interface.

The electronic device or the VR device may automatically generate the roaming operation instruction, or the server may transmit the roaming operation instruction, or the user may input the roaming operation instruction by performing a corresponding operation on the electronic device or the VR device.

And responding to the roaming operation instruction, and performing corresponding roaming display on at least one of the at least two panoramic video streams according to the roaming operation instruction. In the process of roaming display, the visual angle of the live broadcast picture is correspondingly changed, and the content of the live broadcast picture can be switched.

There are various ways for generating the roaming operation command, and the following steps may be implemented before step S300:

s301: and in the direct display mode, receiving a roaming operation instruction input through at least one of touch operation on a screen, rotation operation of a gyroscope, click operation of a preset key, gesture operation and voice recognition operation.

S302: in the VR display mode, a roaming operation instruction input through at least one of a key click operation, a gyroscope rotation operation, a gesture operation and a voice recognition operation on the VR device is received.

Through obtaining two at least panoramic video flows to show corresponding live broadcast picture on the live broadcast interface, and then at least one of them in the live broadcast picture carries out the roaming display corresponding with roaming operation instruction, realize panoramic video in the live broadcast effectively and link the wheat, and at least one in the live broadcast picture realizes corresponding roaming display through roaming operation instruction, make the user realize immersive experience, can watch more live broadcast picture details in the live broadcast picture, and roaming display can strengthen the interactive function of live broadcast, promote interactive experience, and then realize good display effect. Moreover, based on panoramic video and microphone, the convenience and the content richness of watching of a user are improved.

Based on the above description of the first and second screen regions 111 and 112, the step S300 may include the steps of:

s310: and performing roaming display corresponding to the roaming operation instruction on the live broadcast picture displayed in the first picture area.

The live broadcast pictures of at least two panoramic video streams are separately displayed in the first picture area 111 and the second picture area 112, and the live broadcast pictures of the first picture area 111 are subjected to roaming display corresponding to the roaming operation instruction, so that the first picture area 111 can be accurately subjected to roaming display. Optionally, the size of the first screen region 111 is larger than the size of the second screen region 112, forming a main-sub hierarchy. The first screen area is often a party to which the user wants to watch, such as a certain anchor, and then performs roaming display on the first screen area 111 with a larger size, so that the user can conveniently operate and watch, the watching convenience is improved, the display priority of the first screen area 111 is highlighted, and the display effect is improved. Moreover, the live view displayed in the first view area 111 is displayed in a roaming manner, so that the live view in the second view area 112 is not interfered, and the display effect of the roaming display can be improved.

In the direct display mode, the input of the roaming operation command may be implemented as follows:

for example, the user may slide a live view of the first screen region 111 displayed on the screen, so as to change the viewing angle of the live view, thereby implementing roaming display of the live view.

For example, the input of the screen switching instruction is realized by a click operation of a physical key or a virtual button of the electronic device, such as a volume key, a power key, or an on-screen virtual button.

For example, voice input is realized through a microphone of the electronic device, and input of a roaming operation instruction is realized through voice recognition.

For example, the input of the roaming operation instruction is realized by the rotation operation of the electronic device and the orientation and gravity induction generated by the gyroscope.

In the VR display mode, the input of the roaming operation instruction may be implemented as follows:

for example, the VR device provides physical keys, such as a handle, through which a roaming operation command can be input.

For example, the VR device may construct a virtual ray and a virtual key, after the user wears the VR device, the ray may be emitted from a virtual world position corresponding to a handle or a helmet, and by rotating a head or moving the handle, the direction of the ray is adjusted, the virtual ray is directed to the virtual key (the virtual key is, for example, a left upper key and a right lower key), and the ray directed to the corresponding key triggers a corresponding roaming operation instruction.

For example, a VR device or an electronic device may be controlled by a sensor, such as a gyroscope, to set a specific action corresponding to a specific command, such as a left-right shake, a movement, or a tilt, to trigger a roaming operation command.

For example, the VR device is provided with a camera, and performs gesture recognition to input a roaming instruction by capturing a picture through the camera, or inputs a picture switching instruction by interacting with a virtual menu through gesture recognition.

For example, the VR device is provided with a microphone, the microphone collects audio, and the roaming operation instruction is input through voice recognition.

Optionally, the roaming display corresponding to the roaming operation instruction is performed on the live broadcast picture in the first screen area 111, and the live broadcast picture in the second screen area 112 is not subjected to the roaming display, so that the roaming display can be emphasized on the first screen area 111, the display effect is enhanced, and the feeling of the user being confused about watching by the roaming display performed on all live broadcast pictures can be reduced.

The live view of the second view area 112 is not displayed in a roaming manner, but the viewing angle thereof may be set accordingly, which may be implemented by the following steps included in step S210.

S213: and respectively displaying the live broadcast pictures corresponding to one visual angle of the rest panoramic video streams in a second picture area.

Specifically, an initial viewing angle is selected for each of the panoramic video streams other than the panoramic video stream corresponding to the first screen section 111, and a live view corresponding to the initial viewing angle is displayed in the second screen section 112. Alternatively, the initial viewing angle may be determined randomly. The initial viewing angles of the respective panoramic video streams in the second screen section 112 may be the same or different. The initial viewing angle may be a panoramic viewing angle or may be an azimuthal viewing angle.

By displaying the live broadcast picture corresponding to one view angle of each of the other panoramic video streams in the second screen area 112, the second screen area 112 is not affected by the roaming display performed by the first screen area 111, so that the display effect of the roaming display is improved, and the processing pressure of the device can be relieved.

The viewing angle of the live view of the panoramic video stream displayed in the second view region 112 may change during the live view, and may be specifically implemented as follows after step S213.

S214: and automatically switching the view angle of at least one of the rest panoramic video streams presented in the second picture area according to a preset time interval, so as to correspondingly display the live broadcast picture corresponding to the switched view angle in the second picture area.

On the premise that the second screen region 112 cannot be displayed in a roaming manner, the live view of the second screen region 112 may be switched to the first screen region 111 for displaying in a roaming manner. In addition, in order to increase the display richness of the second screen section 112 and enhance the display function and the interactive function, the viewing angle of the live broadcast screen corresponding to the panoramic video stream in the second screen section 112 may be automatically switched according to a preset time interval. The viewing angle interval is, for example, 5 minutes, 10 minutes, or 20 minutes, etc. The preset time interval may be set automatically by the electronic device or set by the user himself.

For example, in the live view displayed in a certain display window in the second screen area 112, the initial viewing angle is the panoramic viewing angle, the first orientation viewing angle is switched after 5 minutes, and the first orientation viewing angle is switched to the second orientation viewing angle after 5 minutes, … ….

The visual angle of the panoramic video stream in the second picture area 112 can be switched according to the time interval, so that the live broadcast process is richer and more interesting, the interactive function can be further enhanced, and the display effect is improved. Therefore, the roaming display of the first screen area 111 is not interfered, the visual angle of the second screen area 112 is not solidified and becomes flexible, the problem of poor interaction function caused by the fact that the second screen area 112 cannot be subjected to roaming display is solved, and the roaming display and the device processing capacity are well balanced.

The electronic device mentioned above can refer to the following embodiments of the electronic device of the present application.

Referring to fig. 6, the electronic device according to the embodiment of the present application includes a processor 211, a communication circuit 212, a display screen 213, and a memory 214. The communication circuit 212, the display screen 213 and the memory 214 are respectively coupled to the processor 211. The processor 211 is configured to execute a computer program to implement the display processing method of the panoramic video live broadcast microphone as described in the embodiment of the display processing method of the panoramic video live broadcast microphone of the present application.

The communication circuit 212 is used for the electronic device of the present embodiment to communicate with an external device, and the electronic device can transmit data to the external device or receive data from the external device through the communication circuit 212. The display screen 213 is used to implement the process of live room virtual gift interaction. The memory 214 is used for storing program data, and may be a RAM, a ROM, or other types of storage devices.

The processor 211 is used for controlling the operation of the electronic device, and the processor 211 may also be referred to as a CPU (Central Processing Unit). The processor 211 may be an integrated circuit chip having signal processing capabilities. The processor 211 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor 211 may be any conventional processor or the like.

In the several embodiments provided in the present application, it should be understood that the disclosed electronic device and the display processing method of the panoramic video live broadcast microphone may be implemented in other ways. For example, the above-described embodiments of the electronic device are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.

Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.

In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.

Referring to fig. 7, the integrated unit may be stored in a computer-readable storage medium 220 if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions (computer program) for causing a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media such as a usb disk, a portable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and electronic devices such as a computer, a mobile phone, a notebook computer, a tablet computer, and a camera having the storage medium.

For the description of the execution process of the computer program in the computer-readable storage medium, reference may be made to the foregoing description of the embodiment of the display processing method for live panoramic video and live broadcast, which is not described herein again.

To sum up, above-mentioned embodiment is through acquireing two at least panoramic video streams at the live broadcast in-process to show the live broadcast picture in the live broadcast interface, and then carry out the roaming display to at least one of them live broadcast picture, realize effectively that the live broadcast of panoramic video links the wheat, and can strengthen the interactivity of live broadcast, promote the display effect, richen the display content, richen the live broadcast function.

The above description is only an example of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings, or which are directly or indirectly applied to other related technical fields, are intended to be included within the scope of the present application.

16页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:信息处理方法、装置、电子设备及存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类