Stage lighting element design method and system based on virtual reality

文档序号:1782704 发布日期:2019-12-06 浏览:20次 中文

阅读说明:本技术 基于虚拟现实的舞台灯光元素的设计方法及设计系统 (Stage lighting element design method and system based on virtual reality ) 是由 *** 韦韬 任伟 于 2018-05-28 设计创作,主要内容包括:本发明提供基于虚拟现实的舞台灯光元素的设计方法及设计系统。所述方法包括:通过所述虚拟现实穿戴设备向用户展示虚拟舞台场景,其中,所述虚拟舞台场景包括通过增强现实技术建立的虚拟用户界面;通过所述虚拟现实穿戴设备捕捉用户在所述虚拟舞台场景中操作所述虚拟用户界面而下达的界面指令,并根据所述界面指令布设或调整舞台灯光元素在所述虚拟舞台场景中的位置及姿态;通过所述显示设备实时展示所述虚拟舞台场景及其舞台灯光元素的位置及姿态。本发明有效避免了舞台设计者反复的构思和调试过程,减轻了场景布置人员的劳动强度和劳动时间,大大提高了舞台场景的布局效率和企业效益。(The invention provides a design method and a design system of stage lighting elements based on virtual reality. The method comprises the following steps: displaying a virtual stage scene to a user through the virtual reality wearing equipment, wherein the virtual stage scene comprises a virtual user interface established through an augmented reality technology; capturing an interface instruction issued by a user operating the virtual user interface in the virtual stage scene through the virtual reality wearing equipment, and laying or adjusting the position and the posture of a stage lighting element in the virtual stage scene according to the interface instruction; and displaying the position and the posture of the virtual stage scene and the stage lighting elements thereof in real time through the display equipment. The invention effectively avoids the repeated conception and debugging process of the stage designer, reduces the labor intensity and labor time of scene arrangement personnel, and greatly improves the layout efficiency and enterprise benefit of the stage scene.)

1. A design method of stage lighting elements based on virtual reality is characterized by being applied to terminal equipment, wherein the terminal equipment is in communication connection with virtual reality wearing equipment and display equipment; the method comprises the following steps:

Displaying a virtual stage scene to a user through the virtual reality wearing equipment, wherein the virtual stage scene comprises a virtual user interface established through an augmented reality technology;

Capturing an interface instruction issued by a user operating the virtual user interface in the virtual stage scene through the virtual reality wearing equipment, and laying or adjusting the position and the posture of a stage lighting element in the virtual stage scene according to the interface instruction;

And displaying the position and the posture of the virtual stage scene and the stage lighting elements thereof in real time through the display equipment.

2. The method of claim 1, further comprising:

Displaying each pre-stored stage lighting element through the virtual user interface;

Capturing first gesture action information of a user through the virtual reality wearable device, and accordingly identifying a target stage lighting element selected by the user;

And acquiring voice input information through the virtual reality wearable equipment, and modifying the parameter value of the target stage lighting element according to the voice input information after a preset time interval.

3. The method of claim 2, further comprising: capturing second gesture action information of the user through the virtual reality wearing equipment, and arranging the target stage lighting elements at corresponding positions of the virtual stage scene.

4. The method of claim 2, further comprising:

Capturing third gesture action information of a user through the virtual reality wearable device to display a laser ray in the virtual stage scene, wherein one end of the laser ray is intersected with the target stage lighting element, and meanwhile, the target stage lighting element is highlighted;

Capturing fourth gesture action information of the user through the virtual reality wearable device to move the position of the target stage lighting element or rotate the angle of the target stage lighting element.

5. The method of claim 1, further comprising:

Capturing head movements of a user through the virtual reality wearable device so as to synchronously change the visual angle state of the virtual stage scene;

Capturing user hand actions through the virtual reality wearing equipment so as to synchronously change the moving state of the user in the virtual stage scene.

6. A stage lighting element design system based on virtual reality is characterized by being applied to terminal equipment, wherein the terminal equipment is in communication connection with virtual reality wearing equipment and display equipment; the system comprises:

The virtual scene display module is used for displaying a virtual stage scene to a user through the virtual reality wearing equipment, wherein the virtual stage scene comprises a virtual user interface established through an augmented reality technology;

the lighting element design module is used for capturing an interface instruction issued by a user operating the virtual user interface in the virtual stage scene through the virtual reality wearable device, and laying or adjusting the position and the posture of a stage lighting element in the virtual stage scene according to the interface instruction;

And the lighting element display module is used for displaying the position and the posture of the virtual stage scene and the stage lighting elements thereof in real time through the display equipment.

7. The system of claim 6,

The virtual scene display module is further configured to: displaying each pre-stored stage lighting element through the virtual user interface;

The light element design module is further configured to: capturing first gesture action information of a user through the virtual reality wearable device, and accordingly identifying a target stage lighting element selected by the user; and acquiring voice input information through the virtual reality wearable equipment, and modifying the parameter value of the target stage lighting element according to the voice input information after a preset time interval.

8. The system of claim 7, wherein the light element design module is further configured to: capturing second gesture action information of the user through the virtual reality wearing equipment, and arranging the target stage lighting elements at corresponding positions of the virtual stage scene.

9. The system of claim 7, wherein the light element design module is further configured to: capturing third gesture action information of a user through the virtual reality wearable device to display a laser ray in the virtual stage scene, wherein one end of the laser ray is intersected with the target stage lighting element, and meanwhile, the target stage lighting element is highlighted; capturing fourth gesture action information of the user through the virtual reality wearable device to move the position of the target stage lighting element or rotate the angle of the target stage lighting element.

10. The system of claim 6, wherein the virtual scene display module is further configured to: capturing head movements of a user through the virtual reality wearable device so as to synchronously change the visual angle state of the virtual stage scene; capturing user hand actions through the virtual reality wearing equipment so as to synchronously change the moving state of the user in the virtual stage scene.

11. A terminal device, comprising: a processor, and a memory; wherein the content of the first and second substances,

The memory is used for storing a computer program;

the processor is used for loading and executing the computer program to enable the terminal device to execute the design method of the stage lighting element based on the virtual reality according to any one of claims 1 to 5.

12. A stage lighting element design system, comprising: the terminal device of claim 11, and a virtual reality wearing device and a display device in communication connection with the terminal device.

Technical Field

the invention relates to the field of stage modeling software, in particular to a design method and a design system of stage lighting elements based on virtual reality.

Background

the traditional stage lighting operation system is mainly characterized in that the stage lighting rendering effect is manually configured through experience, stage elements in a stage lighting software system are distributed through a mouse or a touch screen, then interface command information is sent to a computer control center, and the computer control center sends instructions to the stage lighting elements so as to move the stage lighting elements and adjust the stage lighting elements to the positions and postures corresponding to the instructions. Such conventional stage layout technology has existed for many years, and is increasingly unable to meet the application requirements of stage layout in new era. The traditional stage layout mode cannot enable a designer to personally perceive the stereoscopic impression of a three-dimensional space, so that the designer can only imagine stage layout on a two-dimensional plane, and the space impression and the real-time rendering effect are difficult to grasp in the design process.

Disclosure of Invention

In view of the above-mentioned shortcomings of the prior art, an object of the present invention is to provide a method and a system for designing a stage lighting element based on virtual reality, which are used to solve the above-mentioned problems in the prior art, and enable a designer to feel a three-dimensional effect of a virtual stage in real time, so that the designer can enjoy an immersive stage lighting design process.

in order to achieve the above objects and other related objects, the present invention provides a method for designing stage lighting elements based on virtual reality, which is applied to a terminal device, wherein the terminal device is in communication connection with a virtual reality wearable device and a display device; the method comprises the following steps: displaying a virtual stage scene to a user through the virtual reality wearing equipment, wherein the virtual stage scene comprises a virtual user interface established through an augmented reality technology; capturing an interface instruction issued by a user operating the virtual user interface in the virtual stage scene through the virtual reality wearing equipment, and laying or adjusting the position and the posture of a stage lighting element in the virtual stage scene according to the interface instruction; and displaying the position and the posture of the virtual stage scene and the stage lighting elements thereof in real time through the display equipment.

In an embodiment of the present invention, the method further includes: displaying each pre-stored stage lighting element through the virtual user interface; capturing first gesture action information of a user through the virtual reality wearable device, and accordingly identifying a target stage lighting element selected by the user; and acquiring voice input information through the virtual reality wearable equipment, and modifying the parameter value of the target stage lighting element according to the voice input information after a preset time interval.

In an embodiment of the present invention, the method further includes: capturing second gesture action information of the user through the virtual reality wearing equipment, and arranging the target stage lighting elements at corresponding positions of the virtual stage scene.

in an embodiment of the present invention, the method further includes: capturing third gesture action information of a user through the virtual reality wearable device to display a laser ray in the virtual stage scene, wherein one end of the laser ray is intersected with the target stage lighting element, and meanwhile, the target stage lighting element is highlighted; capturing fourth gesture action information of the user through the virtual reality wearable device to move the position of the target stage lighting element or rotate the angle of the target stage lighting element.

In an embodiment of the present invention, the method further includes: capturing head movements of a user through the virtual reality wearable device so as to synchronously change the visual angle state of the virtual stage scene; capturing user hand actions through the virtual reality wearing equipment so as to synchronously change the moving state of the user in the virtual stage scene.

In order to achieve the above objects and other related objects, the present invention provides a stage lighting element design system based on virtual reality, which is applied to a terminal device, wherein the terminal device is in communication connection with a virtual reality wearable device and a display device; the system comprises: the virtual scene display module is used for displaying a virtual stage scene to a user through the virtual reality wearing equipment, wherein the virtual stage scene comprises a virtual user interface established through an augmented reality technology; the lighting element design module is used for capturing an interface instruction issued by a user operating the virtual user interface in the virtual stage scene through the virtual reality wearable device, and laying or adjusting the position and the posture of a stage lighting element in the virtual stage scene according to the interface instruction; and the lighting element display module is used for displaying the position and the posture of the virtual stage scene and the stage lighting elements thereof in real time through the display equipment.

In an embodiment of the present invention, the virtual scene display module is further configured to: displaying each pre-stored stage lighting element through the virtual user interface; the light element design module is further configured to: capturing first gesture action information of a user through the virtual reality wearable device, and accordingly identifying a target stage lighting element selected by the user; and acquiring voice input information through the virtual reality wearable equipment, and modifying the parameter value of the target stage lighting element according to the voice input information after a preset time interval.

In an embodiment of the present invention, the light element design module is further configured to: capturing second gesture action information of the user through the virtual reality wearing equipment, and arranging the target stage lighting elements at corresponding positions of the virtual stage scene.

In an embodiment of the present invention, the light element design module is further configured to: capturing third gesture action information of a user through the virtual reality wearable device to display a laser ray in the virtual stage scene, wherein one end of the laser ray is intersected with the target stage lighting element, and meanwhile, the target stage lighting element is highlighted; capturing fourth gesture action information of the user through the virtual reality wearable device to move the position of the target stage lighting element or rotate the angle of the target stage lighting element.

In an embodiment of the present invention, the virtual scene display module is further configured to: capturing head movements of a user through the virtual reality wearable device so as to synchronously change the visual angle state of the virtual stage scene; capturing user hand actions through the virtual reality wearing equipment so as to synchronously change the moving state of the user in the virtual stage scene.

To achieve the above and other related objects, the present invention provides a terminal device, comprising: a processor, and a memory; wherein the memory is for storing a computer program; the processor is used for loading and executing the computer program to enable the terminal device to execute the design method of the stage lighting element based on the virtual reality.

to achieve the above and other related objects, the present invention provides a design system for stage lighting elements, comprising: the terminal device, the virtual reality wearing device and the display device are in communication connection with the terminal device.

as described above, according to the stage lighting element design method and system based on virtual reality, the user is brought into the virtual three-dimensional world through equipment such as VR glasses, the movement of the user in the virtual scene is realized through gesture actions, the operation of the user on the UI interface in the virtual scene is captured through gestures, and the position and the posture of the stage lighting element in the three-dimensional scene are adjusted, so that the effects of placing the stage element scene and sensing the rendering of the stage lighting in real time are achieved, the labor intensity of a designer is greatly reduced, and the perfection degree of the stage scene design is favorably improved.

Drawings

Fig. 1 is a schematic structural diagram of a stage lighting element design system according to an embodiment of the present invention.

Fig. 2 is a schematic diagram illustrating a design method of stage lighting elements based on virtual reality according to an embodiment of the present invention.

fig. 3A is a schematic diagram illustrating an interaction effect between a user and a virtual scene according to an embodiment of the present invention.

fig. 3B is a schematic diagram illustrating an effect of a user interacting with a virtual scene according to another embodiment of the present invention.

Fig. 4 is a schematic diagram illustrating a design system of stage lighting elements based on virtual reality according to an embodiment of the present invention.

Fig. 5 is a schematic view illustrating an imaging of a virtual stage scene of a virtual reality wearable device according to an embodiment of the invention.

Detailed Description

The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.

it should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.

The invention provides a novel stage lighting element design method, a novel stage lighting element design system, a novel terminal device and a novel stage lighting element design system based on virtual reality. The invention utilizes VR (Virtual Reality) and AR (Augmented Reality) technology to adjust the position and the posture of the stage lighting element in the Virtual stage scene, thereby designing the simulated rendering effect of the stage lighting.

The present invention will be described in detail below with reference to examples and the accompanying drawings.

fig. 1 shows a design system for a stage element. The stage element design system mainly comprises terminal equipment 1 (such as a desktop computer, a portable computer, a tablet personal computer, a smart phone, a network cloud and the like), virtual reality wearing equipment 2 (such as VR helmets, VR glasses, VR gloves, VR handles and the like) and display equipment 3 (such as an LED display screen and the like), wherein the virtual reality wearing equipment 2 and the display equipment 3 are respectively connected with the terminal equipment 1 in a mode that: a wired communication connection, a wireless communication connection, etc.

As shown in fig. 2, the method for designing stage lighting elements based on virtual reality according to the present embodiment is applied to the terminal device 1 shown in fig. 1, and mainly includes the following steps:

S21: a virtual stage scene is presented to a user through the virtual reality wearable device 2 shown in fig. 1, wherein the virtual stage scene includes a virtual user interface established through an augmented reality technology.

In detail, the user can "enter" a pre-established virtual stage scene after wearing the virtual reality wearing device 2, and meanwhile, a virtual user interface (UI interface) is provided in the virtual stage scene to realize real-time interaction between the user and the virtual stage scene. The terminal device 1 stores a plurality of predefined gesture actions and instruction meanings corresponding to the gesture actions, and when the gesture action captured by the virtual reality wearable device 2 is successfully matched with a predefined gesture action, the terminal device 1 can recognize an operation to be executed by the captured gesture action.

To avoid the virtual user interface interfering with the overall display of the virtual stage scene, in one embodiment, the virtual user interface pops up when the virtual reality wearable device 2 captures one gesture action of the user and hides when another gesture action of the user is captured, for example: the right hand extends to the right front of the head, the door knocking action is carried out for three times, and a UI (user interface) is popped up; the palm of the hand is downward and extends out of five fingers, and a UI interface is popped up; the left hand extends to the left front of the head, the action of knocking the door is performed three times, and a UI interface is hidden.

the virtual reality wearable device 2 captures the head motion and the hand motion of the user, and transmits the motion information to the terminal device 1. The terminal device 1 processes the action information and then synchronously displays the changed virtual stage scene to the user through the virtual reality wearable device 2, so that the user can obtain immersive spatial experience.

for example, according to the fact that a sensor of a VR helmet captures the left-right rotation and up-down movement of the head of a human body, a virtual stage scene can achieve the effect of synchronously rotating and moving up and down along with the head; left and right hands extend horizontally in front of the chest to represent that the scene is far away, left and right hands move forwards from two sides of the body to represent that the scene is close, right hands extend forwards and form 90 degrees with the body to represent that the user walks forwards along the current front direction, two hands extend forwards and form 90 degrees with the body to represent that the user walks forwards along the current front direction, right arms extend rightwards and form 90 degrees with the body to represent that the user walks rightwards, right arms extend rightwards and form 90 degrees with the body to represent that the user walks fast along the right direction, left arms extend leftwards and form 90 degrees with the body to represent that the user walks fast along the left direction, right arms extend forwards and form 45 degrees with the body to represent that the user walks fast along the front direction, two hands extend backwards and form 45 degrees with the body to represent that the user walks fast along the front direction, the left arm stretches out to the left place ahead and is 45 simultaneously with the health in the dead ahead and be 90 and represent along the rotation left, both hands stretch out to the left place ahead simultaneously and are 45 simultaneously with the dead ahead and be 90 with the health and represent along the rotation left fast, the right arm stretches out to the right place ahead and is 45 simultaneously with the health and be 90 and represent along the rotation right, both hands stretch out to the right place ahead simultaneously and are 45 simultaneously with the health and represent along the rotation right fast etc..

S22: an interface instruction issued by a user operating the virtual user interface in the virtual stage scene is captured through the virtual reality wearable device 2 shown in fig. 1, and the position and the posture of the stage lighting element in the virtual stage scene are laid or adjusted according to the interface instruction. The method comprises the following steps of capturing an interface instruction issued by a user operating the virtual user interface in the virtual stage scene, wherein two different implementation modes can be adopted: captured through VR gloves, captured through VR handles.

Specific implementations of capture by VR gloves, capture by VR handles are set forth below, respectively.

For capturing user instructions through VR gloves:

Referring to fig. 3A, the virtual user interface displays the pre-stored stage lighting elements and the operation function keys required in the design process to the user. When the virtual reality wearable device 2 captures first gesture action information of the user, for example: and moving the right hand to the virtual user interface, clicking a certain stage lighting element by the index finger, and considering that the user selects a target stage lighting element by the terminal equipment 1. At this time, the user may modify the parameter value of the target stage lighting element. When a user wishes to modify the parameter value of the target stage lighting element, the virtual reality wearable device 2 obtains voice input information, and modifies the parameter value of the target stage lighting element according to the voice input information after a preset time interval (for example, 1.5 seconds).

It should be noted that, since most of the voice input information obtained by modifying the parameter values is digital information, and continuous broadcast by the user may cause the terminal device 1 to be difficult to recognize whether the voice input is for modification of one parameter value or for modification of a plurality of parameter values, the present embodiment separates the digital information of the voice input by setting a preset time interval, so as to facilitate accurate recognition of each modified value by the terminal device 1, and avoid confusion of each modified item.

after recognizing the target stage lighting element selected by the user, if the second gesture action information of the user is captured through the virtual reality wearable device 2, the terminal device 1, for example: and if the right hand grasps the virtual stage scene and drags the virtual stage scene to a certain position, arranging the target stage lighting element at the corresponding position of the virtual stage scene, and adding the stage lighting element at a preset default position or a position selected in a UI (user interface).

in order to facilitate the user to intuitively know the touch screen position of the user on the virtual user interface, a virtual mouse is displayed on the virtual user interface and moves along with the movement of the limb of the user, such as: the right hand hovers at a certain operation function key in the moving process, and after hovering for 1.5 seconds, if the index finger performs tapping action, namely clicking the operation function key, the operation function is executed.

After the terminal device 1 identifies the target stage lighting element selected by the user, if the virtual reality wearing device 2 captures third gesture action information of the user, the terminal device 1 displays a laser ray pointing to the target stage lighting element in a virtual stage scene through the virtual reality wearing device 2, and at the moment, the target stage lighting element is highlighted; subsequently, if virtual reality wearing equipment 2 caught the fourth gesture action information of user, terminal equipment 1 then can show the gesture after the adjustment of target stage lighting element in the virtual stage scene through virtual reality wearing equipment 2, include: the position after the movement, the angle after the rotation, and the like. Further, the virtual reality wearable device 2 captures the fifth gesture action information of the user (if the little finger is extended after selection), and then the whole stage enters the rendering state.

For example: the right hand is extended forwards, a point in the palm center can emit a laser ray, the laser ray is intersected with the stage lighting element, namely the stage lighting element is selected, and meanwhile the selected stage lighting element can be highlighted; the right hand clenches a fist, and the drag and drop object is moved left and right and back to place the selected stage lighting element at a proper position; and when the right hand is extended, the selection of the stage lighting elements is released. Another example is: the left hand is stretched forwards, a point in the palm center can emit a laser ray, the laser ray is intersected with the stage lighting element, namely the stage lighting element is selected, and meanwhile the selected stage lighting element can be highlighted; the right hand is placed on the Leap Motion somatosensory controller to do a rotating action, so that the object can be rotated to rotate the selected stage lighting element to a proper angle position; the left hand is extended, and the selection of the stage lighting elements is released. For another example: when the stage lighting element is selected, the stage lighting element can be moved leftwards by swinging the arm leftwards, the stage lighting element can be moved rightwards by swinging the arm rightwards, the stage lighting element can be moved upwards by swinging the arm upwards, and the stage lighting element can be moved downwards by swinging the arm downwards; only extending out the middle finger after selection, and representing that the stage lighting elements rotate rightwards; after the selection, only the thumb is extended out, which represents that the stage lighting element rotates leftwards.

For capturing user instructions through the VR handle:

The controls for the VR handle typically include: modify key, track pad, trigger key, side key. Referring to fig. 3B, unlike capturing user commands through the VR glove, when a user presses a certain button of the VR handle, it emits a beam of light, and the color of the beam represents the interaction mode of the VR handle, such as: orange in the standard interaction mode, green in the selection mode, yellow in the movement mode, etc. Clicking the track pad can display or hide the opened window. And the virtual user interface is displayed in a suspending state, and when a user presses down the key of the VR handle, the operation function key is confirmed to be selected.

For a general VR handle, a controller is moved to a world scene by pressing a side key, and the user feels like to push and pull by grasping the world scene with a hand; the side key, the alignment controller and the trigger key are pressed, so that the current position can be moved to the position aligned with the controller; the world scene can be rotated around the user by pressing the side keys on the two controllers and alternately moving the two controllers, and the user feels like to grasp the world scene by hands to rotate; pressing the side keys on the two controllers and moving them close to or away from each other can enlarge and reduce the world scene, etc.

It goes without saying that the correspondence between the keys/key combinations and the first to fifth gesture information can be set by those skilled in the art in view of the prior art of the controller, and will not be expanded in detail here.

S23: the position and the posture of the virtual stage scene and the stage lighting elements thereof are shown in real time by the display device 3 shown in fig. 1.

Each action of the user interacting with the UI interface in real-time in the virtual stage scene, and the changes to the virtual stage scene caused by each action, are displayed in the display.

Fig. 4 shows a design system 400 for stage lighting elements based on virtual reality, wherein the system 400 is implemented as a piece of software in the terminal device 1 installed in fig. 1 to execute the design method for stage lighting elements based on virtual reality in the foregoing embodiments when running. Since the principle in this embodiment is the same as that in the foregoing method embodiment, the same technical details are not repeated.

The design system 400 for stage lighting elements based on virtual reality of the embodiment includes: a virtual scene display module 401, a light element design module 402, and a light element display module 403.

The virtual scene display module 401 displays a virtual stage scene to a user through the virtual reality wearable device, wherein the virtual stage scene includes a virtual user interface established by an augmented reality technology. In an embodiment, the virtual scene display module 401 captures head movements of the user through the virtual reality wearable device, so as to synchronously change the view angle state of the virtual stage scene; capturing user hand actions through the virtual reality wearing equipment so as to synchronously change the moving state of the user in the virtual stage scene.

the lighting element design module 402 captures an interface instruction issued by a user operating the virtual user interface in the virtual stage scene through the virtual reality wearable device, and lays or adjusts the position and the posture of the stage lighting element in the virtual stage scene according to the interface instruction. In an embodiment, the lighting element designing module 402 captures second gesture motion information of the user through the virtual reality wearable device, and lays the target stage lighting element at a corresponding position of the virtual stage scene. In an embodiment, the light element design module 402 captures third gesture motion information of the user through the virtual reality wearable device to display a laser ray in the virtual stage scene, wherein one end of the laser ray intersects with the target stage light element, and the target stage light element is highlighted; capturing fourth gesture action information of the user through the virtual reality wearable device to move the position of the target stage lighting element or rotate the angle of the target stage lighting element.

In an embodiment, the virtual scene display module 401 displays the pre-stored stage lighting elements through the virtual user interface. The lighting element design module 402 captures first gesture action information of the user through the virtual reality wearable device, so as to identify a target stage lighting element selected by the user; and acquiring voice input information through the virtual reality wearable equipment, and modifying the parameter value of the target stage lighting element according to the voice input information after a preset time interval.

The light element display module 403 displays the position and the posture of the virtual stage scene and the stage light elements thereof in real time through the display device.

in addition, the present invention further includes a storage medium and a terminal device, and the technical features in the foregoing embodiments may be applied to the storage medium embodiment and the electronic device embodiment, so that repeated descriptions are omitted.

The storage medium includes: various media such as ROM, RAM, magnetic disk or optical disk, etc. which can store program codes, wherein the computer program is stored, and when the computer program is loaded and executed by a processor, the computer program realizes all or part of the steps of the design method of the stage lighting element based on virtual reality in the foregoing embodiments.

The terminal equipment is equipment comprising a processor (CPU/MCU/SOC), a memory (ROM/RAM), a communication module (wired/wireless network) and a display module, and is preferably a desktop computer. In particular, the memory stores a computer program, and the processor implements all or part of the steps of the design method for stage lighting elements based on virtual reality in the foregoing embodiments when the computer program is loaded and executed.

In summary, the stage lighting element design method and system based on virtual reality of the present invention enables the user to sense or adjust the three-dimensional rendering scene by immersing in the virtual stage scene, as shown in fig. 5, selecting the stage lighting elements through gestures, popping up a UI interface after the stage lighting elements are selected, capturing an operation command by the UI interface to adjust the posture of the stage lighting elements in the virtual stage scene, displaying the adjustment process of the stage lighting elements by a display in real time, meanwhile, the designed stage lighting rendering effect is output, repeated conception and debugging processes of stage designers are avoided, the labor intensity and labor time of scene arrangement personnel are reduced, the layout efficiency and enterprise benefit of stage scenes are greatly improved, the stage lighting rendering effect is more visual compared with the traditional stage design mode, various defects in the prior art are effectively overcome, and the stage lighting rendering effect has high industrial utilization value.

the foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

13页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:电子装置控制方法及相关设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类