Game animation processing method and device and electronic device

文档序号:427738 发布日期:2021-12-24 浏览:3次 中文

阅读说明:本技术 游戏动画的处理方法、装置及电子装置 (Game animation processing method and device and electronic device ) 是由 朱家豪 于 2021-08-27 设计创作,主要内容包括:本发明公开了一种游戏动画的处理方法、装置及电子装置。该方法包括:响应于接收到的第一控制参数,将第一控制参数转化为第二控制参数,其中,第一控制参数用于控制虚拟角色在游戏场景中进行运动状态,第二控制参数用于控制目标游戏动画的播放,目标游戏动画至少包括:虚拟角色的运动惯性叠加动画;采用第二控制参数确定目标游戏动画的播放方式。本发明解决了相关技术中所提供的FPS游戏中虚拟角色的惯性表现不自然,从而影响虚拟角色与虚拟枪械模型惯性整体表现的真实性的技术问题。(The invention discloses a game animation processing method and device and an electronic device. The method comprises the following steps: responding to the received first control parameter, converting the first control parameter into a second control parameter, wherein the first control parameter is used for controlling the virtual character to move in the game scene, the second control parameter is used for controlling the playing of a target game animation, and the target game animation at least comprises: overlaying animation on the motion inertia of the virtual character; and determining the playing mode of the target game animation by adopting the second control parameter. The invention solves the technical problem that the reality of the integral inertia expression of the virtual character and the virtual firearm model is influenced because the inertia expression of the virtual character in the FPS game provided by the related technology is unnatural.)

1. A method for processing game animation is characterized by comprising the following steps:

responding to the received first control parameter, converting the first control parameter into a second control parameter, wherein the first control parameter is used for controlling the virtual character to move in a game scene, the second control parameter is used for controlling the playing of a target game animation, and the target game animation at least comprises: motion inertia superposition animation of the virtual character;

and determining the playing mode of the target game animation by adopting the second control parameter.

2. The method of claim 1, wherein translating the first control parameter to the second control parameter comprises:

converting the first control parameter into a motion trend numerical value, wherein the motion trend numerical value is used for representing the motion direction trend of the virtual character;

carrying out data smoothing processing on the motion trend numerical value to obtain an inertia force parameter;

and carrying out linear mapping processing on the inertia force parameter to obtain the second control parameter.

3. The method of claim 2, wherein the step of performing data smoothing on the motion trend values to obtain the second control parameter comprises:

and performing multi-order half-life period smoothing treatment on the motion trend numerical value to obtain the second control parameter.

4. The method for processing game animation according to claim 1, wherein the target game animation further comprises: the processing method of the game animation also comprises the following steps:

when the virtual character is detected to stop moving in the game scene, determining a first rebounding time length corresponding to the virtual weapon model;

and acquiring a third control parameter corresponding to the first rebound duration based on a first preset mapping relation, wherein the first preset mapping relation is a mapping relation between the rebound duration and a rebound trajectory of the virtual weapon model, and the third control parameter is used for controlling an inertia rebound trajectory of the virtual weapon model.

5. The method for processing game animation according to claim 4, further comprising:

converting the second control parameter and the third control parameter into a fourth control parameter;

and determining the playing mode of the target game animation by adopting the fourth control parameter.

6. The method of claim 5, wherein translating the second control parameter and the third control parameter to the fourth control parameter comprises:

performing multiplication calculation on the second control parameter and the third control parameter to obtain a calculation result;

and performing linear mapping processing on the calculation result to obtain the fourth control parameter.

7. The method for processing game animation according to claim 1, further comprising:

when the virtual character is detected to stop moving in the game scene, determining a second rebound duration corresponding to a virtual target part of the virtual character;

and acquiring a fifth control parameter corresponding to the second springback time length based on a second preset mapping relation, wherein the second preset mapping relation is the mapping relation between the springback time length of the virtual target part and a springback track, and the fifth control parameter is used for controlling an inertia springback track of the virtual target part.

8. The method for processing game animation according to claim 7, further comprising:

converting the second control parameter and the fifth control parameter into a sixth control parameter;

and determining the playing mode of the target game animation by adopting the sixth control parameter.

9. The method for processing game animation according to claim 1, wherein when the virtual character is in a walking state, the target game animation comprises: the virtual character moving method comprises a first frame image, an intermediate frame image and a last frame image, wherein the first frame image is used for describing a first walking posture of the virtual character, the last frame image is used for describing a second walking posture of the virtual character, and the intermediate frame image is used for describing a transition posture between the first walking posture and the second walking posture.

10. The method of claim 1, wherein when the virtual character is in a rotated state, the target game animation comprises: deflection images of the virtual character at a plurality of angles.

11. The method for processing game animation according to claim 10, wherein when the target game animation includes a motion inertia superposition animation of the virtual character and a motion inertia superposition animation of the virtual weapon model, the deflection images at each angle respectively include: the deflection image of the virtual weapon model in the aiming state and the deflection image of the virtual weapon model in the non-aiming state.

12. A game animation processing apparatus, comprising:

a conversion module, configured to convert, in response to a received first control parameter, the first control parameter into a second control parameter, where the first control parameter is used to control a virtual character to perform a motion state in a game scene, and the second control parameter is used to control playing of a target game animation, where the target game animation at least includes: motion inertia superposition animation of the virtual character;

and the processing module is used for determining the playing mode of the target game animation by adopting the second control parameter.

13. A computer-readable storage medium, in which a computer program is stored, wherein the computer program is configured to execute the processing method of a game animation according to any one of claims 1 to 11 when executed.

14. An electronic device comprising a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform the method of processing a game animation according to any one of claims 1 to 11.

Technical Field

The invention relates to the field of computers, in particular to a game animation processing method and device and an electronic device.

Background

The hand feeling and the expression of a virtual weapon model (such as a virtual gun model) in a First-person shooting (FPS) game are key of game experience, aiming at the control input of different game players to virtual characters, the virtual characters with certain hysteresis on the action expression can be considered as motion inertia expression, and the good motion inertia expression can greatly improve the reality, the operation hand feeling and the game expression of the virtual gun model.

In the FPS game provided in the related art, the action expression of the movement of the firearm is generally performed by a superimposed animation gesture method. Firstly, the movement direction of the virtual character and the movement posture expression in the corresponding direction are determined through the control input of the game player to the virtual character, then, the basic movement animation is superposed, and the animation fusion transition is carried out through an interpolation mode, so that the movement inertia in different directions is expressed.

However, the drawbacks of this solution are: the superimposed animation postures are subjected to fusion transition in an interpolation mode, so that the action inertia is not naturally represented, and the animation representation becomes hard in the transition process. In addition, the above solution can only show the current motion direction of the virtual character in posture, but does not further show the motion rebound performance caused by the inertia of a specific part (for example, a hand) of the virtual character at the end of the motion, thereby affecting the reality of the inertial overall performance of the virtual character and the virtual gun model.

In view of the above problems, no effective solution has been proposed.

Disclosure of Invention

At least part of embodiments of the invention provide a game animation processing method, a game animation processing device and an electronic device, so as to at least solve the technical problem that the reality of the integral inertial expression of a virtual character and a virtual gun model is influenced due to the unnatural inertial expression of the virtual character in an FPS game provided in the related technology.

According to an embodiment of the present invention, there is provided a method for processing game animation, including:

responding to the received first control parameter, converting the first control parameter into a second control parameter, wherein the first control parameter is used for controlling the virtual character to move in the game scene, the second control parameter is used for controlling the playing of a target game animation, and the target game animation at least comprises: overlaying animation on the motion inertia of the virtual character; and determining the playing mode of the target game animation by adopting the second control parameter.

Optionally, converting the first control parameter to the second control parameter comprises: converting the first control parameter into a motion trend numerical value, wherein the motion trend numerical value is used for expressing the motion direction trend of the virtual character; carrying out data smoothing processing on the motion trend numerical value to obtain an inertia force parameter; and carrying out linear mapping processing on the inertia force parameter to obtain a second control parameter.

Optionally, the performing data smoothing on the motion trend value to obtain the second control parameter includes: and performing multi-order half-life period smoothing treatment on the motion trend numerical value to obtain a second control parameter.

Optionally, the target game animation further includes: the processing method of the game animation also comprises the following steps: when the fact that the virtual character stops moving in a game scene is detected, determining a first rebounding time length corresponding to a virtual weapon model; and acquiring a third control parameter corresponding to the first rebound duration based on a first preset mapping relation, wherein the first preset mapping relation is a mapping relation between the rebound duration and the rebound trajectory of the virtual weapon model, and the third control parameter is used for controlling the inertial rebound trajectory of the virtual weapon model.

Optionally, the processing method of the game animation further includes: converting the second control parameter and the third control parameter into a fourth control parameter; and determining the playing mode of the target game animation by adopting the fourth control parameter.

Optionally, converting the second control parameter and the third control parameter into a fourth control parameter comprises: performing multiplication calculation on the second control parameter and the third control parameter to obtain a calculation result; and performing linear mapping processing on the calculation result to obtain a fourth control parameter.

Optionally, the processing method of the game animation further includes: when the fact that the virtual character stops moving in the game scene is detected, determining a second rebound duration corresponding to a virtual target part of the virtual character; and acquiring a fifth control parameter corresponding to the second springback time length based on a second preset mapping relation, wherein the second preset mapping relation is the mapping relation between the springback time length of the virtual target part and the springback track, and the fifth control parameter is used for controlling the inertial springback track of the virtual target part.

Optionally, the processing method of the game animation further includes: converting the second control parameter and the fifth control parameter into a sixth control parameter; and determining the playing mode of the target game animation by adopting the sixth control parameter.

Optionally, when the virtual character is in a walking state, the target game animation comprises: the virtual character moving method comprises a first frame image, an intermediate frame image and a last frame image, wherein the first frame image is used for describing a first walking posture of the virtual character, the last frame image is used for describing a second walking posture of the virtual character, and the intermediate frame image is used for describing a transition posture changing from the first walking posture to the second walking posture.

Optionally, when the virtual character is in a rotated state, the target game animation includes: a deflection image of the virtual character at a plurality of angles.

Alternatively, when the target game animation includes a motion inertia superposition animation of the virtual character and a motion inertia superposition animation of the virtual weapon model, the deflection images at each angle respectively include: the deflection image of the virtual weapon model in the aiming state and the deflection image of the virtual weapon model in the non-aiming state.

According to an embodiment of the present invention, there is also provided a game animation processing apparatus, including:

the conversion module is used for responding to the received first control parameter and converting the first control parameter into a second control parameter, wherein the first control parameter is used for controlling the virtual character to move in a game scene, the second control parameter is used for controlling the playing of a target game animation, and the target game animation at least comprises: overlaying animation on the motion inertia of the virtual character; and the processing module is used for determining the playing mode of the target game animation by adopting the second control parameter.

Optionally, the conversion module is configured to convert the first control parameter into a motion trend numerical value, where the motion trend numerical value is used to represent a motion direction trend of the virtual character; carrying out data smoothing processing on the motion trend numerical value to obtain an inertia force parameter; and carrying out linear mapping processing on the inertia force parameter to obtain a second control parameter.

Optionally, the conversion module is configured to perform multi-order half-life smoothing on the motion trend value to obtain a second control parameter.

Optionally, the target game animation further includes: the processing device of the game animation further comprises: the determining module is used for determining a first rebounding time length corresponding to the virtual weapon model when the virtual character is detected to stop moving in the game scene; the acquisition module is used for acquiring a third control parameter corresponding to the first rebound duration based on a first preset mapping relation, wherein the first preset mapping relation is a mapping relation between the rebound duration and the rebound trajectory of the virtual weapon model, and the third control parameter is used for controlling the inertia rebound trajectory of the virtual weapon model.

Optionally, the conversion module is further configured to convert the second control parameter and the third control parameter into a fourth control parameter; and the processing module is also used for determining the playing mode of the target game animation by adopting the fourth control parameter.

Optionally, the conversion module is configured to perform multiplication calculation on the second control parameter and the third control parameter to obtain a calculation result; and performing linear mapping processing on the calculation result to obtain a fourth control parameter.

Optionally, the determining module is configured to determine, when it is detected that the virtual character stops moving in the game scene, a second springback time length corresponding to the virtual target portion of the virtual character; and the obtaining module is used for obtaining a fifth control parameter corresponding to the second springback time length based on a second preset mapping relation, wherein the second preset mapping relation is a mapping relation between the springback time length of the virtual target part and the springback track, and the fifth control parameter is used for controlling the inertia springback track of the virtual target part.

Optionally, the conversion module is further configured to convert the second control parameter and the fifth control parameter into a sixth control parameter; and the processing module is also used for determining the playing mode of the target game animation by adopting the sixth control parameter.

Optionally, when the virtual character is in a walking state, the target game animation comprises: the virtual character moving method comprises a first frame image, an intermediate frame image and a last frame image, wherein the first frame image is used for describing a first walking posture of the virtual character, the last frame image is used for describing a second walking posture of the virtual character, and the intermediate frame image is used for describing a transition posture changing from the first walking posture to the second walking posture.

Optionally, when the virtual character is in a rotated state, the target game animation includes: a deflection image of the virtual character at a plurality of angles.

Alternatively, when the target game animation includes a motion inertia superposition animation of the virtual character and a motion inertia superposition animation of the virtual weapon model, the deflection images at each angle respectively include: the deflection image of the virtual weapon model in the aiming state and the deflection image of the virtual weapon model in the non-aiming state.

According to an embodiment of the present invention, there is further provided a computer-readable storage medium in which a computer program is stored, wherein the computer program is configured to execute the processing method of the game animation in any one of the above items when running.

There is further provided, according to an embodiment of the present invention, a processor for executing a program, where the program is configured to execute, when running, the processing method of the game animation in any one of the above.

There is further provided, according to an embodiment of the present invention, an electronic apparatus including a memory and a processor, the memory storing therein a computer program, the processor being configured to execute the computer program to perform the method for processing a game animation in any one of the above.

In at least some embodiments of the invention, a method for converting a first control parameter into a second control parameter in response to a received first control parameter is adopted, the first control parameter is used for controlling a virtual character to move in a game scene, the second control parameter is used for controlling the playing of a target game animation, the target game animation at least comprises a motion inertia superposition animation mode of the virtual character, the playing mode of the target game animation is determined through the second control parameter, the aim of expressing the motion inertia of the virtual character through complete and continuous superposition animation is achieved, the overall motion inertia expression of the virtual character and a virtual weapon model in the motion state of the virtual character is improved, meanwhile, the technical effect of smoothness of motion expression can be enhanced, and further the problem that the inertia expression of the virtual character in an FPS game provided in the related technology is unnatural is solved, thereby influencing the reality of the integral inertia expression of the virtual character and the virtual firearm model.

Drawings

The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:

fig. 1 is a block diagram of a hardware configuration of a mobile terminal of a method of processing a game animation according to an embodiment of the present invention;

FIG. 2 is a flow diagram of a method of processing a game animation according to one embodiment of the invention;

FIG. 3 is a schematic illustration of a mapping between rebound trajectory and rebound duration in accordance with an alternative embodiment of the present invention;

FIG. 4 is a flow chart of a firearm movement inertia performance optimization process according to an alternate embodiment of the invention;

FIG. 5 is a block diagram of a game animation processing apparatus according to an embodiment of the present invention;

fig. 6 is a block diagram of a game animation processing apparatus according to an alternative embodiment of the present invention.

Detailed Description

In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.

It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.

In accordance with one embodiment of the present invention, there is provided an embodiment of a method for processing game animation, wherein the steps shown in the flowchart of the figure may be performed in a computer system such as a set of computer-executable instructions, and wherein although a logical order is shown in the flowchart, in some cases the steps shown or described may be performed in an order different than here.

The method embodiments may be performed in a mobile terminal, a computer terminal or a similar computing device. Taking the example of the Mobile terminal running on the Mobile terminal, the Mobile terminal may be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, a Mobile Internet Device (MID), a PAD, a game console, etc. Fig. 1 is a block diagram of a hardware configuration of a mobile terminal according to a processing method of game animation according to an embodiment of the present invention, and as shown in fig. 1, the mobile terminal may include one or more (only one is shown in fig. 1) processors 102 (the processors 102 may include, but are not limited to, a Central Processing Unit (CPU), a Graphic Processing Unit (GPU), a Digital Signal Processing (DSP) chip, a Microprocessor (MCU), a programmable logic device (FPGA), a neural Network Processor (NPU), a Tensor Processor (TPU), an Artificial Intelligence (AI) type processor, etc.), and a memory 104 for storing data. Optionally, the mobile terminal may further include a transmission device 106, an input/output device 108, and a display device 110 for communication functions. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration, and does not limit the structure of the mobile terminal. For example, the mobile terminal may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.

The memory 104 may be used to store a computer program, for example, a software program and a module of application software, such as a computer program corresponding to the processing method of the game animation in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer program stored in the memory 104, that is, implements the processing method of the game animation. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.

The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.

The inputs in the input output Device 108 may come from a plurality of Human Interface Devices (HIDs). For example: keyboard and mouse, game pad, other special game controller (such as steering wheel, fishing rod, dance mat, remote controller, etc.). Some human interface devices may provide output functions in addition to input functions, such as: force feedback and vibration of the gamepad, audio output of the controller, etc.

The display device 110 may be, for example, a head-up display (HUD), a touch screen type Liquid Crystal Display (LCD), and a touch display (also referred to as a "touch screen" or "touch display screen"). The liquid crystal display may enable a user to interact with a user interface of the mobile terminal. In some embodiments, the mobile terminal has a Graphical User Interface (GUI) with which a user can interact by touching finger contacts and/or gestures on a touch-sensitive surface, where the human-machine interaction function optionally includes the following interactions: executable instructions for creating web pages, drawing, word processing, making electronic documents, games, video conferencing, instant messaging, emailing, call interfacing, playing digital video, playing digital music, and/or web browsing, etc., for performing the above-described human-computer interaction functions, are configured/stored in one or more processor-executable computer program products or readable storage media.

In this embodiment, a method for processing a game animation running on the mobile terminal is provided, and fig. 2 is a flowchart of a method for processing a game animation according to an embodiment of the present invention, as shown in fig. 2, the method includes the following steps:

step S20, in response to the received first control parameter, converting the first control parameter into a second control parameter, where the first control parameter is used to control the virtual character to perform a motion state in the game scene, and the second control parameter is used to control the playing of the target game animation, and the target game animation at least includes: overlaying animation on the motion inertia of the virtual character;

the first control parameter (i.e., the control input to the virtual character by the game player) is used to control the motion state of the virtual character in the game scene, which is usually represented by the movement of the virtual character in a plurality of different directions (e.g., front, back, left, right, etc.) and the rotation of the field of view caused by the rotation of the virtual character. The second control parameter (i.e., the frame control parameter of the animation play frame) is used to control the playing of the target game animation, which is a continuous superimposed animation previously created by the animator to represent the inertia of the virtual character.

When the virtual character is in a walking state, the target game animation may include: a first frame image, an intermediate frame image, and a last frame image. The first frame image is used to describe a first walking pose of the virtual character (e.g., a pose in which the virtual character walks to the left). The last frame of image is used to describe a second walking pose of the virtual character (e.g., a pose in which the virtual character walks to the right). The intermediate frame image is used to describe a transition pose between transitioning from the first walking pose to the second walking pose. For the intermediate frame images, the animator will make a natural transition from the first frame image to the last frame image based on the still posture (i.e., the reference frame of the superimposed animation) to ensure the expressiveness of the animation. When the virtual character is in a rotated state, the target game animation includes: a deflection image of the virtual character at a plurality of angles. That is, a 360-frame virtual character deflection image representing 360 degrees of rotation is provided by the animator.

Further, the target game animation may further include: and (4) superposing animation by the motion inertia of the virtual weapon model corresponding to the virtual character. When the target game animation comprises a motion inertia superposition animation of a virtual character and a motion inertia superposition animation of a virtual weapon model, the deflection images at each angle respectively comprise: the deflection image of the virtual weapon model in the aiming state and the deflection image of the virtual weapon model in the non-aiming state. That is, the animator also needs to provide 360 degree rotated images of the virtual character with the virtual firearm model in the non-aiming state and the virtual firearm model in the open-sight aiming state, respectively.

In order to realize the optimization of the transition of the superimposed animation, the superimposed animation previously made by a whole section of animation maker is adopted, the control input of a game player to a virtual character is subjected to multi-stage mapping conversion, and frame control parameters of corresponding animation playing frames are output so as to control each frame image played by the animation and play continuous inertial motion superimposed animation. The multi-stage mapping conversion can be understood as that the control input of the game player to the virtual character is firstly converted into the motion trend numerical value, and then the half-life period smoothing processing is carried out on the motion trend numerical value. That is, the control input is first mapped to the animation playback speed and then further mapped to the acceleration value.

In the process of converting the first control parameter into the second control parameter, firstly, the first control parameter needs to be converted into a motion trend numerical value, and the motion trend numerical value is used for representing the motion direction trend of the virtual character; secondly, carrying out data smoothing processing on the motion trend numerical value to obtain an inertia force parameter; and then, carrying out linear mapping processing on the inertia force parameter to obtain a second control parameter.

In an alternative embodiment, after the animator animators make the continuous superimposed animation for the motion inertia presentation, the control input of the game player to the virtual character can be converted into a motion trend numerical value to determine the motion direction trend of the virtual character; secondly, performing multi-order half-life period smoothing on the motion trend numerical value, and converting the motion trend numerical value into an inertia force parameter; and then, carrying out linear mapping processing on the inertia force parameters to obtain playing frame control parameters of the motion inertia superposition animation so as to control the playing progress of the whole superposition animation.

And step S21, determining the playing mode of the target game animation by adopting the second control parameter.

The playing of the motion inertia superposition animation can be controlled by using the playing frame control parameter of the motion inertia superposition animation. Through complete and continuous superposition of animation expression motion inertia, the animation expression is more natural, and therefore the consistency and the authenticity of the animation expression are improved. Meanwhile, as the whole motion inertia animation is the superposition animation, when the virtual character performs actions in other states (such as shooting, bullet changing, running and the like), the superposition of the inertia actions can still be performed, so that the action inertia expression of the virtual weapon model under different behaviors of the virtual character is improved.

Through the steps, the method can adopt a mode of converting the first control parameter into the second control parameter in response to the received first control parameter, wherein the first control parameter is used for controlling the virtual character to move in the game scene, the second control parameter is used for controlling the playing of the target game animation, the target game animation at least comprises the motion inertia superposition animation of the virtual character, the playing mode of the target game animation is determined through the second control parameter, the aim of expressing the motion inertia of the virtual character through complete and continuous superposition animation is achieved, thereby realizing the technical effects of improving the integral motion inertia performance of the virtual character and the virtual weapon model in the motion state of the virtual character, simultaneously enhancing the fluency of the motion performance, thereby solving the problem that the inertia of the virtual character in the FPS game provided by the related technology is not natural, thereby influencing the reality of the integral inertia expression of the virtual character and the virtual firearm model.

Optionally, the processing method of the game animation may further include the following steps:

step S22, when the virtual character is detected to stop moving in the game scene, determining a first rebounding time length corresponding to the virtual weapon model;

and step S23, acquiring a third control parameter corresponding to the first rebound duration based on a first preset mapping relation, wherein the first preset mapping relation is a mapping relation between the rebound duration and the rebound trajectory of the virtual weapon model, and the third control parameter is used for controlling the inertia rebound trajectory of the virtual weapon model.

When the virtual character is detected to stop moving in the game scene, the inertial rebound performance of the virtual character at the end of the movement can be optimized in addition to the optimization of the superimposed animation transition. And when the virtual character movement is finished, establishing a curve relation between the rebound duration and the rebound track as an inertia rebound track parameter (namely the third control parameter), and simultaneously controlling the complete action rebound performance by combining the inertia dynamics parameter. The different weights of the different types of virtual weapon models are considered, and correspond to different rebound time lengths respectively. For example: the heavy virtual firearm model and the light virtual firearm model respectively correspond to different rebound duration. Therefore, the inertial deflection effect caused by the weight of the virtual weapon model and the force sense under different action forces are provided for the game player.

Fig. 3 is a schematic diagram of a mapping relationship between a rebound trajectory and a rebound duration according to an alternative embodiment of the present invention, and as shown in fig. 3, by establishing a curve mapping relationship between the rebound trajectory and the rebound duration, when a virtual character movement ends, an inertial rebound trajectory parameter of each frame image is obtained, where an abscissa represents a change in the rebound duration and an ordinate represents a change in a numerical value of the rebound trajectory. Therefore, the elastic correcting effect is generated by combining the calculation of the damping coefficient when the movement is stopped, the inertial deflection effect brought by the weight of the virtual weapon model can be provided for game players, and meanwhile, the inertial resilience performance with different degrees can be realized by combining different inertial dynamics, so that the authenticity of the action performance is improved.

Optionally, the processing method of the game animation may further include the following steps:

step S24, converting the second control parameter and the third control parameter into a fourth control parameter;

and step S25, determining the playing mode of the target game animation by adopting the fourth control parameter.

In the process of converting the second control parameter and the third control parameter into the fourth control parameter, the second control parameter and the third control parameter may be multiplied to obtain a calculation result; and then, carrying out linear mapping processing on the calculation result to obtain a fourth control parameter so as to determine the playing mode of the target game animation by adopting the fourth control parameter. That is, the inertia rebound trajectory parameter and the inertia force parameter can be multiplied to obtain a calculation result, and the calculation result is mapped to be the percentage of the animation playing progress to obtain the playing frame control parameter of the motion inertia superposition animation, so as to control the playing progress of the whole superposition animation, and achieve the effect of motion inertia rebound. Meanwhile, different rebound force performances can be realized under different inertia forces.

Optionally, the processing method of the game animation may further include the following steps:

step S26, when the virtual character is detected to stop moving in the game scene, determining a second rebound duration corresponding to the virtual target part of the virtual character;

and step S27, acquiring a fifth control parameter corresponding to the second springback time length based on a second preset mapping relation, wherein the second preset mapping relation is the mapping relation between the springback time length of the virtual target portion and the springback trajectory, and the fifth control parameter is used for controlling the inertial springback trajectory of the virtual target portion.

The virtual target portion may be a hand portion of the virtual character. When the virtual character does not hold the virtual weapon model and stops moving in the game scene, the second rebound duration corresponding to the virtual target part of the virtual character needs to be determined. And then, acquiring a fifth control parameter corresponding to the second springback time length based on the second preset mapping relation. The preset mapping relationship is a mapping relationship between the rebound duration of the virtual target part and the rebound trajectory. The fifth control parameter is used for controlling the inertial rebound trajectory of the virtual target portion. Considering that the virtual characters of different types are different in stature and weight, the virtual characters respectively correspond to different rebound duration. For example: the virtual character with the big stature and the heavy weight and the virtual character with the thin stature and the light weight respectively correspond to different rebound durations. Therefore, the inertial deflection effect brought by different types of virtual characters and the force sense under different action forces are provided for game players.

In addition, the above-described mapping relationship between the rebound trajectory and the rebound duration shown in fig. 3 is also applicable to the second preset mapping relationship. And will not be described in detail herein.

Optionally, the processing method of the game animation may further include the following steps:

step S28, converting the second control parameter and the fifth control parameter into a sixth control parameter;

and step S29, determining the playing mode of the target game animation by adopting the sixth control parameter.

In the process of converting the second control parameter and the fifth control parameter into the sixth control parameter, the second control parameter and the fifth control parameter may be multiplied to obtain a calculation result; and then, carrying out linear mapping processing on the calculation result to obtain a sixth control parameter so as to determine the playing mode of the target game animation by adopting the sixth control parameter. That is, the inertia rebound trajectory parameter and the inertia force parameter can be multiplied to obtain a calculation result, and the calculation result is mapped to be the percentage of the animation playing progress to obtain the playing frame control parameter of the motion inertia superposition animation, so as to control the playing progress of the whole superposition animation, and achieve the effect of motion inertia rebound. Meanwhile, different rebound force performances can be realized under different inertia forces.

The above implementation will be described in further detail below with reference to an alternative embodiment shown in fig. 4.

Fig. 4 is a flowchart of a firearm movement inertia representation optimization process according to an alternative embodiment of the invention, and as shown in fig. 4, the optimization process may include the following processing steps, taking a left-right walking firearm movement inertia representation as an example:

step S402, the animation producing personnel produces continuous superposition animation for expressing the movement inertia of the virtual weapon model, namely the virtual character with the preset frame number and the complete animation of the virtual weapon model from left deviation to right deviation. The first frame of image is used for representing the posture of the virtual character moving leftwards, the last frame of image is used for representing the posture of the virtual character moving rightwards, and the middle frame is a static posture (namely a reference frame for overlaying the animation), so that not only is the animation maker conveniently make continuous overlaying animation, but also the parameter control of animation playing in a process sequence is facilitated. A natural transition will be made by the animator from the first frame image to the last frame image to ensure the expressiveness of the animation.

In step S404, control input of the virtual character by the game player is obtained, which is generally represented by the movement of the virtual character to a plurality of different directions (such as front, back, left, right, etc.) and the rotation change of the visual field caused by the rotation of the virtual character, so as to map the control input to the motion trend value of the virtual character, such as: and controlling the corresponding movement trend numerical value of the virtual character to move leftwards to be-1, controlling the corresponding movement trend numerical value of the virtual character to move rightwards to be 1, and controlling the corresponding movement trend numerical value of the virtual character to be 0 when the virtual character is still. In addition, when the view angle is rotated, the view angle is mapped to different angles, for example: the left-right rotation of the control avatar is mapped to the angle of horizontal rotation of the avatar (0-360 degrees), and the up-down rotation of the control avatar is mapped to the angle of vertical rotation (-90-90 degrees).

Step S406, determining the motion state of the virtual character, and when the virtual character does not stop moving, continuing to execute step S408, and when the virtual character stops moving, continuing to execute step S412.

Step S408, performing a smoothing hysteresis process on the motion trend value to obtain an inertia force parameter move _ dir, where the data smoothing process mentioned herein may be implemented by using different difference calculations, for example: half-life smoothing, linear difference, etc., and in an alternative embodiment, a second half-life smoothing process may be employed.

Step S410, calculating the inertia force parameter move _ dir through linear mapping to obtain a frame control parameter smooth _ move _ frame of the animation playing frame, wherein the interval is 0-1, namely the percentage of the animation playing frame.

Step S412, when the virtual character stops moving, establishing a mapping relation between the rebound track and the rebound duration. The virtual gun model starts to perform elastic correction with a fixed time length, and the movement trend of the gun body with the elastic correction is controlled by the mapping relation so as to increase the inertia effect when the correction action is finished.

In step S414, after the virtual firearm model starts to return to normal, the inertial rebound trajectory parameter move _ coef of each frame of image is obtained according to the rebound duration, and meanwhile, numerical smoothing is performed to solve the problem of animation smoothness in motion states such as jerk walking and jerk stopping.

In step S416, while the inertial rebounding trajectory parameter move _ coef changes, the inertial force parameter move _ dir is continuously transitioning to the static state 0.

Step S418, combining the inertial rebounding track parameter move _ coef and the inertial force parameter move _ dir, an animation frame control parameter smooth _ move _ frame (whose value range is 0-1) of each frame of image can be obtained, and the parameter will be affected by the inertial rebounding track parameter and the inertial force parameter at the same time.

In an optional embodiment, a computing mode of smooth _ move _ frame ═ move _ coef ═ move _ dir is adopted to embody the motion effect of motion inertia force and elastic correction.

In step S420, the playing of the motion inertia superposition animation is controlled by the animation frame control parameter smooth _ move _ frame of each frame image.

In addition, the action requirement of the virtual firearm model for the inertia of the firearm body under the rotation condition can be more complex, and compared with the inertial representation of the movement of the firearm walking left and right, the virtual firearm model has the main differences that:

(1) the 360 frame representation of the virtual character deflection image at 360 degrees of rotation is provided by the animator, while it is desirable to provide 360 degree rotation images of the virtual character with the virtual firearm model in the non-aiming state and the virtual firearm model in the open-sight state, respectively.

(2) Before the playing of the superimposed animation is controlled, a frame of image and the static posture of the virtual character are selected to be fused in a specific proportion, so that the angle of the virtual firearm model deflects, and a motion inertia effect is generated through the same animation playing control logic.

The gun body inertia of the virtual gun model under the rotation condition can be realized, meanwhile, the two-dimensional rotation animation of the virtual gun model can be changed into two one-dimensional rotation inertia animations which respectively correspond to the horizontal direction and the vertical direction, the effect equivalent to the two-dimensional rotation animation can be achieved through animation fusion between the two one-dimensional rotation inertia animations, and the two-dimensional rotation inertia animations can be flexibly selected according to practical application scenes.

By combining the optional embodiments, the control input of the game player to the virtual character is mapped into the inertia force parameters, multi-stage smoothing processing is carried out, the inertia animation of the corresponding frame is superposed, and the virtual character can be accompanied with the integral deflection of the virtual weapon model and the arm when moving or rotating the virtual weapon model, so that the fluency of action expression is improved, and the motion inertia expression of the virtual weapon model is enhanced. Meanwhile, the deflection of different degrees can be realized under different inertia forces, and the authenticity of action performance is improved. The control input of the game player to the virtual character is subjected to multi-stage smoothing processing, the control input of the game player can be mapped into the playing speed and the acceleration of the inertia superposition animation on the numerical value, the control input of the game player in different directions (for example, the control of the virtual character to walk leftwards or rightwards) is not immediately converted into the playing speed change of the inertia animation, but is changed on the acceleration so as to ensure the playing continuity of the inertia animation, and therefore the problem that the virtual character does not move continuously under the similar operations of sudden walking, sudden stopping and the like is solved, and meanwhile, the real dynamics sense expression is brought.

Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.

In this embodiment, a processing device for game animation is further provided, and the device is used to implement the foregoing embodiments and preferred embodiments, and the description of the device that has been already made is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.

Fig. 5 is a block diagram of a game animation processing apparatus according to an embodiment of the present invention, as shown in fig. 5, the apparatus including: a conversion module 10, configured to convert a first control parameter into a second control parameter in response to a received first control parameter, where the first control parameter is used to control a virtual character to perform a motion state in a game scene, and the second control parameter is used to control playing of a target game animation, where the target game animation at least includes: overlaying animation on the motion inertia of the virtual character; and the processing module 20 is configured to determine a playing mode of the target game animation by using the second control parameter.

Optionally, the conversion module 10 is configured to convert the first control parameter into a motion trend value, where the motion trend value is used to represent a motion direction trend of the virtual character; carrying out data smoothing processing on the motion trend numerical value to obtain an inertia force parameter; and carrying out linear mapping processing on the inertia force parameter to obtain a second control parameter.

Optionally, the conversion module 10 is configured to perform multi-step half-life smoothing on the motion trend value to obtain a second control parameter.

Optionally, the target game animation further includes: fig. 6 is a block diagram of a processing apparatus of a game animation according to an alternative embodiment of the present invention, as shown in fig. 6, the processing apparatus of a game animation includes, in addition to all modules shown in fig. 5: the determining module 30 is configured to determine a first rebounding duration corresponding to the virtual weapon model when it is detected that the virtual character stops moving in the game scene; the obtaining module 40 is configured to obtain a third control parameter corresponding to the first rebound duration based on a first preset mapping relationship, where the first preset mapping relationship is a mapping relationship between the rebound duration and a rebound trajectory of the virtual weapon model, and the third control parameter is used to control an inertial rebound trajectory of the virtual weapon model.

Optionally, the conversion module 10 is further configured to convert the second control parameter and the third control parameter into a fourth control parameter; the processing module 20 is further configured to determine a playing mode of the target game animation by using the fourth control parameter.

Optionally, the conversion module 10 is configured to perform multiplication calculation on the second control parameter and the third control parameter to obtain a calculation result; and performing linear mapping processing on the calculation result to obtain a fourth control parameter.

Optionally, the determining module 30 is configured to determine, when it is detected that the virtual character stops moving in the game scene, a second springback time length corresponding to the virtual target portion of the virtual character; the obtaining module 40 is configured to obtain a fifth control parameter corresponding to the second springback time length based on a second preset mapping relationship, where the second preset mapping relationship is a mapping relationship between the springback time length of the virtual target portion and the springback trajectory, and the fifth control parameter is used to control the inertial springback trajectory of the virtual target portion.

Optionally, the conversion module 10 is further configured to convert the second control parameter and the fifth control parameter into a sixth control parameter; the processing module 20 is further configured to determine a playing mode of the target game animation by using the sixth control parameter.

Optionally, when the virtual character is in a walking state, the target game animation comprises: the virtual character moving method comprises a first frame image, an intermediate frame image and a last frame image, wherein the first frame image is used for describing a first walking posture of the virtual character, the last frame image is used for describing a second walking posture of the virtual character, and the intermediate frame image is used for describing a transition posture changing from the first walking posture to the second walking posture.

Optionally, when the virtual character is in a rotated state, the target game animation includes: a deflection image of the virtual character at a plurality of angles.

Alternatively, when the target game animation includes a motion inertia superposition animation of the virtual character and a motion inertia superposition animation of the virtual weapon model, the deflection images at each angle respectively include: the deflection image of the virtual weapon model in the aiming state and the deflection image of the virtual weapon model in the non-aiming state.

It should be noted that, the above modules may be implemented by software or hardware, and for the latter, the following may be implemented, but not limited to: the modules are all positioned in the same processor; alternatively, the modules are respectively located in different processors in any combination.

Embodiments of the present invention also provide a non-volatile storage medium having a computer program stored therein, wherein the computer program is configured to perform the steps of any of the above method embodiments when executed.

Alternatively, in the present embodiment, the above-mentioned nonvolatile storage medium may be configured to store a computer program for executing the steps of:

s1, responding to the received first control parameter, converting the first control parameter into a second control parameter, wherein the first control parameter is used for controlling the virtual character to move in the game scene, the second control parameter is used for controlling the playing of the target game animation, and the target game animation at least comprises: the motion inertia of the virtual weapon model of the virtual character is superposed with animation;

and S2, determining the playing mode of the target game animation by adopting the second control parameter.

Optionally, in this embodiment, the nonvolatile storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.

Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.

Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.

Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:

s1, responding to the received first control parameter, converting the first control parameter into a second control parameter, wherein the first control parameter is used for controlling the virtual character to move in the game scene, the second control parameter is used for controlling the playing of the target game animation, and the target game animation at least comprises: overlaying animation on the motion inertia of the virtual character;

and S2, determining the playing mode of the target game animation by adopting the second control parameter.

Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again.

The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.

In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.

In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.

The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.

In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.

The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.

The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

19页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种游戏视野生成方法、装置、电子设备及存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类