Shooting method, shooting device, electronic equipment and storage medium

文档序号:576951 发布日期:2021-05-25 浏览:3次 中文

阅读说明:本技术 拍摄方法、装置、电子设备及存储介质 (Shooting method, shooting device, electronic equipment and storage medium ) 是由 胡婷婷 赵男 包炎 刘超 施一东 李鑫培 师锐 董一夫 于 2020-12-31 设计创作,主要内容包括:本发明实施例公开了一种拍摄方法、装置、电子设备及存储介质,该方法包括:获取当前时刻下与当前用户对应的当前动画数据;基于历史拍摄动画数据以及与历史拍摄动画数据对应的历史拍摄参数,确定与当前动画数据对应的目标拍摄参数;基于目标拍摄参数对当前动画数据中的至少一个目标对象进行拍摄。本发明实施例的技术方案,通过根据历史拍摄动画数据的历史拍摄参数,确定拍摄当前动画数据的目标拍摄参数,并基于目标拍摄参数拍摄包括目标对象的目标图像,能够对当前动画数据的高效率拍摄,对用户可能感兴趣的画面进行自动拍摄,实现了及时记录用户高光时刻,满足用户个性化需求,提高用户体验感的技术效果。(The embodiment of the invention discloses a shooting method, a shooting device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring current animation data corresponding to a current user at a current moment; determining target shooting parameters corresponding to the current animation data based on the historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data; and shooting at least one target object in the current animation data based on the target shooting parameters. According to the technical scheme of the embodiment of the invention, the target shooting parameters for shooting the current animation data are determined according to the historical shooting parameters of the historical shooting animation data, and the target image including the target object is shot based on the target shooting parameters, so that the current animation data can be shot efficiently, the picture which is possibly interested by the user can be automatically shot, the highlight time of the user can be recorded in time, the personalized requirement of the user is met, and the technical effect of improving the experience of the user is achieved.)

1. A photographing method, characterized by comprising:

acquiring current animation data corresponding to a current user at a current moment;

determining target shooting parameters corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data;

and shooting at least one target object in the current animation data based on the target shooting parameters.

2. The method of claim 1, wherein determining target shooting parameters corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data comprises:

training an original machine learning model based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data to obtain a shooting parameter prediction model;

and determining target shooting parameters corresponding to the current animation data based on the shooting parameter prediction model.

3. The method of claim 1, wherein determining target shooting parameters corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data comprises:

determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data;

target photographing parameters are determined based on the history photographing parameters corresponding to the at least one set of target history photographing animation data.

4. The method of claim 3, wherein the determining at least one set of target historical captured animation data corresponding to the current animation data from the historical captured animation data comprises:

and determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data based on the scene information of the historical shooting animation data and the current animation data.

5. The method of claim 3, wherein the determining at least one set of target historical captured animation data corresponding to the current animation data from the historical captured animation data comprises:

acquiring historical user operation information corresponding to the at least one group of target historical shooting animation data;

and determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data based on the historical user operation information.

6. The method of claim 3, wherein determining target shooting parameters based on historical shooting parameters corresponding to the at least one set of target historical shooting animation data comprises:

and if the historical shooting parameters corresponding to the target historical shooting animation data are two or more groups, determining the target shooting parameters based on the shooting time information corresponding to the historical shooting parameters.

7. The method of claim 3, wherein determining target shooting parameters based on historical shooting parameters corresponding to the at least one set of target historical shooting animation data comprises:

and respectively determining the historical shooting parameters corresponding to the at least one group of target historical shooting animation data as target shooting parameters.

8. A camera, comprising:

the animation data acquisition module is used for acquiring current animation data corresponding to a current user at the current moment;

the shooting parameter determining module is used for determining target shooting parameters corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data;

and the shooting module is used for shooting at least one target object in the current animation data based on the target shooting parameters.

9. An electronic device, characterized in that the electronic device comprises:

one or more processors;

a storage device for storing one or more programs,

when executed by the one or more processors, cause the one or more processors to implement the photographing method of any of claims 1-7.

10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the photographing method according to any one of claims 1-7.

Technical Field

The embodiment of the invention relates to the technical field of game development, in particular to a shooting method, a shooting device, electronic equipment and a storage medium.

Background

In order to record animation such as storyline, player interaction and the like in a network game, the conventional game is provided with a function of recording animation or taking pictures. When a game player wants to record a certain animation or a certain frame of picture, the shooting button can be triggered to achieve the purpose of recording.

At present, in the game process, if a game player needs to shoot a game picture, the game player often intercepts the game picture through manual operation of the player. In such a manual screenshot manner, since the screenshot operation of the Player is not in time or the screenshot is forgotten, various key frames cannot be obtained in time, for example, an interactive frame with a Non-Player Character (NPC), a battle frame with BOSS, or a special effect frame that releases skills during battle, it is difficult for the Player to obtain a frame that is fleeting during the game. Meanwhile, the manual screenshot may also cause inaccurate screenshot opportunity and miss a proper game picture due to network delay of player equipment or machine stutter.

Disclosure of Invention

The embodiment of the invention provides a shooting method, a shooting device, electronic equipment and a storage medium, and aims to realize automatic shooting of current animation data.

In a first aspect, an embodiment of the present invention provides a shooting method, where the method includes:

acquiring current animation data corresponding to a current user at a current moment;

determining target shooting parameters corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data;

and shooting at least one target object in the current animation data based on the target shooting parameters.

In a second aspect, an embodiment of the present invention further provides a shooting apparatus, where the shooting apparatus includes:

the animation data acquisition module is used for acquiring current animation data corresponding to a current user at the current moment;

the shooting parameter determining module is used for determining target shooting parameters corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data;

and the shooting module is used for shooting at least one target object in the current animation data based on the target shooting parameters.

In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:

one or more processors;

a storage device for storing one or more programs,

when the one or more programs are executed by the one or more processors, the one or more processors implement the photographing method according to any of the embodiments of the present invention.

In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the shooting method according to any one of the embodiments of the present invention.

According to the technical scheme of the embodiment of the invention, the target shooting parameters for shooting the current animation data at the current moment can be determined by determining the historical shooting animation data and the historical shooting parameters corresponding to the historical shooting animation data, and then the target image comprising the target object is shot based on the target shooting parameters, so that the technical problem of poor user experience caused by incapability of automatically shooting the corresponding picture in the prior art is solved, the technical effect of automatically shooting the corresponding picture is realized, the shooting parameters corresponding to the corresponding picture can be called to shoot the corresponding picture when the corresponding animation data is shot, the matching degree of the shot image and the user is improved, the shooting effect and the shooting flexibility are improved, and the technical effect of user experience is greatly improved.

Drawings

In order to more clearly illustrate the technical solutions of the exemplary embodiments of the present invention, a brief description is given below of the drawings used in describing the embodiments. It should be clear that the described figures are only views of some of the embodiments of the invention to be described, not all, and that for a person skilled in the art, other figures can be derived from these figures without inventive effort.

Fig. 1 is a schematic flowchart of a shooting method according to an embodiment of the present invention; (ii) a

Fig. 2 is a schematic flow chart of a shooting method according to a second embodiment of the present invention;

fig. 3 is a schematic flowchart of a shooting method according to a second embodiment of the present invention; (ii) a

Fig. 4 is a schematic structural diagram of a shooting device according to a third embodiment of the present invention;

fig. 5 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.

Detailed Description

The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.

It should be further noted that, for the convenience of description, only some but not all of the relevant aspects of the present invention are shown in the drawings. Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.

Before the embodiments of the present invention are described, an application scenario is described. The shooting method provided by the embodiment of the invention can be suitable for scenes for automatically shooting the animation data in the multimedia resources, and is particularly suitable for automatically shooting the animation data in the game animation. For convenience of understanding, in the embodiment of the present invention, an application scene is taken as an example of a game scene, and a shooting method is described. Animation data in a game animation may include environments, characters, monsters, and the like.

Example one

Fig. 1 is a flowchart of a shooting method according to an embodiment of the present invention, where the present embodiment is applicable to a case where current animation data is automatically shot based on animation historical shooting data, and the method may be executed by a shooting device, where the shooting device may be configured in a terminal or a server, and the terminal and the server independently execute or cooperate to execute the shooting method according to the embodiment of the present invention.

As shown in fig. 1, the shooting method in this embodiment may specifically include:

and S110, acquiring current animation data corresponding to the current user at the current moment.

Note that the animation is often composed of many frame images. Animations are usually themed and are provided with a storyline. As the story line progresses, the content of each frame of image also changes. During the playing of the animation, the user may continuously see new animation data. And the playing progress of the same animation corresponding to different users may also be different. Therefore, in the embodiment of the present invention, the current animation data corresponding to the current user at the current time may be obtained.

The current animation data may be data corresponding to an animation picture currently played. Taking a game animation as an example, the current animation data may be data of a scene picture including one or more objects of a character, a monster, weather, a tree, or a building. In animation, these objects generally need to be realized through the corresponding information groups of the objects. For example, the information group corresponding to each object in the current animation data may be rendered based on a preset game player viewing angle to obtain a current animation picture. That is, the current animation data is determined based on the information groups of the respective objects included in the current animation picture.

Specifically, each video frame in the game process may be acquired, a target video frame corresponding to the terminal device at the current time may be determined, scene picture data such as a character, a monster, and weather included in the target video frame may be determined, and the data determined at this time may be used as the current animation data.

And S120, determining target shooting parameters corresponding to the current animation data based on the historical shooting animation data and the historical shooting parameters corresponding to the historical shooting animation data.

The historical shooting animation data can be understood as animation data shot by other terminals or the current terminal when shooting is triggered before the current time. That is, the history shooting animation data may be understood as animation data that has been shot once, and may or may not include the current animation data. For example, the historical shooting animation data includes shooting data of a historical user on a preset animation event, and the current animation data may be video frame data of the current user playing the preset animation event; the current animation data may be other video frame data than the history shooting animation data. The content included in the history animation data may be the same as or different from the content included in the current animation data explained above, and the difference is that different users have different preferences for images, and accordingly, specific content corresponding to the history shooting animation data has a certain difference. The history user mentioned above may be understood as a user who has animation history shooting data, and may or may not include the current user. The number of the historical users can be one or more. The historical shooting parameters can be understood as corresponding shooting parameters of a camera, attribute parameters of the camera and/or scene parameters when shooting the historical animation data. The shooting parameters of the camera are corresponding parameters when shooting history shooting animation data, and can include but are not limited to at least one of the following data: shooting angle when shooting the historical shooting animation data, shooting position of the camera when shooting, wide-angle parameter of the camera when shooting, and the like; the camera attribute parameters are parameters used in the shooting process, i.e. specific parameters corresponding to the shooting, such as shutter, aperture, field angle, exposure, flash on or off, and shooting light and/or shooting angle corresponding to the current user. The scene parameter may be a light source parameter in the current scene, e.g. number of light sources, intensity of light sources, etc. The target shooting parameters can be understood as camera shooting parameters corresponding to shooting of the current animation data, camera attribute parameters and scene parameters corresponding to the current animation data. The target shooting parameters are shooting parameters corresponding to shooting of the current animation data, and the target shooting parameters are determined from historical shooting parameters corresponding to historical shooting animation data.

It should be noted that, for each animation data, the shooting parameters may be one or more of a specific angle at which a certain picture is shot, a specific position at which a camera is placed, and a wide angle at which a picture is shot.

In this embodiment, determining the target shooting parameter corresponding to the current animation data according to the historical shooting animation data may be: the plurality of historical moving image data having a high degree of association with the current moving image data are determined from the historical moving image data, and the historical photographing parameters associated with the historical moving image data are retrieved, and the historical photographing parameters obtained at this time may all be used as the target photographing parameters of the current moving image data, so that the target image including the target photographic object is photographed based on the determined photographing parameters.

For example, a corresponding relationship between the historical shooting animation data and the historical shooting parameters may be established, after the current animation data is obtained, a similarity between the current animation data and each piece of historical shooting animation data may be determined, at least one piece of historical shooting animation data associated with the current animation data may be determined according to the similarity, and the historical shooting parameters corresponding to the at least one piece of historical shooting animation data may be retrieved according to the corresponding relationship and used as the target shooting parameters.

And S130, shooting at least one target object in the current animation data based on the target shooting parameters.

Before shooting at least one target object, shooting of at least one target object in the current animation data can be triggered based on the animation history shooting data of the current user, and the shooting method comprises the following steps: determining whether the current animation data is target shooting animation data or not based on animation historical shooting data of a current user; if yes, shooting of at least one target object in the current animation data is triggered based on the animation historical shooting data of the current user.

For example, it may be determined whether the current animation data is the target photographing animation data based on a picture type of the history photographing animation data among animation history photographing data of the current user.

Specifically, determining whether the current animation data is the target photographing animation data based on the picture type of the history photographing animation data in the animation history photographing data of the current user may include: determining the historical shooting picture type of the historical shooting animation data in the animation historical shooting data of the current user; determining a current picture type of the current animation data; and if the type of the picture shot based on the history is the same as the type of the current picture, determining the current animation data as target shooting animation data.

The picture type may be determined based on a scene type of the current animation data, and may be, for example, a battle scene, a reloading scene, an upgrading scene, or the like. The picture type may also be determined based on picture style, for example, may be a fierce style, an aesthetic style, a fresher style, or the like. The picture type can also be determined based on picture colors, for example, the picture type can be determined based on the type of the picture colors and a preset color type threshold, for example, the picture type can be divided into gorgeous or simple colors based on the picture colors; for example, the determination may be based on the tone of the picture color, and specifically, the picture type may be determined based on the proportion of the preset color in the picture. It should be noted that, the determining manner of the picture type may be various, and the specific division basis may be set according to the actual requirement, and is specifically limited herein.

Alternatively, the present operation can also be realized by artificial intelligence. For example, a preset machine learning model may be trained based on animation historical shooting data of the current user to obtain an animation shooting prediction model, and then shooting of at least one target object in the current animation data is triggered based on a far-side result of the animation shooting prediction model on the current animation data.

Optionally, triggering the shooting of at least one target object in the current animation data based on the animation history shooting data of the current user includes: determining whether the current animation data is target animation shooting data or not based on historical shooting data which corresponds to at least one historical user and corresponds to the current animation data in animation historical shooting data of at least one historical user; if yes, shooting of at least one target object in the current animation data is triggered based on the animation historical shooting data of the current user.

It is understood that the historical photographing data corresponding to at least one historical user and corresponding to the current animation data, for example, the historical animation data when the historical user and the current user are in the same game stage, tends to have a higher similarity to the current animation data. In game animation, animation data includes a game character, a game scene, and the like. The game scene usually includes some fixed objects which are not changed by the change of the game player, and the data can be used as static data. I.e., the inherent object data in the animation data. For example, inanimate objects such as buildings, plants, and small objects in the game scene, or atmosphere data such as time, weather, wind conditions, and tide in the environment atmosphere may be used.

Optionally, historical static data in historical shooting animation data in animation historical shooting data based on at least one historical user and current static data of the current animation data corresponding to the current user are respectively obtained, and historical shooting data corresponding to the at least one historical user and the current animation data is determined based on the similarity between the current static data and the historical static data. Specifically, if the similarity between the current static data and the historical static data is greater than a preset static similarity threshold, the historical captured animation data may be determined as the historical captured data corresponding to the current animation data and corresponding to at least one historical user.

Of course, the history shooting data corresponding to the at least one history user and corresponding to the current animation data may be determined based on the history shooting animation data in the animation history shooting data of the at least one history user, the picture type of the current animation data of the current user, and the like. For a specific implementation manner, reference may be made to the explanation of the picture type in the embodiment of the present invention, which is not described herein again.

As previously described, the current animation data may include one or more objects such as characters, monsters, weather, trees, or buildings. The target object may be understood as a key object to be photographed among the respective objects contained in the current animation data. The number of target objects may be one, two, or more than two. Target objects include, but are not limited to, player-manipulated characters, game monsters, game NPCs, scene buildings. It is noted that player-manipulated characters include, but are not limited to, characters and animals; scene buildings include, but are not limited to, natural scenes such as mountains, sky, grass, etc., and real buildings such as churches, arenas, etc.

In one embodiment, the target object may be determined based on a picture type of the current animation data. The association relationship between the screen type and the corresponding target object may be stored in advance. After the picture type of the current animation data is identified, the target object corresponding to the current animation data can be determined based on the pre-stored association relation. For example, when the current animation data is a battle scene, the corresponding at least two target objects to be photographed include at least one player control character and a game monster battle with each control character, or include player control characters respectively battle with each other. That is, at least two target objects corresponding to the current animation data are determined by recognizing the type of the current animation data of the target player.

In another embodiment, at least one of the two target objects includes a target player character. The remaining target objects may be interactive objects of the target player manipulating the character in the current animation data. Specifically, the target player control character in the current animation data can be detected in real time, so that when the target player control character generates an interactive behavior, such as battle, each mutual object of the target player control character is determined as the rest target objects to be photographed.

In another embodiment, the target object may also be determined based on attribute information of each object. The attribute information of each object includes, but is not limited to, object types such as player characters, monsters, NPCs, buildings, and the like. Specifically, when it is detected that the attribute information of each object in the current picture data includes a preset object type, an object conforming to the preset object type is selected from the objects to serve as a target object. The preset attribute information may be player character + monster or player character + building, etc.

Specifically, after the target object is determined, the target object may be photographed based on the target photographing parameters determined in S120 to obtain the target image.

It should be noted that, if the determined target shooting parameters include multiple groups, at least one target shooting object may be shot based on each group of shooting parameters, and the number of the target images obtained by shooting is the same as the number of the determined target shooting parameter groups.

Optionally, after triggering the shooting of at least one target object in the current animation data, the method further includes: and shooting at least one target object in the current animation data. The shooting of the at least one target object in the current animation data may be performed on the at least one target object in the current animation data, or the at least one target object in the current animation data is recorded.

It is to be understood that the target object is a main subject, and is not a limitation on the contents of photographing. When at least one target object in the current animation data is shot, objects except the target object can be included in the shot data.

According to the technical scheme of the embodiment of the invention, the target shooting parameters for shooting the current animation data at the current moment can be determined by determining the historical shooting animation data and the historical shooting parameters corresponding to the historical shooting animation data, and then the target image comprising the target object is shot based on the target shooting parameters, so that the technical problem of poor user experience caused by incapability of automatically shooting the corresponding picture in the prior art is solved, the technical effect of automatically shooting the corresponding picture is realized, the shooting parameters corresponding to the corresponding picture can be called to shoot the corresponding picture when the corresponding animation data is shot, the matching degree of the shot image and the user is improved, the shooting effect and the shooting flexibility are improved, and the technical effect of user experience is greatly improved.

Example two

Fig. 2 is a schematic flow chart of a shooting method according to a second embodiment of the present invention, which is further refined based on the foregoing optional technical solutions, and optionally, the determining a target shooting parameter corresponding to the current animation data based on historical shooting animation data and a historical shooting parameter corresponding to the historical shooting animation data includes: training an original machine learning model based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data to obtain a shooting parameter prediction model; and determining target shooting parameters corresponding to the current animation data based on the shooting parameter prediction model. The same or corresponding terms as those in the above embodiments are not explained in detail herein.

As shown in fig. 2, the shooting method in this embodiment may specifically include:

s210, current animation data corresponding to the current user at the current moment are obtained.

S220, training the original machine learning model based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data to obtain a shooting parameter prediction model.

The shooting parameter prediction model is obtained by training an original machine learning model after sample data is obtained. The shooting parameter prediction model is used for predicting the shooting parameter model corresponding to the corresponding picture when the corresponding picture is shot. The original machine learning model may be a deep learning model or a reinforcement learning model. In order to improve the accuracy of the shooting parameter prediction model, training sample data can be obtained as much as possible. The training sample data includes history shooting animation data and history shooting parameters corresponding to the history shooting animation data. Historical captured animation data may be used as input to the original machine learning model, with corresponding historical captured parameters being used as output from the original machine learning model.

The shooting prediction model obtained by training may be: aiming at each training sample data, inputting the historical shooting animation in the current training sample data into an original machine learning model, wherein the original machine learning model can output a corresponding actual output result; according to the actual output result and the historical shooting parameters in the current training sample data, the loss function in the model can be corrected. The loss function convergence can be used as a training target to train the original machine learning model so as to obtain a shooting parameter prediction model.

It should be noted that the model parameters in the original machine learning model may be set as default parameters, and the specific contents included in the shooting parameters may be specifically expressed in the first embodiment, and will not be specifically explained here.

And S230, determining target shooting parameters corresponding to the current animation data based on the shooting parameter prediction model.

In this embodiment, if the training is completed with the shooting parameter prediction model, the current animation data may be input into the shooting parameter prediction model, the shooting parameter prediction model may output the shooting parameters matched with the current animation data, and the obtained shooting parameters may be used as the target shooting parameters.

And S240, shooting at least one target object in the current animation data based on the target shooting parameters.

According to the technical scheme of the embodiment of the invention, the original machine learning model is trained through the historical shooting animation data and the shooting parameters corresponding to the historical shooting animation data to obtain the shooting parameter prediction model, and then the current animation data of the current user is predicted based on the shooting parameter prediction model, so that the shooting parameters of the current animation data can be determined quickly and effectively, and the model can be optimized continuously with the increase of the historical shooting animation data, so that the shooting parameters of the current animation data can be determined accurately and effectively, and meanwhile, the technical effect of user experience is improved.

EXAMPLE III

Fig. 3 is a schematic flow chart of a shooting method according to a third embodiment of the present invention, which is further refined on the basis of the foregoing optional technical solutions, and optionally, the determining a target shooting parameter corresponding to the current animation data based on historical shooting animation data and a historical shooting parameter corresponding to the historical shooting animation data includes: determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data; determining current photographing parameters based on the history photographing parameters corresponding to the at least one set of target history photographing animation data. The same or corresponding terms as those in the above embodiments are not explained in detail herein.

S310, current animation data corresponding to the current user at the current time are obtained.

And S320, determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data.

Among them, the history shooting animation data may be a frame image including a plurality of elements. The current animation data may be a current frame image corresponding to the current picture. The similarity between the current frame image and each historical frame image can be determined by adopting an image similarity algorithm, and at least one group of historical frame images related to the current frame image is determined from the historical frame images based on the similarity. Namely, the target historical shooting animation data is shooting data matched with the current animation data screened from the historical shooting animation data.

In this embodiment, the determining of the target historical animation data may further be: after the history photographed moving image data is acquired, it may be classified and stored respectively according to the picture type of the history photographed moving image data. After determining the target screen type of the current animation data, a plurality of pieces of history photographed animation data matching the current animation data may be determined from a database corresponding to the target screen type as the target history photographed animation data. Alternatively, the target history shooting animation data may be determined in combination with the similarity.

On the basis of the technical scheme, in order to quickly and accurately determine a set of target historical shooting animation data matched with the current animation data, at least one or more of the following manners can be adopted for determination.

In one embodiment, at least one set of target historical captured animation data corresponding to current animation data in historical captured animation data is determined based on scene information of the historical captured animation data and the current animation data.

The historical shooting animation data may be animation data shot at a certain level, the scene identifiers corresponding to different levels are different, and the scene identifiers may be used as scene information. Different users may prefer different video frames corresponding to the same level differently, and the historical captured animation data corresponding to the same level may include a plurality of video frames, that is, the historical captured animation data associated with the same scene identifier may include a plurality of video frames. For example, the level 1 includes three scenes, which are respectively a target object to perform a task in a snow scene, a task in a rain scene, and a task in a dark scene, and the scene identifier may be 1-1, 1-2, and 1-3. The determination of the scene information may also be determined according to the subjects included in the frame picture, for example, if the subjects included in a certain scene are completely the same, then the scene identifications are the same. For example, if the scene information includes trees, mountains, and streams, only the scenes of all the frame pictures (the historical shooting animation data) including the above elements may be marked as one scene id.

It should be noted that, any one of the foregoing manners may be adopted to determine the scene identifier corresponding to each animation data, and the principle of mainly determining the scene identifier is uniform.

Specifically, when storing each piece of history captured animation data, the scene identifier of each piece of history captured animation data may be determined according to the content included in the history captured animation data, and the corresponding piece of history captured animation data may be marked. Or, determining scene identification according to the level corresponding to each historical shooting animation data; alternatively, the scene identification is determined according to different sub-scenes of the historical shooting animation data in the same level. After the scene identifier is determined, the target scene identifier corresponding to the current animation data can be determined according to the same principle. Historical scene identification matched with the target scene identification can be determined, corresponding historical shooting animation data can be called according to the historical scene identification, and the called historical shooting animation data is the target historical shooting animation data.

Alternatively, if there is history photographed moving image data that coincides with the scene information of the current moving image data among the history photographed moving image data, a ratio of the history photographed moving image data that coincides with the scene information of the current moving image data in the total history photographed moving image data is determined, and the history photographed moving image data corresponding to the current moving image data is determined based on the ratio.

In another embodiment, the determining at least one set of target historical captured animation data corresponding to the current animation data from among the historical captured animation data includes: acquiring historical user operation information corresponding to the at least one group of target historical shooting animation data; and determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data based on the historical user operation information.

The historical user operation information corresponding to the historical shooting animation data can be understood as operation information of the historical user on the historical shooting animation data. The historical user operation information may include one item, two items or more items. Optionally, the historical user operation information may include, but is not limited to, at least one of the following operations: clipping, beautifying, deleting, saving, sharing, and/or manually re-shooting, etc. In the embodiment of the invention, the historical preference information of the historical user on the historical shooting animation data can be determined through the historical user operation information of the historical user. The advantage of determining the operation information is that the historical captured animation data associated with the same scene information may include a plurality of pieces, and it is possible to determine which of the historical captured animation data is the animation data that the user is interested in. Illustratively, when the historical user shares and saves the historical shooting animation data, the historical shooting preference of the historical shooting animation data is described to be higher than that when the historical user only saves the historical shooting animation data; when the historical shooting animation data is stored by the historical user, the historical shooting preference degree of the historical shooting animation data is higher than that of the historical shooting animation data when the historical user beautifies the historical shooting animation data; when the historical shooting animation data is beautified by the historical user, the historical shooting preference of the historical shooting animation data is higher than that of the historical shooting animation data when the historical user deletes the historical shooting animation data and manually shoots again; the history shooting preference for the history shooting animation data when the history user deletes the history shooting animation data and manually retakes it is higher than the history shooting preference for the history shooting animation data when the history user deletes only the history shooting animation data. After determining the historical preference degrees of the historical shooting animation data, a preset number of target historical shooting animation data can be determined according to the historical shooting preference degrees, and corresponding historical shooting parameters are called.

That is, after determining the severe shooting preference of the historical user for each historical shooting animation data according to the above manner, the historical shooting preference and the corresponding historical shooting animation data may be bound. When determining the target historical shooting animation data corresponding to the current animation data, screening out the to-be-processed historical shooting animation data with consistent scene identifications according to the target scene identifications of the current animation data, and selecting the historical shooting animation data with the historical shooting preference higher than a preset preference threshold value according to the historical shooting preference of each to-be-processed historical shooting animation data, and using the selected historical shooting animation data as the target historical shooting animation data.

It can be understood that, first, according to the historical operation information of the user, the historical shooting preference corresponding to each historical shooting animation data is determined, and when determining the target historical shooting animation data, the historical shooting preference may be: and screening part of historical shooting animation data to be processed by combining the scene identification, and screening target historical shooting animation data according to the historical shooting preference of the historical shooting animation data to be processed.

It should be noted that the above level of the preference and the specific user operation information depended on in the setting are only an exemplary illustration of the determination method of the history shooting preference, and are not limited.

For example, the history shooting preference for the history shooting animation data may also be determined by the number of times of sharing the history shooting animation data, or the like.

S330, determining target shooting parameters based on the historical shooting parameters corresponding to the at least one group of target historical shooting animation data.

In this embodiment, the target shooting parameters are determined based on the historical shooting parameters corresponding to at least one set of target historical shooting animation data, and at least two embodiments may be adopted.

The first embodiment may be such that, when the determined history photographing animation data includes a plurality of sets, each of the plurality of sets of history photographing parameters may be taken as a target photographing parameter, and a plurality of target images including a target object may be photographed based on the photographing parameters.

That is, the history photographing parameters corresponding to the at least one set of the target history photographing animation data are respectively set as the target photographing parameters.

The second embodiment may be: the determining of the current photographing parameters based on the history photographing parameters corresponding to the at least one set of target history photographing animation data includes: and if the historical shooting parameters corresponding to the target historical shooting animation data are two or more groups, determining the target shooting parameters based on the shooting time information corresponding to the historical shooting parameters.

Specifically, the determined target historical animation data may include a plurality of sets, and the determination of the target shooting parameters may be performed according to the shooting time of each set of historical shooting animation data. For example, the history shooting parameter closest to the current time may be used as the target shooting parameter, because the appreciation level and the appreciation ability of the user are continuously improved, the history shooting parameter farther from the current time may not match the current appreciation level, the matching degree of the history shooting parameter at the angle from the current time with the user is higher, and the history shooting parameter closer to the current time is used as the target shooting parameter.

S340, shooting at least one target object in the current animation data based on the target shooting parameters.

According to the technical scheme of the embodiment of the invention, the target shooting parameters for shooting the current animation data at the current moment can be determined by determining the historical shooting animation data and the historical shooting parameters corresponding to the historical shooting animation data, and then the target image comprising the target object is shot based on the target shooting parameters, so that the technical problem of poor user experience caused by incapability of automatically shooting the corresponding picture in the prior art is solved, the technical effect of automatically shooting the corresponding picture is realized, the shooting parameters corresponding to the corresponding picture can be called to shoot the corresponding picture when the corresponding animation data is shot, the matching degree of the shot image and the user is improved, the shooting effect and the shooting flexibility are improved, and the technical effect of user experience is greatly improved.

Example four

Fig. 4 is a schematic structural diagram of a shooting apparatus according to a fourth embodiment of the present invention, which can be used to execute the shooting method according to any embodiment of the present invention, and the apparatus can be implemented by software and/or hardware.

The photographing apparatus of an embodiment of the present invention may include: an animation data acquisition module 410, a photographing parameter determination module 420, and a photographing module 430.

The animation data obtaining module 410 is configured to obtain current animation data corresponding to a current user at a current time; a shooting parameter determining module 420, configured to determine a target shooting parameter corresponding to the current animation data based on historical shooting animation data and a historical shooting parameter corresponding to the historical shooting animation data; a shooting module 430, configured to shoot at least one target object in the current animation data based on the target shooting parameters. According to the technical scheme of the embodiment of the invention, the current animation data is shot by triggering based on the animation historical shooting data of the current user, personalized analysis is carried out on the current user, the current animation data can be shot by automatic triggering, the current animation data can be shot with high efficiency, and pictures which are possibly interested by the user can be shot automatically, so that the highlight time of the user can be recorded in time, the personalized requirements of the user can be met, and the technical effect of improving the experience of the user can be achieved.

On the basis of the above technical solutions, the shooting parameter determining module includes:

the shooting parameter prediction model determining unit is used for training an original machine learning model based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data to obtain a shooting parameter prediction model; and the target shooting parameter determining unit is used for determining the target shooting parameters corresponding to the current animation data based on the shooting parameter prediction model.

On the basis of the above technical solution, the shooting parameter determining module further includes:

the shooting animation data determining unit is used for determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data; a target photographing parameter determining unit for determining a target photographing parameter based on a history photographing parameter corresponding to the at least one set of target history photographing animation data.

On the basis of the above technical solutions, the photographed moving image data determining unit is further configured to: and determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data based on the scene information of the historical shooting animation data and the current animation data.

On the basis of the above technical solutions, the photographed moving image data determining unit is further configured to: acquiring historical user operation information corresponding to the at least one group of target historical shooting animation data; and determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data based on the historical user operation information.

On the basis of the above technical solutions, the shooting parameter determining unit is further configured to: and if the historical shooting parameters corresponding to the target historical shooting animation data are two or more groups, determining the target shooting parameters based on the shooting time information corresponding to the historical shooting parameters.

On the basis of the above technical solutions, the shooting parameter determining unit is further configured to: and respectively determining the historical shooting parameters corresponding to the at least one group of target historical shooting animation data as target shooting parameters.

The shooting device provided by the embodiment of the invention can execute the shooting method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.

It should be noted that, the units and modules included in the above-mentioned shooting device are merely divided according to functional logic, but are not limited to the above-mentioned division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the embodiment of the invention.

EXAMPLE five

Fig. 5 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention. FIG. 5 illustrates a block diagram of an exemplary electronic device 12 suitable for use in implementing embodiments of the present invention. The electronic device 12 shown in fig. 5 is only an example and should not bring any limitation to the function and the scope of use of the embodiment of the present invention.

As shown in FIG. 5, electronic device 12 is embodied in the form of a general purpose computing device. The components of electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.

Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.

Electronic device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.

The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 5, and commonly referred to as a "hard drive"). Although not shown in FIG. 5, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. System memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.

A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in system memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.

Electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with electronic device 12, and/or with any devices (e.g., network card, modem, etc.) that enable electronic device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the electronic device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with other modules of the electronic device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.

The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, to implement a photographing method provided by the present embodiment.

EXAMPLE six

An embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, perform a photographing method, including:

acquiring current animation data corresponding to a current user at a current moment;

determining target shooting parameters corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data;

and shooting at least one target object in the current animation data based on the target shooting parameters.

Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.

A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.

Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.

Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).

It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

19页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:游戏角色控制方法、装置、存储介质与电子设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类