Submarine operation instruction control system interaction method and device based on gesture operation

文档序号:19624 发布日期:2021-09-21 浏览:12次 中文

阅读说明:本技术 基于手势操作的潜艇作战指控系统交互方法及设备 (Submarine operation instruction control system interaction method and device based on gesture operation ) 是由 崔佳航 李江红 朱正一 魏浩正 刘科峰 单航 马鹏刚 曹铸 李晓宇 李伟鹏 于 2021-05-25 设计创作,主要内容包括:本发明公开了一种基于手势操作的潜艇作战指控系统交互方法,包括以下步骤:获取手部动作在手臂处产生的肌电信号和手臂运动的运动信息;识别所述肌电信号对应的手势和手臂运动的位移信息;根据识别出的手势和手臂运动信息关联到操作系统中各项指令,使系统做出相应的操作;在系统根据手势做出操作时反馈给人手,使用户知道手势操作是否成功,若手势识别失败操作不成功,系统返回错误提示,震动提醒用户手势识别失败。本发明还包括基于手势操作的潜艇作战指控系统交互设备。本发明解决了现有技术中存在的传统键鼠和按钮式交互遇到的操作繁琐和交互适应性差的问题,达到表征原型系统中的集成要求和实验要求,提供模块化、组合式的任务场景操控支持。(The invention discloses a submarine operation instruction control system interaction method based on gesture operation, which comprises the following steps: acquiring myoelectric signals generated by hand movements at arms and movement information of the arm movements; recognizing the gesture corresponding to the electromyographic signal and the displacement information of the arm movement; associating each instruction in the operating system according to the recognized gesture and arm movement information to enable the system to perform corresponding operation; and feeding back to the human hand when the system operates according to the gesture, so that the user knows whether the gesture operation is successful, and if the gesture recognition operation is unsuccessful, the system returns an error prompt to shake and remind the user of the gesture recognition failure. The invention also comprises submarine operation control system interaction equipment based on gesture operation. The invention solves the problems of complex operation and poor interaction adaptability in the prior art in the traditional keyboard-mouse and button type interaction, meets the integration requirement and the experiment requirement in the representation prototype system, and provides modularized and combined task scene control support.)

1. The submarine battle control system interaction method based on gesture operation is characterized by comprising the following steps:

acquiring myoelectric signals generated by hand movements at arms and movement information of the arm movements;

recognizing the gesture corresponding to the electromyographic signal and the displacement information of the arm movement;

associating each instruction in the operating system according to the recognized gesture and arm movement information to enable the system to perform corresponding operation;

and feeding back to the human hand when the system operates according to the gesture, so that the user knows whether the gesture operation is successful, and if the gesture recognition operation is unsuccessful, the system returns an error prompt to shake and remind the user of the gesture recognition failure.

2. The submarine battle finger control system interaction method based on gesture operation according to claim 1, wherein the obtaining of myoelectric signals generated by hand actions at arms and motion information of arm motions specifically comprises:

accurately identifying corresponding gesture signals according to the gesture result matched by the person;

acquiring a multi-channel electromyographic signal.

3. The submarine battle finger control system interaction method based on gesture operation according to claim 2, wherein the gesture corresponding to the electromyographic signal specifically comprises:

the hand gesture actions including fist, palm stretching, palm internal waving, palm external waving, empty pinching and gun opening are related to different operations of an operating system, and the hand gesture actions specifically include: a long-time pressing event of a fist-making related mouse, a long-time pressing event of a palm extending related cancellation, a left mouse button selected in a palm inner swinging related selection, a right mouse button selected in a palm outer swinging related selection, a middle mouse button selected in a hollow-pinching related selection, a gun-opening related mouse clicking event, a gun-opening related double mouse clicking event twice in a short time, and a user can also define gesture related operations by self, so that gesture operation of free definition is realized;

corresponding gestures of the multichannel myoelectric signal combination comprise myoelectric signal combination of two arms; the combination of the electromyographic signals can be in two forms, wherein one form is the combination of the electromyographic signals when the left arm and the right arm act simultaneously, and the other form is the sequential combination of the gesture actions of the two arms; the first combination, such as the combination of myoelectric signals of a left arm fist and a right arm fist, and the combination of myoelectric signals of the left arm fist and a right arm palm stretching, namely the combination of 6 left hands and 6 right hands can be set to 36 groups at most; the second type of combination is that the left arm first makes a fist, the right arm later makes a fist, or the right arm first stretches the palm and the left arm swings inwards, so that theoretically, countless gesture combinations can be achieved; in order to avoid recognition errors caused by intervals existing among sequential gesture actions, the gesture combination in the second form is used for increasing the time span of gesture recognition in the actual operation process, namely waiting for a period of time after the last gesture recognition is finished, and then converting operation instructions; but the waiting time is not suitable to be too long, so as to avoid slow reaction in practical operation.

4. The submarine battle finger control system interaction method based on gesture operation as claimed in claim 2, wherein before the gesture corresponding to the electromyographic signal is identified, the electromyographic signals of each channel are further subjected to rectification, filtering and normalization processing respectively to obtain the processed electromyographic signals.

5. The submarine battle command system interaction method based on gesture operation according to claim 1, wherein the corresponding operation is performed by the system, and specifically comprises:

the method simulates the functions of clicking, double clicking, selecting and dragging of a mouse, and quickly realizes the operations of zooming and page turning of an interface and determining, canceling, returning and quitting, and allows a user to set the complex instruction operation corresponding to the combined gesture by himself.

6. Submarine battle control system interaction equipment based on gesture operation is characterized by comprising:

the acquisition module is used for acquiring electromyographic signals and arm movement information;

the interaction module is used for identifying gestures and arm movement information corresponding to the electromyographic signals and feeding back identification results;

and the display module is used for displaying an operation interface of the finger control system.

7. The submarine battle finger control system interaction device based on gesture operation as claimed in claim 6, wherein the collection module is a wearable myoelectric sensing bracelet, and the bracelet internally comprises 8 paths of myoelectric sensors or 9 axes of motion sensors for acquiring arm myoelectric nerve signals and arm displacement information, and is connected with a computer through Bluetooth, WIFI and infrared induction.

8. The gesture-based submarine operation guidance system interaction device according to claim 6, wherein the interaction module comprises:

a memory: the system is used for storing computer executable instructions, such as preset operation instructions corresponding to different gestures;

a processor: the executable instructions, when executed, cause the processor to perform any of the methods described above.

9. The gesture-operation-based submarine battle command control system interaction device according to claim 6, wherein the display module is a plurality of wall-type large screens surrounding the user.

Technical Field

The invention belongs to the technical field of intelligent human-computer interaction, and particularly relates to a submarine operation command control system interaction method based on gesture operation.

Background

The submarine command control system is a short name of a submarine operation command and weapon control system, is used as an information, command and control center of the submarine operation system, and is an adhesive for realizing the organic combination of a sensor and a weapon system. Because the submarine commanding and controlling system is a non-autonomous system, the exertion of the function and the efficiency of the submarine commanding and controlling system is mainly determined by operators except the capability factors of the system. With the new functions of adding a novel sensor, a novel weapon and a formation coordination, and highlighting new characteristics of system integration, flexible equipment function recombination, sailor reduction and the like, the work burden and the pressure of operators of the submarine command control system can be increased more and more in the future. The method of adding new equipment and integrating new functions in the existing man-machine interaction environment is far from meeting the operation and use requirements of the relevant fighters of the command control system. The technology of non-contact human-computer interaction is greatly valued and developed, wherein the gesture recognition interaction occupies an important position.

Disclosure of Invention

The invention aims to provide a submarine operation instruction control system interaction method based on gesture operation, solves the problems of complex operation and poor interaction adaptability in the traditional keyboard-mouse and button type interaction in the prior art, meets the integration requirement and experimental requirement in a representation prototype system, and provides modularized and combined task scene control support.

Another object of the invention is to provide a submarine operation guidance system interaction device based on gesture operation.

The first technical scheme adopted by the invention is that the submarine operation instruction control system interaction method based on gesture operation comprises the following steps:

acquiring myoelectric signals generated by hand movements at arms and movement information of the arm movements;

recognizing the gesture corresponding to the electromyographic signal and the displacement information of the arm movement;

associating each instruction in the operating system according to the recognized gesture and arm movement information to enable the system to perform corresponding operation;

and feeding back to the human hand when the system operates according to the gesture, so that the user knows whether the gesture operation is successful, and if the gesture recognition operation is unsuccessful, the system returns an error prompt to shake and remind the user of the gesture recognition failure.

The first technical aspect of the present invention is also characterized in that,

the method for acquiring the myoelectric signals generated by the hand actions at the arms and the motion information of the arm motions comprises the following steps:

accurately identifying corresponding gesture signals according to the gesture result matched by the person;

acquiring a multi-channel electromyographic signal.

The gesture corresponding to the electromyographic signal specifically includes:

the hand gesture actions including fist, palm stretching, palm internal waving, palm external waving, empty pinching and gun opening are related to different operations of an operating system, and the hand gesture actions specifically include: a long-time pressing event of a fist-making related mouse, a long-time pressing event of a palm extending related cancellation, a left mouse button selected in a palm inner swinging related selection, a right mouse button selected in a palm outer swinging related selection, a middle mouse button selected in a hollow-pinching related selection, a gun-opening related mouse clicking event, a gun-opening related double mouse clicking event twice in a short time, and a user can also define gesture related operations by self, so that gesture operation of free definition is realized;

corresponding gestures of the multichannel myoelectric signal combination comprise myoelectric signal combination of two arms; the combination of the electromyographic signals can be in two forms, wherein one form is the combination of the electromyographic signals when the left arm and the right arm act simultaneously, and the other form is the sequential combination of the gesture actions of the two arms; the first combination, such as the combination of myoelectric signals of a left arm fist and a right arm fist, and the combination of myoelectric signals of the left arm fist and a right arm palm stretching, namely the combination of 6 left hands and 6 right hands can be set to 36 groups at most; the second type of combination is that the left arm first makes a fist, the right arm later makes a fist, or the right arm first stretches the palm and the left arm swings inwards, so that theoretically, countless gesture combinations can be achieved; in order to avoid recognition errors caused by intervals existing among sequential gesture actions, the gesture combination in the second form is used for increasing the time span of gesture recognition in the actual operation process, namely waiting for a period of time after the last gesture recognition is finished, and then converting operation instructions; but the waiting time is not suitable to be too long, so as to avoid slow reaction in practical operation.

Before the gesture corresponding to the electromyographic signal is recognized, the electromyographic signals of all channels are subjected to rectification, filtering and normalization processing respectively to obtain the processed electromyographic signals.

Making the system perform corresponding operations, specifically including:

the method simulates the functions of clicking, double clicking, selecting and dragging of a mouse, and quickly realizes the operations of zooming and page turning of an interface and determining, canceling, returning and quitting, and allows a user to set the complex instruction operation corresponding to the combined gesture by himself.

The second technical scheme adopted by the invention is that the submarine operation finger control system interaction equipment based on gesture operation comprises:

the acquisition module is used for acquiring electromyographic signals and arm movement information;

the interaction module is used for identifying gestures and arm movement information corresponding to the electromyographic signals and feeding back identification results;

and the display module is used for displaying an operation interface of the finger control system.

The second technical aspect of the present invention is also characterized in that,

the acquisition module is wearable's flesh electricity sensing bracelet, and the bracelet is inside to contain 8 way flesh electricity sensors or 9 axle motion sensor and be used for acquireing arm flesh electricity neural signal and arm displacement information to be connected with the computer through bluetooth, WIFI, infrared ray induction's mode.

The interaction module comprises:

a memory: the system is used for storing computer executable instructions, such as preset operation instructions corresponding to different gestures;

a processor: the executable instructions, when executed, cause the processor to perform any of the methods described above.

The display module is a plurality of wall-type large screens surrounding the user.

The submarine operation guidance control system interaction method and the submarine operation guidance control system interaction equipment based on gesture operation have the advantages that hand actions are recognized according to myoelectric signals generated by the hand actions at the arms, the hand actions are converted into operation instructions for the system through the processor, operation of the guidance control system is completed, meanwhile, operation results are fed back to a user through the display module and the interaction module, the purpose of gesture quick operation of man-machine interaction is achieved, and the user obtains better interaction experience.

Drawings

FIG. 1 is a gesture interaction module interaction process;

FIG. 2 is a pattern recognition process of a gesture in an interaction process;

FIG. 3 is a flowchart of a process for implementing interactive correlation during an interactive process.

Detailed Description

The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.

The invention relates to a submarine operation instruction control system interaction method based on gesture operation, which comprises the following steps:

acquiring myoelectric signals generated by hand movements at arms and movement information of the arm movements;

recognizing the gesture corresponding to the electromyographic signal and the displacement information of the arm movement;

associating each instruction in the operating system according to the recognized gesture and arm movement information to enable the system to perform corresponding operation;

and feeding back to the human hand when the system operates according to the gesture, so that the user knows whether the gesture operation is successful, and if the gesture recognition operation is unsuccessful, the system returns an error prompt to shake and remind the user of the gesture recognition failure.

The method for acquiring the myoelectric signals generated by the hand actions at the arms and the motion information of the arm motions comprises the following steps:

accurately identifying corresponding gesture signals according to the gesture result matched by the person;

acquiring a multi-channel electromyographic signal.

The gesture corresponding to the electromyographic signal specifically includes:

the hand gesture actions including fist, palm stretching, palm internal waving, palm external waving, empty pinching and gun opening are related to different operations of an operating system, and the hand gesture actions specifically include: a long-time pressing event of a fist-making related mouse, a long-time pressing event of a palm extending related cancellation, a left mouse button selected in a palm inner swinging related selection, a right mouse button selected in a palm outer swinging related selection, a middle mouse button selected in a hollow-pinching related selection, a gun-opening related mouse clicking event, a gun-opening related double mouse clicking event twice in a short time, and a user can also define gesture related operations by self, so that gesture operation of free definition is realized;

corresponding gestures of the multichannel myoelectric signal combination comprise myoelectric signal combination of two arms; the combination of the electromyographic signals can be in two forms, wherein one form is the combination of the electromyographic signals when the left arm and the right arm act simultaneously, and the other form is the sequential combination of the gesture actions of the two arms; the first combination, such as the combination of myoelectric signals of a left arm fist and a right arm fist, and the combination of myoelectric signals of the left arm fist and a right arm palm stretching, namely the combination of 6 left hands and 6 right hands can be set to 36 groups at most; the second type of combination is that the left arm first makes a fist, the right arm later makes a fist, or the right arm first stretches the palm and the left arm swings inwards, so that theoretically, countless gesture combinations can be achieved; in order to avoid recognition errors caused by intervals existing among sequential gesture actions, the gesture combination in the second form is used for increasing the time span of gesture recognition in the actual operation process, namely waiting for a period of time after the last gesture recognition is finished, and then converting operation instructions; but the waiting time is not suitable to be too long, so as to avoid slow reaction in practical operation.

Before the gesture corresponding to the electromyographic signal is recognized, the electromyographic signals of all channels are subjected to rectification, filtering and normalization processing respectively to obtain the processed electromyographic signals.

Making the system perform corresponding operations, specifically including:

the method simulates the functions of clicking, double clicking, selecting and dragging of a mouse, and quickly realizes the operations of zooming and page turning of an interface and determining, canceling, returning and quitting, and allows a user to set the complex instruction operation corresponding to the combined gesture by himself.

The invention relates to a submarine operation finger control system interaction device based on gesture operation, which comprises:

the acquisition module is used for acquiring electromyographic signals and arm movement information;

the interaction module is used for identifying gestures and arm movement information corresponding to the electromyographic signals and feeding back identification results;

and the display module is used for displaying an operation interface of the finger control system.

The second technical aspect of the present invention is also characterized in that,

the acquisition module is wearable's flesh electricity sensing bracelet, and the bracelet is inside to contain 8 way flesh electricity sensors or 9 axle motion sensor and be used for acquireing arm flesh electricity neural signal and arm displacement information to be connected with the computer through bluetooth, WIFI, infrared ray induction's mode.

The interaction module comprises:

a memory: the system is used for storing computer executable instructions, such as preset operation instructions corresponding to different gestures;

a processor: the executable instructions, when executed, cause the processor to perform any of the methods described above.

The display module surrounds the user for polylith wall formula large screen, makes the user obtain the interactive experience that the sense of immersion is stronger.

The concrete actual operation of the submarine is mostly realized by an operator through a mouse keyboard and buttons, and the operation is complicated and boring. For satisfying the comprehensive command means that provides more quick convenient for the commander, improve command efficiency, shorten the system response time of fighting, this application provides the operation that carries out finger control system based on the gesture operation of flesh electricity sensing, improves the operation complexity, raises the efficiency.

The man-machine interaction process based on gesture operation shown in fig. 1 includes:

step 1, when the hand-gesture-collection hand ring is used for the first time, person matching of gesture collection is needed, a user needs to wear the hand ring correctly, and the hand ring is tightly attached to the skin and is worn on the middle section of a forearm; the myoelectricity bracelet is wirelessly connected with the computer through Bluetooth, and when the connection is successful, a user can observe the connection on a display interface and simultaneously feed back the connection to the bracelet to vibrate to remind the user of successful connection; step 2, when no gesture action is performed before gesture recognition, the gesture recognition device is in a natural relaxation state; in the gesture collection process, the gesture action process is crisp, hesitation or slow in action, and a process of returning to a natural relaxation state is required between two gesture actions; step 3, after electromyographic signals are collected, the computer performs pattern recognition on corresponding gestures to realize gesture recognition of corresponding electromyographic signals, and stores data in the computer; step 4, programming simulation mouse driving programs corresponding to different operation instructions for different gestures, connecting the simulation mouse driving programs with hardware, simultaneously setting a gyroscope in the myoelectricity bracelet, continuously and real-timely acquiring angular velocity information on a coordinate axis where the arm is located, calculating the moving direction and distance of the arm through an internal program, converting the three-dimensional movement of the arm into two-dimensional movement data of a cursor, and finally sending the two-dimensional movement data to a computer to realize the control of the cursor in an operation interface of the computer; the combination form of the gesture operation and the arm displacement data can be set through a program, and the combined form corresponds to a complex instruction of the submarine operation control system, so that a user can perform space operation on the control system through a gesture, and the interactive association of gesture actions and the control instruction is realized; step 5, during the air-separating operation, the myoelectric bracelet acquires myoelectric signals and arm displacement information acquired by the gyroscope, the myoelectric signals and the arm displacement information are processed and then sent to the computer, and the computer identifies corresponding instructions through program judgment and sends corresponding operation instruction information to the finger control system; and 6, if the computer cannot recognize the human body gesture in the air separation operation process, the computer feeds back error reporting information to the bracelet, and the bracelet performs vibration processing when receiving the error reporting information to remind a user to perform gesture operation again or perform gesture recognition again.

The pattern recognition process of the gesture shown in fig. 2, namely the pattern recognition of step 3, is implemented as follows: the traditional gesture recognition is based on a pattern recognition method in a machine learning algorithm, and the method comprises the steps of data acquisition, preprocessing, feature extraction and pattern classification. The method comprises the steps that a skin surface electromyogram signal (sEMG) is obtained through a sensor, noise of the sEMG is removed through amplification and filtering preprocessing, then data used for classification are manually extracted by a human, time and frequency characteristics of the data are captured from the sEMG, and the characteristics serve as input of a machine learning classifier, so that gesture classification and recognition of the sEMG are achieved.

The interactive association in step 4 is mainly realized by two ways, including:

1) simulating mouse action events from the operating system bottom layer by a Human Interface Device (HID) driver;

2) and obtaining related action events through programming, and binding the associated logic from an application layer.

The Human Interface Device (HID) is a device class definition for replacing a PS/2 style connector with a generic USB driver that supports HID devices (e.g., keyboard, mouse, touchpad); hardware innovation requires that existing protocols be used to reload data, or that non-standard hardware be created using its own dedicated driver, HID adds support for hardware innovation through an extensible, standardized, and easily programmable interface; an API (Application Programming Interface) function is composed of different dynamic connection libraries, is a set of function definition, parameter definition and message format supported by an operating system, and is mostly written by C and C + + languages; when the calling function is called, only the called interface is needed to work, such as a declaration function; the control of the cursor may be implemented using the mouse _ event function of the "Windows user interface manager (user32. dll)" Dynamic Link Library (Dynamic Link Library).

The setting of the simulated mouse action event comprises the following steps:

the recognition gestures mainly comprise six types of fists, palm stretching, internal waving, external waving and air kneading and shooting, and support two groups of self-defined actions and movement of arms.

The 1) is written by an HID driver, and the gesture and the arm movement are associated with instructions of an operating system, and the method comprises the following steps: when the computer recognizes a fist making signal, the program calls a function to execute a command of long-time pressing of the mouse, which is equivalent to holding a physical mouse button; when the computer recognizes the palm extending signal, the program calls the function to execute a command of releasing the mouse button, which is equivalent to releasing the entity mouse button; when the computer recognizes the internal swing signal, the program calls the function to execute the command of selecting the left key, which is equivalent to placing a finger on the left key of the mouse; when the computer recognizes the external swing signal, the program calls the function to execute the command of selecting the right button, which is equivalent to placing a finger on the right button of the mouse; when the computer recognizes the empty pinch signal, the program calls the function to execute the command of selecting the middle key, which is equivalent to putting the finger on the middle key of the mouse; when the computer identifies the shooting signal, the program calls the function to execute the single-click command, which is equivalent to single-click the currently selected key of the mouse; when the computer recognizes two continuous shooting signals in a short time interval, the program calls a function to execute a double-click command, which is equivalent to double-clicking a currently selected key of the mouse; when the computer recognizes a signal of arm movement, the program calls a function to execute a command of mouse movement, which is equivalent to moving the mouse.

As shown in fig. 3, which is a flowchart of the program and the user operation in space mode in 2), the program is written according to the following steps:

storing the recognized gesture data in a computer, generating a graphical interface for convenient selection, and enabling a user to select different gesture combinations, namely reading the designated gesture data, arranging the gesture data into a new data combination according to the intention of the user, and storing the new data combination in the computer and associating the new data combination with the operation in the submarine control system; the gesture operation mode is used as the simulated mouse or combined operation and can be changed at any time in the operation process; the operation mode can be changed by gesture operation except the operation on the operation interface, and the gesture is not similar to other gestures so as to avoid misoperation.

The user selects different gesture combinations from the gesture combinations, wherein the different gesture combinations comprise muscle and electric signal combinations with the two arms acting simultaneously and muscle and electric signal combinations with the two arms acting sequentially;

the muscle electric signal combinations of the two arms acting simultaneously theoretically can be 6 × 6 — 36, and the muscle electric signal combinations of the two arms acting sequentially theoretically can be infinite;

in order to avoid incomplete gesture recognition caused by intervals among the muscle and electric signal combinations with the two arms acting in sequence, gesture recognition is carried out after a period of delay after the acquisition of each gesture is finished;

the delay time is not suitable to be too long, so that slow reaction in actual operation is avoided.

The gesture interaction has the following main points, including:

support is needed for recognizing specific gestures; establishing association between the gesture and a designated command, operation or shortcut according to a use scene, realizing rapid air-separating operation experience, and having no restriction or limitation on space and positions, standing postures, sitting postures and the like of users in the whole operation process; the quick entry of individual gestures for several minutes before use is required to work properly.

The innovation points of the gesture interaction module are as follows: by the design close to the actual combat environment such as integration, integration and wearable, commanders and other occupied personnel have rapid identification capability under the actual use scene in the cabin, can obtain more operation reaction time, are not influenced by occasions and light rays, and are closer to the interactive identification occasions of day and night tasks;

as a prospect, with the maturity of the multichannel human-computer interaction technology in the future, gesture interaction can be extended to a new generation of interaction mode applying the multichannel human-computer interaction design and integrating the technologies such as sight tracking, VR display, voice recognition, gesture control and sensory feedback, including: the head-mounted double displays respectively observe the left display screen and the right display screen through binoculars to obtain a three-dimensional wide-angle panorama, and a simulated or remotely controlled 360-degree visual threshold is obtained by combining the movement and the posture of the head of a user and the change of the sight line of eyeballs; responding by voice commands, wherein commanders send out voice commands, and modify and confirm voice recognition results by combining a touch screen, and three-dimensional sound prompts the target direction, graphical display information, vibration and sound are combined for emergency prompt; according to the method, the gesture operation assists or even replaces the operation of a keyboard, a mouse and buttons, the command sending of the finger control system is controlled, the freedom degree of both hands is liberated, great convenience is provided for potential operation commanders, and the naturalness, the efficiency and the applicability of human-computer interaction are improved. The display module mainly adopts a two-view interaction mode of taking a table page as a main view and taking a graph as an auxiliary view. The table page view mainly completes the display and operation entry of information such as information comprehensive processing, auxiliary command decision, target motion analysis and the like. The amount of information that the form view presents to the operator is large and complex, but with a multi-form sorting display, the operator can more easily obtain the desired information from the large amount of information. The graphical view mainly completes the graphical display of the electronic chart and battlefield comprehensive situation (2D), and provides support for the decision of the commander.

In summary, the present invention is implemented by a combination of hardware and software modules running on one or more processors, where the software implementation can write program code in any combination of one or more programming languages, including an object oriented programming language such as C + +, as well as conventional procedural based programming languages, such as C, that can execute entirely on a user computing device.

12页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种非标定眼动交互方法和装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类