Multi-mode man-machine hybrid interaction aircraft cabin

文档序号:235847 发布日期:2021-11-12 浏览:22次 中文

阅读说明:本技术 一种多模态人机混合交互的飞行器座舱 (Multi-mode man-machine hybrid interaction aircraft cabin ) 是由 李嘉琪 吴晓莉 陈强 李家辉 李泽珩 李孟牛 江忆 王智亮 于 2021-07-01 设计创作,主要内容包括:本发明公开了一种多模态人机混合交互的飞行器座舱,包括风挡、主显示屏、发射按钮、战场态势信息次显示屏、手势识别摄像头、舱盖龙骨、头盔、可调节座椅、可移动计算机、机载设备隔舱、座舱梭体、摄像机、整体逃生拉杆、头盔、航电设备配置器、涡轮喷气发动机、座舱总线;本发明的座舱梭体和头盔的脑机、眼动、语音三个通道混合交互,头盔通过上述传感器采集人体的脑电信号、眼动信号、语音指令,将数据传输给座舱梭体的可移动计算机,可移动计算机对数据进行降噪后识别飞行员的意图从而实现控制任务,通过多通道混合交互提升了作战效率,提高飞行员任务执行中的控制能力和交互效率。(The invention discloses a multi-mode man-machine hybrid interaction aircraft cabin, which comprises a windshield, a main display screen, a transmitting button, a battlefield situation information secondary display screen, a gesture recognition camera, a cabin cover keel, a helmet, an adjustable seat, a movable computer, an airborne equipment compartment, a cabin shuttle body, a camera, an integral escape pull rod, a helmet, an avionics equipment configurator, a turbojet engine and a cabin bus, wherein the cabin is provided with a main display screen and a transmitting button; according to the invention, three channels of brain, eye movement and voice of the cockpit shuttle body and the helmet are mixed and interacted, the helmet collects electroencephalogram signals, eye movement signals and voice instructions of a human body through the sensors, data are transmitted to the movable computer of the cockpit shuttle body, the movable computer performs noise reduction on the data and then identifies the intention of a pilot so as to realize a control task, the combat efficiency is improved through multi-channel mixed interaction, and the control capability and interaction efficiency in task execution of the pilot are improved.)

1. The multi-mode man-machine hybrid interaction aircraft cabin is characterized by comprising a windshield (1), a main display screen (2), a transmitting button (3), a battlefield situation information secondary display screen (4), a gesture recognition camera (5), a cabin cover keel (6), a helmet (7), an adjustable seat (8), a movable computer (9), an airborne equipment compartment (10), a cabin shuttle body (11), a direction controller (12), an integral escape pull rod (13), an avionics equipment configurator (17) and a cabin bus (19);

the wind shield (1) is connected with the back of the main display screen (2), one end of the emission button (3) is connected with the battlefield situation information secondary display screen (4), the U-shaped end of the battlefield situation information secondary display screen (4) is connected with one end of the adjustable seat (8) through a seat armrest keel, the other end of the adjustable seat (8) is connected with the movable computer (9), and the airborne equipment compartment (10) is arranged between the lower end of the adjustable seat (8) and the cabin shuttle body (11); the gesture recognition camera (5) is fixed at one end of a cabin cover keel (6), the helmet (7) is arranged in the middle of the cabin cover keel (6), the adjustable seat (8) is connected with the cabin shuttle body (11) through an avionics equipment configurator (17), the integral escape pull rod (13) is connected with the bottom end of the cabin shuttle body (11) through the adjustable seat (8), the avionics equipment configurator (17) is fixed on the cabin shuttle body (11) and connected with a cabin bus (19), and the direction controller (12) is connected with the windshield (1);

the helmet (7) is internally provided with a helmet display (14), an eye movement signal receiving sensor (15), a brain-computer interface (16) and a breathing device (20).

2. The multi-modal hybrid human-machine interaction aircraft cabin according to claim 1, wherein the eye movement signal receiving sensor (15) is provided with an infrared camera and an infrared LED lamp, the brain-machine interface (16) is provided with a plurality of groups of brain wave sensors, and the breathing device (20) is provided with pressure, flow, temperature and sound sensors.

3. The multi-modal man-machine hybrid interaction aircraft cockpit according to claim 2 is characterized in that the helmet (7) collects electroencephalogram signals, eye movement signals and voice commands of a human body through an eye movement signal receiving sensor (15), a brain wave sensor and pressure, flow, temperature and sound sensors, transmits the electroencephalogram signals, eye movement signals and voice command data to the movable computer (9) of the cockpit shuttle (11), and the movable computer (9) performs noise reduction on the data to recognize the intention of a pilot so as to realize a control task.

4. The multi-modal man-machine hybrid interaction aircraft cabin according to claim 1, wherein the launching button (3), the battlefield situation information secondary display screen (4) and the mobile computer (9) are respectively connected with the adjustable seat (8) through a seat armrest keel in a nested assembly connection mode, the top end of the cabin cover keel (6) is provided with a telescopic gesture recognition camera (5), and the tail ends of the avionics equipment configurator (17) and the cabin shuttle body (11) are provided with a power device (18) for integral ejection escape.

5. A multi-modal human-machine hybrid interaction aircraft cockpit according to claim 1 characterized in that the main display screen (2) and adjustable seats (8) are adjustable in distance and angle.

6. The multi-modal man-machine hybrid interaction aircraft cabin according to claim 1, wherein the windshield (1), the adjustable seat (8) and the integral escape pull rod (13) are in threaded connection with the cabin shuttle body (11), the top ends of the straight pipe ends of the windshield (1), the adjustable seat (8) and the integral escape pull rod (13) are provided with external threads, and the cabin shuttle body (11) is provided with internal threads.

7. The multi-modal human-machine hybrid interaction aircraft cockpit of claim 1 where the gesture recognition camera (5) is used to recognize hand movements and the directional controller (12) is embedded with micro sensors for multi-channel sensing interaction.

8. The multi-modal hybrid human-machine interaction aircraft cockpit of claim 1 further comprising a hatch (23), said hatch (23) being of an integrated bird strike containment design.

Technical Field

The invention relates to an aircraft cockpit, in particular to an aircraft cockpit capable of multi-mode man-machine hybrid interaction.

Background

The human-computer fusion is taken as the main direction of cabin intelligent development, and the core of the intelligent development is to construct a multi-modal human-computer interaction normal form by utilizing novel interaction technologies such as brain-computer, eye movement, gestures and the like, so as to realize an integrated and collaborative information circulation mechanism. On the theoretical exploration, the new generation artificial intelligence introduces the human action into the system, and the human, machine and environment ternary depth fusion enables the intelligence of the human and the intelligence of the machine to be mutually enlightened, so that the purposes of intelligence diffusion and aggregation throughout the battlefield situation, global full-dimensional space-time causal value evaluation, rule and intelligent network composite optimization, human-machine cooperation and mutual trust are achieved, and finally a novel observation-judgment-decision-action (OODA) cycle is formed. In real-life aircraft, the control system of the aircraft is still in the "machine-centric" phase, and some of the testing machines are shifting the center of gravity to the "human-centric" interaction mode. Compared with the method relying on a single interaction mode, the accuracy of human-computer interaction intention recognition can be obviously improved through multi-mode mixed interaction. In order to adapt to the digital battlefield environment, the man-centered interactive control mode of multi-mode man-machine mixed interaction such as visual, auditory, tactile and brain is integrated, and the method has great value in the next generation aircraft cabin. Therefore, aiming at the man-machine fusion intelligent control technology, a design model of the aircraft cabin is provided from various interaction channels of gestures, voice, eye control and brain control.

Disclosure of Invention

In order to solve the problems, the invention provides a multi-mode man-machine hybrid interaction aircraft cabin.

The solution of the invention for realizing the above purpose is as follows:

a multi-mode man-machine hybrid interaction aircraft cabin comprises a windshield, a main display screen, a transmitting button, a battlefield situation information secondary display screen, a gesture recognition camera, a cabin cover keel, a helmet, an adjustable seat, a movable computer, an airborne equipment compartment, a cabin shuttle body, a direction controller, an integral escape pull rod, an avionics equipment configurator and a cabin bus;

the wind shield is connected with the back of the main display screen, one end of the emission button is connected with the battlefield situation information secondary display screen, the U-shaped end of the battlefield situation information secondary display screen is connected with one end of the adjustable seat through a seat armrest keel, the other end of the adjustable seat is connected with the movable computer, and the airborne equipment compartment is arranged between the lower end of the adjustable seat and the cabin shuttle body; the gesture recognition camera is fixed at one end of the cabin cover keel, the helmet is arranged in the middle of the cabin cover keel, the adjustable seat is connected with the cabin shuttle body through an avionics equipment configurator, the integral escape pull rod is connected with the bottom end of the cabin shuttle body through the adjustable seat, the avionics equipment configurator is fixed on the cabin shuttle body and connected with a cabin bus, and the direction controller is connected with a windshield;

the helmet is provided with a helmet display, an eye movement signal receiving sensor, a brain-computer interface and a breathing device.

Furthermore, an infrared camera and an infrared LED lamp are arranged in the eye movement signal receiving sensor, a plurality of groups of brain wave sensors are arranged in the brain-computer interface, and pressure, flow, temperature and sound sensors are arranged in the breathing device.

Furthermore, the helmet collects electroencephalogram signals, eye movement signals and voice instructions of a human body through an eye movement signal receiving sensor, a brain wave sensor and a pressure sensor, a flow sensor, a temperature sensor and a sound sensor, transmits the electroencephalogram signals, the eye movement signals and the voice instruction data to a movable computer of the cabin shuttle body, and the movable computer performs noise reduction on the data and then recognizes the intention of a pilot so as to realize a control task.

Furthermore, the launching button, the battlefield situation information secondary display screen and the movable computer are connected with the adjustable seat through seat armrest keels respectively in a nested assembly connection mode, a telescopic gesture recognition camera is arranged at the top end of the cabin cover keel, and a power device (such as a turbojet engine or a motor) for integral ejection escape is arranged at the tail end of the avionics equipment configurator and the tail end of the cabin shuttle body.

Further, the main display screen and the adjustable seat can be adjusted in distance and angle.

Furthermore, the windshield, the adjustable seat and the integral escape pull rod are in threaded connection with the cabin shuttle body, external threads are arranged at the top ends of the straight pipe ends of the windshield, the adjustable seat and the integral escape pull rod, and internal threads are arranged on the cabin shuttle body.

Furthermore, the gesture recognition camera is used for recognizing hand motions, and a micro sensor (such as a pressure sensor) is embedded in the direction controller and used for multi-channel sensing interaction.

Further, the bird strike prevention cabin cover is an integrated bird strike prevention hidden design.

Compared with the prior art, the invention has the following beneficial effects:

the helmet is provided with a helmet display, an eye movement signal receiving sensor, a brain-computer interface and a breathing device, wherein the eye movement signal receiving sensor is provided with an infrared camera and an infrared LED lamp, the brain-computer interface is provided with a plurality of groups of brain wave sensors, the breathing device is provided with pressure, flow, temperature and sound sensors, the brain-computer, eye movement and voice channels of the cockpit shuttle and the helmet are mixed and interacted, the helmet collects brain-computer signals, eye movement signals and voice instructions of a human body through the sensors and transmits the data to a movable computer of the cockpit shuttle, the movable computer performs noise reduction on the data and then identifies the intention of a pilot so as to realize a control task, the operational efficiency is improved through multi-channel mixed interaction, and the control capability and the interaction efficiency in the task execution of the pilot are improved.

Drawings

Fig. 1 is a schematic perspective view of a cabin of an aircraft capable of multi-modal man-machine hybrid interaction according to the invention.

FIG. 2 is a front view of an aircraft cockpit with multi-modal human-machine hybrid interaction of the present invention.

Fig. 3 is a schematic view of the structure of the helmet of the present invention.

Fig. 4 is a schematic view of the hatch cover structure of the present invention.

Fig. 5 is an exploded view of an aircraft cabin capable of multi-modal man-machine hybrid interaction according to the present invention.

Detailed Description

In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.

As shown in fig. 1-5, an aircraft cabin capable of multi-modal man-machine hybrid interaction comprises a windshield 1, a main display screen 2, a transmitting button 3, a battlefield situation information secondary display screen 4, a gesture recognition camera 5, a cabin cover keel 6, a helmet 7, an adjustable seat 8, a movable computer 9, an onboard equipment compartment 10, a cabin shuttle 11, a direction controller 12, an integral escape pull rod 13, an avionics equipment configurator 17 and a cabin bus 19;

the windshield 1 is connected with the back of the main display screen 2, one end of the emission button 3 is connected with the battlefield situation information secondary display screen 4, the U-shaped end of the battlefield situation information secondary display screen 4 is connected with the adjustable seat 8 through a seat armrest keel, the other end of the adjustable seat 8 is connected with the movable computer 9, and the airborne equipment compartment 10 is arranged between the lower end of the adjustable seat 8 and the cabin shuttle body 11; the gesture recognition camera 5 is fixed at one end of a cabin cover keel 6 to be connected, the helmet 7 is arranged in the middle of the cabin cover keel 6, the adjustable seat 8 is connected with a cabin shuttle body 11 through an avionics equipment configurator 17, the integral escape pull rod 13 is connected with the bottom end of the cabin shuttle body 11 through the adjustable seat 8, the avionics equipment configurator 17 is fixed on the cabin shuttle body 11 through a cabin bus 19, the direction controller 12 is connected with the windshield 1, and the turbojet engine 18 is arranged inside the cabin shuttle body 11;

the helmet 7 is provided with a helmet display 14, an eye movement signal receiving sensor 15, a brain-computer interface 16 and a breathing device 20, the eye movement signal receiving sensor 15 is provided with an infrared camera and an infrared LED lamp, the brain-computer interface 16 is provided with a plurality of groups of brain wave sensors, and the breathing device 20 is provided with pressure, flow, temperature and sound sensors.

Preferably, the cockpit shuttle 11 and the helmet 7 are in mixed interaction with three channels of brain, eye movement and voice, the helmet 7 collects brain electrical signals, eye movement signals and voice instructions of a human body through the sensors, data are transmitted to the movable computer 9 of the cockpit shuttle 11, and the movable computer 9 performs noise reduction on the data and then recognizes the intention of a pilot so as to realize a control task.

Preferably, the launching button 3, the battlefield situation information secondary display screen 4 and the movable computer 9 are respectively connected with the adjustable seat 8 through a seat armrest keel in a nested assembly connection mode, the top end of the cabin cover keel 6 is provided with a telescopic gesture recognition camera 5, and the tail end of the avionics equipment configurator 17 and the tail end of the cabin shuttle 11 are provided with a power device 18 (such as a turbine jet engine or a motor) for integral ejection escape.

Preferably, the main display 2 and the adjustable seat 8 are adjustable in distance and angle.

Preferably, the windshield 1, the adjustable seat 8 and the integral escape pull rod 13 are in threaded connection with the cabin shuttle body 11, the top ends of the straight pipe ends of the windshield 1, the adjustable seat 8 and the integral escape pull rod 13 are provided with external threads, and the cabin shuttle body 11 is provided with internal threads.

Preferably, the gesture recognition camera 5 is used for recognizing hand movements, and a micro sensor (e.g. a pressure sensor) is embedded in the direction controller 12 for multi-channel sensing interaction.

Preferably, the hatch cover is of an integrated bird-collision-prevention stealth design.

The specific disassembly, assembly and working processes of the invention are as follows:

when the cabin is separated from the aircraft into an independent unit body:

as shown in fig. 1, the windshield 1, the adjustable seat 8 and the integral escape pull rod 13 are engaged with the cabin shuttle body 11 through internal and external threads; the back ends of the emission button 3, the battlefield situation information secondary display screen 4 and the movable computer 9 are connected with the adjustable seat 8 through a seat armrest keel, and the emission button 3, the battlefield situation information secondary display screen 4 and the movable computer 9 can be completely separated from the adjustable seat 8 by rotating bone joints in the sliding keel to the end bottom; other parts (helmets, seats, displays and hatches) of the cockpit also adopt a mode of separating the sliding blocks, so that the independence is kept and the reassembly is facilitated.

As shown in fig. 2, the helmet display 14 and the rotary sliders on both sides of the breathing apparatus 20 are slid upwards along the notches, so that the head of the pilot is separated, and the eye movement signal receiving sensor 15, the microphone 21 and the headset 22 have the same structure on the left and right;

as shown in fig. 3, the whole cabin structure is in an integrated water drop shape, and the connection between the cabin cover 23 and the cabin shuttle body 11 adopts rivet and bolt connection;

as shown in fig. 4, the whole escape pull rod 13 is pulled down, the turbojet engine 18 is opened, the whole cabin flies and glides to escape, and the main display screen 2, the battlefield situation information secondary display screen 4, the movable computer 9 and the helmet display 14 play a role in controlling the command of the cabin which is ejected and escaped.

The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

9页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:伺服控制丝杠传动装置和控制方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!