Mobile terminal and electronic device with same

文档序号:890085 发布日期:2021-02-23 浏览:42次 中文

阅读说明:本技术 移动终端和具有移动终端的电子装置 (Mobile terminal and electronic device with same ) 是由 闵惠铃 李静彬 赵玟行 于 2019-05-09 设计创作,主要内容包括:根据本发明的电子装置的移动终端包括:终端主体,其联接到壳体;以及第一显示单元,其联接到所述壳体。另外,所述壳体包括:第一主体,所述第一主体被配置为容纳所述终端主体的至少一部分;第二主体,在所述第二主体中布置有第二显示单元;布线单元,其电联接所述第一主体和所述第二主体,从而将从所述移动终端接收的数据传输到第二显示单元;以及无线通信单元,其被配置为向移动终端发送信号和从移动终端接收信号。移动终端的控制单元控制所述布线单元、所述无线通信单元以及所述第一显示单元和所述第二显示单元,使得在所述第一显示单元上显示包括输入区域的第一画面信息并且在所述第二显示单元上显示第二画面信息的同时,响应于在所述输入区域中接收到的触摸输入来捕获在第二显示单元上显示的第二画面信息,并且将所捕获的第二画面信息插入到所述第一画面信息的所述输入区域中。(The mobile terminal of the electronic device according to the present invention includes: a terminal body coupled to the housing; and a first display unit coupled to the housing. In addition, the housing includes: a first body configured to receive at least a portion of the terminal body; a second body in which a second display unit is disposed; a wiring unit electrically coupling the first body and the second body, thereby transmitting data received from the mobile terminal to a second display unit; and a wireless communication unit configured to transmit and receive signals to and from the mobile terminal. A control unit of a mobile terminal controls the wiring unit, the wireless communication unit, and the first and second display units such that second screen information displayed on a second display unit is captured in response to a touch input received in an input area while first screen information including the input area is displayed on the first display unit and second screen information is displayed on the second display unit, and the captured second screen information is inserted into the input area of the first screen information.)

1. An electronic device comprising a mobile terminal and a housing, the mobile terminal being combined with the housing,

wherein the mobile terminal includes:

a terminal body combined with the housing, an

A first display combined with the housing,

wherein the housing includes:

a first body formed in a manner to accommodate at least a portion of the terminal body,

a second body in which a second display is disposed,

a wiring unit electrically coupling the first body and the second body to transmit data received from the mobile terminal to the second display, an

A wireless communication unit formed in such a manner as to transmit and receive signals to and from the mobile terminal, and

wherein, in a state in which first screen information including an input area is displayed on the first display and second screen information is displayed on the second display, in response to receiving a predetermined touch input in the input area, the controller of the mobile terminal controls the wiring unit, the wireless communication unit, the first display, and the second display so as to capture the second screen information displayed on the second display and insert the captured second screen information into the input area of the first screen information.

2. The electronic device as set forth in claim 1,

wherein an icon for performing the capturing of the second screen information displayed on the second display is displayed on the input area, and the capturing of the second screen information is performed at a point of time when a touch input is applied to the icon.

3. The electronic device as set forth in claim 2,

wherein the input area comprises: a keyboard region that receives a key input; and a display area on which a result of the received key input is displayed; and is

Wherein, when a touch input is applied to the icon, the controller of the mobile terminal performs control so that the second screen information is captured and the captured second screen information is attached to the display area after a fixed time elapses.

4. The electronic device as set forth in claim 1,

wherein the first screen information is a screen corresponding to execution of a message application, and the second screen information is a screen or a home screen corresponding to execution of an application different from the first screen information, and

wherein the second screen information further maintains a state of being displayed on the second display after inserting the captured second screen information into the input area of the first screen information.

5. The electronic device as set forth in claim 1,

wherein switching from the second screen information displayed on the second display to the captured image of the second screen information occurs in response to receiving a touch input in the input area, and

wherein a tool area for image editing is displayed on one area of the captured image of the second screen information for a predetermined time.

6. The electronic device as set forth in claim 5,

wherein editing of the captured image of the second screen information is performed based on a touch input applied to the tool area, and

wherein the edited image is appended to the first display in response to an edit completion input applied to the second display.

7. The electronic device as set forth in claim 1,

wherein, in a state in which the input area is displayed, when it is determined that the second screen information displayed on the second display cannot be captured, the controller of the mobile terminal displays an icon for performing the capture of the second screen information included in the input area in an inactive state.

8. The electronic device as set forth in claim 7,

wherein when a touch input is continuously applied to the icon displayed in the inactivated state, alarm information indicating that the capturing cannot be performed is output on the second screen information on the second display.

9. The electronic device as set forth in claim 1,

wherein, in a case where the second screen information is displayed on the first display, in response to receiving a screen output request in the first display, the controller of the mobile terminal performs control such that the second screen information and data associated with execution of an application corresponding to the second screen information are transmitted to the second display.

10. The electronic device as set forth in claim 9,

wherein, when the transmitted second screen information is displayed on the second display, the transmitted second screen information is captured and attached to an input area of the first screen information based on a touch input applied to the input area of the first screen information corresponding to a message application executed on the first display.

11. A method of operating an electronic device, the electronic device comprising a mobile terminal and a housing, the mobile terminal being combined with the housing, the mobile terminal comprising a first display combined with the housing, and the housing being arranged in a second display, the method comprising the steps of:

detecting a touch input applied to an input region in a state where first screen information including the input region is displayed on the first display and second screen information is displayed on the second display;

transmitting a command to capture second screen information to the second display in response to the touch input; and

receiving data corresponding to the captured second screen information and inserting the received data into an input area of the first screen information on the first display.

12. The method of claim 11, further comprising the steps of:

displaying an icon for capturing second screen information displayed on the second display on the input area, and generating a command to capture the second screen information at a point of time when a touch input is applied to the displayed icon.

13. The method of claim 11, wherein the first and second light sources are selected from the group consisting of,

wherein the first screen information is a screen corresponding to execution of a message application, and the second screen information is a screen or a home screen corresponding to execution of an application different from the first screen information, the method further comprising the steps of:

continuing to display the second screen information on the second display after inserting the captured second screen information into the input area.

14. The method of claim 11, further comprising the steps of:

in response to receiving a touch input in the input area, switching from second screen information displayed on the second display to an image of the captured second screen information occurs; and

displaying a tool area for image editing on an area of the captured image of the second screen information for a predetermined time.

15. The method of claim 14, further comprising the steps of:

performing editing of the captured image of the second screen information based on the touch input applied to the tool area within the predetermined time; and

in response to receiving an edit completion input in the second display, inserting an edited image into the input area on the first display.

Technical Field

The present disclosure relates to an electronic device including a mobile terminal and a case combined with the mobile terminal, and a method of controlling the electronic device.

Background

Terminals can be classified into mobile/portable terminals and stationary terminals according to mobility. In addition, the mobile terminal may be classified into a handheld type and a vehicle mount type according to whether it can be directly carried by a user.

Mobile terminals have become more and more versatile. Examples of such functions include data and voice communication, capturing images and video via a camera, recording audio, playing music files via a speaker system, and displaying images and video on a display. Some mobile terminals include additional functionality that supports the playing of electronic games, while other terminals are configured as multimedia players. In particular, more recently, mobile terminals may receive broadcast and multicast signals to allow viewing of video or television programs.

As mobile terminals become multifunctional, the mobile terminals may be allowed to capture still images or moving pictures, play music or video files, play games, receive broadcasts, and the like, so as to be implemented as an integrated multimedia player.

On the other hand, a recent trend is to implement these various functions using an external device that operates in cooperation with a mobile terminal, to more effectively utilize the mobile terminal and to further improve the usability of the mobile terminal. In this case, it is desirable that the mobile terminal and the external device cooperating with the mobile terminal operate independently or in conjunction with each other as necessary in order to improve user convenience and usability.

Disclosure of Invention

Technical problem

Therefore, in order to obviate these problems, an aspect of the detailed description is to provide an electronic apparatus including a mobile terminal combined with a housing, and the electronic apparatus capable of expanding a display area of the mobile terminal by combining the housing including an additional display, which operates in cooperation with the mobile terminal, with the mobile terminal.

To obviate those problems, another aspect of the detailed description is to provide an electronic apparatus including a mobile terminal and a housing, the mobile terminal being combined with the housing, the electronic apparatus being capable of sharing a currently viewing screen in an easier and faster manner using an extended display area.

To obviate those problems, it is still another aspect of the detailed description to provide an electronic apparatus including a mobile terminal combined with a housing, and the electronic apparatus capable of performing message input using a plurality of independent display areas and simultaneously searching for necessary information, and immediately sharing search results using an expanded display area.

To obviate those problems, another aspect of the detailed description is to provide an electronic apparatus including a mobile terminal combined with a housing and the electronic apparatus capable of capturing a picture currently being viewed with one-time input using an extended display area without going through various steps and inserting the captured picture as message content.

Technical scheme

To achieve these and other advantages and in accordance with the purpose of this disclosure, as embodied and broadly described herein, there is provided an electronic device including a mobile terminal and a case, the mobile terminal being combined with the case, wherein the mobile terminal includes a terminal body combined with the case and a first display combined with the case, wherein the case includes: a first body formed in a manner to accommodate at least a portion of the terminal body; a second body in which a second display is disposed; a wiring unit electrically coupling the first body and the second body such that data received from the mobile terminal is transmitted to the second display; and a wireless communication unit formed in such a manner as to transmit and receive signals to and from the mobile terminal, and wherein, in a case where first screen information including an input area is displayed on the first screen and second screen information is displayed on the second display, in response to receiving a predetermined touch input in the input area, the controller of the mobile terminal controls the wiring unit, the wireless communication unit, the first display, and the second display so as to capture the second screen information displayed on the second display and insert the captured second screen information into the input area of the first screen information.

In the electronic apparatus, an icon for performing capturing of second screen information displayed on the second display may be displayed on the input area, and capturing of the second screen information may be performed at a point of time when the touch input is applied to the icon.

In the electronic device, the input area may include: a keyboard region that receives a key input; and a display area on which a result of the received key input is displayed, and when the touch input is applied to the icon, the controller of the mobile terminal may perform control so that the second screen information is captured and attached to the display area after a fixed time elapses.

In the electronic apparatus, the first screen information may be a screen corresponding to execution of a message application, and the second screen information may be a screen or a home screen corresponding to execution of an application different from the first screen information, and the second screen information may also maintain a state of being displayed on the second display after inserting the captured second screen information into an input area of the first screen information.

In the electronic apparatus, in response to receiving a touch input in the input area, switching from the second screen information displayed on the second display to the captured image of the second screen information may occur, and a tool area for image editing may be displayed on one area of the captured image of the second screen information for a predetermined time.

In the electronic apparatus, editing of the captured image of the second screen information may be performed based on a touch input applied to the tool area, and the edited image may be attached to the first display in response to an editing completion input applied to the second display.

In the electronic apparatus, when it is determined that the second screen information displayed on the second display is not capturable in a state in which the input area is displayed, the controller of the mobile terminal may display an icon for performing capture of the second screen information included in the input area in an inactive state.

In the electronic apparatus, when a touch input is continuously applied to the icon displayed in the inactivated state, alarm information indicating that capturing cannot be performed is output on the second screen information on the second display.

In the electronic apparatus, in a case where the second screen information is displayed on the first display, in response to receiving a screen output request in the first display, the controller of the mobile terminal may perform control such that the second screen information and data associated with execution of an application corresponding to the second screen information are transmitted to the second display.

In the electronic device, when the transmitted second screen information is displayed on the second display, the transmitted second screen information may be captured and attached to an input area of the first screen information based on a touch input applied to the input area of the first screen information corresponding to the message application executed on the first display.

To achieve these and other advantages and in accordance with the purpose of this disclosure, as embodied and broadly described herein, there is provided a method of operating an electronic device including a mobile terminal and a housing, the mobile terminal being combined with the housing, the mobile terminal including a first display combined with the housing, and the housing being disposed in a second display, the method including the steps of: detecting a touch input applied to an input area in a state where first screen information including the input area is displayed on a first display and second screen information is displayed on a second display; transmitting a command to capture second screen information to the second display in response to the touch input; receiving data corresponding to the captured second screen information and inserting the received data into an input area of the first screen information on the first display.

The method may further comprise the steps of: an icon for performing capture of the second screen information displayed on the second display is displayed on the input area, and a command to capture the second screen information is generated at a point in time when the touch input is applied to the displayed icon.

In the method, the first screen information may be a screen corresponding to execution of the message application, and the second screen information may be a screen or a home screen corresponding to execution of an application different from the first screen information, and the method may further include the steps of: after the captured second screen information is inserted into the input area, the second screen information continues to be displayed on the second display.

The method may further comprise the steps of: in response to receiving the touch input in the input area, generating a switch from the second screen information displayed on the second display to the captured image of the second screen information; and displaying a tool area for image editing on an area of the captured image of the second screen information for a predetermined time.

The method may further comprise the steps of: performing editing of the captured image of the second screen information based on a touch input applied to the tool area for a predetermined time; and inserting the edited image into the input area on the first display in response to the edit completion input received in the second display.

Advantageous effects

According to the detailed description, in the mobile terminal and the electronic device including the same according to the present disclosure, screen information output on different screens is immediately shared as message content while a message conversation is in progress, with only one kind of extended display combining function. With one touch input, the user can perform operations performed through the respective steps. Further, as before, the display state of the keyboard region for message input and the display state of the screen information output on different screens are maintained. Therefore, the convenience and usability of the user are further improved.

Drawings

Fig. 1a and 1b are conceptual diagrams describing an electronic device according to the present disclosure.

Fig. 2a, 2b, and 2c are conceptual diagrams describing a basic structure of an electronic device according to the present disclosure.

Fig. 3a, 3b, and 3c are conceptual diagrams describing examples of a mobile terminal according to the present disclosure.

Fig. 4 is a conceptual diagram describing a method of controlling between a display provided in a mobile terminal and a display provided in a case, respectively, in an electronic device according to the present disclosure.

Fig. 5a to 5h are conceptual diagrams describing various embodiments of a method of controlling screens of a plurality of displays using a first display provided on a mobile terminal side in an electronic device according to the present disclosure.

FIG. 6 is an exemplary flow chart depicting steps of a method of operating an electronic device according to the present disclosure.

Fig. 7a and 7b are diagrams illustrating the operation method in fig. 6.

Fig. 8a, 8b, and 8c are conceptual diagrams describing steps of capturing a screen of the second display, then editing it, and inserting it into the first display in the electronic device according to the present disclosure.

Fig. 9a and 9b are exemplary conceptual diagrams describing a processing operation in a case where a screen of the second display is not capturable in the electronic apparatus according to the present disclosure.

Fig. 10a, 10b, 10c, 10d, and 10e are exemplary conceptual diagrams describing an operation in an electronic device according to the present disclosure in which a screen being viewed on a display of a mobile terminal is transmitted and then the transmitted screen is captured and inserted into a message.

Detailed Description

A description will now be given in detail according to exemplary embodiments disclosed herein with reference to the accompanying drawings. For brief description with reference to the drawings, the same or equivalent parts will be provided with the same or similar reference numerals, and the description thereof will not be repeated. In general, suffixes such as "module" and "unit" may be used to indicate elements or components. The use of such suffixes herein is intended merely to aid in the description of the specification, and the suffix itself is not intended to give any particular meaning or function. In the description of the present disclosure, if a detailed description of related known functions or configurations is considered to unnecessarily mislead the gist of the present disclosure, such description is omitted but will be understood by those skilled in the art.

It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are generally only used to distinguish one element from another.

It will be understood that when an element is referred to as being "connected" to another element, it can be connected to the other element or intervening elements may also be present. In contrast, when an element is referred to as being "directly connected" to another element, there are no intervening elements present.

Singular references may include plural references unless the context clearly dictates otherwise.

Terms such as "including" or "having" are used herein, and it is to be understood that they are intended to indicate that there are a plurality of components, functions or steps disclosed in the specification, and it is also to be understood that more or fewer components, functions or steps may likewise be utilized.

Fig. 1a and 1b are conceptual diagrams describing an electronic device according to the present disclosure.

Referring to the drawings, the mobile terminal 100 is combined with the housing 200, and the mobile terminal 100 is combined with the housing 200 to constitute one electronic device 300.

In this case, the mobile terminal includes a cellular phone, a smart phone, a laptop computer, a digital broadcasting terminal, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a navigation device, a touch screen tablet PC (slate PC), a tablet PC (tablet PC), a high-end notebook computer (high-end notebook computer), a wearable device (e.g., a smart watch, smart glasses, a Head Mounted Display (HMD)). The mobile terminal will be described in detail below with reference to fig. 3a and 3 c.

The case 200 may be a pouch protecting the external appearance of the mobile terminal 100, or may cover or accommodate at least one surface of the mobile terminal 100 as an accessory of the mobile terminal 100. The housing 200 is configured to be combined with a mobile terminal in order to extend the functional range of the mobile terminal 100.

On the other hand, according to the present disclosure, information output in the mobile terminal is processed in association with the structure or function of the case 200. As an example of this case, referring to fig. 1a, a display (hereinafter, referred to as a second display 250) cooperating with a display (hereinafter, referred to as a first display 151) of the mobile terminal is provided in the case 200.

The housing includes a first body 210 and a second body 220 rotatably coupled with respect to each other, and a second display 250 is disposed on any one of the first body 210 and the second body 220.

For example, the first body 210 is formed to accommodate at least a portion of a body of the mobile terminal. The rear side of the mobile terminal is accommodated in the first body 210, and the first display 151 disposed on the front side of the mobile terminal is exposed to the outside.

In this case, the mobile terminal 100 is detachably combined with the first body 210. In addition, the mobile terminal is formed to detect whether it is combined with the first body 210. For the detection, a magnet 245 is provided on one surface of the first body 210 facing the mobile terminal 100. Further, a hall sensor (hall sensor)143 is included on the rear side of the mobile terminal, which is formed such that a magnetic field corresponding to the magnet 245 is sensed in a case where the main body of the mobile terminal is combined with the first main body. When the hall sensor senses the magnetic field, the mobile terminal recognizes that the mobile terminal is combined with the housing, and performs a predetermined control.

For example, when the hall sensor 143 senses a magnetic field, the controller 180 of the mobile terminal 100 controls the power supply unit 190 such that current for operation is supplied to the second display 250 provided in the second body 220.

That is, the second display 250 provided in the second body 220 is configured to operate with power supplied from the mobile terminal 100.

On the other hand, the second display 250 is disposed in the second body 220, and thus performs a function of expanding a display area of the first display 151 or is driven independently of the first display 151. For example, content associated with information output to the first display 151 is mirrored and output on the second display 250.

In addition, execution screens of different applications are output on the first display 151 and the second display 250, respectively. As another example, it is also possible to divide an execution screen of one application into a plurality of areas, and to display the areas on the first display 151 and the second display 250, respectively.

On the other hand, the first display 151 and the second display 250 are exposed to the outside together in an open state, and the open state is defined with reference to fig. 1 b.

Further, the mobile terminal 100 is configured to control screen information output on the second display 250. To this end, at least one of a wired communication link and a wireless communication link is established between the mobile terminal 100 and the second display 250.

In addition, the mobile terminal 100 is configured such that current for operation is supplied to the second display 250, and current for operation is supplied from the mobile terminal 100 to the second display 250 through a wiring provided in the case 200.

Referring to fig. 1b, the first and second bodies 210 and 220 of the housing 200 are rotated relative to each other between a closed state and a fully opened state.

The closed state refers to a state in which the first body 210 of the case 200 covers the first display 151 of the mobile terminal 100, as shown in (a) of fig. 1b, and the first display 151 is hidden by the first body 210. That is, a state in which the first display 151 is covered by the second display 250 is a closed state. In the closed state, the mobile terminal 100 and the case 200 overlap each other in the thickness direction of the mobile terminal, and thus take the form of a diary, thereby improving user portability.

In addition, in this case, front surfaces of the first and second displays 151 and 250 face each other. The front surface is an outer surface that displays time information and receives touch input.

The second body 220 is rotated with respect to the first body 210 in the closed state, thus being switched to the open state. The open state refers to a state in which an angle formed by the first display 151 and the second display 250 with respect to each other is a specific angle other than 0 degrees, such as a state in which the first display 151 is not hidden by the second display 250.

Fig. 1b (b) illustrates a state in which the first display 151 and the second display 250 are at an angle of 180 degrees with respect to each other in the open state. Fig. 1a referred to above illustrates a state in which the first display 151 and the second display 250 are at an angle of 180 degrees with respect to each other. In addition, the first and second bodies 210 and 220 are fixed at a certain angle in the opened state, and for this, a fixing means is provided in the housing 220.

As shown in (b) of fig. 1b, the first body 210 and the second body 220 are further rotated in the a direction with respect to each other. Accordingly, as shown in (c) of fig. 1b, the first and second bodies 210 and 220 are rotated up to 360 degrees with respect to each other. In the open state, the state is defined as "fully open state". In this case, outer surfaces of the first and second bodies 210 and 220 contact each other, and the first and second displays 151 and 250 are disposed to face the outside. That is, the first display 151 and the second display 250 face opposite directions.

On the other hand, the mobile terminal is formed to detect a closed state and an open state. As an example thereof, the mobile terminal includes an illuminance sensor configured to sense ambient illuminance, and the controller 180 of the mobile terminal 100 senses any one of the closed state and the open state according to the illuminance sensed by the illuminance sensor.

Further, the controller 180 separately detects a full open state among the open states.

The electronic device 300 according to the present disclosure performs an operation of controlling the first display 151 and the second display 250 in conjunction with the open state and the closed state. As an example thereof, the first display 151 and the second display 250 are driven in an inactivated state in a closed state, and at least one of the first display 151 and the second display 250 is activated in a case where switching from the closed state to an open state occurs.

As one example, in the event of a switch to an on state occurring, both the first display 151 and the second display 250 switch to an active state. At this time, different home screen pages are output on the first display 151 and the second display 250, respectively, or the same home screen page is displayed in a separated manner on the first display 151 and the second display 250. In addition, various information may be output on the first display 151 and the second display 250 according to circumstances.

As another example, in the case where switching to the on state occurs, the first display 151 switches to the active state while the second display 250 maintains the inactive state.

The second display 250 includes a touch sensor configured to sense a touch applied to the second display 250.

The second display 250 is configured to sense a touch even in an inactive state.

Regarding the touch sensing by the touch sensor, in the case where a touch applied to the second display 250 in the open state corresponds to a predetermined type of touch, the second display 250 is driven in the active state.

On the other hand, in case of applying a touch to the second display 250, the second display 250 transmits a touch signal corresponding to the touch to the mobile terminal 100. Then, in case that the touch corresponding to the received touch signal is a predetermined type of touch, the mobile terminal 100 transmits a signal corresponding to a control command for activating the second display 250 to the second display 250 side.

Then, the controller of the second display 250 activates the second display 250 based on the signal received from the mobile terminal 100.

On the other hand, the housing adopts a new structure in order to realize the operation of the electronic device as described above. The new structure of the housing will be described in more detail below.

Fig. 2a, 2b, and 2c are conceptual diagrams describing a basic structure of an electronic device according to the present disclosure.

The first body 210 of the case 200 has an accommodation space 211 configured to accommodate a rear surface of a body of the mobile terminal. The receiving space 211 in the first body receives at least a portion of the mobile terminal, and a rear surface of the mobile terminal is disposed on a bottom surface of the receiving space 211. However, the present disclosure is not necessarily limited thereto, and for example, the first body in the form of a plate is formed to be combined with a rear surface of the mobile terminal or is configured to be combined with a side surface (flip surface) of the mobile terminal.

With the coupling unit 230, the second body 220 is rotatably combined with the first body, with the second display 250 disposed. That is, the coupling unit 230 is disposed between the first and second bodies 210 and 220, and combines the first and second bodies 210 and 220 such that they can rotate relative to each other.

Referring to fig. 2a, 2b and 2c, the second body 220 includes a first cover 221, a second cover 222 and a second display 250. An accommodation groove 221a is formed in the first cover 221, and at least a portion of the coupling unit 230 is accommodated in the accommodation groove 221 a. The second cover 222 is a frame that is combined with the first cover 221 and to which various electronic components are mounted. As an example thereof, a second circuit board, which will be described below, is mounted on the second cover 222.

The second cover 222 is rotatably combined with the coupling unit 230. The grooves 222a are formed on the second cover 222 at positions corresponding to the receiving grooves 221a in the first cover 221. The coupling unit 230 is disposed in the groove 222 a. In this case, the second display 250 is mounted on the second cover 222.

The coupling unit 230 includes a first hinge 231 and a second hinge 232, and the first hinge 231 and the second hinge 232 are separately disposed along a side surface of the first body 210. Each of the first hinge 231 and the second hinge 232 includes a hinge body 233 and a hinge shaft 234.

A hinge slot (not shown) is formed in the hinge main body 233. The hinge shaft 234 is inserted into the hinge slot, and thus the first and second bodies 210 and 220 can rotate relative to each other. A plurality of hinge shafts 234 are provided, and a combining unit 235 combined with each of the first and second bodies 210 and 220 is disposed on one side of the hinge shafts 234.

At this time, the housing 200 includes the wireless communication unit 283 and the wiring unit 242, and thus, the mobile terminal 100 controls the second display 250.

The wireless communication unit 283 is disposed in the first body 21, and performs short-range wireless communication with the mobile terminal. A wireless communication unit (hereinafter referred to as "first wireless communication unit") that performs short-range wireless communication and a wireless communication unit (hereinafter referred to as "second wireless communication unit") in the housing 200 are arranged in the mobile terminal 100.

The first wireless communication unit 116 (see fig. 3c) transmits a wireless signal to the rear side of the mobile terminal 100. The second wireless communication unit 283 is disposed in the first body 210 to face the first wireless communication unit 116, thereby receiving a wireless signal. Each of the first wireless communication unit 116 and the second wireless communication unit 283 includes, for example, a keysa chip that transmits and receives wireless data. The Keyssa chips are spaced apart by a distance of several centimeters or less in the thickness direction of the mobile terminal. Therefore, the first wireless communication unit 116 and the second wireless communication unit 283 communicate using a short-range communication scheme having a transmission distance of about several centimeters.

As shown, the first body 210 includes a first circuit board 243, and the second wireless communication unit 283 is disposed on the first circuit board 243. The second body 210 includes a second circuit board 244, the second circuit board 244 being disposed under the second display 250 and electrically coupled to the first circuit board 243 through the wiring unit 242. The second circuit board 244 is coupled to the second display 250, and thus performs a function of transmitting a control signal received from the mobile terminal 100 to the second display 250.

That is, the second circuit board 244 transmits data transmitted and received between the first wireless communication unit 116 and the second wireless communication unit 283 to the second display 250.

The wiring unit 242 is a component that electrically couples the first body 210 and the second body 220 through the coupling unit 230. Wireless signals (or data) received through the mobile terminal 100 and the short-range wireless communication are transmitted to the second display 250 through the wiring unit 242. To perform such coupling, a coupling path along which the wiring unit 242 passes is formed in the coupling unit 230.

As an example thereof, an accommodation space that accommodates at least a part of the wiring unit 242 is formed in any one of the first hinge 231 and the second hinge 232. In more detail, the first hinge 231 is closer to an upper side of the mobile terminal than the second hinge 232, and the second hinge 232 is closer to a lower side of the mobile terminal 100 than the first hinge 231. The second circuit board 244 is disposed adjacent to the lower side end of the housing 200, and thus each of the first wireless communication unit 116 and the second wireless communication unit 283 is disposed in the housing 200 or under the mobile terminal 100.

In this structure, the receiving space is formed in the second hinge 232. The second hinge 232 includes an extension portion 236 extending from the hinge main body 233, and a cable 246 extending to each of the first and second main bodies 210 and 220 is provided to the extension portion 236. A structure is adopted in which an accommodation space is formed in the extension part 236 and the cable 246 is accommodated in the accommodation space. The first and second flexible printed circuit boards 247 and 248 are located at both ends of the cable 246, respectively, and the first and second flexible printed circuit boards 247 and 248 are electrically coupled to the first and second circuit boards 243 and 244, respectively. In this structure, a signal controlling the second display 250 is transmitted from the mobile terminal to the first body 210 in a wireless manner, and is transmitted to the second body 220 in a wired manner.

On the other hand, referring to fig. 2a, 2b and 2c, power terminals (e.g., pogo-pins) 249, which are in contact with power terminals (not shown) of the mobile terminal, are disposed on the first circuit board 243, thereby supplying power from the mobile terminal. The power terminals 249 are electrically coupled to the wiring unit 242 such that power is supplied to the second display 250. In this structure, power supplied to the second display 250 is transmitted through a wired path in the mobile terminal.

With the above-described structure, the electronic device performs an operation of controlling the first display 151 and the second display 250 in conjunction with each other using short-range wireless communication and a wired path for a power supply. First, the structure and function of the mobile terminal will be described in detail below. Subsequently, the control operation will be described.

Fig. 3a, 3b, and 3c are conceptual diagrams describing examples of a mobile terminal according to the present disclosure. The mobile terminal 100 according to the present disclosure is coupled to the housing of the electronic device as described above.

The mobile terminals presented herein may be implemented using a variety of different types of terminals. Examples of such terminals include cellular phones, smart phones, user equipment, laptops, digital broadcast terminals, Personal Digital Assistants (PDAs), Portable Multimedia Players (PMPs), navigators, Portable Computers (PCs), touch screen tablets (slate PCs), tablets (tablet PCs), ultrabooks, wearable devices (e.g., smart watches, smart glasses, Head Mounted Displays (HMDs)), and the like.

By way of non-limiting example only, further description will be made with reference to a particular type of mobile terminal. However, such teachings apply equally to other types of terminals (such as those described above). In addition, these teachings can also be applied to fixed terminals (such as digital TVs, desktop computers, etc.).

Referring to fig. 3a to 3c, fig. 3a is a block diagram of a mobile terminal according to an exemplary embodiment of the present disclosure, and fig. 3b and 3c are conceptual views illustrating one example of the mobile terminal viewed from different directions.

The mobile terminal 100 may be shown with components such as: a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180, and a power supply unit 190. It should be understood that implementing all of the illustrated components in fig. 3a is not necessary, and that more or fewer components may alternatively be implemented.

In more detail, the wireless communication unit 110 may generally include one or more modules that allow communication such as wireless communication between the mobile terminal 100 and a wireless communication system, communication between the mobile terminal 100 and another mobile terminal, or communication between the mobile terminal 100 and an external server. Further, the wireless communication unit 110 may generally include one or more modules that connect the mobile terminal 100 to one or more networks.

The wireless communication unit 110 may include one or more of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.

The input unit 120 may include: a camera 121 or an image input unit for obtaining an image or video; a microphone 122, which is one type of audio input means for inputting audio signals; and a user input unit 123 (e.g., a touch key, a mechanical key, etc.) for allowing a user to input information. Data (e.g., audio, video, images, etc.) may be obtained through the input unit 120 and may be analyzed and processed according to user commands.

The sensing unit 140 may be generally implemented using one or more sensors configured to sense internal information of the mobile terminal, a surrounding environment of the mobile terminal, user information, and the like. For example, the sensing unit 140 may include at least one of: a proximity sensor 141, an illuminance sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an Infrared (IR) sensor, a finger scan sensor, an ultrasonic sensor, an optical sensor (e.g., camera 121), a microphone 122, a battery gauge, an environmental sensor (e.g., a barometer, a hygrometer, a thermometer, a radiation detection sensor, a heat sensor, a gas sensor, etc.), and a chemical sensor (e.g., an electronic nose, a medical sensor, a biosensor, etc.). The mobile terminal disclosed herein may be configured to utilize information obtained from one or more sensors of the sensing unit 140 and combinations thereof.

The output unit 150 may be generally configured to output various types of information (such as audio, video, haptic output, etc.). The output unit 150 may be shown to have at least one of a display 151, an audio output module 152, a haptic module 153, and an optical output module 154. The display 151 may have an interlayer structure or an integrated structure with a touch sensor in order to implement a touch screen. A touch screen may be used as the user input unit 123, and the user input unit 123 provides an input interface between the mobile terminal 100 and a user and simultaneously provides an output interface between the mobile terminal 100 and the user.

The interface unit 160 serves as an interface to various types of external devices coupled to the mobile terminal 100. For example, the interface unit 160 may include any wired or wireless port, an external power supply port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, a headphone port, and the like. In some cases, the mobile terminal 100 may perform various control functions related to the connected external device in response to the external device connected to the interface unit 160.

The memory 170 is generally implemented to store data to support various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store applications executed in the mobile terminal 100, data or instructions for the operation of the mobile terminal 100, and the like. Some of these applications may be downloaded from an external server via wireless communication. Other applications may be installed in the mobile terminal 100 at the time of manufacture or shipment, which is typically the case for the basic functions of the mobile terminal 100 (e.g., receiving a call, making a call, receiving a message, sending a message, etc.). The application program may be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform operations (or functions) with respect to the mobile terminal 100.

The controller 180 is generally used to control the overall operation of the mobile terminal 100, in addition to the operation related to the application program. The controller 180 may provide or process information or functions suitable for a user by processing signals, data, information, etc. input or output by the aforementioned various components or activating an application program stored in the memory 170.

In addition, the controller 180 may control at least some of the components shown in fig. 3a to execute an application program that has been stored in the memory 170. In addition, the controller 180 may control at least two of those components included in the mobile terminal 100 to activate an application.

The power supply unit 190 may be configured to receive external power or provide internal power in order to supply appropriate power required to operate elements and components included in the mobile terminal 100. The power supply unit 190 may include a battery, and the battery may be configured to be embedded in the terminal body or configured to be detachable from the terminal body.

At least some of the components may cooperate to implement a method of operation, control, or control of a mobile terminal in accordance with various embodiments disclosed herein. In addition, an operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by activating at least one application stored in the memory 170.

In the following, before describing various embodiments implemented by the mobile terminal 100, a description of the aforementioned components will be given in more detail with reference to fig. 1 a.

First, with respect to the wireless communication unit 110, the broadcast receiving module 111 is generally configured to receive a broadcast signal and/or broadcast-related information from an external broadcast management entity via a broadcast channel. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules may be utilized to facilitate simultaneous reception of two or more broadcast channels or to support switching between broadcast channels.

The mobile communication module 112 may transmit and/or receive wireless signals to and/or from one or more network entities. Typical examples of network entities include base stations, external mobile terminals, servers, etc. Such network entities form part of a mobile communication network which is constructed according to technical standards or communication methods for mobile communication (e.g. global system for mobile communications (GSM), Code Division Multiple Access (CDMA), CDMA2000 (code division multiple access 2000), EV-DO (enhanced voice data optimized or enhanced voice data only), wideband CDMA (wcdma), High Speed Downlink Packet Access (HSDPA), HSUPA (high speed uplink packet access), Long Term Evolution (LTE), LTE-a (long term evolution-advanced), etc.).

The wireless signal may include various types of data according to voice call signal, video call signal, or text/multimedia message transmission/reception.

The wireless internet module 113 refers to a module for wireless internet access. The module may be internally or externally coupled to the mobile terminal 100. The wireless internet module 113 may transmit and/or receive wireless signals via a communication network according to a wireless internet technology.

Examples of such wireless internet access include wireless lan (wlan), wireless fidelity (Wi-Fi), Wi-Fi direct, Digital Living Network Alliance (DLNA), wireless broadband (WiBro), Worldwide Interoperability for Microwave Access (WiMAX), High Speed Downlink Packet Access (HSDPA), HSUPA (high speed uplink packet access), Long Term Evolution (LTE), LTE-advanced (LTE-a), and the like. The wireless internet module 113 may transmit/receive data according to one or more of such wireless internet technologies and other internet technologies.

When the wireless internet access is implemented according to, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE-a, etc., the wireless internet module 113 performs such wireless internet access as part of a mobile communication network. Also, the internet module 113 may cooperate with the mobile communication module 112 or serve as the mobile communication module 112.

The short-range communication module 114 is configured to facilitate short-range communications. Suitable techniques for implementing such short-range communications include BLUETOOTHTMRadio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), wireless fidelity (Wi-Fi), Wi-Fi direct connection, wireless USB (wireless universal serial bus), and the like. The short-range communication module 114 generally supports wireless communication between the mobile terminal 100 and a wireless communication system, communication between the mobile terminal 100 and another mobile terminal 100, or communication between the mobile terminal and a network in which another mobile terminal 100 is located, via a wireless local area network. One example of a wireless local area network is a wireless personal area network.

Here, another mobile terminal (which may be configured similar to the mobile terminal 100) may be a wearable device (e.g., a smart watch, smart glasses, a Head Mounted Display (HMD)) capable of exchanging data with the mobile terminal 100 (or otherwise cooperating with the mobile terminal 100). The short-range communication module 114 may sense and identify the wearable device and allow communication between the wearable device and the mobile terminal 100. In addition, when the sensed wearable device is a device authenticated to communicate with the mobile terminal 100, the controller 180 may, for example, cause at least a portion of the data processed in the mobile terminal 100 to be transmitted to the wearable device via the short-range communication module 114. Accordingly, the user of the wearable device may use the data processed in the mobile terminal 100 on the wearable device. For example, when a call is received at the mobile terminal 100, the user may answer the call using the wearable device. In addition, when a message is received at the mobile terminal 100, the user may check the received message using the wearable device.

The location information module 115 is generally configured to detect, calculate, derive, or otherwise identify a location (or current location) of the mobile terminal. By way of example, the location information module 115 includes a Global Positioning System (GPS) module, a Wi-Fi module, or both. For example, when the mobile terminal uses a GPS module, the position of the mobile terminal may be acquired using signals transmitted from GPS satellites. As another example, when the mobile terminal uses a Wi-Fi module, the location of the mobile terminal may be obtained based on information about a wireless Access Point (AP) that transmits or receives wireless signals to or from the Wi-Fi module. The location information module 115 may alternatively or additionally work with any other module of the wireless communication unit 110 to obtain data related to the location of the mobile terminal, if desired. The location information module 115 is a module for acquiring a location (or a current location), and may not be limited to a module for directly calculating or acquiring a location of the mobile terminal.

Examples of such input include audio, image, video, data, and user input. One or more cameras 121 are often used to obtain image or video input. Such a camera 121 may process image frames of still pictures or video obtained by an image sensor in a video or image capturing mode. The processed image frames may be displayed on the display 151 or stored in the memory 170. In addition, the cameras 121 may be arranged in a matrix configuration to allow a plurality of images having various angles or focal points to be input to the mobile terminal 100. In addition, the camera 121 may be located in a stereoscopic arrangement to obtain left and right images for implementing a stereoscopic image.

The microphone 122 processes the external audio signal into electrical audio (sound) data. The processed audio data may be processed in various ways according to functions performed in the mobile terminal 100. If desired, the microphone 122 may include various noise removal algorithms to remove unwanted noise generated during the reception of external audio.

The user input unit 123 is a component that allows input by a user. Such user input may enable the controller 180 to control the operation of the mobile terminal 100. The user input unit 123 may include one or more of a mechanical input element (e.g., a mechanical key, a button located on a front surface and/or a rear surface or a side surface of the mobile terminal 100, a dome switch, a jog wheel switch, etc.), a touch-sensitive input element, or the like. As one example, the touch-sensitive input element may be a virtual key, a soft key, or a visual key displayed by software processing on a touch screen or a touch key located at a position other than the touch screen on the mobile terminal. On the other hand, virtual or visual keys may be displayed on the touch screen in various shapes (e.g., graphics, text, icons, video, or combinations thereof).

The sensing unit 140 is generally configured to sense one or more of internal information of the mobile terminal, surrounding environment information of the mobile terminal, user information, etc., and generate a corresponding sensing signal. The controller 180 generally cooperates with the sensing unit 140 to control an operation of the mobile terminal 100 or perform data processing, functions or operations related to an application installed in the mobile terminal based on the sensing signal. The sensing unit 140 may be implemented using any of a variety of sensors, some of which will now be described in more detail.

The proximity sensor 141 refers to sensing whether there is an object approaching the surface or an object located near the surface by using an electromagnetic field, infrared rays, or the like without mechanical contact. The proximity sensor 141 may be disposed at an inner area of the mobile terminal covered by the touch screen or disposed near the touch screen.

When the touch screen is implemented as a capacitive type, the proximity sensor 141 may sense the proximity of the pointer with respect to the touch screen by a change in an electromagnetic field in response to the proximity of a conductive object. When the touch screen is implemented as a capacitive type, the proximity sensor 141 may sense the proximity of the pointer with respect to the touch screen by a change in an electromagnetic field in response to the proximity of a conductive object. In this case, the touch screen (touch sensor) can also be classified as a proximity sensor.

The term "proximity touch" that will often be referred to herein denotes a scenario in which a pointer is placed in proximity to a touch screen without contacting the touch screen. The term "contact touch" which will often be referred to herein denotes a scenario in which a pointer is in physical contact with a touch screen. For a location corresponding to a proximity touch of the pointer relative to the touch screen, such a location will correspond to a location where the pointer is perpendicular to the touch screen. The proximity sensor 141 may sense a proximity touch and a proximity touch pattern (e.g., distance, direction, speed, time, position, moving state, etc.). In general, the controller 180 processes data corresponding to a proximity touch and a proximity touch pattern sensed by the proximity sensor 141 and causes output of visual information on the touch screen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or process different data (or information) according to whether a touch with respect to a point on the touch screen is a proximity touch or a contact touch.

The touch sensor senses a touch (or a touch input) applied to the touch screen (or the display 151) using any of various touch methods. Examples of such a touch method include a resistance type, a capacitance type, an infrared type, a magnetic field type, and the like.

As one example, the touch sensor may be configured to convert a change in pressure applied to a specific portion of the display 151 or a capacitance appearing at a specific portion of the display 151 into an electrical input signal. The touch sensor may also be configured to sense not only a touch position and a touch area but also a touch pressure and/or a touch capacitance. Touch objects are commonly used to apply touch input to touch sensors. Examples of typical touch objects include fingers, touch pens, hand-writing pens, pointers, and the like

When the touch sensor senses a touch input, a corresponding signal may be sent to the touch controller. The touch controller may process the received signal and then transmit corresponding data to the controller 180. Accordingly, the controller 180 may sense which region of the display 151 has been touched. Here, the touch controller may be a separate component from the controller 180, and a combination thereof.

In addition, the controller 180 may perform the same or different control according to the type of a touch object touching the touch screen or a touch key provided at a place other than the touch screen. For example, whether to perform the same or different control according to an object providing a touch input may be decided based on a current operating state of the mobile terminal 100 or a currently executed application program.

The touch sensor and the proximity sensor may be implemented separately or in combination to sense various types of touches. Such touches include short (tap) touches, long touches, multi-touches, drag touches, tap touches, pinch-in touches, pinch-out touches, sweep touches, hover touches, and the like.

If desired, the ultrasonic sensor may be implemented to recognize position information associated with the touching object using ultrasonic waves. For example, the controller 180 may calculate the position of the wave generation source based on information sensed by the illuminance sensor and the plurality of ultrasonic sensors. Since light is much faster than ultrasound waves, the time for light to reach the optical sensor is much shorter than the time for ultrasound waves to reach the ultrasound sensor. The fact can be used to calculate the position of the wave generation source. For example, the position of the wave generation source may be calculated using the time difference of arrival of the ultrasonic wave at the sensor based on the light as a reference signal.

The camera 121, which has been described as a component of the input unit 120, generally includes at least one camera sensor (CCD, CMOS, etc.), a photosensor (or image sensor), and a laser sensor.

Implementing the camera 121 with a laser sensor may allow detecting a touch of a physical object with respect to the 3D stereoscopic image. The photosensor may be laminated on or overlap the display device. The photosensor can be configured to scan for movement of a physical object in proximity to the touch screen. In more detail, the photosensor may include photodiodes and Transistors (TR) in rows and columns to scan content received at the photosensor with an electrical signal that varies with the amount of light applied. That is, the photoelectric sensor may calculate coordinates of the physical object according to the change of light to thereby obtain position information of the physical object.

The display 151 is generally configured to output information processed in the mobile terminal 100. For example, the display 151 may display execution screen information of an application program executed at the mobile terminal 100, or User Interface (UI) and Graphical User Interface (GUI) information in response to the execution screen information.

In addition, the display 151 may be implemented as a stereoscopic display for displaying a stereoscopic image.

A typical stereoscopic display may employ a stereoscopic display scheme such as a stereoscopic scheme (glasses scheme), an autostereoscopic scheme (glasses-free scheme), a projection scheme (hologram scheme), and the like.

The audio output module 152 may receive audio data from the wireless communication unit 110 or output audio data stored in the memory 170 during a mode such as a signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. The audio output module 152 may provide audio output related to a specific function (e.g., a call signal reception sound, a message reception sound, etc.) performed by the mobile terminal 100. The audio output module 152 may also be implemented as a handset, a speaker, a buzzer, etc.

The haptic module 153 may be configured to generate various haptic effects that a user feels, perceives, or otherwise experiences. A typical example of the haptic effect generated by the haptic module 153 is vibration. The intensity, pattern, etc. of the vibration generated by the haptic module 153 may be selectively controlled by a user or set by a controller. For example, the haptic module 153 may output different vibrations in a combined manner or a sequential manner.

In addition to vibration, the haptic module 153 may generate various other haptic effects including an effect by stimulation, such as an effect in which a vertically aligned pointer is moved to contact the skin, a spraying force or a suction force through a nozzle or a mouthpiece, a contact to the skin, a contact of an electrode, an electrostatic force, an effect of reproducing cold and heat by using an element capable of absorbing or generating heat.

The haptic module 153 may also be implemented to allow a user to feel a haptic effect through a muscular sense such as a finger or an arm of the user, and to transfer the haptic effect through direct contact. Two or more haptic modules 153 may be provided according to a specific configuration of the mobile terminal 100.

The optical output module 154 may use the light output of the light source for indicating the signal generated by the event. Examples of the event generated in the mobile terminal 100 may include message reception, call signal reception, missed call, alarm, schedule notification, e-mail reception, information reception through an application, and the like.

The signal output through the optical output module 154 may be implemented such that the mobile terminal emits light of a single color or light having a plurality of colors. For example, the signal output may be terminated when the mobile terminal senses that the user has checked the generated event.

The interface unit 160 serves as an interface to an external device connected with the mobile terminal 100. For example, the interface unit 160 may receive data transmitted from an external device, receive power for transmission to elements and components within the mobile terminal 100, or transmit internal data of the mobile terminal 100 to the external device. The interface unit 160 may include a wired or wireless headset port, an external power supply port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, a headset port, and the like.

The identification module may be a chip storing various information for authenticating an authority to use the mobile terminal 100, and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, the device having the identification module (hereinafter also referred to as "identification device") may take the form of a smart card. Accordingly, the identification device may be connected with the terminal 100 via the interface unit 160.

The interface unit 160 may serve as a passage to allow power from the cradle to be supplied to the mobile terminal 100 when the mobile terminal 100 is connected with an external cradle, or may serve as a passage to allow various command signals input from the cradle by a user to be transmitted therethrough to the mobile terminal. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal 100 is properly mounted to the cradle.

The memory 170 may store programs to support the operation of the controller 180 and store input/output data (e.g., phone books, messages, still images, videos, etc.). The memory 170 may store data related to various patterns of vibration and audio output in response to touch input on the touch screen.

The memory 170 may include one or more types of storage media including a flash memory type, a hard disk type, a Solid State Disk (SSD) type, a Silicon Disk Drive (SDD) type, a multimedia mini card type, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. The mobile terminal 100 may also be operated in association with a network storage device that performs storage functions of the memory 170 over a network such as the internet.

The controller 180 may generally control operations related to application programs of the mobile terminal 100 and general operations. For example, the controller 180 may set or release a lock state for restricting a user from inputting a control command with respect to an application when the state of the mobile terminal satisfies a predetermined condition.

The controller 180 can also perform control and processing associated with voice calls, data communications, video calls, etc., or perform pattern recognition processing to recognize handwriting input or drawing input performed on the touch screen as characters or images, respectively. In addition, the controller 180 may control one or a combination of those components in order to implement the various embodiments disclosed herein.

The power supply unit 190 provides power required for operating various elements and components included in the wearable device 100 under the control of the controller 180, and receives external power or provides internal power. The power supply unit 190 may include a battery, which is generally rechargeable or detachably coupled to the terminal body so as to be charged.

The power supply unit 190 may include a connection port. The connection port may be configured as one example of the interface unit 160, to which an external charger for supplying power to recharge the battery is electrically connected.

As another example, the power supply unit 190 may be configured to wirelessly recharge a battery without using a connection port. In this example, the power supply unit 190 may receive power transferred from the external wireless power transmitter using at least one of an inductive coupling method based on magnetic induction or a magnetic resonance coupling method based on electromagnetic resonance.

The various embodiments described herein may be implemented in a computer-readable medium, a machine-readable medium, or the like, using, for example, software, hardware, or any combination thereof.

Referring to fig. 3b and 3c, the disclosed mobile terminal 100 includes a bar-shaped terminal body. However, the mobile terminal 100 may alternatively be implemented in any of a number of different configurations. Examples of such a configuration include a watch type, a clip type, a glasses type or a folding type, a flip type, a slide type, a swing type, and a rotation type (in which two or more bodies are combined with each other in a relatively movable manner), and a combination thereof. The discussion herein will generally refer to a particular type of mobile terminal. However, the teachings regarding a particular type of mobile terminal will generally apply to other types of mobile terminals as well.

Here, the mobile terminal 100 is regarded as at least one component, and a terminal body may be understood as a concept related to the component.

The mobile terminal 100 generally includes a case (e.g., a frame, a case, a cover, etc.) forming an appearance of the terminal. In the present embodiment, the front case 101 and the rear case 102 are used to form a case. Various electronic components are incorporated in a space formed between the front case 101 and the rear case 102. At least one middle case may be additionally disposed between the front case 101 and the rear case 102.

The display 151 is shown on the front side of the terminal body to output information. As illustrated, the window 151a of the display 151 may be mounted to the front case 101 so as to form a front surface of the terminal body together with the front case 101.

In some embodiments, electronic components may also be mounted to the rear housing 102. Examples of such electronic components may include a removable battery 191, an identification module, a memory card, and the like. In this case, the rear cover 103 is shown to cover the electronic components, and the cover may be detachably coupled to the rear case 102. Therefore, when the rear cover 103 is detached from the rear case 102, the electronic components mounted on the rear case 102 are exposed to the outside.

As illustrated, when the rear cover 103 is coupled to the rear case 102, a side surface of the rear case 102 may be partially exposed. In some cases, depending on the coupling, the rear housing 102 may also be completely concealed by the rear cover 103. In addition, the rear cover 103 may include an opening for externally exposing the camera 121b or the audio output module 152 b.

The cases 101, 102, 103 may be formed by injection molding synthetic resin, or may be formed of metal such as stainless steel (STS), aluminum (Al), titanium (Ti), or the like.

As an example alternative to forming an internal space for accommodating components for a plurality of housings, the mobile terminal 100 may be configured such that one housing forms the internal space. In this case, the mobile terminal 100 having a single body is formed such that synthetic resin or metal extends from a side surface to a rear surface.

In addition, the mobile terminal 100 may include a waterproof unit for preventing water from being introduced into the terminal body. For example, the waterproof unit may include a waterproof member located between the window 151a and the front case 101, between the front case 101 and the rear case 102, or between the rear case 102 and the rear cover 103 to hermetically seal the internal space when those cases are coupled.

The mobile terminal 100 may include a display 151, first and second audio output modules 152a and 152b, a proximity sensor 141, an illuminance sensor 142, an optical output module 154, first and second cameras 121a and 121b, first and second manipulation units 123a and 123b, a microphone 122, an interface unit 160, and the like.

Hereinafter, as shown in fig. 3b and 3c, a description will be given of an exemplary mobile terminal 100 in which a front surface of a terminal body is shown to have a display 151, a first audio output module 152a, a proximity sensor 141, an illuminance sensor 142, an optical output module 154, a first camera 121a, and a first manipulation unit 123a, a side surface of the terminal body is shown to have a second manipulation unit 123b, a microphone 122, and an interface unit 160, and a rear surface of the terminal body is shown to have a second audio output module 152b and a second camera 121 b.

However, those components may not be limited to this arrangement. Some components may be omitted or rearranged or located on different surfaces. For example, the first manipulation unit 123a may be located on another surface of the terminal body, and the second audio output unit 152b may be located on a side surface of the terminal body other than the rear surface of the terminal body.

The display 151 is generally configured to output information processed in the mobile terminal 100. For example, the display 151 may display execution screen information of an application program executed at the mobile terminal 100, or User Interface (UI) and Graphical User Interface (GUI) information in response to the execution screen information.

The display module 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor-liquid crystal display (TFT-LCD), an Organic Light Emitting Diode (OLED), a flexible display, or a three-dimensional (3D) display and an electronic ink display.

The display 151 may be implemented using two display devices according to the configuration type thereof. For example, a plurality of displays 151 may be arranged spaced apart from each other on one side, or these devices may be integrated or these devices may be arranged on different surfaces.

The display 151 may include a touch sensor that senses a touch with respect to the display 151 to receive a control command in a touch manner. Accordingly, when a touch is applied to the display 151, the touch sensor may sense the touch, and the controller 180 may generate a control command corresponding to the touch. The contents input in a touch manner may be characters, numbers, instructions in various modes, or menu items that can be designated.

On the other hand, the touch sensor may be configured in the form of a film having a touch pattern and disposed between the window 151a and a display (not shown) on the rear surface of the window, or may be a metal line directly patterned on the rear surface of the window. Alternatively, the touch sensor may be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or may be disposed within the display.

In this way, the display 151 may form a touch screen together with a touch sensor, and in this case, the touch screen may serve as a user input unit (123, see fig. 3 a). In some cases, the touch screen may replace at least some functions of the first manipulation unit 123 a.

The first audio output module 152a may be implemented as an earpiece for transmitting a call sound to a user's ear, and the second audio output module 152b may be implemented as a speaker for outputting various alarm sounds or multimedia reproduction request sounds.

The window 151a of the display 151 may include a sound hole for emitting a sound generated from the first audio output module 152 a. However, the present disclosure is not limited thereto, and the sound may be released along an assembly gap between the structural bodies (e.g., a gap between the window 151a and the front case 101). In this case, the hole independently formed to output the audio sound may not be seen or otherwise hidden in terms of appearance, thereby further simplifying the appearance of the mobile terminal 100.

The optical output module 154 may be configured to output light indicative of event generation. Examples of such events include message reception, call signal reception, missed call, alarm, schedule reminder, e-mail reception, information reception by an application, and the like. When the user has checked the generated event, the controller 180 may control the optical output module 154 to stop the light output.

The first camera 121a may process image frames such as still images or moving pictures obtained by an image sensor in a capture mode or a video call mode. The processed image frames may then be displayed on the display 151 or stored in the memory 170.

The first and second manipulation units 123a and 123b are examples of the user input unit 123, which can be manipulated by a user to provide an input to the mobile terminal 100. The first and second manipulating units 123a and 123b may be collectively referred to as a manipulating portion. The first and second manipulating units 123a and 123b may employ any method as long as the method is a tactile manner allowing the user to perform manipulation with a tactile sensation such as touch, push, scroll, or the like. The first and second manipulation units 123a and 123b may also be manipulated by a proximity touch, a hover touch, or the like without a tactile sensation of the user.

The drawings are illustrated based on the first manipulation unit 123a being a touch key, but the present disclosure may not necessarily be limited thereto. For example, the first manipulation unit 123a may be configured with a mechanical key or a combination of a touch key and a key.

The contents received by the first and second manipulation units 123a and 123b may be set in various ways. For example, the first manipulation unit 123a may be used by the user to input a command such as a menu, a home key, a cancel, a search, etc., and the second manipulation unit 123b may be used by the user to input a command such as controlling a volume level being output from the first audio output module 152a or the second audio output module 152b, switching to a touch recognition mode of the display 151, etc.

On the other hand, as another example of the user input unit 123, a rear input unit (not shown) may be located on a rear surface of the terminal body. The rear input unit may be manipulated by a user to input a command for controlling the operation of the mobile terminal 100. The content input may be set in various ways. For example, the rear input unit may be used by a user to input commands such as power on/off, start, end, scroll, etc., control a volume level being output from the first or second audio output module 152a or 152b, switch to a touch recognition mode of the display 151, etc. The rear input unit may be implemented in a form allowing a touch input, a push input, or a combination thereof.

The rear input unit may be disposed to overlap the display 151 of the front surface in a thickness direction of the terminal body. As one example, the rear input unit may be disposed on an upper end portion of the rear surface of the terminal body such that a user can easily manipulate it using an index finger when the user grips the terminal body with one hand. However, the present disclosure may not be limited thereto, and the position of the rear input unit may be changeable.

When the rear input unit is disposed on the rear surface of the terminal body, a new user interface may be implemented using the rear input unit. In addition, the above-described touch screen or rear input unit may replace at least a part of the functions of the first manipulation unit 123a located on the front surface of the terminal body. Accordingly, when the first manipulation unit 123a is not disposed on the front surface of the terminal body, the display 151 may be implemented to have a larger screen.

On the other hand, the mobile terminal 100 may include a finger scan sensor that scans a user's fingerprint. The controller may use fingerprint information sensed by the finger scan sensor as an authentication means. The finger scan sensor may be installed in the display 151 or the user input unit 123.

The microphone 122 may be configured to receive a user's voice, other sounds, and the like. The microphone 122 may be disposed at a plurality of positions and configured to receive stereo sound.

The interface unit 160 may serve as a path allowing the mobile terminal 100 to exchange data with an external device. For example, the interface unit 160 may be at least one of a connection terminal for connecting to another device (e.g., an earphone, an external speaker, etc.), a port for near field communication (e.g., an infrared data association (IrDA) port, a bluetooth port, a wireless LAN port, etc.), or a power terminal for supplying power to the mobile terminal 100. The interface unit 160 may be implemented in the form of a socket for receiving an external card, such as a Subscriber Identity Module (SIM), a User Identity Module (UIM), or a memory card for information storage.

The second camera 121b may also be mounted to the rear surface of the terminal body. The second camera 121b may have an image capturing direction substantially opposite to that of the first camera unit 121 a.

The second camera 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix form. The camera may be referred to as an "array camera". When the second camera 121b is implemented as an array camera, images may be captured in various ways using a plurality of lenses and images with better quality may be obtained.

The flash 124 may be disposed adjacent to the second camera 121 b. When an image of a subject is captured with the camera 121b, the flash 124 may illuminate the subject.

The second audio output module 152b may also be disposed on the terminal body. The second audio output module 152b may implement a stereo function in combination with the first audio output module 152a, and may also be used to implement a speakerphone mode for call communication.

At least one antenna for wireless communication may be disposed on the terminal body. The antenna may be embedded in the terminal body or formed in the case. For example, an antenna configuring a portion of the broadcast receiving module 111 (see fig. 3a) may be retracted into the terminal body. Alternatively, the antenna may be formed in the form of a film to be attached to the inner surface of the rear cover 103, or a case including a conductive material may be used as the antenna.

The terminal body is provided with a power supply unit 190 (see fig. 3a) for supplying power to the mobile terminal 100. The power supply unit 190 may include a battery 191 installed in the terminal body or detachably coupled to the outside of the terminal body.

The battery 191 may receive power via a power cable connected to the interface unit 160. In addition, the battery 191 may be (re) charged in a wireless manner using a wireless charger. Wireless charging may be achieved by magnetic induction or electromagnetic resonance.

On the other hand, the drawings illustrate that the rear cover 103 is coupled to the rear case 102 to shield the battery 191, so as to prevent the battery 191 from being separated and to protect the battery 191 from external impact or foreign matter. When the battery 191 is detachable from the terminal body, the rear case 103 may be detachably coupled to the rear case 102.

Accessories to protect the appearance or to assist or extend the functionality of the mobile terminal 100 may also be provided on the mobile terminal 100. As one example of the accessory, a cover or pouch for covering or accommodating at least one surface of the mobile terminal 100 may be provided. The cover or pouch may cooperate with the display 151 to expand the functionality of the mobile terminal 100. Another example of an accessory may be a stylus for assisting or extending touch input to a touch screen.

As described above, the electronic device 300 according to the present disclosure is configured such that the first wireless communication unit 116 (refer to fig. 3c) provided on the rear side of the mobile terminal and the second wireless communication unit 283 provided in the first body 210 of the case 200 perform data communication with each other. A method of performing data communication between the first wireless communication unit 116 and the second wireless communication unit 283 will be described in more detail below with reference to the drawings. Fig. 4 is a conceptual diagram describing a control method between displays respectively provided in mobile terminals and a display provided in a case in an electronic device according to the present disclosure.

The mobile terminal 100 according to the present disclosure is combined with the first body 210, and in the case where the mobile terminal 100 is combined with the first body 210, the first wireless communication unit 116 and the second wireless communication unit 283 are configured to face each other.

The first wireless communication unit 116 provided in the mobile terminal 100 is included in the wireless communication unit 110 described above with reference to fig. 3 a. The wireless communication unit 110 of the mobile terminal 100 is configured to be composed of a plurality of wireless communication units, and the plurality of wireless communication units are respectively arranged at different positions in the mobile terminal 100. In particular, in the mobile terminal 100 according to the present disclosure, the first wireless communication unit 116 is disposed on the rear side of the mobile terminal 100, and therefore, in the case where the mobile terminal 100 is combined with the first body 210, the first wireless communication unit 116 is configured to perform data communication with the second wireless communication unit 283 disposed in the first body 210.

The first wireless communication unit 116 and the second wireless communication unit 283 according to the present disclosure are each configured with a contactless connector for data communication. As the electromagnetic connectors forming the electromagnetic communication link, the contactless connectors are respectively arranged in different devices, and thus, the electromagnetic communication link is established.

The first wireless communication unit 116 and the second wireless communication unit 283 are each configured with a transceiver for converting electrical signals into EM signals. The transceiver of one of the first wireless communication unit 116 and the second wireless communication unit 283 converts the electrical signal into an EM signal. The EM signals are received by different transceivers, and the different transceivers convert the EM signals to electrical signals.

On the other hand, according to the present disclosure, the term "transceiver" refers to a device such as an Integrated Circuit (IC) including a transmitter (Tx) and a receiver (Rx) such that the IC is used to transmit and receive information (data). In general, a transceiver may operate in a half-duplex mode (alternately performing transmission and reception) and a full-duplex mode (simultaneously performing transmission and reception), or be configured as any one of a transmitter and a receiver. The transceiver includes separate integrated circuits for transmit and receive functions. As used in this specification, the terms "contactless," "coupled pair," and "proximity combination" are used to refer to an implementation of an Electromagnetic (EM) connection and signal transmission between the first wireless communication unit 116 and the second wireless communication unit 283, rather than an implementation that is electrical (along a wire line or through contact).

The term "contactless" as used in this specification refers to a dielectric coupling system capable of having an optimal range of carrier-assisted (carrier-assisted) within a distance range of 0 to 5 centimeters. The connection is verified by the proximity of one of the first wireless communication unit 116 and the second wireless communication unit 283 to the other. Many contactless transmitters and receivers occupy little space. Unlike wireless links over which conventional broadcasting to multiple points is performed, contactless links established in an Electromagnetic (EM) manner are point-to-point links.

The first wireless communication unit 116 and the second wireless communication unit 283 each make a wired connection in order to transmit data from one location to another, or form a point-to-point contactless communication link or coupled pair that does not require a physical wired connection. The transceiver is an Extremely High Frequency (EHF) transceiver.

For example, in the case where the mobile terminal 100 is combined with the first body 210, the first wireless communication unit 116 of the mobile terminal 100 and the second wireless communication unit 283 in the first body 210 are configured to face each other and to be positioned within a predetermined distance. Thus, a contactless communication link is formed between the first wireless communication unit 116 and the second wireless communication unit 283.

Data is transferred between the mobile terminal 100 and the second display 250 through the EHF transceivers respectively included in the first wireless communication unit 116 and the second wireless communication unit 283.

As described with reference to fig. 2a, 2b, and 2c, the second wireless communication unit 283 that transmits and receives data to and from the second display 250 (i.e., the EFH transceiver) is provided in the first body 210.

The second wireless communication unit 283 provided in the first body 210 is configured to transmit and receive data to and from the second display 250 in a wired manner through the wiring unit 242 included in the coupling unit 230.

On the other hand, as described above, the mobile terminal 100 is combined with the first body 210, and thus the EHF transceivers respectively included in the first and second wireless communication units 116 and 283 are combined with each other when in proximity.

The coupled pair of EHF transceivers of each of the first wireless communication unit 116 and the second wireless communication unit 283 provides a contactless data route, path or channel. In some embodiments, the data path is unidirectional (e.g., data flow from the mobile terminal 100 to the second display 250 through a particular path) or bidirectional (e.g., bidirectional data flow between the mobile terminal 100 and the second display 150 through a particular path).

The first wireless communication unit 116 and the second wireless communication unit 283 according to the present disclosure are configured in such a manner as to transmit and receive various types of data. For example, the various types of data include graphics data, audio data, video data, touch event data, and combinations thereof.

On the other hand, the second display 250 provided on the second body 220 is configured to operate with power supplied from the mobile terminal 100.

As described above, power at this time is supplied to the second display 250 through the wiring unit 242 disposed on the first circuit board 243 and the coupling unit 230 electrically connected to the mobile terminal 100 and through the electrical coupling path to the second circuit board 244 disposed in the second body 220.

That is, as shown in fig. 4, the power supply unit 191 of the mobile terminal 100 is configured to supply current (or power) for operation to the power supply unit 291 of the second display 250 through the wiring unit 242 provided on the first circuit board 243 and the coupling unit 230 and through an electrical coupling path to the second circuit board 244 provided in the second body 220.

On the other hand, as described above, the mobile terminal 100 is detachably combined with the first body 210. In addition, the mobile terminal is formed to detect whether it is combined with the first body 210. For the detection, a magnet 245 is provided on one surface of the first body 210 facing the mobile terminal 100. Further, a hall sensor 143 is included on the rear side of the mobile terminal, the hall sensor 143 being formed such that a magnetic field corresponding to the magnet 245 is sensed in a case where the main body of the mobile terminal is combined with the first main body. When the hall sensor senses the magnetic field, the mobile terminal recognizes that the mobile terminal is combined with the housing, and performs a predetermined control.

For example, when the magnetic field is sensed by the hall sensor 143, the controller 180 of the mobile terminal 100 controls the power supply unit 190 such that current for operation is supplied to the second display 250 provided in the second body 220.

That is, the second display 250 provided in the second body 220 is configured to operate with power supplied from the mobile terminal 100.

In this manner, when a current for operation is supplied to the second display 250, the system of the second display 250 is started and initialized and is in a standby state in which the system is operable.

At this time, the second display 250 is in any one of an activated state and an inactivated state, and even in a state where the second display 250 is in the inactivated state, the touch sensor (or the touch panel 252) provided in the second display 250 is configured to be driven in the activated state and sense a touch applied to the second display 250.

On the other hand, in the case where the second display 250 is activated, the mobile terminal 100 transmits screen information (or a digital image signal) to be output on the display 251 provided in the second display 250 to the second wireless communication unit 283 side through the first wireless communication unit 116. At this time, as described above, the image signal at this time is wirelessly transmitted as a signal in the 60GHz band by the wireless connector.

As described above, the second display 250 receives data (e.g., digital image signals, etc.) from the first wireless communication unit 116 through the second wireless communication unit 283 and the second circuit board 244. At this time, the digital image signal is converted into data in a format that may be output on the second display 250 by the data conversion unit 282. For example, the second display 250 is configured with an LCD panel. At this time, the digital image signal in the DP format received from the mobile terminal 100 is converted into a data format (MIPI format) receivable by the LCD panel by the data conversion unit 282, and is transmitted to the display 251 and output on the display 251.

On the other hand, the types of data transmitted and received by the first wireless communication unit 116 and the second wireless communication unit 283 are predetermined. For example, only data corresponding to the image signal is transmitted and received through the first wireless communication unit 116 and the second wireless communication unit 283.

At this time, in addition to the image signal, signals required for transmission between the mobile terminal 100 and the second display 250, such as a communication control signal, a touch signal, and a brightness control signal, pass through the first and second signal processing units 181 and 281 via a multi-input channel, and are then transmitted and received through the first circuit board 243 and a power supply terminal (e.g., a test probe (pogo-pin)) 249. On the other hand, initialization and the like of the second display 250 may be controlled by a controller included in the second display 250.

A method of screen control between a first display and a second display respectively provided in a mobile terminal and a housing of an electronic device 300 according to the present disclosure will be described in more detail below with reference to fig. 5a to 5 h.

In fig. 5a, the first display 151 provided in the mobile terminal 100 is in an activated state, and the second display 250 provided in the case 200 is in an inactivated state.

In one example, in a case where the electronic apparatus 300 is switched from the closed state to the open state, the first display 151 is in an active state, and for example, the home screen 510 is output.

In the case where the electronic device 300 is switched from the closed state to the open state, a current for operation is supplied to the second display 250, but remains in an inactive state until an input is applied. The input here is an input to wake up the second display 250, and is, for example, a touch input (e.g., a tap or double-click, which is hereinafter referred to as a "tap function") applied to the second display 250.

When a touch input is applied to the second display 250, a touch signal corresponding to the touch input is transmitted to the first wireless communication unit 116 through the second wireless communication unit 283 (fig. 4). Then, the controller 180 of the mobile terminal determines whether the touch signal received through the first wireless communication unit corresponds to a predetermined type of touch. In case that the touch signal is a signal corresponding to a predetermined type of touch (e.g., a tap or a double tap) as a result of the determination, a control signal for switching the second display 250 to an active state is generated and transmitted to the second display 250 through the first and second wireless communication units 116 and 283.

On the other hand, in another example, in a case where the electronic device 300 is switched from the closed state to the open state, both the first display 151 and the second display 250 are in the activated state. In this case, for example, a first home screen is output on the first display 151, and a second home screen different from the first home screen is output on the second display 250, or a given screen is output on the second display 250.

On the other hand, an icon 520 is displayed on one area (e.g., an edge area) of the home screen 510 output on the first display 151, the icon 520 indicating that a menu associated with control of the second display 250 is hidden. The position of the icon 520 may be moved by dragging the touch input, and the icon 520 may be restricted from being always displayed on the second display 250.

The controller of the mobile terminal 100 displays a hidden menu on the first display 151 based on a touch input applied to the displayed icon 520. A symbol ^ (a) indicating a direction to drag a touch input for displaying a hidden menu is displayed on the icon 520.

For example, as shown in fig. 5b, in a state where the home screen 510 is output on the first display 151 and the execution screen 530 for the Web application is output on the second display 250, when a touch input of the icon 520 applied to the home screen 510 is dragged from the edge area of the first display 151 toward the center, a hidden menu is displayed.

The displayed menu provides various functions of controlling the first display 151 and the second display 250 in conjunction with each other based on an input to the first display 151. For example, as shown in fig. 5b, a screen switching menu 521, a screen exporting menu 522, a screen importing menu 523, a main screen power saving menu 524, and a dual screen closing menu 525 are provided. However, not limited to the illustrated example, and more different menus may be displayed.

Fig. 5c to 5f specifically illustrate controlling various functions of the first display 151 and the second display 250 in conjunction with each other based on a touch input to a menu displayed on the first display 151.

In fig. 5c, the following operation corresponds to an exchange between a screen displayed on the first display 151 and a screen displayed on the second display 250.

Referring to fig. 5c, first screen information (e.g., a home screen 510) is output on the first display 151, and second screen information (e.g., an execution screen 530 for a Web application) is output on the second display 250.

In this way, in a state where different screen information is output on the first display 151 and the second display 250, respectively, when a touch input is applied to the screen switching menu 521 among the menus displayed on the first display 151, the main screen 510 output on the first display 151 is moved to the second display 250. Along with this, the execution screen 530 being output on the second display 250 is moved to the first display 151.

For this, the controller of the mobile terminal 100 moves a task corresponding to the home screen 510 being output on the first display 151 to a memory stack (memory stack) allocated for the second display 250. In addition, the controller of the mobile terminal 100 moves a task corresponding to the execution screen 530 being output on the second display 250 to a storage stack for the first display 151.

In this way, after the first display 151 and the second display 250 exchange screens, when an input for displaying back a hidden menu is performed and the screen switching menu 521 is selected, an operation to return to an original screen state occurs.

To this end, the controller of the mobile terminal 100 moves the newly input task from the storage stack allocated to the second display 250 back to the storage stack for the first display 151. Meanwhile, the controller of the mobile terminal 100 moves the latest input task from the storage stack for the first display 151 to the storage stack allocated for the second display 250.

On the other hand, although not shown, in a case where the screen export menu 522 is selected from the first display 151, only the screen output on the first display 151 is moved to the second display 250.

For this, the controller of the mobile terminal 100 moves a task corresponding to a screen output on the first display 151 to a storage stack allocated for the second display 250. At this time, a screen corresponding to the next task in the memory stack for the first display 151 is output on the first display 151. At this time, if there is no next task, the home screen is output.

In a case where the second display 250 is in an activated state and at least the screen export menu 522 is executed, the screen import menu 523 on the first display 151 is activated.

In a state where this condition is satisfied, when the screen import menu 523 is selected, a screen displayed on the second display 250 (the screen is output on the first display 151) is displayed back on the first display 151. Then, a screen output on the second display 250 before the screen displayed on the first display 151 appears on the second display 250 is returned to the second display 250.

For this, the controller of the mobile terminal 100 moves a task corresponding to a screen output on the second display 151 to a storage stack for the first display 250.

In fig. 5d, the following operation corresponds to a function of switching only a screen displayed on the first display 151 to a power saving mode.

When a touch input is applied to the home screen power saving menu 524, among menus displayed on the first display 151, a control signal corresponding to a touch signal corresponding to the menu is generated, and a power saving mode for the first display 151 is enabled.

At this time, only the power saving mode for the first display 151 is enabled. Accordingly, an image signal corresponding to the screen information output on the second display 250 is transmitted from the mobile terminal 100 to the second display 250 through the first and second wireless communication units 116 and 283.

The power saving mode for the first display 151 is enabled, and thus, as shown in fig. 5d, the screen brightness of the home screen 510 on the first display 151 is adjusted to be low. On the other hand, the screen brightness of the execution screen 530 on the second display 250 is maintained at its previous level.

In fig. 5e and 5f, the following operations correspond to a function of controlling power on and off of the second display 250 using a touch input to the first display 151.

First, referring to fig. 5e, when the dual-screen close menu displayed on the first display 151 is selected, the controller 180 of the mobile terminal transmits a control signal for switching the second display 250 to an inactive state to the second display 250 through the first and second wireless communication units 166 and 283. Thus, as shown in the lower part of fig. 5e, the second display 250 switches to an inactive state.

Then, switching from an icon (hereinafter referred to as "first icon") 520 indicating that a hidden menu displayed on the first display 151 exists to an icon (hereinafter referred to as "second icon") 520' indicating a locked state occurs to display.

In this manner, the second display 250 is switched to an inactive state based on an input to the first display 151. Accordingly, the supply of the current for operation supplied from the mobile terminal 100 to the second display 250 through the wiring provided on the housing 200 is interrupted.

However, in a case where a storage stack for the screen information output on the second display 250 is allocated to the mobile terminal 100, the mobile terminal 100 recognizes a task corresponding to the screen information output on the second display 250.

In this manner, after the dual screen close menu 525 is selected, as shown in fig. 5f, when the touch input applied to the second icon 520 'is dragged in a given direction (e.g., from an edge area of the first display 151 toward the center thereof), only the dual screen open menu 525' is displayed.

When the touch input is applied to the dual screen close menu 525', the controller 180 of the mobile terminal transmits a control signal for switching the second display 250 to an active state to the second display 250 through the first and second wireless communication units 166 and 283. Then, a current for operation is provided from the mobile terminal 100 back to the second display 250.

At this time, the screen information 530 output just before the second display 250 is in an inactive state is output back onto the second display 250. For this, the controller 180 of the mobile terminal performs control such that the state of the task in the storage stack allocated to the second display 250 is maintained. In another example, unlike fig. 5f, a home screen may be displayed on the second display 250 switched to an active state.

In this manner, when the second display 250 is in an on state, switching from the second icon 520' displayed on the first display 151 to the first icon 520 indicating that a hidden menu exists occurs for display.

On the other hand, using a predetermined touch gesture instead of the icon 520, a first screen displayed on the first display 151 may be transmitted to the second display 250, or a second screen displayed on the second display 250 may be transmitted to the first display 151. The predetermined touch gesture here may be a multi-finger touch gesture.

For example, as shown in fig. 5g, in a state where the first screen information 540 is displayed on the first display 151 and the second screen information 530 is displayed on the second display 250, when the three-finger touch gesture applied to the second display 250 is dragged toward the first display 151, the second screen information 530 displayed on the second display 250 is transmitted to the first display 151. In other words, the task in the memory stack allocated to the second display 250 is moved to the memory stack of the first display 151.

Accordingly, an application corresponding to the first screen information 540 displayed on the first display 151 is located in the background, and the second screen information 530 is displayed on the first display 151. Then, a screen or a home screen of the application that is being executed in the background is displayed on the second display 250.

Similarly, as shown in fig. 5h, in a state where the first screen information 540 is displayed on the first display 151 and the second screen information 530 is displayed on the second display 250, when the three-finger touch gesture applied to the first display 151 is dragged toward the second display 250, the first screen information 540 displayed on the first display 151 is transmitted to the second display 250. In this case, the task in the storage stack of the first display 151 is moved to the storage stack allocated to the second display 250.

Accordingly, an application corresponding to the second screen information 530 displayed on the second display 250 is located in the background, and the first screen information 530 is displayed on the second display 250. Then, a screen or a home screen of the application that is being executed in the background is displayed on the first display 151.

In this way, the electronic device 300 according to the present disclosure provides both a plurality of independent display functions and an extended display function using the first display 151 provided in the combined mobile terminal 100 and the second display 250 provided in the housing as necessary, which are controlled by the mobile terminal 100. Hereinafter, all these functions are collectively referred to as "extended display combining function".

In particular, according to the present disclosure, it is possible to perform message input and simultaneously search for information to be shared using a plurality of independent display functions. Then, in order to share the search result, the search result may be immediately shared using the expanded display area.

To this end, according to the present disclosure, a separate storage stack is allocated to the second display 250, but the control of the storage stack allocated to the second display 250 is performed by the mobile terminal.

In a state where first screen information including an input area is displayed on the first display 151 and second screen information is displayed on the second display 250, in response to receiving a touch input in the input area, the controller 180 of the mobile terminal according to the present disclosure controls the first wireless communication unit, the second wireless communication unit, the first display 151, and the second display 250, thereby capturing the second screen information displayed on the second display 250 and inserting the captured second screen information into the input area of the first screen information.

In this specification, the first wireless communication unit is provided on the mobile terminal side, and the second wireless communication unit is provided on the housing side. The first wireless communication unit and the second wireless communication unit are electrically coupled to the circuit board and the power supply unit through the wiring unit.

Referring to fig. 6, a method in which a screen currently being viewed is shared in an easier and faster manner using an extended display combining function will be described in detail below.

First, if it is detected that the electronic device according to the present disclosure is switched from the closed state to the open state, both the first display 151 and the second display 250 are activated, or only the first display 151 is in the activated state. In the latter case, when a touch signal corresponding to a touch input applied to the second display 250 is transmitted to the controller of the mobile terminal, a control signal for switching the second display 250 to an active state is generated. The generated control signal is transmitted to the second display 250 through the first wireless communication unit 116 of the mobile terminal 100 side and through the second wireless communication unit 283 of the case 200 side. Then, the current for operation is provided to the second display 250 by the controller of the mobile terminal 100.

In this manner, when both the first display 151 and the second display 250 are activated, different screen information respectively corresponding to the execution of different applications is respectively output on the first display 151 and the second display 250.

Referring to fig. 6, first screen information including an input area is displayed on the first display 151, and second screen information different from the first screen information is displayed on the second display 250. In this state, the controller of the mobile terminal detects that a predetermined touch input is received in the input area displayed on the first display 151 (S10).

The first screen information herein is limited to an execution screen of an application for displaying, storing, or transmitting information input through an input area. For example, the first screen information is a message screen or an SNS execution screen in the output keyboard region.

Here, the second screen information is not particularly limited. For example, the second screen information may be a main screen, and may include all execution screens for a specific application (e.g., a gallery application, an SNS application, a camera application, a Web application, a map application, a moving picture application, a finance application, etc.) different from the first screen information.

On the other hand, in one example, the first screen information may be limited to an execution screen for an application that includes an input area and through which image attachment is possible. In addition, the second screen information may be limited to screens that can be captured, and for example, screens that cannot be captured for security or identification purposes (e.g., security screens, ticket purchasing screens, etc.) may be excluded from the second screen information.

In addition, the input area here may be disposed at a lower end of the first screen information, and may include at least a keyboard area. The input area may include a tool area and a display area in addition to the keyboard area.

The keypad area is formed to include a plurality of keys and receive a key input applied to at least one of the plurality of keys. In addition, the tool area is formed to have a plurality of icons, each of which changes the configuration of the keyboard area and provides additional functions associated with key input. In addition, the display area is formed such that an input result using the keyboard area or the tool area is displayed on the display area.

The predetermined touch input applied to the input area refers to a touch input applied to an icon included in the tool area. However, there is no limitation thereto. The predetermined touch input may be a predetermined touch gesture (e.g., a particular pattern/touch gesture of a given shape) received in the keyboard region and/or the tool region.

In this manner, when a predetermined touch input is received in an input area displayed on the first display 151, the controller 180 of the mobile terminal generates a control signal (hereinafter, referred to as "first control signal") that captures second screen information output on the second display 250. The controller of the mobile terminal transmits the first control signal to the second display 250 of the case 200 side through the first and second wireless communication units 116 and 283 (S20).

According to the first control signal, when the second screen information currently being output on the second display 250 is captured, data corresponding to the captured second screen information is transmitted to the controller 180 of the mobile terminal through the second wireless communication unit 283 and the first wireless communication unit 116.

In response to receiving the data corresponding to the captured second screen information, the controller 180 of the mobile terminal generates a control signal (hereinafter, referred to as a "second control signal") for inserting an image (e.g., a thumbnail image) corresponding to the captured second screen information image into the input area of the first display 151. An image corresponding to the captured second picture information is inserted into the input area of the first display 151 according to the second control signal (S30).

By a predetermined type of touch (e.g., a touch to a "transfer icon"), the image inserted into the input area is immediately displayed on the first screen information and is transferred to the other party (terminal participating in the session) for sharing. On the other hand, the captured second screen information is stored in the memory 170 of the mobile terminal 100 or the memory 270 of the case 200 side.

In the related art, only an already stored image is searched and attached when a message conversation is performed. In addition, since the keyboard area for message input is used to search for an already stored image, message input is interrupted, and image search is only possible through a small-sized screen. Further, in order to attach an image that is not stored, a message conversation screen is interrupted, a different application is executed, and desired information is searched for. Thereafter, capturing or storing is performed, and then various steps of selecting an application to be shared or switching to a previous message conversation screen must be performed.

However, by performing the extended display combining function according to the embodiment of the present disclosure as described above without going through various steps, images displayed through different display areas are immediately captured and inserted as message contents. Accordingly, it is possible to immediately share information displayed on a different screen with another conversation party while keeping the input area of the message conversation screen as it is, with one input.

Each step in fig. 6 described above will be described in more detail below with reference to fig. 7a and 7 b.

Referring to fig. 7a, a message dialog screen 710 generated due to execution of a message application is displayed on the first display 151 of the electronic device 300. Then, a thumbnail list screen 730 generated due to the execution of the gallery application is displayed on the second display 250 disposed on the housing side of the electronic apparatus 300.

The second display 250 is controlled by the controller 180 of the mobile terminal 100 and is supplied with current for operation from the mobile terminal 100. However, a separate storage stack has been allocated for the second display 250, and thus, as described above, the second display 250 can be used independently of the first display 151.

An input area 740 for message input is output on the message dialog screen 710 displayed on the first display 151. The input area 740 is not always output. Based on the user input, the input area 740 may be output on the large-size screen or may disappear from the large-size screen.

An icon 750 for performing capturing and inserting (capturing or pasting) of an image displayed on the second display 250 is displayed on the input area 740.

Although not shown, when a proximity touch to the icon 750 is detected, the controller of the mobile terminal controls the first display 151, thereby popping up guide information indicating a function to be performed. Accordingly, the user can recognize the function corresponding to the icon 750.

The controller of the mobile terminal 100 differently determines whether the icon 750 included in the input area 740 is activated according to the display state of the second display 250.

Specifically, when the second display 250 is in an inactive state, the icon 750 displayed on the input area 740 of the first display 151 is displayed in an inactive state. Here, the icon 750 being displayed in an inactive state means that a dotted mark, a blurring or a darkening effect is applied to the icon 750 in a manner visually distinguished from a different icon in an active state. In the case where an input is applied to the icon 750 displayed in an inactive state, no operation is performed.

In another example, the icon 750 may be displayed in an activated state even when the second display 250 is in an inactivated state. In this case, in response to the touch input being applied to the icon 750, the second display 250 is switched to an active state (i.e., screen size is increased) and displays the home screen or given screen information. Subsequently, capturing is performed. To this end, the controller 180 of the mobile terminal 100 supplies a current for operation to the second display 250 according to a touch signal corresponding to a touch input applied to the icon 750, and transmits a first control signal for capturing an image to be subsequently displayed and a control signal for switching the second display 250 to an active state to the second display 250 through the first and second wireless communication units 116 and 283.

In addition, the controller of the mobile terminal 100 differently determines whether the icon 750 included in the input region 740 is activated according to the type of screen displayed on the second display 250 or the type of application being executed.

For this, when the input region 740 is output on the first display 151, the controller 180 of the mobile terminal 100 determines whether the screen information displayed on the second display 250 is capturable screen information. For example, when the input region 740 is output, if the screen information displayed on the second display 250 is non-capturable screen information, the icon 750 included in the input region 740 is displayed in an inactivated state.

In this case, however, when it is also determined that switching from picture information displayed on the second display 250 to capturable picture information occurs in response to an operation on the second display 250, the controller 180 may control the first display 151 such that the icon 750 included in the input region 740 is switched to an active state to be displayed.

When a capturable thumbnail list screen 730 is displayed on the second display 250, the thumbnail list screen 730 is captured at a point of time when a touch input is applied when the touch input is applied to an icon 750 displayed on an input area 740 of the first display 151.

Referring to the upper part of fig. 7b, a captured image 730' of the thumbnail list screen 730 is output on the second display 250. At this time, a tool region 780 for performing editing of the captured image is displayed at the lower end of the captured image 730' for a predetermined time. In addition, the input area 740 of the first display 151 maintains its previous display state, and particularly, the keypad area does not appear. At this time, a symbol for selection is displayed on the icon 750 displayed on the input area 740.

Subsequently, after a predetermined time (e.g., 0.5 to 1.5 seconds) elapses, the captured image 730' displayed on the second display 250 is inserted into the input area 740 of the first display 151.

From the lower part of fig. 7b, it can be seen that the captured image 730' is displayed on the display area 742 of the input area 740 of the first display 151. At this time, the captured image 730' is displayed as a thumbnail image 730S on the display area 742. Along with this, the captured image 730' on the second display 250 returns to the previous screen, i.e., the gallery thumbnail list screen 730. That is, when the user touches the "transfer icon" within the display area 742, the inserted image 730S is immediately transferred and shared.

The operation within the mobile terminal 100 is an operation for capturing or pasting screen information output on the second display 250 at a time based on a touch input to the first display 151, and is as follows.

When the input region of the first display 151 is output, the controller 180 of the mobile terminal 100 determines whether the display state of the second display 250 and the output screen information are capturable. When the screen information output on the second display 250 is a capturable screen as a result of the determination, a capture and paste icon included in the input area is displayed in an activated state. On the other hand, when the screen information output on the second display 250 is a screen that cannot be captured (for example, a screen including unique identification information such as a security screen or a ticket purchase screen) as a result of the determination, the first display 151 is controlled such that the capture and insertion icon included in the input area is displayed in an inactivated state.

In addition, the controller 180 of the mobile terminal 100 detects that a predetermined touch input is applied to an input area of the first display 151 (e.g., a touch is applied to a capture and insert icon), and recognizes that a touch signal corresponding to the detected touch input is a signal generated due to the touch input for paste and insert. At a point of time when the touch input is applied (that is, at a point of time when the touch signal is received), the controller 180 transmits a first control signal for capturing screen information displayed on the second display 250 to the second display 250 through the first wireless communication unit 116 and then through the second wireless communication unit 283.

According to the first control signal, the second display 250 performs the capturing of the screen information currently being displayed under the control of the controller 280. Then, the data associated with the captured image is transmitted to the controller 180 of the mobile terminal 100 through the second wireless communication unit 283 and then through the first wireless communication unit 116. Then, the controller 180 generates a second control signal for inserting a captured image corresponding to the received data into an input area of the first display 151 and provides the generated second control signal to the first display 151. According to the second control signal, the captured image is displayed on the input area of the first display 151. At this time, the newly input task (i.e., the original screen corresponding to the captured image) in the separately allocated storage stack is displayed on the second display 250.

In this way, according to the present disclosure, with only one kind of extended display combining function, screen information output on different screens is immediately shared as message content while a message conversation is in progress. With one touch input, the user can perform operations performed through the respective steps. Further, as before, the display state of the keyboard region for message input and the display state of the screen information output on different screens are maintained. Therefore, the convenience and usability of the user are further improved.

In fig. 8a to 8c, the following embodiment is obtained by adding an operation of performing editing after capturing the second screen information displayed on the second display 250 but before inserting it. In this case, user convenience in which additional input for insertion is not required after editing is also provided.

Referring to fig. 8a, a message dialog screen 810 including an input region 840 is displayed on the first display 151 as first screen information. Then, pre-edit screen information (e.g., an execution screen 830 of a map application) of an image to be shared on the message conversation screen on the first display 151 is displayed on the second display 250 as second screen information.

In this state, when an input is applied to the icon 850 for performing capturing and inserting, displayed on the input region 840 of the first display 151, the above-described first control signal is transmitted to the second display 250 by the controller 180. Accordingly, the execution screen 830 for the map application displayed on the second display 250 is captured and the captured image is displayed on the second display 250 for a predetermined time.

While the captured image is displayed on the second display 250 (i.e., within a predetermined time after the touch is applied to the icon 850), when the touch input is applied to the captured image displayed on the second display 250, the captured image is not immediately inserted into the input area 840 of the first display 151, and the "edit mode" is enabled.

For example, when a touch input is applied to a "tool area" output on a lower portion of a captured image within a predetermined time, an editing mode is enabled. In addition, after a fixed time has elapsed, a tool region appears, which is output in the lower part of the captured image. At this time, the editing mode is no longer enabled, and the captured image is inserted into the first display 151 as it is. In this way, when a predetermined time has elapsed after the captured image including the tool region is output, the captured image is inserted. This provides editing opportunities for the user.

In the edit mode, as shown in fig. 8b, an edit screen 830E for capturing an image is displayed. The user may input additional information to the editing screen 830E using the tool area 880 displayed on the lower end of the editing screen 830E and a touch input. Therefore, as shown in fig. 8b, an editing screen 830E in which additional information is input so as to clearly identify a destination is generated in real time.

When a touch is applied to the edit completion icon 831 displayed on the editing screen 830E, the editing mode is disabled. Then, as shown in fig. 8c, the editing screen 830E on which the editing is completed is inserted into the display area 842 included in the input area 840 of the first display 151. Then, screen information 803 existing before the edit mode is enabled is output on the second display 250.

Here, a touch applied to the edit completion icon 831 generates a touch signal for disabling the edit mode. In this manner, when the touch signal for disabling the editing mode is generated, the second control signal for inserting the editing screen into the first display 151 is generated by the controller 180 of the mobile terminal 100.

To this end, the controller 180 of the mobile terminal 100 receives data associated with the captured image displayed on the second display 250 once through the second wireless communication unit 283 and the first wireless communication unit 116. Then, when a touch input is applied at a tool region at a lower end of the captured image displayed on the second display 250, the controller 180 of the mobile terminal 100 receives a touch signal corresponding to the touch input and suspends the insertion of the captured image. Then, the controller 180 transmits a processing signal associated with a drag input corresponding to a touch gesture performed on the second display 250 to the second display 250 through the first and second wireless communication units 116 and 283 until an editing end signal is received.

In addition, in response to receiving the edit end signal, the controller 180 inserts an edit screen on which editing is completed into the input area of the first display 151. At this time, when the captured image is edited on the second display 250, image processing associated with the generation of the editing screen is performed by the controller 180 of the mobile terminal 100. Accordingly, when the edit end signal is received, the controller 180 recognizes the edit screen. Therefore, the touch input to the edit completion icon 830 does not correspond to the additional input of the insertion edit screen.

When the editing screen 830E is attached as a thumbnail 830s to the display area 842 within the input area 840 of the first display 151, the editing screen 830E displayed on the second display 250 is restored to the screen before capture (i.e., the execution screen 830 of the map application). At this time, the editing screen 830E is stored in the memory or the like of the mobile terminal by the controller 180.

On the other hand, in another embodiment, editing may be performed before the capturing and inserting are performed. In this case, the editing work on the second display 250 is performed independently of the editing work on the first display 151. Accordingly, the user can alternately or simultaneously perform the editing work on the first display 151 and the editing on the second display 250.

In this way, according to the present disclosure, capture and insertion are performed while message conversion is performed with one touch input, actively using the extended display combining function. In addition, quick editing is performed on the captured image after capture but before insertion, thereby inserting additional information in an easier and faster manner.

On the other hand, all screen information displayed on the second display 250 is not captured for insertion into the first display 151. For this, the controller 180 of the mobile terminal 100 determines whether the picture information displayed on the second display 250 is a capturable picture. The time point for this determination is a time point at which an input region 940 (i.e., a keyboard region) is output on the first display 151.

When the input region 940 is output on the first display 151, if it is determined that the screen information displayed on the second display 250 is a non-capturable screen, the controller 180 performs control such that the icon 950' for performing capturing and inserting is displayed in an inactivated state.

On the other hand, in one example, when it is detected that the switching from the screen information displayed on the second display 250 to the capturable screen occurs while the input region 940 is displayed on the first display 151, the controller 180 switches the icon 950' displayed in the inactive state to the active state.

It is determined whether the screen information 930 displayed on the second display 250 is a non-capturable screen based on the type of the screen information 930 or the type of the corresponding application. For example, it is determined that a financial application, a learning application or ticket purchase screen, a QR screen, or the like is a screen that cannot be captured. The determination is made by the controller 180 of the mobile terminal 100 when the input region 940 is output on the first display 151.

In addition, according to an embodiment, while a non-capturable screen is displayed on the second display 250, when input for capturing and inserting is continuously performed, the controller 180 of the mobile terminal outputs guide information indicating that the screen is non-capturable. Specifically, an icon 950' for performing capturing and inserting is displayed in an inactivated state on an input area of the first display 151. At this time, when the application of the touch input to the icon 950' is continuously detected a predetermined number of times or more, as shown in fig. 9b, guidance information 990 indicating that the screen is not capturable pops up on the second display 250. After a predetermined time elapses, the pop-up guide information 990 disappears.

The operation within the mobile terminal 100 associated therewith is as follows.

The controller 180 of the mobile terminal 100 receives a touch signal corresponding to a touch input continuously applied to an icon displayed in an inactive state on the input area of the first display 151. The controller 180 generates a control signal (hereinafter, referred to as a "third control signal") for outputting guide information (or alarm information) indicating that the picture information displayed on the second display 250 cannot be captured. The third control signal is transmitted to the second display 250 through the first and second wireless communication units 116 and 283. Alarm information indicating that the currently displayed picture information cannot be captured is output on the second display 250 according to the third control signal.

On the other hand, in another example, the guide information 990 is output on an edge region (e.g., an upper end or a lower end) of the second display 250 or is displayed within an input region of the first display 151 and then disappears, thereby minimizing the coverage of the current display state.

In addition, in the case where the screen information displayed on the second display 250 is a non-capturable screen, message input is also performed through the first display 151 while viewing the screen information displayed on the second display 250.

Fig. 10a to 10e illustrate an embodiment in which a picture viewed on the first display 151 is transferred to the second display 250, and the transferred picture is then captured and immediately inserted into a message. Therefore, the user can continue to watch the currently displayed screen without interruption, and at the same time, can immediately share the screen as a still image with a third party.

Referring to fig. 10a, first screen information (e.g., a moving picture reproduction screen 1010) is displayed on the first display 151, and the second display 250 is in an inactivated state ("initial state"), or a home screen is displayed on the second display 250. The user may want to share the first screen information displayed on the first display 151 with a third party through a message or the like while viewing it. In this case, the first screen information is transmitted to the second display 250 and is used like the second screen information.

For this, as shown in fig. 10b, when the screen export menu 522 is selected from among the menus 521, 522, 523, 524, and 525 displayed on the first display 151 for collectively controlling the second display 250, the controller 180 of the mobile terminal 100 generates a control signal for transmitting the moving picture reproduction screen 1010 reproduced on the first display 151. Then, the controller 180 moves a task (e.g., moving picture reproduction) in a storage stack for the first display 151 to a separate storage stack allocated to the second display 250.

In addition, the controller 180 of the mobile terminal transmits second screen information (here, first screen information transmitted from the first display 151) displayed on the second display 250 and data related to execution of an application corresponding to the second screen information to the second display 250.

At this time, the screen information 1010 reproduced on the first display 151 is reproduced on the second display 250. Then, when a different task performed in the background on the first display 151 does not exist, as shown in fig. 10c, a home screen 1020 is output on the first display 151.

When an icon 1021 for a message application is selected from the home screen 1020 displayed on the first display 151 and a specific dialog screen is displayed, as shown in fig. 10d, a dialog screen 1030 including an input area 1040 is output on the first display 151. At this time, an icon 1050 for capturing and inserting the screen information 1010 reproduced on the second display 250 is displayed in an activated state on the input area 1040. Then, the picture information 1010 continues to be reproduced on the second display 250.

When a touch is applied to the icon 1050 displayed on the input area 1040, the controller 180 of the mobile terminal 100 captures a scene of the second display 250 at a point of time when the touch is applied and, as shown in fig. 10e, attaches the captured scene to the display area 1042 of the input area 1040 of the first display 151. At this time, after the capturing, the reproduction on the second display 250 is continued from the captured scene. Accordingly, the user can continue a message conversation through the first display 151 without any interference while continuing to view the screen information 1010 reproduced through the second display 250.

On the other hand, according to the embodiments of the present disclosure as described above, a wide range of applications may also be found in the case where a structure in which the first display 151 and the second display 250 are independent from each other is not employed and one display is divided in software, or in the case where one display is folded so as to be divided/divided into a plurality of display areas.

As described above, an electronic device including a mobile terminal according to the present disclosure includes a case including a first body combined with the mobile terminal and a second body provided with a display. In the case where wireless communication with the mobile terminal is possible, a wireless communication unit is provided, and the display provided in the second body transmits and receives control signals and data to and from the mobile terminal through the wireless communication unit provided in the housing. In addition, according to the present disclosure, screen information output on different screens is immediately shared as message content while a message conversation is in progress, with only one extended display combination function. With one touch input, the user can perform operations performed through the respective steps. Further, as before, the display state of the keyboard region for message input and the display state of the screen information output on different screens are maintained. Therefore, the convenience and usability of the user are further improved.

56页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于用户设备的增强的内置语音邮件

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类