Method and device for controlling terminal display and computer readable medium

文档序号:1956878 发布日期:2021-12-10 浏览:12次 中文

阅读说明:本技术 控制终端显示的方法、装置以及计算机可读介质 (Method and device for controlling terminal display and computer readable medium ) 是由 袁檀 于 2020-07-03 设计创作,主要内容包括:本发明公开了一种控制终端显示的方法、装置以及计算机可读介质,涉及通信技术领域。该方法包括:当检测到显示屏开始执行操作时,获取摄像头的感光度,以检测外部环境的亮暗程度在第一预设时间内有无变化;当检测到外部环境的亮暗程度在第一预设时间内有变化时,根据摄像头的感光度以及预设的摄像头的感光度与显示屏的显示亮度的对应关系,调整显示屏的显示亮度;获取摄像头拍摄的目标图像数据;当在目标图像数据中检测到人脸时,根据目标图像数据确定人脸与显示屏的距离;当检测到人脸与显示屏的距离在第二预设时间内有变化时,根据人脸与显示屏的距离调整显示屏的字体大小。该实施方式可以减缓用户的用眼疲劳。(The invention discloses a method and a device for controlling terminal display and a computer readable medium, and relates to the technical field of communication. The method comprises the following steps: when the fact that the display screen starts to execute operation is detected, acquiring the light sensitivity of the camera to detect whether the brightness degree of the external environment changes within a first preset time; when the brightness degree of the external environment is detected to be changed within a first preset time, adjusting the display brightness of the display screen according to the sensitivity of the camera and the corresponding relation between the preset sensitivity of the camera and the display brightness of the display screen; acquiring target image data shot by a camera; when a human face is detected in the target image data, determining the distance between the human face and the display screen according to the target image data; and when the distance between the face and the display screen is detected to be changed within the second preset time, adjusting the font size of the display screen according to the distance between the face and the display screen. This embodiment may reduce eye strain for the user.)

1. A method for controlling a terminal display, comprising:

when the fact that the display screen starts to execute operation is detected, acquiring the light sensitivity of the camera to detect whether the brightness degree of the external environment changes within a first preset time;

when the brightness degree of the external environment is detected to be changed within a first preset time, adjusting the display brightness of the display screen according to the sensitivity of the camera and the preset corresponding relation between the sensitivity of the camera and the display brightness of the display screen;

acquiring target image data shot by the camera, wherein the target image data is image data with definition and image brightness meeting preset requirements;

when a human face is detected in the target image data, determining the distance between the human face and a display screen according to the target image data;

and when the fact that the distance between the face and the display screen is changed within a second preset time is detected, adjusting the font size of the display screen according to the distance between the face and the display screen.

2. The method according to claim 1, wherein the adjusting the display brightness of the display screen according to the sensitivity of the camera and a preset corresponding relationship between the sensitivity of the camera and the display brightness of the display screen comprises:

when the brightness degree of the external environment is detected to be changed within a first preset time, judging whether the sensitivity of the camera is within a first preset range;

if the sensitivity of the camera is not within a first preset range, adjusting parameters of the camera, and adjusting the display brightness of the display screen according to the sensitivity of the camera in an external environment and a preset corresponding relation between the sensitivity of the camera and the display brightness of the display screen;

and if the sensitivity of the camera is within a first preset range or after the display brightness of the display screen is adjusted, executing the step of acquiring target image data shot by the camera.

3. The method for controlling terminal display according to claim 1 or 2, wherein the acquiring target image data captured by the camera comprises:

acquiring candidate image data shot by the camera according to a preset acquisition strategy;

determining a sharpness of each of the candidate image data;

judging whether the definition of each candidate image data is within a second preset range or not;

if the definition of the first candidate image data is not in the second preset range but in the third preset range, adjusting the definition of the first candidate image data, and then continuing to execute the step of judging whether the definition of each candidate image data is in the second preset range;

if the definition of the first candidate image data is within a second preset range, determining the image brightness of the first candidate image data;

judging whether the image brightness of the first candidate image data is within a fourth preset range;

if the image brightness of the first candidate image data is not within a fourth preset range, adjusting the image brightness of the first candidate image data until the image brightness of the first candidate image data is within the fourth preset range;

and if the image brightness of the first candidate image data is within a fourth preset range, determining the first candidate image data as target image data, and acquiring the first candidate image data.

4. The method of claim 3, wherein the determining the sharpness of each candidate image data comprises:

determining a sharpness of each of the candidate image data based on an open source computer vision library and employing a Laplace variance algorithm.

5. The method for controlling the terminal to display according to claim 3, wherein the determining the distance between the human face and the display screen according to the target image data comprises:

identifying the interpupillary distance of the face in the target image data through a face identification algorithm;

and determining the distance between the human face and the display screen according to the interpupillary distance of the human face in the target image data.

6. The method for controlling the display of the terminal according to claim 5, wherein the identifying the interpupillary distance of the face in the target image data by a face recognition algorithm according to the target image data comprises:

determining the position of a human face in the target image data according to the target image data;

intercepting image data of a designated area in the target image data according to the position of the face in the target image data, wherein the designated area is a preset area around the face as the center;

and identifying the interpupillary distance of the face in the target image data through a face identification algorithm according to the image data of the designated area.

7. The method for controlling terminal display according to claim 1, wherein after the step of adjusting the font size of the display screen according to the distance between the face and the display screen, the method for controlling terminal display further comprises:

if the distance between the face and the display screen is not changed within a second preset time or after the font size of the display screen is adjusted, acquiring the duration time and the current time of the operation;

judging whether the duration time of the operation exceeds a fifth preset range or whether the current time exceeds a sixth preset range;

and if the operation duration exceeds a fifth preset range or the current time exceeds a sixth preset range, sending a prompt message.

8. The method of controlling a terminal display according to claim 1, further comprising:

when detecting that the display screen starts to execute the operation, adjusting the background color of the display screen to a designated color or displaying an interface for the user to select the background color.

9. An apparatus for controlling a display of a terminal, comprising:

the first obtaining submodule is used for obtaining the light sensitivity of the camera when the display screen is detected to start to execute operation so as to detect whether the brightness degree of the external environment changes within a first preset time;

the first adjusting submodule is used for adjusting the display brightness of the display screen according to the sensitivity of the camera and the preset corresponding relation between the sensitivity of the camera and the display brightness of the display screen when the brightness of the external environment is detected to change within a first preset time;

the second acquisition submodule is used for acquiring target image data shot by the camera, wherein the target image data is image data with definition and image brightness meeting preset requirements;

the determining submodule is used for determining the distance between the human face and the display screen according to the target image data when the human face is detected in the target image data;

and the second adjusting submodule is used for adjusting the font size of the display screen according to the distance between the face and the display screen when the fact that the distance between the face and the display screen is changed within second preset time is detected.

10. An apparatus for controlling a display of a terminal, comprising:

one or more processors;

a storage device for storing one or more programs,

when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-8.

11. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-8.

Technical Field

The present invention relates to the field of communications technologies, and in particular, to a method and an apparatus for controlling terminal display, and a computer readable medium.

Background

With the continuous development and progress of scientific technology, different types of terminals (such as mobile phones and other terminals) become an indispensable part of daily life. Although the terminal may facilitate the life of the user, the user may experience eyestrain when using the terminal for a long time, particularly, a user who gets lost in a mobile game or a network novel.

Disclosure of Invention

In view of this, embodiments of the present invention provide a method, an apparatus, and a computer-readable medium for controlling a terminal display, which can solve the problem of eye fatigue of a user when the user uses the terminal for a long time.

To achieve the above object, according to an aspect of an embodiment of the present invention, there is provided a method of controlling a display of a terminal.

The method for controlling the display of the terminal comprises the following steps:

when the fact that the display screen starts to execute operation is detected, acquiring the light sensitivity of the camera to detect whether the brightness degree of the external environment changes within a first preset time;

when the brightness degree of the external environment is detected to be changed within a first preset time, adjusting the display brightness of the display screen according to the sensitivity of the camera and the preset corresponding relation between the sensitivity of the camera and the display brightness of the display screen;

acquiring target image data shot by the camera, wherein the target image data is image data with definition and image brightness meeting preset requirements;

when a human face is detected in the target image data, determining the distance between the human face and a display screen according to the target image data;

and when the fact that the distance between the face and the display screen is changed within a second preset time is detected, adjusting the font size of the display screen according to the distance between the face and the display screen.

Optionally, the adjusting the display brightness of the display screen according to the sensitivity of the camera and a preset correspondence between the sensitivity of the camera and the display brightness of the display screen includes:

when the brightness degree of the external environment is detected to be changed within a first preset time, judging whether the sensitivity of the camera is within a first preset range;

if the sensitivity of the camera is not within a first preset range, adjusting parameters of the camera, and adjusting the display brightness of the display screen according to the sensitivity of the camera in an external environment and a preset corresponding relation between the sensitivity of the camera and the display brightness of the display screen;

and if the sensitivity of the camera is within a first preset range or after the display brightness of the display screen is adjusted, executing the step of acquiring target image data shot by the camera.

Optionally, the acquiring target image data captured by the camera includes:

acquiring candidate image data shot by the camera according to a preset acquisition strategy;

determining a sharpness of each of the candidate image data;

judging whether the definition of each candidate image data is within a second preset range or not;

if the definition of the first candidate image data is not in the second preset range but in the third preset range, adjusting the definition of the first candidate image data, and then continuing to execute the step of judging whether the definition of each candidate image data is in the second preset range;

if the definition of the first candidate image data is within a second preset range, determining the image brightness of the first candidate image data;

judging whether the image brightness of the first candidate image data is within a fourth preset range;

if the image brightness of the first candidate image data is not within a fourth preset range, adjusting the image brightness of the first candidate image data until the image brightness of the first candidate image data is within the fourth preset range;

and if the image brightness of the first candidate image data is within a fourth preset range, determining the first candidate image data as target image data, and acquiring the first candidate image data.

Optionally, the determining the sharpness of each of the candidate image data comprises:

determining a sharpness of each of the candidate image data based on an open source computer vision library and employing a Laplace variance algorithm.

Optionally, the determining a distance between the human face and the display screen according to the target image data includes:

identifying the interpupillary distance of the face in the target image data through a face identification algorithm;

and determining the distance between the human face and the display screen according to the interpupillary distance of the human face in the target image data.

Optionally, the recognizing, according to the target image data, the interpupillary distance of the face in the target image data through a face recognition algorithm includes:

determining the position of a human face in the target image data according to the target image data;

intercepting image data of a designated area in the target image data according to the position of the face in the target image data, wherein the designated area is a preset area around the face as the center;

and identifying the interpupillary distance of the face in the target image data through a face identification algorithm according to the image data of the designated area.

Optionally, after the step of adjusting the font size of the display screen according to the distance between the face and the display screen, the method for controlling the display of the terminal further includes:

if the distance between the face and the display screen is not changed within a second preset time or after the font size of the display screen is adjusted, acquiring the duration time and the current time of the operation;

judging whether the duration time of the operation exceeds a fifth preset range or whether the current time exceeds a sixth preset range;

and if the operation duration exceeds a fifth preset range or the current time exceeds a sixth preset range, sending a prompt message.

Optionally, the method further comprises:

when detecting that the display screen starts to execute the operation, adjusting the background color of the display screen to a designated color or displaying an interface for the user to select the background color.

To achieve the above object, according to another aspect of an embodiment of the present invention, there is provided an apparatus for controlling a display of a terminal.

The device for controlling the display of the terminal comprises the following components:

the first obtaining submodule is used for obtaining the light sensitivity of the camera when the display screen is detected to start to execute operation so as to detect whether the brightness degree of the external environment changes within a first preset time;

the first adjusting submodule is used for adjusting the display brightness of the display screen according to the sensitivity of the camera and the preset corresponding relation between the sensitivity of the camera and the display brightness of the display screen when the brightness of the external environment is detected to change within a first preset time;

the second acquisition submodule is used for acquiring target image data shot by the camera, wherein the target image data is image data with definition and image brightness meeting preset requirements;

the determining submodule is used for determining the distance between the human face and the display screen according to the target image data when the human face is detected in the target image data;

and the second adjusting submodule is used for adjusting the font size of the display screen according to the distance between the face and the display screen when the fact that the distance between the face and the display screen is changed within second preset time is detected.

To achieve the above object, according to another aspect of an embodiment of the present invention, there is provided an apparatus for controlling a display of a terminal.

The device for controlling the display of the terminal comprises the following components:

one or more processors;

a storage device for storing one or more programs,

when executed by the one or more processors, cause the one or more processors to implement the method as described above.

To achieve the above object, according to still another aspect of an embodiment of the present invention, there is provided a computer-readable medium.

A computer-readable medium of an embodiment of the invention has stored thereon a computer program which, when executed by a processor, implements the method as described above.

One embodiment of the above invention has the following advantages or benefits:

in the embodiment of the invention, the method for controlling the terminal display is realized based on OpenCV (cross-platform computer vision library) and the functional characteristics of the camera, the method can adjust the display brightness of the display screen in real time according to the brightness of the external environment, and can also adjust the font size in real time according to the distance between the face and the display screen, and the method can reduce the harm to eyes caused by the fact that a user stares at the terminal display screen for a long time, and can relieve the eye fatigue of the user.

Further effects of the above-mentioned non-conventional alternatives will be described below in connection with the embodiments.

Drawings

The drawings are included to provide a better understanding of the invention and are not to be construed as unduly limiting the invention. Wherein:

fig. 1 is one of schematic diagrams of a main flow of a method of controlling a terminal display according to an embodiment of the present invention;

FIG. 2 is a second schematic diagram illustrating a main flow of a method for controlling a display of a terminal according to an embodiment of the present invention;

fig. 3 is a third schematic diagram of the main flow of a method for controlling a terminal display according to an embodiment of the present invention;

FIG. 4 is one of schematic diagrams of main blocks of an apparatus for controlling a terminal display according to an embodiment of the present invention;

FIG. 5 is a second schematic diagram of the main modules of the apparatus for controlling the display of a terminal according to the embodiment of the present invention;

FIG. 6 is an exemplary system architecture diagram in which embodiments of the present invention may be employed;

fig. 7 is a schematic block diagram of a computer system suitable for use in implementing a terminal device or server of an embodiment of the invention.

Detailed Description

Exemplary embodiments of the present invention are described below with reference to the accompanying drawings, in which various details of embodiments of the invention are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.

Currently, there are various ways to reduce the harm to the user's eyes by the terminal, and two of these ways are exemplified below:

the first mode is as follows: blue light emitted by the terminal can be reduced, so that harm to eyes of a user by the terminal is reduced. However, this method cannot solve the visual fatigue caused by the user gazing at the display screen for a long time, especially when the brightness contrast is large at night. Also, the materials used to make reduced-emission and blue-light displays are expensive and too costly for users of moderate living standards.

The second mode is as follows: the usage time of the user is controlled. After the user uses the terminal for a certain time, the user is reminded that the use time is too long and eyes need to rest, so that the use time of the user on the terminal is reduced, and the harm of the terminal to the eyes of the user is reduced. For users who need to look at the terminal or are enthusiastic in the mobile game or novel during work for a long time, it is obviously not possible to reduce the harm of the terminal in this way.

Although the above two methods can reduce the damage of the terminal to the eyes of the user to a certain extent, they still have some disadvantages, so that a more convenient, cheap and effective eye protection method still needs to be found.

Based on the above analysis, the embodiment of the present invention provides a method for controlling a terminal display, which aims to alleviate visual fatigue of a user, and solves a problem that the eyes of the user are damaged when the user uses the terminal for a long time in a relatively cheap manner through a currently advanced terminal and an IT technology, so as to alleviate the visual fatigue generated when the user uses the terminal and reduce harm to the eyes when the user stares at the terminal for a long time. The method can adopt proper measures to make up the defects of the first mode in a dark environment, can also perform some settings on the terminal to practically reduce the damage of the display screen to the eyes of people, and is not a second mode of simply reminding the user of overlong use time of the terminal. The method for controlling the terminal display can effectively relieve the visual fatigue of the user caused by using the terminal for a long time and reduce the harm of the terminal to the eyes of the user.

Fig. 1 is a schematic diagram of a main flow of a method for controlling a terminal display according to an embodiment of the present invention, and as shown in fig. 1, the method for controlling a terminal display is applied to a terminal, and the terminal at least includes: the camera comprises a camera and a display screen, wherein the shooting direction of the camera is consistent with the display direction of the display screen. The method for controlling the display of the terminal specifically comprises the following steps:

step 101: when the detection display screen starts to execute operation, acquiring the sensitivity of the camera to detect whether the brightness degree of the external environment changes within a first preset time; when detecting that the brightness level of the external environment changes within a first preset time, executing step 102; when it is detected that the brightness level of the external environment does not change within the first preset time, step 101 is repeatedly executed.

In step 101, the operation may be one or more of the following: clicking, sliding, playing video and/or displaying web page. The sensitivity (ISO) of the camera refers to the sensitivity of the camera to the intensity of light, and the sensitivity of the camera can be used for reflecting the brightness of the external environment. It can be understood that there is a corresponding relationship between the sensitivity of the camera and the brightness of the external environment, and whether the brightness of the external environment changes within the first preset time can be known through the sensitivity of the camera. Wherein, the external environment may be understood as the external environment in which the terminal is currently located. The first preset time can be understood as a detection period, and a value of the first preset time can be determined according to detection requirements.

Step 102: when the brightness degree of the external environment is detected to be changed within a first preset time, adjusting the display brightness of the display screen according to the sensitivity of the camera and the preset corresponding relation between the sensitivity of the camera and the display brightness of the display screen;

in step 102, the display brightness of the display screen may be adjusted according to a fitting formula of the sensitivity of the camera and the display brightness of the display screen; the fitting formula may be used to represent a correspondence between sensitivity of the camera and display brightness of the display screen, and is as follows:

y1=k1g(x1)

wherein x is1Representing the sensitivity of the camera; y is1Displaying the display brightness of the display screen corresponding to each sensitivity; k is a radical of1Represents a proportionality coefficient, and k1Is a fixed value.

When determining a fitting formula of the sensitivity of the camera and the brightness degree of the external environment, firstly, acquiring different sensitivities of a preset number and target display brightness of a display screen corresponding to each sensitivity; and then obtaining a fitting formula of the sensitivity of the camera and the display brightness of the display screen according to the preset number of different sensitivities and the target display brightness of the display screen corresponding to each sensitivity. The target display brightness can be understood as optimal display brightness corresponding to different sensitivities, and the target display brightness can be obtained in advance according to experiments.

For example: the display brightness range of the display screen is between 0 and 100, and the display brightness of the display screen can be divided into the following five grades: dark, shade, weak light, normal and strong light, wherein the display brightness values of the display screens of five grades are respectively (20, 40, 60, 80 and 100). Correspondingly, the values of the sensitivity of the camera corresponding to the display brightness of each level are respectively (2000, 1000, 200, 100, 30). According to the sensitivity of the camera corresponding to the display brightness of different levels, the corresponding relation between the sensitivity of the camera and the display brightness of the display screen, namely y, can be obtained through fitting1=k1g(x1)。

It is understood that the display brightness of the display screen can be changed along with the brightness of the external environment through the step 102. For example: if the external environment becomes bright, the display brightness of the display screen is increased; if the external environment becomes dark, the display brightness of the display screen is reduced. Therefore, the problem that the display brightness is too high to cause dazzling in a darker environment and the problem that the display content of the display screen cannot be seen clearly by a user due to too low display brightness in a bright environment can be solved, and the eyestrain of the user can be relieved.

In addition, in order to obtain the target image data in step 103, before adjusting the display brightness of the display screen, it may be first determined whether the sensitivity of the camera is within a first preset range; if the sensitivity of the camera is not within a first preset range, adjusting parameters of the camera, and adjusting the display brightness of the display screen according to the sensitivity of the camera in an external environment and a preset corresponding relation between the sensitivity of the camera and the display brightness of the display screen; if the sensitivity of the camera is within a first preset range or after adjusting the parameters of the camera, step 103 is executed.

It should be noted that the first preset range is used to determine whether the brightness level of the external environment where the terminal is located is within an ideal range. It can be understood that, if the sensitivity of the camera is not within the first preset range, it indicates that the brightness of the external environment is not satisfactory, which may be unfavorable to both eyes of the user, and may even affect the acquisition of the target image data in step 103, and the display brightness of the display screen and the parameters of the camera may be adjusted. For example: the light supplementing rate of the camera can be adjusted, so that the target image data can be conveniently acquired.

Step 103: acquiring target image data shot by the camera, wherein the target image data is image data with definition and image brightness meeting preset requirements;

in step 103, the target image data may specifically be image data with a definition in a second preset range and an image brightness in a fourth preset range, where the image data may be data such as a picture or a video. The definition of the target image data refers to the definition of each detail shadow and its boundary of the target image data, and the definition can be used for reflecting the image quality of the target image data. The image brightness of the target image data refers to the brightness degree of the target image data picture. The second preset range is a threshold of the definition of the image data, the fourth preset range is a threshold of the image brightness of the image data, and values of the second preset range and the fourth preset range may be determined as needed.

When target image data shot by the camera is obtained, candidate image data shot by the camera can be collected according to a preset collection strategy; then determining the sharpness of each of the candidate image data; further judging whether the definition of each candidate image data is within a second preset range; if the definition of the first candidate image data is not in the second preset range but in the third preset range, adjusting the definition of the first candidate image data, and then continuing to execute the step of judging whether the definition of each candidate image data is in the second preset range; if the definition of the first candidate image data is within a second preset range, determining the image brightness of the first candidate image data; judging whether the image brightness of the first candidate image data is within a fourth preset range; if the image brightness of the first candidate image data is not within a fourth preset range, adjusting the image brightness of the first candidate image data until the image brightness of the first candidate image data is within the fourth preset range; and if the image brightness of the first candidate image data is within a fourth preset range, determining the first candidate image data as target image data, and acquiring the first candidate image data.

Step 104: when a human face is detected in the target image data, determining the distance between the human face and a display screen according to the target image data;

in step 104, firstly, according to the target image data, identifying the interpupillary distance of the human face in the target image data through a human face recognition algorithm; and then determining the distance between the human face and the display screen according to the interpupillary distance of the human face in the target image data.

When identifying the interpupillary distance of the face in the target image data, firstly, determining the position of the face in the target image data according to the target image data; then according to the position of the face in the target image data, intercepting image data of a designated area in the target image data, wherein the designated area is a preset area around the face as the center; and finally, according to the image data of the specified area, identifying the interpupillary distance of the face in the target image data through a face identification algorithm.

Then, the distance between the human face and the display screen can be determined according to the following formula and the interpupillary distance of the human face in the target image data:

y2=k2x2+b2

wherein, y2Representing the interpupillary distance of the human face in the target image data; k is a radical of2Represents a scaling factor; x is the number of2Representing the distance between the human face and the display screen; b2Representing interpupillary distance when the distance between the face and the display screen is minimal, and b2Is a fixed value.

Step 105: and when the fact that the distance between the face and the display screen is changed within a second preset time is detected, adjusting the font size of the display screen according to the distance between the face and the display screen.

In step 105, firstly, judging whether the distance between the human face and the display screen changes within a second preset time; if the distance between the face and the display screen is not changed within a second preset time or after the font size of the display screen is adjusted, whether the duration time of the operation exceeds a fifth preset range or whether the current time exceeds a sixth preset range can be judged; and if the operation duration exceeds a fifth preset range or the current time exceeds a sixth preset range, sending a prompt message. If the distance between the face and the display screen is changed within a second preset time, the font size of the display screen can be adjusted according to the following formula and the distance between the face and the display screen:

y3=k3x3+b3

wherein, y3Representing the font size of the display screen; k is a radical of3Represents a scaling factor; x is the number of3Representing the distance between the human face and the display screen; b3Is a fixed value.

For example: the distance between the human face and the display screen ranges from 20cm to 40cm, the distance between a user and the display screen is 40cm when the user normally uses the terminal, and the font size of the display screen is corresponding to a No. 10 font. From this, k is3=-0.5,b3=30。

In the embodiment of the invention, the method for controlling the terminal display is realized based on OpenCV and the functional characteristics of the camera, and the OpenCV is a cross-platform computer vision library and can run on Linux, Windows, Android and Mac OS operating systems. The method can adjust the display brightness of the display screen in real time according to the brightness degree of the external environment, and can also adjust the font size in real time according to the distance between the face and the display screen.

Fig. 2 is a schematic diagram of a main flow of a method for controlling a terminal display according to an embodiment of the present invention, and as shown in fig. 2, the method for controlling a terminal display is applied to a terminal, where the terminal at least includes: the camera comprises a camera and a display screen, wherein the shooting direction of the camera is consistent with the display direction of the display screen. The method for controlling the display of the terminal specifically comprises the following steps:

step 201: when the fact that the display screen starts to execute operation is detected, acquiring the light sensitivity of the camera to detect whether the brightness degree of the external environment changes within a first preset time; when detecting that the brightness level of the external environment changes within a first preset time, executing step 202; when it is detected that the brightness level of the external environment does not change within the first preset time, step 201 is repeatedly performed.

In step 201, the operation may be one or more of the following: clicking, sliding, playing video and/or displaying web page. The sensitivity (ISO) of the camera refers to the sensitivity of the camera to the intensity of light, and the sensitivity of the camera can be used for reflecting the brightness of the external environment. It can be understood that there is a corresponding relationship between the sensitivity of the camera and the brightness of the external environment, and whether the brightness of the external environment changes within the first preset time can be known through the sensitivity of the camera. Wherein, the external environment may be understood as the external environment in which the terminal is currently located. The first preset time can be understood as a detection period, and a value of the first preset time can be determined according to detection requirements.

Step 202: when the brightness degree of the external environment is detected to be changed within a first preset time, adjusting the display brightness of the display screen according to the sensitivity of the camera and the preset corresponding relation between the sensitivity of the camera and the display brightness of the display screen;

in step 202, the display brightness of the display screen may be adjusted according to a fitting formula of the sensitivity of the camera and the display brightness of the display screen; the fitting formula can be used for representing the corresponding relation between the sensitivity of the camera and the display brightness of the display screen. The fitting formula is as follows:

y1=k1g(x1)

wherein x is1Representing the sensitivity of the camera; y is1Displaying the display brightness of the display screen corresponding to each sensitivity; k is a radical of1Represents a proportionality coefficient, and k1Is a fixed value.

When determining a fitting formula of the sensitivity of the camera and the brightness degree of the external environment, firstly, acquiring different sensitivities of a preset number and target display brightness of a display screen corresponding to each sensitivity; and then obtaining a fitting formula of the sensitivity of the camera and the display brightness of the display screen according to the preset number of different sensitivities and the target display brightness of the display screen corresponding to each sensitivity. The target display brightness can be understood as optimal display brightness corresponding to different sensitivities, and the target display brightness can be obtained in advance according to experiments.

For example: the display brightness range of the display screen is between 0 and 100, and the display brightness of the display screen can be divided into the following five grades: dark, shade, weak light, normal and strong light, wherein the display brightness values of the display screens of five grades are respectively (20, 40, 60, 80 and 100). Correspondingly, the values of the sensitivity of the camera corresponding to the display brightness of each level are respectively (2000, 1000, 200, 100, 30). According to the sensitivity of the camera corresponding to the display brightness of different levels, the corresponding relation between the sensitivity of the camera and the display brightness of the display screen, namely y, can be obtained through fitting1=k1g(x1)。

It should be noted that the implementation principle of step 202 is the same as that of step 102, and the description of the same parts is omitted.

Step 203: acquiring target image data shot by the camera, wherein the target image data is image data with definition and image brightness meeting preset requirements;

in step 203, the target image data may specifically be image data with a definition within a second preset range and an image brightness within a fourth preset range, and the image data may be data such as a picture or a video. The definition of the target image data refers to the definition of each detail shadow and its boundary of the target image data, and the definition can be used for reflecting the image quality of the target image data. The image brightness of the target image data refers to the brightness degree of the target image data picture. The second preset range is a threshold of the definition of the image data, the fourth preset range is a threshold of the image brightness of the image data, and values of the second preset range and the fourth preset range may be determined as needed.

When target image data shot by the camera is obtained, candidate image data shot by the camera can be collected according to a preset collection strategy; then determining the sharpness of each of the candidate image data; further judging whether the definition of each candidate image data is within a second preset range; if the definition of the first candidate image data is not in the second preset range but in the third preset range, adjusting the definition of the first candidate image data, and then continuing to execute the step of judging whether the definition of each candidate image data is in the second preset range; if the definition of the first candidate image data is within a second preset range, determining the image brightness of the first candidate image data; judging whether the image brightness of the first candidate image data is within a fourth preset range; if the image brightness of the first candidate image data is not within a fourth preset range, adjusting the image brightness of the first candidate image data until the image brightness of the first candidate image data is within the fourth preset range; and if the image brightness of the first candidate image data is within a fourth preset range, determining the first candidate image data as target image data, and acquiring the first candidate image data.

Step 204: when a human face is detected in the target image data, determining the distance between the human face and a display screen according to the target image data;

in step 204, firstly, according to the target image data, identifying the interpupillary distance of the human face in the target image data through a human face recognition algorithm; and then determining the distance between the human face and the display screen according to the interpupillary distance of the human face in the target image data.

When identifying the interpupillary distance of the face in the target image data, firstly, determining the position of the face in the target image data according to the target image data; then according to the position of the face in the target image data, intercepting image data of a designated area in the target image data, wherein the designated area is a preset area around the face as the center; and finally, according to the image data of the specified area, identifying the interpupillary distance of the face in the target image data through a face identification algorithm.

Then, the distance between the human face and the display screen can be determined according to the following formula and the interpupillary distance of the human face in the target image data:

y2=k2x2+b2

wherein, y2Representing the interpupillary distance of the human face in the target image data; k is a radical of2Represents a scaling factor; x is the number of2Representing the distance between the human face and the display screen; b2Representing interpupillary distance when the distance between the face and the display screen is minimal, and b2Is a fixed value.

Step 205: judging that the distance between the face and the display screen is changed within a second preset time; if the distance between the face and the display screen is detected to be changed within the second preset time, executing step 206; if the distance between the face and the display screen is not changed within a second preset time, step 207 is executed.

Step 206: and when the fact that the distance between the face and the display screen is changed within a second preset time is detected, adjusting the font size of the display screen according to the distance between the face and the display screen. Then, step 207 is performed after step 206.

In step 206, first, it is determined whether the distance between the face and the display screen changes within a second preset time; if the distance between the face and the display screen is not changed within a second preset time or after the font size of the display screen is adjusted, the duration time and the current time of the operation can be acquired, and whether the duration time of the operation exceeds a fifth preset range or whether the current time exceeds a sixth preset range is judged; and if the operation duration exceeds a fifth preset range or the current time exceeds a sixth preset range, sending a prompt message. If the distance between the face and the display screen is changed within a second preset time, the font size of the display screen can be adjusted according to the following formula and the distance between the face and the display screen:

y3=k3x3+b3

wherein, y3Representing the font size of the display screen; k is a radical of3Represents a scaling factor; x is the number of3Representing the distance between the human face and the display screen; b3Is a fixed value.

For example: the distance between the human face and the display screen ranges from 20cm to 40cm, the distance between a user and the display screen is 40cm when the user normally uses the terminal, and the font size of the display screen is corresponding to a No. 10 font. From this, k is3=-0.5,b3=30。

Step 207: if the distance between the face and the display screen is not changed within a second preset time or after the font size of the display screen is adjusted, acquiring the duration time and the current time of the operation;

step 208: judging whether the duration time of the operation exceeds a fifth preset range or whether the current time exceeds a sixth preset range; if the duration of the operation exceeds a fifth preset range or the current time exceeds a sixth preset range, go to step 209; and if the duration time of the operation does not exceed the fifth preset range and the current time does not exceed the sixth preset range, ending the process.

In step 208, the duration of the operation refers to the duration of the operation performed by the user on the display screen, or the duration of the operation may be understood as the time that the user keeps staring at the display screen. The fifth preset range is a threshold of the duration of the operation, and a user can set a value of the fifth preset range according to a usual use habit, and can also select a default value of the system. The sixth preset range may be understood as a normal use time of the user, for example: the sixth preset range can be 6 to 23 points, the user can set the value of the sixth preset range according to the usual use habit, and the user can also select the default value of the system.

When the duration of the operation is determined, when the fact that the user executes the operations of clicking, sliding, playing videos or displaying webpages and the like and the distance between the face and the display screen is within the detection range is detected, a timer starts timing, and the duration of the operation is calculated. It can be understood that the duration of the operation is determined based on the user performing operations such as clicking, sliding, playing a video or displaying a web page, and the like, and assisted by whether the distance between the face and the display screen is within the detection range.

Step 209: and if the operation duration exceeds a fifth preset range or the current time exceeds a sixth preset range, sending a prompt message.

In step 209, the prompt message is used to prompt the user that the operation time is too long or the normal use time has been exceeded. The prompt message may be a voice message, a text message, or a video message.

In the embodiment of the invention, the method for controlling the terminal display is realized based on OpenCV and the functional characteristics of a camera of the mobile device, the method can adjust the display brightness of the display screen in real time according to the brightness degree of the external environment, can also adjust the font size in real time according to the distance between the face and the display screen, and can also send prompt information when the operation duration exceeds a fifth preset range or the current time exceeds a sixth preset range.

Fig. 3 is a schematic diagram of a main flow of a method for controlling a terminal display according to an embodiment of the present invention, and as shown in fig. 3, the method for controlling a terminal display is applied to a terminal, where the terminal at least includes: the camera comprises a camera and a display screen, wherein the shooting direction of the camera is consistent with the display direction of the display screen. The method for controlling the display of the terminal specifically comprises the following steps:

step 301: when detecting that the display screen starts to execute the operation, adjusting the background color of the display screen to a designated color or displaying an interface for the user to select the background color. Then, step 302 is performed after step 301.

In step 301, the designated color may be understood as a color that facilitates eye protection, such as: the specified color may be green. In addition, in order to improve the use experience of the user, an interface for the user to select the background color can be displayed, so that the user can conveniently select the favorite background color according to the requirement.

Step 302: acquiring the light sensitivity of the camera to detect whether the brightness degree of the external environment changes within a first preset time; when detecting that the brightness level of the external environment changes within a first preset time, executing step 303; when it is detected that the brightness level of the external environment does not change within the first preset time, step 302 is repeatedly performed.

In step 302, the operation may be one or more of: clicking, sliding, playing video and/or displaying web page. The sensitivity (ISO) of the camera refers to the sensitivity of the camera to the intensity of light, and the sensitivity of the camera can be used for reflecting the brightness of the external environment. It can be understood that there is a corresponding relationship between the sensitivity of the camera and the brightness of the external environment, and whether the brightness of the external environment changes within the first preset time can be known through the sensitivity of the camera. Wherein, the external environment may be understood as the external environment in which the terminal is currently located. The first preset time can be understood as a detection period, and a value of the first preset time can be determined according to detection requirements.

Step 303: when the brightness degree of the external environment is detected to be changed within a first preset time, judging whether the sensitivity of the camera is within a first preset range; if the sensitivity of the camera is not within the first preset range, executing step 304; if the sensitivity of the camera is within the first preset range, step 305 is executed.

In step 303, the sensitivity of the camera is within a first preset range, which indicates that the brightness of the external environment is good, so as to obtain the target image data. The sensitivity of the camera is not within a first preset range, which indicates that the external environment is too dark, and in order to shoot target image data meeting preset requirements, the light supplementing rate of the camera can be adjusted to perform light supplementing processing.

It should be noted that the first preset range is used to determine whether the brightness level of the external environment where the terminal is located is within an ideal range. It can be understood that, if the sensitivity of the camera is not within the first preset range, it indicates that the brightness of the external environment is not satisfactory, which may be unfavorable to both eyes of the user, and may even affect the acquisition of the target image data in step 312, and the display brightness of the display screen and the parameters of the camera may be adjusted. For example: the light supplementing rate of the camera can be adjusted, so that the target image data can be conveniently acquired.

Step 304: adjusting parameters of the camera, and adjusting the display brightness of the display screen according to the sensitivity of the camera and a preset corresponding relation between the sensitivity of the camera and the display brightness of the display screen;

in step 304, the display brightness of the display screen may be adjusted according to a fitting formula of the sensitivity of the camera and the display brightness of the display screen; the fitting formula can be used for representing the corresponding relation between the sensitivity of the camera and the display brightness of the display screen. In this way, the display brightness of the display screen can be adjusted by the brightness level of the external environment through step 304, so as to prevent the user's eyes from being pricked by the light-dark difference. In addition, through step 304, light supplement processing can be performed on darker image data, and the light supplement rate of the camera is increased, so that the problem that target image data cannot be acquired due to the fact that the image brightness of the image data is darker is solved.

In the embodiment of the present invention, the fitting formula is as follows:

y1=k1g(x1)

wherein x is1Representing the sensitivity of the camera; y is1Displaying the display brightness of the display screen corresponding to each sensitivity; k is a radical of1Represents a proportionality coefficient, and k1Is a fixed value.

When determining a fitting formula of the sensitivity of the camera and the brightness degree of the external environment, firstly, acquiring different sensitivities of a preset number and target display brightness of a display screen corresponding to each sensitivity; and then obtaining a fitting formula of the sensitivity of the camera and the display brightness of the display screen according to the preset number of different sensitivities and the target display brightness of the display screen corresponding to each sensitivity. The target display brightness may be understood as an optimal display brightness corresponding to each sensitivity, and the target display brightness may be obtained in advance according to an experiment.

For example: the display brightness range of the display screen is between 0 and 100, and the display brightness of the display screen can be divided into the following five grades: dark, shade, weak light, normal and strong light, wherein the display brightness values of the display screens of five grades are respectively (20, 40, 60, 80 and 100). Correspondingly, the values of the sensitivity of the camera corresponding to the display brightness of each level are respectively (2000, 1000, 200, 100, 30). According to the sensitivity of the camera corresponding to the display brightness of different levels, the corresponding relation between the sensitivity of the camera and the display brightness of the display screen, namely y, can be obtained through fitting1=k1g(x1)。

It should be noted that the implementation principle of step 304 is the same as that of step 202, and the description of the same parts is omitted.

Step 305: if the sensitivity of the camera is within a first preset range or after the display brightness of the display screen is adjusted, acquiring candidate image data shot by the camera according to a preset acquisition strategy;

in step 305, the preset acquisition strategy may be understood as acquiring candidate image data shot by the camera according to a preset acquisition frequency. Because the external environment where the user is located and the distance between the human face and the display screen are greatly changed in a short time, and meanwhile, in order to reduce the workload of the terminal, when candidate image data are collected, the candidate image data can be collected once at intervals according to a preset collection period by combining the brightness degree (the light sensitivity ISO of a camera) of the external environment. For example: a bright environment of ISO <200 or a dark environment of ISO >1000, the time interval is set to 1 minute since the environment does not change much, otherwise 30 s.

Step 306: determining a sharpness of each of the candidate image data;

in step 306, the sharpness of each of the candidate image data may be determined using the basis function provided by OpenCV and the laplacian variance algorithm. Simply speaking, the candidate image data is only required to be converted into a data type required by cvMat (important matrix transformation function in OpenCV), convolution operation is performed through a laplacian variance method on the gray value of the candidate image data of the picture, so as to obtain a pixel matrix, the variance is calculated on the numerical values in the matrix, and the definition of each candidate image data is determined. The variance is larger if the candidate image data is sharper at edges and larger in difference between pixels, the sharper the candidate image data is. Otherwise, the candidate image data is blurred, and the pixel difference between edges is small, the variance is smaller.

Step 307: judging whether the definition of each candidate image data is within a second preset range or not; if the sharpness of the first candidate image data is not within the second preset range, go to step 308; if the sharpness of the first candidate image data is within the second preset range, go to step 309;

for example: when the definition of the candidate image data is qualified, the candidate image data may be regarded as the target image data. When the definition of the candidate image data is not lower than 80% of the second preset range, the definition of the candidate image data can be improved by adopting a development method provided by a mobile platform. When the sharpness of the candidate image data is lower than 80% of the second preset range, the candidate image data is directly discarded.

Step 308: if the definition of the first candidate image data is not within the second preset range but within the third preset range, adjusting the definition of the first candidate image data, and then continuing to execute step 307;

step 309: if the definition of the first candidate image data is within a second preset range, determining the image brightness of the first candidate image data;

step 310: judging whether the image brightness of the first candidate image data is within a fourth preset range;

step 311: if the image brightness of the first candidate image data is not within a fourth preset range, adjusting the image brightness of the first candidate image data until the image brightness of the first candidate image data is within the fourth preset range;

step 312: and if the image brightness of the first candidate image data is within a fourth preset range, determining the first candidate image data as target image data, and acquiring the first candidate image data.

Step 313: when a human face is detected in the target image data, identifying the interpupillary distance of the human face in the target image data through a human face identification algorithm according to the target image data;

in step 313, the position of the human face in the target image data may be determined according to the target image data; then according to the position of the face in the target image data, intercepting image data of a designated area in the target image data, wherein the designated area is a preset area around the face as the center; and finally, according to the image data of the specified area, identifying the interpupillary distance of the face in the target image data through a face identification algorithm.

When the position of the face in the target image data is determined, a coordinate system can be established by taking one point of the target image data as an origin, and then the coordinates of the designated area corresponding to the face are determined.

Step 314: and determining the distance between the human face and the display screen according to the interpupillary distance of the human face in the target image data.

In step 314, the distance between the human face and the display screen may be determined according to the following formula and the interpupillary distance of the human face in the target image data:

y2=k2x2+b2

wherein, y2Representing the interpupillary distance of the human face in the target image data; k is a radical of2Represents a scaling factor; x is the number of2Representing the distance between the human face and the display screen; b2Representing interpupillary distance when the distance between the face and the display screen is minimal, and b2Is a fixed value.

Step 315: judging whether the distance between the face and the display screen changes within a second preset time; if the distance between the face and the display screen is changed within a second preset time, executing step 316; if the distance between the face and the display screen is not changed within the second preset time, step 317 is executed.

Step 316: and if the distance between the face and the display screen is changed within a second preset time, adjusting the font size of the display screen according to the distance between the face and the display screen. Step 317 is then performed after step 316.

In step 316, determining whether the distance between the face and the display screen changes within a second preset time; if the distance between the face and the display screen is not changed within second preset time or after the font size of the display screen is adjusted, judging whether the duration time of the operation exceeds a fifth preset range or whether the current time exceeds a sixth preset range; and if the operation duration exceeds a fifth preset range or the current time exceeds a sixth preset range, sending a prompt message. If the distance between the face and the display screen is changed within a second preset time, the font size of the display screen can be adjusted according to the following formula and the distance between the face and the display screen:

y3=k3x3+b3

wherein, y3Representing the font size of the display screen; k is a radical of3Represents a scaling factor; x is the number of3Representing the distance between the human face and the display screen; b3Is a fixed value.

Step 317: if the distance between the face and the display screen is not changed within a second preset time or after the font size of the display screen is adjusted, acquiring the duration time and the current time of the operation;

step 318: judging whether the duration time of the operation exceeds a fifth preset range or whether the current time exceeds a sixth preset range; if the duration of the operation exceeds a fifth preset range or the current time exceeds a sixth preset range, go to step 319; and if the duration time of the operation does not exceed the fifth preset range and the current time does not exceed the sixth preset range, ending the process.

Step 319: and if the operation duration exceeds a fifth preset range or the current time exceeds a sixth preset range, sending a prompt message.

In step 319, the prompt message is used to prompt the user that the operation time is too long or the normal use time has been exceeded. The prompt message may be a voice message, a text message, or a video message.

In the embodiment of the invention, the method for controlling the display of the terminal is realized based on OpenCV and the self functional characteristics of the camera, the OpenCV is a cross-platform computer vision library and can be operated on Linux, Windows, Android and Mac OS operating systems. The method can adjust the display brightness of the display screen in real time according to the brightness degree of the external environment, and can also adjust the font size in real time according to the distance between the face and the display screen.

Fig. 4 is a schematic diagram of main modules of an apparatus for controlling terminal display according to an embodiment of the present invention, and referring to fig. 4, the apparatus 400 for controlling terminal display may specifically include:

the first obtaining submodule 401 is configured to, when it is detected that the display screen starts to perform an operation, obtain sensitivity of the camera to detect whether a brightness level of an external environment changes within a first preset time;

a first adjusting submodule 402, configured to adjust the display brightness of the display screen according to the sensitivity of the camera and a preset correspondence between the sensitivity of the camera and the display brightness of the display screen when it is detected that the brightness of the external environment changes within a first preset time;

a second obtaining submodule 403, configured to obtain target image data captured by the camera, where the target image data is image data whose definition and image brightness both meet preset requirements;

a determining sub-module 404, configured to determine, when a face is detected in the target image data, a distance between the face and a display screen according to the target image data;

and a second adjusting submodule 405, configured to, when it is detected that the distance between the face and the display screen changes within a second preset time, adjust the font size of the display screen according to the distance between the face and the display screen.

Optionally, the first adjusting submodule 402 is further configured to:

when the brightness degree of the external environment is detected to be changed within a first preset time, judging whether the sensitivity of the camera is within a first preset range;

if the sensitivity of the camera is not within a first preset range, adjusting parameters of the camera, and adjusting the display brightness of the display screen according to the sensitivity of the camera in an external environment and a preset corresponding relation between the sensitivity of the camera and the display brightness of the display screen;

if the sensitivity of the camera is within a first preset range or after the display brightness of the display screen is adjusted, the second obtaining sub-module 403 is turned on to perform the step of obtaining the target image data captured by the camera.

Optionally, the second obtaining sub-module 403 is further configured to:

acquiring candidate image data shot by the camera according to a preset acquisition strategy;

determining a sharpness of each of the candidate image data;

judging whether the definition of each candidate image data is within a second preset range or not;

if the definition of the first candidate image data is not in the second preset range but in the third preset range, adjusting the definition of the first candidate image data, and then continuing to execute the step of judging whether the definition of each candidate image data is in the second preset range;

if the definition of the first candidate image data is within a second preset range, determining the image brightness of the first candidate image data;

judging whether the image brightness of the first candidate image data is within a fourth preset range;

if the image brightness of the first candidate image data is not within a fourth preset range, adjusting the image brightness of the first candidate image data until the image brightness of the first candidate image data is within the fourth preset range;

and if the image brightness of the first candidate image data is within a fourth preset range, determining the first candidate image data as target image data, and acquiring the first candidate image data.

Optionally, the second obtaining sub-module 403 is further configured to:

determining a sharpness of each of the candidate image data based on an open source computer vision library and employing a Laplace variance algorithm.

Optionally, the first adjusting submodule 402 is further configured to:

adjusting the display brightness of the display screen according to the following formula and the sensitivity of the camera in the external environment:

y1=k1g(x1)

wherein x is1Representing the sensitivity of the camera; y is1Displaying the display brightness of the display screen corresponding to each sensitivity; k is a radical of1Represents a proportionality coefficient, and k1Is a fixed value.

Optionally, the determining sub-module 404 is further configured to:

identifying the interpupillary distance of the face in the target image data through a face identification algorithm according to the target image data;

and determining the distance between the human face and the display screen according to the interpupillary distance of the human face in the target image data.

Optionally, the determining sub-module 404 is further configured to:

determining the distance between the human face and the display screen according to the following formula and the interpupillary distance of the human face in the target image data:

y2=k2x2+b2

wherein, y2Representing the interpupillary distance of the human face in the target image data; k is a radical of2Represents a scaling factor; x is the number of2Representing the distance between the human face and the display screen; b2Representing interpupillary distance when the distance between the face and the display screen is minimal, and b2Is a fixed value.

Optionally, the determining sub-module 404 is further configured to:

determining the position of a human face in the target image data according to the target image data;

intercepting image data of a designated area in the target image data according to the position of the face in the target image data, wherein the designated area is a preset area around the face as the center;

and identifying the interpupillary distance of the face in the target image data through a face identification algorithm according to the image data of the designated area.

Optionally, the second adjusting submodule 405 is further configured to:

adjusting the font size of the display screen according to the following formula and the distance between the face and the display screen:

y3=k3x3+b3

wherein, y3Representing the font size of the display screen; k is a radical of3Represents a scaling factor; x is the number of3Representing the distance between the human face and the display screen; b3Is a fixed value.

Optionally, the apparatus for controlling terminal display further includes:

the third obtaining submodule obtains the duration time and the current time of the operation;

the judging submodule is used for judging whether the duration time of the operation exceeds a fifth preset range or whether the current time exceeds a sixth preset range if the distance between the face and the display screen is not changed within second preset time or after the font size of the display screen is adjusted;

and the prompting submodule is used for sending a prompting message if the duration time of the operation exceeds a fifth preset range or the current time exceeds a sixth preset range.

Optionally, the apparatus for controlling terminal display further includes:

and the third adjusting submodule is used for adjusting the background color of the display screen to a designated color or displaying an interface for a user to select the background color when the display screen is detected to start to execute the operation.

In the embodiment of the invention, the device displayed by the control terminal is realized based on OpenCV and the self functional characteristics of the camera, the OpenCV is a cross-platform computer vision library and can run on Linux, Windows, Android and Mac OS operating systems. The device can adjust the display brightness of the display screen in real time according to the brightness degree of the external environment, and can also adjust the font size in real time according to the distance between the face and the display screen.

On the basis of the above embodiment, the embodiment of the present invention further provides another apparatus for controlling terminal display, where the apparatus for controlling terminal display may implement the method described above based on some functions provided by an open source framework OpenCV and the functional characteristics of a camera of a mobile device, that is, the display brightness, the background color, and the font size of the adaptive display screen may be adaptively adjusted according to the brightness of an external environment and the distance between a human face and the display screen, where the display brightness and the background color of the display screen may be adjusted to prevent harm to eyes of a user caused by inappropriate display brightness of the display screen, and the font size may be adjusted to prevent eye fatigue. And the prompt information can be sent according to the operation duration and the current time, the prompt information can be interesting information or voice and the like, and interesting friend prompts and voice are given when the user uses the terminal for a long time, so that the attention of the user is transferred, the mind and the body are relaxed, and the harm to the eyes of the user caused by the fact that the user stares at the display screen for a long time is reduced. Different from the previous eye protection method, the device for controlling the display of the terminal can automatically adjust the display brightness and the font size according to the external environment and the acquired distance between the human face and the display screen so as to weaken the harm of the display screen of the terminal to the eyes, relieve the eye fatigue, greatly save cost resources and effectively weaken the harm of staring at the display screen of the terminal for a long time to the eyes by using modern advanced scientific technology.

In the embodiment of the invention, the device displayed by the control terminal is realized based on some functions provided by OpenCV and the functional characteristics of the camera, and the OpenCV is a cross-platform computer vision library and can run on Linux, Windows, Android and Mac OS operating systems. The device can adjust the display brightness of the display screen in real time according to the brightness degree of the external environment, and can also adjust the font size in real time according to the distance between the face and the display screen.

Fig. 5 is a schematic diagram of main modules of an apparatus for controlling terminal display according to an embodiment of the present invention, and referring to fig. 5, the apparatus for controlling terminal display may specifically include: a data acquisition module 501, a comprehensive detection module 502 and a detection result processing module 503.

The data acquisition module 501 mainly acquires the brightness and darkness of the external environment and the candidate image data in real time through a camera according to a preset acquisition strategy. On one hand, the data acquisition module 501 provides image data meeting preset requirements for the following two modules, and on the other hand, the data acquisition module 501 appropriately reduces acquisition amount and avoids resource consumption. The comprehensive detection module 502 can be used for detecting an external environment, detecting the distance between a human face and a display screen, and detecting time, wherein the detection of the external environment is mainly to judge the brightness degree of the external environment by acquiring the real-time light sensitivity of a camera, the detection of the distance between the human face and the display screen is mainly to identify the distance between two eyes of the human face by a human face identification algorithm, and then the distance between the human face and the display screen is calculated; the detection of the time mainly comprises the steps of recording the duration time of the user staring at the display screen of the terminal and the current time through a timer; the detection result processing module 503 is configured to make different adjustments according to the detection result of the comprehensive detection module 502, mainly adjust the display brightness, the background color, the font size, and the like of the terminal, and meanwhile pop up a prompt such as a small joke, so as to relieve fatigue of the user, relax the body and mind, and reduce damage to the eyes of the user by the terminal. The specific framework and flow chart of the whole device are shown in fig. 5.

(1) Data acquisition module 501

The data acquisition module 501 is mainly used for data acquisition, particularly image data acquisition. Before image data acquisition, the use authority of the camera is applied, and after the use authority exists, the camera is started to acquire image data in real time. And the image data are acquired according to a preset acquisition strategy, so that the subsequent processing of the image data is reduced in sequence, the equipment resources are saved, and unnecessary waste is prevented.

In view of the fact that a camera takes frames 25-30 times per second, the distance of a human eye staring at a display screen changes between blinks, an event is generally about 0.4 second, and in order to save equipment resources and improve the utilization rate of acquired data, one frame of image data can be taken at preset time intervals. For example: the preset acquisition strategy can be that one frame of good image data is taken for each 12 frames for subsequent detection, so that the subsequent detection cost is saved, and the utilization rate of acquired data is improved.

It is understood that the second acquisition submodule 403 may be integrated in the data acquisition module 501.

(2) Integrated detection module 502

The comprehensive detection module 502 is mainly used for detecting the brightness of the external environment and the distance between the human face and the screen. Firstly, detecting an external environment, and judging the brightness degree of the current external environment by judging the light sensitivity (ISO) of a camera; and then carrying out face detection and eye distance detection on the acquired image data, and further calculating the distance between the face and the display screen. Meanwhile, the duration of the terminal used by the user is recorded, and whether the current time of the system is within a fifth preset range or not is detected. Adjusting the display brightness and the font size of the terminal according to the brightness of the external environment and the distance between the human face and the display screen; through the recording and detection of the time, after a period of time, a warm and interesting prompt is given to the user, so that the eyestrain caused by the fact that the user gazes at the terminal for a long time is relieved.

In order to improve the detection efficiency and accuracy of the distance between the human face and the display screen, the comprehensive detection module 502 implements a set of accurate and rapid detection models, specifically as follows: the method comprises the steps of firstly detecting the definition and the image brightness of candidate image data, screening target image data with the definition and the image brightness meeting preset requirements, and then carrying out face detection on the target image data. And after the face is detected in the target image data, recording the coordinates of the face in the target image data, then starting to adopt a face tracking mode according to the coordinates of the current face, namely reducing the face detection area, and carrying out rapid detection on the subsequent face so as to facilitate the distance detection of the subsequent human eyes from the display screen. When the face tracking mode is adopted to detect the face, a normal face detection mode is adopted to detect the whole image data, and the like, so that the detection is repeated in a circulating mode, the face detection efficiency is improved, and the detection speed can be improved.

It is understood that the first obtaining submodule 401 and the determining submodule 404 may be integrated to be the comprehensive detecting module 502.

(3) Detection result processing module 503

The detection result processing module 503 mainly adjusts the display brightness, background color and font size of the terminal according to the detection result of the comprehensive detection module 502, so as to alleviate the eye fatigue of the user and reduce the damage to the eyes caused by the user staring at the terminal for a long time; the user is prompted with friendly and interesting effects by recording the continuous use time of the user and detecting the current time, so that the user can shift attention and relax mood in a fatigue state to further relieve eye fatigue of the user, and the user is prompted whether to sleep for a long time without staying up to night, and the harm to eyes caused by the fact that the user stares at the terminal for a long time is reduced.

It is understood that the first adjusting submodule 402 and the second adjusting submodule 405 may be integrally configured as a detection result processing module 503.

Fig. 6 illustrates an exemplary system architecture 600 to which the method of controlling a terminal display or the apparatus for controlling a terminal display of the embodiments of the present invention may be applied.

As shown in fig. 6, the system architecture 600 may include terminal devices 601, 602, 603, a network 604, and a server 605. The network 604 serves to provide a medium for communication links between the terminal devices 601, 602, 603 and the server 605. Network 604 may include various types of connections, such as wire, wireless communication links, or fiber optic cables, to name a few.

A user may use the terminal devices 601, 602, 603 to interact with the server 605 via the network 604 to receive or send messages or the like. The terminal devices 601, 602, 603 may have installed thereon various communication client applications, such as shopping applications, web browser applications, search applications, instant messaging tools, mailbox clients, social platform software, etc. (by way of example only).

The terminal devices 601, 602, 603 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.

The server 605 may be a server providing various services, such as a background management server (for example only) providing support for shopping websites browsed by users using the terminal devices 601, 602, 603. The backend management server may analyze and perform other processing on the received data such as the product information query request, and feed back a processing result (for example, target push information, product information — just an example) to the terminal device.

It should be noted that the method for controlling the terminal display provided by the embodiment of the present invention is generally executed by the server 605, and accordingly, the apparatus for controlling the terminal display is generally disposed in the server 605.

It should be understood that the number of terminal devices, networks, and servers in fig. 6 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.

Referring now to FIG. 7, shown is a block diagram of a computer system 700 suitable for use with a terminal device implementing an embodiment of the present invention. The terminal device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.

As shown in fig. 7, the computer system 700 includes a Central Processing Unit (CPU)701, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the system 700 are also stored. The CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.

The following components are connected to the I/O interface 705: an input portion 706 including a keyboard, a mouse, and the like; an output section 707 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 708 including a hard disk and the like; and a communication section 709 including a network interface card such as a LAN card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that a computer program read out therefrom is mounted into the storage section 708 as necessary.

In particular, according to the embodiments of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program performs the above-described functions defined in the system of the present invention when executed by the Central Processing Unit (CPU) 701.

It should be noted that the computer readable medium shown in the present invention can be a computer readable signal medium or a computer readable medium or any combination of the two. A computer readable medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.

The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

The modules described in the embodiments of the present invention may be implemented by software or hardware. The described modules may also be provided in a processor, which may be described as: a processor comprises a first obtaining submodule 401, a first adjusting submodule 402, a second obtaining submodule 403, a determining submodule 404 and a second adjusting submodule 405. The names of the modules do not limit the modules themselves in some cases, for example, the first obtaining sub-module 401 may also be described as a module that obtains the sensitivity of the camera in the external environment when detecting that the display screen starts to perform an operation, so as to detect whether the brightness of the external environment changes within a first preset time.

As another aspect, the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be separate and not incorporated into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to comprise: when the fact that the display screen starts to execute operation is detected, acquiring the light sensitivity of the camera to detect whether the brightness degree of the external environment changes within a first preset time; when the brightness degree of the external environment is detected to be changed within a first preset time, adjusting the display brightness of the display screen according to the sensitivity of the camera and the preset corresponding relation between the sensitivity of the camera and the display brightness of the display screen; acquiring target image data shot by the camera, wherein the target image data is image data with definition and image brightness meeting preset requirements; when a human face is detected in the target image data, determining the distance between the human face and a display screen according to the target image data; and when the fact that the distance between the face and the display screen is changed within a second preset time is detected, adjusting the font size of the display screen according to the distance between the face and the display screen.

In the embodiment of the invention, the method for controlling the display of the terminal is realized based on OpenCV and the self functional characteristics of the camera, the OpenCV is a cross-platform computer vision library and can be operated on Linux, Windows, Android and Mac OS operating systems. The method can adjust the display brightness of the display screen in real time according to the brightness degree of the external environment, and can also adjust the font size in real time according to the distance between the face and the display screen.

The above-described embodiments should not be construed as limiting the scope of the invention. Those skilled in the art will appreciate that various modifications, combinations, sub-combinations, and substitutions can occur, depending on design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

29页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种基于互联网的智能电话手表无线通信装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类