Electronic device and imaging system

文档序号:1804636 发布日期:2021-11-05 浏览:4次 中文

阅读说明:本技术 电子设备和成像系统 (Electronic device and imaging system ) 是由 唯野隆一 小森谷阳多 于 2020-02-20 设计创作,主要内容包括:根据本公开的一个实施方式的电子设备包括:成像单元,用于获取成像数据;数据生成单元,用于从获取的数据生成情感数据;数据处理单元,用于将成像数据与情感数据按时间顺序关联;以及控制单元,用于基于情感数据随时间改变成像单元的设定。(An electronic device according to an embodiment of the present disclosure includes: an imaging unit for acquiring imaging data; a data generation unit for generating emotion data from the acquired data; a data processing unit for correlating the imaging data with the emotion data in time series; and a control unit for changing the setting of the imaging unit over time based on the emotion data.)

1. An electronic device, comprising:

an imaging section that acquires imaging data;

a data generator that generates emotion data based on the acquired data;

a data processor for correlating the imaging data with the emotion data in chronological order; and

a controller that controls the imaging section based on the emotion data.

2. The electronic device of claim 1, wherein the acquired data includes measurement data related to at least one of pulse, heart rate, electrocardiogram, electromyogram, respiration, sweating, GSR, blood pressure, blood oxygen saturation, skin surface temperature, electroencephalogram, blood flow changes, body temperature, body movement, head movement, center of gravity, pace of walking or running, eye condition, or ambient sounds.

3. The electronic device according to claim 1, wherein the controller evaluates the imaging data based on the emotion data, and controls the imaging section based on an evaluation result.

4. The electronic apparatus according to claim 1, wherein the controller performs calculation of at least pleasant/uncomfortable or active/inactive intensity data based on the emotion data, and controls the imaging section based on the intensity data obtained by the calculation.

5. The electronic device of claim 1, wherein the controller controls at least one of a resolution, a frame rate, a bit length, an impact, or an effective pixel area in the imaging section based on the emotion data.

6. The electronic apparatus according to claim 1, wherein the controller sets the imaging section to any one of a standby state and an imaging state based on the emotion data.

7. The electronic device of claim 1, further comprising a storage section that stores the imaging data, wherein,

the controller determines whether the imaging data needs to be stored based on the emotion data, and causes the storage section to store the imaging data based on the determination.

8. The electronic device of claim 1, further comprising a storage section that stores the imaging data, wherein,

the controller causes the storage section to store the imaging data, and performs data deletion or rewriting with newly stored imaging data of a portion of the imaging data stored in the storage section, the portion of the imaging data stored in the storage section having a period exceeding a predetermined period.

9. The electronic device according to claim 1, wherein the controller detects an amount and a direction of camera shake in the imaging data, determines a cut-out region in the imaging data from the amount and the direction of camera shake that has been detected, and generates the imaging data in which the camera shake has been corrected based on the determination.

10. The electronic device according to claim 1, further comprising a storage section that stores the delay time, wherein,

the data processor correlates imaging data acquired at a first time point with affective data acquired at a second time point, the second time point being later than the first time point by the delay time.

11. The electronic device of claim 1, further comprising an attachment portion capable of attaching the electronic device to a body.

12. The electronic device of claim 1,

the imaging section includes a plurality of imaging elements whose directions of optical axes are different from each other, and acquires imaging data of respective specific directions, and

the data processor associates imaging data of a plurality of the specific directions obtained by the plurality of imaging elements with the emotion data in time series as the imaging data.

13. The electronic apparatus according to claim 1, wherein the imaging section includes a wide-angle lens or a fisheye lens.

14. The electronic device of claim 1, further comprising a storage section that stores reference data related to an object, wherein,

the controller controls the imaging section based on a result of matching between the reference data and the imaging data.

15. The electronic device according to claim 14, wherein the setting of the imaging section is performed on at least one of resolution, frame rate, bit length, influence, or effective pixel area in the imaging section.

16. The electronic device according to claim 1, further comprising a storage section that stores reference data relating to emotion, wherein,

the controller controls the imaging section based on the reference data and the emotion data.

17. An imaging system, comprising:

an imaging section that acquires imaging data;

a data generator that generates emotion data based on the acquired data;

a data processor for correlating the imaging data with the emotion data in chronological order; and

a controller that controls the imaging section based on the emotion data.

18. The imaging system of claim 17, further comprising a storage portion that stores reference data relating to the subject, wherein,

the controller controls the imaging section based on a result of matching between the reference data and the imaging data.

Technical Field

The present disclosure relates to an electronic apparatus and an imaging system.

Background

A digital camera or a smartphone is recently equipped with a function called a "smile shutter" (registered trademark) which automatically releases the shutter in response to a determination result of whether a subject is smiling to accurately capture the timing of the smile (for example, see patent document 1).

Prior patent literature

Patent document

Patent document 1: japanese unexamined patent application publication No.2018-151660

Disclosure of Invention

Incidentally, the user may wish to take a picture not only at a smiling moment but also at a rare moment such as a natural expression moment of a child or an activity. However, when the user holds the camera, the natural expression may disappear and the expression becomes a smile or tense expression, making it difficult to capture the natural expression. In addition, the precious moment comes suddenly, so that the user may miss shooting while holding the camera. Accordingly, it is desirable to provide an electronic apparatus and an imaging system that enable a user to take a picture without missing the moment he/she wants to take.

An electronic device according to an embodiment of the present disclosure includes: an imaging section that acquires imaging data; a data generator that generates emotion data based on the acquired data; a data processor for correlating the imaging data with the emotion data in time series; and a controller that changes the setting of the imaging section in chronological order based on the emotion data.

An imaging system according to an embodiment of the present disclosure includes: an imaging section that acquires imaging data; a data generator that generates emotion data based on the acquired data; a data processor which correlates the imaging data with the emotion data in time order; and a controller that controls the imaging section based on the emotion data.

In the electronic apparatus and the imaging system according to the embodiment of the present disclosure, imaging data and emotion data are correlated with each other in time series, and the imaging section is controlled based on the emotion data. Thus, for example, in the case where emotion data changes at the time when the user reacts to the photographic subject, the monitoring of emotion data enables grasping the time at which the user may wish to take a picture. As a result, for example, it becomes possible to obtain imaging data in response to a change in emotion data without requiring the user to manually release the shutter.

Drawings

Fig. 1 is a diagram showing an example of functional blocks of an electronic device according to an embodiment of the present disclosure.

Fig. 2 is a diagram for describing a delay time.

Fig. 3 is a diagram showing an example of a procedure regarding adjustment of delay time and selection of a sensor.

Fig. 4 is a diagram showing an example of a display screen for adjustment of delay time and selection of a sensor.

Fig. 5 is a diagram showing an example of a control process of the image forming apparatus.

Fig. 6 is a diagram showing an example of a control process of the image forming apparatus.

Fig. 7 is a diagram showing an example of a control method of the image forming apparatus.

Fig. 8 is a diagram showing an example of a control process of the image forming apparatus.

Fig. 9 is a diagram showing a change in resolution in the control process of fig. 8.

Fig. 10 is a diagram showing an example of a control process of the image forming apparatus.

Fig. 11 is a diagram showing an example of a control process of the image forming apparatus.

Fig. 12 is a diagram showing a change in resolution in the control process of fig. 11.

Fig. 13 is a diagram showing a modification of the functional blocks of the electronic apparatus of fig. 1.

Fig. 14 is a diagram showing a modification of the functional blocks of the electronic apparatus of fig. 1.

Fig. 15 is a diagram showing a modification of the functional blocks of the electronic apparatus of fig. 1.

Fig. 16 is a diagram showing a modification of the functional blocks of the electronic apparatus of fig. 1.

Fig. 17 is a diagram showing a modification of the functional blocks of the electronic apparatus of fig. 1.

Fig. 18 is a diagram showing an example of a state in which the electronic apparatus of fig. 1, 13, 14, 15, or 16 is attached to a body of a user.

Fig. 19 is a diagram showing an example of a state in which the electronic apparatus of fig. 1, 13, 14, 15, or 16 is attached to a body of a user.

Fig. 20 is a diagram showing an example of a state in which the electronic apparatus of fig. 1, 13, 14, 15, or 16 is attached to a body of a user.

Detailed Description

In the following, specific embodiments for the present disclosure will be described in detail with reference to the accompanying drawings. The following description is a specific example of the present disclosure, and the present disclosure is not limited to the following embodiments. Note that the description is given in the following order.

1. Detailed description of the preferred embodiments

Examples with Screen display and Audio output functionality

2. Modification example

Modification A: example modification B in which the functions of screen display and audio output are set to the external device: example in which the storage function is provided to an external device

Modification example C: example of communicating with external device via network

Modification example D: example of communicating with display device and storage device via network

Modification example E: examples in which the image forming portions are provided separately

Modification F: example of being configured to be attachable to the body

Modification example G: example including a plurality of image forming sections

Modification example H: examples including wide-angle lenses or fisheye lenses

<1 > embodiment >

[ Structure ]

An electronic apparatus 1 according to an embodiment of the present disclosure will be described. Fig. 1 shows an example of a schematic configuration of an electronic apparatus 1 according to the present embodiment. The electronic apparatus 1 includes an imaging system that automatically acquires imaging data Dc of a scene that is meaningful to a user, without depending on an operation of a shutter key by the user. For example, the electronic apparatus 1 includes an imaging block 10, a sound block 20, a storage block 30, a system controller 40, an external input block 50, and a communication section 60. The system controller 40 corresponds to a specific example of the "controller" according to the present disclosure.

For example, the imaging block 10 includes an imaging section 11, an imaging signal processor 12, an imaging controller 13, an image processor 14, an image signal generator 15, a display driver 16, a display controller 17, and a display section 18. The image processor 14 corresponds to a specific example of the "data generator" and the "data processor" according to the present disclosure. The imaging signal processor 12 and the imaging controller 13 correspond to a specific example of "controller" according to the present disclosure. For example, the sound block 20 includes a sound input section 21, a sound analyzer 22, a sound processor 23, an audio signal generator 24, and an audio output section 25. For example, the storage block 30 includes a temporary memory 31 and a storage section 32. For example, the external input block 50 includes an operation input section 51, a date/time counter 52, and a sensor 53.

The imaging section 11 outputs imaging data Da obtained by imaging (moving image, or a plurality of still images obtained by continuous imaging) to the imaging signal processor 12. The imaging section 11 includes, for example, a CCD (charge coupled device) image sensor, a CMOS (complementary metal oxide semiconductor) image sensor, and the like.

The imaging signal processor 12 performs various types of image signal processing on the imaging data Da output from the imaging section 11, and includes, for example, a DSP (digital signal processing) circuit. For example, the imaging signal processor 12 may detect the amount and direction of camera shake in the imaging data Da, determine a cut-out region in the imaging data Da from the amount and direction of camera shake that has been detected, and generate imaging data Da' in which camera shake has been corrected based on the determination.

The imaging data Da or the imaging data Da' that has been processed by the imaging signal processor 12 is output to the image processor 14. Note that, in the following description, the imaging data Da may be imaging data Da'. The imaging controller 13 controls the operations of the imaging section 11 and the imaging signal processor 12 according to the control of the system controller 40.

The image processor 14 controls the transfer of the imaging data Da, the imaging data Dc, evaluation data Dd to be described later, and the like according to the control of the system controller 40. That is, the image processor 14 controls transfer of the imaging data Da, the imaging data Dc, the evaluation data Dd, and the like between the imaging signal processor 12 and the image signal generator 15, the temporary memory 31, and the storage section 32. The evaluation data Dd is generated by the image processor 14 as described below. The image processor 14 inputs the imaging data Da, the evaluation data Dd, and the like to the temporary memory 31. The temporary memory 31 stores the imaging data Da, the evaluation data Dd, and the like. The storage section 32 stores the imaging data Dc as a part of the imaging data Da.

The image processor 14 also extracts the imaging data Dc satisfying a predetermined storage condition from the imaging data Da. A method of extracting the imaging data Dc by the image processor 14 and the evaluation data Dd will be described in detail later.

The image processor 14 also outputs the generated evaluation data Dd to the system controller 40. The system controller 40 controls the imaging section 11 and the imaging signal processor 12 based on the evaluation data Dd input from the image processor 14 (i.e., based on the emotion data Df). Specifically, the system controller 40 outputs a control signal generated based on the evaluation data Dd input from the image processor 14 to the imaging controller 13. The imaging controller 13 controls the imaging section 11 and the imaging signal processor 12 based on a control signal input from the system controller 40. The control of the imaging section 11 and the imaging signal processor 12 by the system controller 40 will be described in detail later.

The image signal generator 15 generates an image signal based on the imaging data Da, and outputs the image signal to the display driver 16. The display driver 16 drives the display section 18 based on the image signal input from the image signal generator 15. The display controller 17 controls the operations of the image signal generator 15 and the display driver 16 according to the control of the system controller 40. The display section 18 includes, for example, a panel-type display device such as a liquid crystal panel or an organic EL (electroluminescence) panel, and displays the imaging data Da driven by the display driver 16.

The sound input section 21 includes, for example, a microphone amplifier, and an AD converter, and outputs digital sound data Sa. The microphone amplifier performs amplification processing on the sound signal obtained by the microphone, and the AD converter performs AD conversion on the sound signal amplified by the microphone amplifier. The audio data Sa obtained by the audio input unit 21 is input to the audio processor 23.

The sound processor 23 controls the transmission of the sound data Sa according to the control of the system controller 40. That is, the sound processor 23 controls the transfer of the sound data Sa between the sound input section 21 and the audio signal generator 24, the temporary memory 31, and the storage section 32. The sound processor 23 inputs the sound data Sa obtained by the sound input unit 21 to the temporary storage 31. The temporary memory 31 stores the sound data Sa obtained by the sound input section 21 together with the imaging data Da obtained by the imaging section 11. The sound processor 23 outputs the sound data Sa obtained by the sound input unit 21 to the audio signal generator 24.

The sound processor 23 controls the transfer of the sound data Sa read from the temporary storage 31 according to the control of the system controller 40. That is, the sound processor 23 controls the transfer of the sound data Sa between the temporary storage 31 and the audio signal generator 24. The sound processor 23 outputs the sound data Sa read from the temporary memory 31 to the audio signal generator 24.

The audio signal generator 24 converts the input sound data Sa into a logical audio signal according to the control of the system controller 40, and outputs the audio signal obtained by the conversion to the audio output section 25. For example, the audio output section 25 includes an amplifier circuit that amplifies a logical audio signal, and a speaker that outputs sound based on an audio signal output from the amplifier circuit.

The temporary memory 31 is used to buffer the imaging data Da obtained by the imaging section 11, and temporarily stores the imaging data Da and the sound data Sa in, for example, a ring memory format. For example, the image processor 14 performs data deletion or overwriting with newly stored imaging data Da on a portion (for example, a storage period Δ t to be described later) of the imaging data Da stored in the temporary storage 31, the period of which exceeds a predetermined period, among the imaging data Da stored in the temporary storage 31.

The storage section 32 stores the imaging data Dc satisfying a predetermined storage condition among the imaging data Da obtained by the imaging section 11. In the case where a predetermined storage condition is satisfied, the system controller 40 extracts the imaging data Dc to be stored from the imaging data Da temporarily stored in the temporary memory 31 at that point in time, and transfers the imaging data Dc to the storage section 32.

Here, the imaging data Dc to be stored in the storage section 32 is, for example, imaging data of a scene meaningful to the user, such as imaging data of a scene in which the user is interested, imaging data of a scene when the emotion of the user changes, imaging data of a scene remaining in the memory of the user, and the like, in the imaging data Da that the system controller 40 has determined to be imaging constantly.

The operation input section 51 includes an operation target such as a keyboard or a dial, for example. The operation input section 51 receives an input from a user through an operation object operated by the user, for example, and outputs the input to the system controller 40. For example, the date/time counter 52 counts dates and times (year, month, day, hour, minute, and second), and outputs current date and time information (hereinafter referred to as "time data Dt") to the system controller 40.

The sensor 53 measures at least one of pulse, heart rate, electrocardiogram, electromyogram, respiration, sweating, GSR, blood pressure, blood oxygen saturation, skin surface temperature, electroencephalogram, blood flow variation, body temperature, body movement, head movement, center of gravity, walking or running rhythm, eye condition, or surrounding sound, and outputs measurement data obtained by the measurement (hereinafter referred to as "acquisition data Db") to the system controller 40. The system controller 40 outputs the acquired data Db input from the sensor 53 to the image processor 14 in association with the time data Dt input from the date/time counter 52.

The communication unit 60 can communicate with an external device via a network. Here, the network is a network that performs communication using a communication protocol (TCP/IP) that is commonly used in the internet, for example. For example, the network may be a secure network that communicates using its own communication protocol. The network may be, for example, the internet, an intranet, or a local area network. For example, the connection between the network and the communication section 60 may be a wired LAN (local area network) such as ethernet (registered trademark), a wireless LAN such as Wi-Fi, a cellular phone line, or the like.

The communication section 60 may be capable of communicating with an external device by near field communication. For example, near field communication in the communication unit 60 is performed by ISO/IEC14443 (international standard for short-range RFID), ISO/IEC18092 (international standard for wireless communication called NFC), ISO/IEC15693 (international standard for RFID), bluetooth (registered trademark), or the like.

Next, a method of extracting the imaging data Dc by the image processor 14 will be described. First, the principle of the delay time and the method of adjusting the delay time will be described, and thereafter, the method of extracting the imaging data Dc will be described.

Fig. 2 is a diagram for describing the principle of the delay time td. Fig. 2 conceptually shows the imaging data Da stored in the temporary memory 31. The temporary memory 31 stores imaging data Da imaged by the imaging section 11 during the storage period Δ t. The evaluation data Dd in fig. 2 is a numerical evaluation of the emotion of the user generated based on the acquisition data Db. The image processor 14 generates emotion data Df based on the acquisition data Db, and generates evaluation data Dd based on the generated emotion data Df.

The evaluation data Dd relates to the emotion of the user, and therefore, has a predetermined correspondence with the imaging data Da. However, the imaging data Da and the evaluation data Dd at the same time point cannot be correlated with each other. This is because, when the user views several scenes at the time point t1, the emotion (emotion data Df) of the user significantly changes at the time point t2 with a slight time delay from the time point t 1. Therefore, there is a delay time td in correlating the imaging data Da with the evaluation data Dd and the emotion data Df in time series. Thus, the electronic apparatus 1 correlates the imaging data Da and the emotion data Df in time series with the delay time taken into account. Specifically, the electronic apparatus 1 correlates the imaging data Da acquired at the time point t1 (first time point) with the emotion data Df acquired at the time point t2 (second time point) later than the time point t1 by the delay time td.

Fig. 3 shows an example of a process of adjusting the delay time td. First, the image processor 14 acquires imaging data Da and acquisition data Db (step S101). Thereafter, the image processor 14 generates emotion data Df based on the acquisition data Db (step S102). The image processor 14 generates emotion data Df for each type of acquired data Db. Thereafter, the image processor 14 evaluates the imaging data Da based on the emotion data Df (step S103). The image processor 14 calculates intensity data of at least pleasure/discomfort or activity/inactivity based on the emotion data Df, and sets the calculated intensity data as evaluation data Dd.

The image processor 14 outputs the resultant evaluation data Dd and imaging data Da to the image signal generator 15. The image signal generator 15 generates an image signal for displaying the evaluation data Dd and the imaging data Da input from the image processor 14, and outputs the image signal to the display driver 16. The display driver 16 drives the display section 18 based on the image signal input from the image signal generator 15. The display section 18 is driven by the display driver 16, and thereby displays the imaging data Da and the evaluation data Dd (step S104).

Fig. 4 shows an example of a display screen for adjusting the delay time td and selecting the sensor. The display section 18 is driven by the display driver 16, and thereby displays a display screen 18A, as shown in fig. 4. The display screen 18A includes, for example, a reproduction window W1, a timeline window W2, an evaluation data window W3, a delay time display window W4, a sensor selection button BT1, and a selection button BT 2.

In the reproduction window W1, for example, reproduction of the imaged data Da (moving image, or a plurality of still images obtained by continuous imaging) is started by clicking the reproduction window W1, and reproduction of the imaged data Da is stopped by clicking the reproduction window W1 during reproduction of the imaged data Da.

In the timeline window W2, for example, a part of the imaging data Da is horizontally arranged in chronological order, and the reproduction position or reproduction speed of the imaging data Da reproduced in the reproduction window W1 is adjusted by horizontally sliding the timeline window W2. In the timeline window W2, for example, the reproduction line Ls is displayed at a position corresponding to the reproduction time point ts of the imaged image Da reproduced in the reproduction window W1.

In the evaluation data window W3, for example, the evaluation data Dd is displayed so as to correspond to the imaging data Da that are displayed simultaneously in the time line window W2 in chronological order. In the timeline window W3, for example, the reproduction line Ls is displayed at a position corresponding to the reproduction time point ts of the imaged image Da reproduced in the reproduction window W1. In the evaluation data window W3, for example, a peak line Lp is further displayed at a position corresponding to the peak position (peak time point tp) of the evaluation data Dd.

In the delay time display window W4, for example, the difference (tp-ts) between the peak time point tp and the reproduction time point ts is displayed as a delay time. In the sensor selection button BT1, for example, by clicking the sensor selection button BT1, a screen for changing a sensor to be used for generating the evaluation data Dd or for mixing sensors to be used for generating the evaluation data Dd (i.e., selecting a plurality of types of sensors) is displayed. The user can change the sensor to be used to generate the evaluation data Dd and mix the sensors to be used to generate the evaluation data Dd (i.e., select a plurality of types of sensors). In the selection button BT2, for example, the selection button BT2 is clicked so that the delay time displayed on the delay time display window W4 at the click point becomes the delay time td.

The image processor 14 receives the delay time td and the type of sensor determined by the user' S operation in the display screen 18A (step S105). The image processor 14 stores the received delay time td and the data of the type of sensor in the temporary memory 31. In this way, the adjustment of the delay time td and the selection of the sensor are performed.

Next, a method of extracting the imaging data Dc will be described.

Fig. 5 shows an example of a process of extracting the imaging data Dc. First, the image processor 14 acquires imaging data Da and acquisition data Db (step S201). Thereafter, the image processor 14 generates emotion data Df based on the acquisition data Db (step S202). The image processor 14 generates emotion data Df for each type of acquired data Db. Thereafter, the image processor 14 correlates the emotion data Df with the imaging data Da in consideration of the delay time td (step S203). For example, the image processor 14 stores a table Dg in which emotion data Df and imaging data Da are associated with each other in consideration of the delay time td in the temporary memory 31.

The image processor 14 evaluates the imaging data Da based on the emotion data Df (step S204). In this case, for example, the image processor 14 evaluates the imaging data Da by using the table Dg and in consideration of the delay time td of the emotion data Df. The image processor 14 determines whether each individual imaging data included in the imaging data Da is required based on the evaluation data Dd obtained by the evaluation (step S205). In the case where the imaging data Da is a moving image, the image processor 14 determines whether or not each individual frame needs to be stored. In the case where the imaging data Da is a plurality of still images obtained by continuous imaging, the image processor 14 determines whether or not each individual still image needs to be stored. The image processor 14 stores the image (imaging data Dc) determined to be stored in the storage section 32. In this way, the imaging data Dc is extracted from the imaging data Da.

Next, control of the imaging section 11 and the imaging signal processor 12 by the system controller 40 will be described.

Fig. 6 shows an example of a process of controlling the imaging section 11 and the imaging signal processor 12 by the system controller 40. First, the image processor 14 acquires imaging data Da and acquisition data Db (step S301). Thereafter, the image processor 14 generates emotion data Df based on the acquisition data Db (step S302). The image processor 14 generates emotion data Df for each type of acquired data Db. Thereafter, the image processor 14 correlates the imaging data Da and the emotion data Df in consideration of the delay time td (step S303). For example, the image processor 14 stores a table Dg in which the imaging data Da and the emotion data Df are associated with each other in consideration of the delay time td in the temporary memory 31.

The image processor 14 evaluates the imaging data Da based on the emotion data Df (step S304). In this case, for example, the image processor 14 evaluates the imaging data Da by using the table Dg and in consideration of the delay time td of the emotion data Df. The image processor 14 outputs instructions for controlling the imaging section 11 and the imaging signal processor 12 to the system controller 40 based on the evaluation data Dd obtained by the evaluation.

For example, the instruction includes a change in at least one of the resolution of the imaging section 11, the frame rate of the imaging section 11, the bit length of data to be output from the imaging section 11, a movable pixel region in a pixel array included in the imaging section 11, an influence on data to be output from the imaging section 11, or the state (standby state/imaging state) of the imaging section 11. In this case, the image processor 14 outputs, for example, an instruction for controlling at least one of the resolution of the imaging section 11, the frame rate of the imaging section 11, the bit length of data to be output from the imaging section 11, an active pixel region in a pixel array included in the imaging section 11, or the influence on the data to be output from the imaging section 11 to the system controller 40 based on the emotion data Df. Further, the image processor 14 also outputs, for example, an instruction for setting the imaging section 11 to one of a standby state and an imaging state to the system controller 40 based on the emotion data Df.

Upon receiving an instruction for controlling the imaging section 11 and the imaging signal processor 12 from the image processor 14, the system controller 40 outputs a control signal to the imaging controller 13 based on the instruction. The imaging controller 13 controls the imaging section 11 and the imaging signal processor 12 in response to the control of the system controller 40 (step S305). In this way, the imaging section 11 and the imaging signal processor 12 are controlled by the system controller 40.

Next, a countermeasure for the delay time td will be described. Fig. 7 shows an example of a method of controlling the imaging section 11 and the imaging signal processor 12.

It is assumed that the time when the resolution of the imaging section 11 is to be controlled is time t 1. In this case, at the time t2, it becomes obvious that the resolution or the like should be controlled at the time t 1. Time t2 is delayed from time t1 by delay time td. In order to control the resolution and the like at the time t1, it is preferable to predict that the resolution and the like of the imaging section 11 should be controlled at the time t3 before the time t 1.

In the case where the influence of the delay time td on the imaging data Da is small, it may be unproblematic without such prediction. However, in the case where the influence of the delay time td on the imaging data Da is large, such prediction may be necessary. Thus, as shown in fig. 7, for example, the image processor 14 may perform predictive evaluation on the imaging data Da acquired after the time points t3-td based on the previously obtained evaluation data Dd (e.g., the evaluation data Dd before the time point t 3) and the previously obtained imaging data Da (e.g., the imaging data Da before the time point t 3). In this case, for example, it becomes possible for the image processor 14 to output instructions for controlling the imaging section 11 and the imaging signal processor 12 to the system controller 40 after the time t3-td based on the evaluation data Dd (hereinafter, referred to as "predictive evaluation data Ddf") obtained in this way.

Next, a specific example of a method of controlling the imaging section 11 and the imaging signal processor 12 will be described. Fig. 8 shows an example of a process of controlling the imaging section 11 and the imaging signal processor 12. Note that, in this case, the storage section 32 stores the reference data De to be used for face recognition. The reference data De includes, for example, face data of a target of face recognition (reference data related to a photographic subject). The reference data De may include, for example, pieces of face data.

It is to be noted that the following describes a case in which the resolution of the imaging section 11 is controlled. However, similar control can be performed even in the case of controlling at least one of the resolution of the imaging section 11, the frame rate of the imaging section 11, the bit length of data to be output from the imaging section 11, the active pixel region in the pixel array included in the imaging section 11, the influence on the data to be output from the imaging section 11, or the state (standby state/imaging state) of the imaging section 11.

First, the system controller 40 sets the resolution of the imaging section 11 to "low" (step S401). Specifically, the system controller 40 outputs a control signal for setting the resolution of the imaging section 11 to "low" to the imaging controller 13. The imaging controller 13 outputs a control signal for setting the resolution of the imaging section 11 to "low" to the imaging section 11 in response to the control of the system controller 40. As a result, the imaging section 11 performs imaging with the resolution set to "low", and acquires imaging data Da with the resolution set to "low", for example, as shown in (a) in fig. 9 (step S402).

Thereafter, the image processor 14 determines whether the imaging data Da of which resolution is "low" includes a person (step S403). In the case where the determination result section includes a person, the imaging section 11 continues imaging with the resolution set to "low". In contrast, for example, in the case where a person is included in the imaging data Da, the image processor 14 outputs an instruction for setting the resolution to "medium" to the system controller 40 as shown in, for example, (B) in fig. 9. Upon receiving an instruction to set the resolution to "medium" from the image processor 14, the system controller 40 outputs a control signal to the imaging controller 13 based on the instruction. The imaging controller 13 outputs a control signal for setting the resolution of the imaging section 11 to "medium" to the imaging section 11 in response to the control of the system controller 40 (step S404). As a result, the imaging section 11 performs imaging with the resolution set to "medium", and acquires the imaging data Da whose resolution is set to "medium", for example, as shown in (C) in fig. 9 (step S405).

Thereafter, the image processor 14 performs face recognition based on the imaging data Da of which the resolution is "medium", and reads the reference data De from the storage section 32 (step S406). In this case, the image processor 14 determines whether or not a person is present in the imaging data Da that matches the reference data De (step S407). As a result of the determination, in the case where no person is present in the imaging data Da matching the reference data De, imaging is continued with the resolution set to "medium". In contrast, for example, in the case where a person is present in the imaging data Da that matches the reference data De, the image processor 14 outputs an instruction for setting the resolution to "high" to the system controller 40. Upon receiving an instruction to set the resolution to "high" from the image processor 14, the system controller 40 outputs a control signal to the imaging controller 13 based on the instruction. The imaging controller 13 outputs a control signal for setting the resolution of the imaging section 11 to "high" to the imaging section 11 in response to the control of the system controller 40 (step S408). As a result, the imaging section 11 performs imaging with the resolution set to "high", and acquires the imaging data Da whose resolution is set to "high", for example, as shown in (D) in fig. 9.

Thereafter, the electronic apparatus 1 executes the above steps S201 to S205 or the above steps S301 to S305 (step S409). In this way, setting of the resolution using face recognition is performed.

Next, another method of setting the resolution using face recognition will be described. Fig. 10 shows an example of a process of controlling the imaging section 11 and the imaging signal processor 12. Note that, in this case, the storage section 32 also stores the reference data De to be used for face recognition.

It is to be noted that the following describes a case in which the resolution of the imaging section 11 is controlled. However, similar control can be performed even in the case of controlling at least one of the resolution of the imaging section 11, the frame rate of the imaging section 11, the bit length of data to be output from the imaging section 11, the active pixel region in the pixel array included in the imaging section 11, the influence on the data to be output from the imaging section 11, or the state (standby state/imaging state) of the imaging section 11.

First, the system controller 40 sets the resolution of the imaging section 11 to "low" (step S501). Specifically, the system controller 40 outputs a control signal for setting the resolution of the imaging section 11 to "low" to the imaging controller 13. The imaging controller 13 outputs a control signal for setting the resolution of the imaging section 11 to "low" to the imaging section 11 in response to the control of the system controller 40. As a result, the imaging section 11 performs imaging with the resolution set to "low", and acquires the imaging data Da whose resolution is set to "low", for example, as shown in (a) in fig. 9 (step S502). Further, the image processor 14 acquires the acquisition data Db from the system controller 40 (step S502).

Thereafter, the image processor 14 generates emotion data Df based on the acquired data Db (step S503). The image processor 14 generates emotion data Df for each type of acquired data Db. Thereafter, the image processor 14 evaluates the imaging data Da based on the emotion data Df (step S504). For example, the image processor 14 calculates intensity data of at least pleasure/discomfort or activity/inactivity based on the emotion data Df, and sets the calculated intensity data as the evaluation data Dd.

In the case where the resultant evaluation data Dd satisfies the predetermined criterion, the image processor 14 outputs an instruction for setting the resolution to "medium" to the system controller 40. Upon receiving an instruction to set the resolution to "medium" from the image processor 14, the system controller 40 outputs a control signal to the imaging controller 13 based on the instruction. The imaging controller 13 outputs a control signal for setting the resolution of the imaging section 11 to "medium" to the imaging section 11 in response to the control of the system controller 40 (step S505). As a result, the imaging section 11 performs imaging with the resolution set to "medium", and acquires the imaging data Da with the resolution set to "medium", for example, as shown in (C) in fig. 9.

Thereafter, the image processor 14 performs face recognition based on the imaging data Da of which the resolution is "medium", and reads the reference data De from the storage section 32 (step S506). In this case, the image processor 14 determines whether or not a person is present in the imaging data Da that matches the reference data De (step S507). As a result of the determination, in the case where no person is present in the imaging data Da matching the reference data De, imaging is continued with the resolution set to "medium". In contrast, for example, in the case where a person is present in the imaging data Da that matches the reference data De, the image processor 14 outputs an instruction for setting the resolution to "high" to the system controller 40. Upon receiving an instruction to set the resolution to "high" from the image processor 14, the system controller 40 outputs a control signal to the imaging controller 13 based on the instruction. The imaging controller 13 outputs a control signal for setting the resolution of the imaging section 11 to "high" to the imaging section 11 in response to the control of the system controller 40 (step S508). As a result, the imaging section 11 performs imaging with the resolution set to "high", and acquires the imaging data Da whose resolution is set to "high", for example, as shown in (D) in fig. 9.

Thereafter, the electronic device 1 executes the above steps S201 to S205 or the above steps S301 to S305 (step S509). In this way, setting of the resolution using face recognition is performed.

Next, a method of setting the resolution using emotion will be described. Fig. 11 shows an example of a process of controlling the imaging section 11 and the imaging signal processor 12. Note that, in this case, the storage section 32 stores reference data De to be used for evaluation of emotion. For example, the reference data De includes reference data related to the emotion of the user. For example, the reference data De may include reference data related to the emotion of the user among various types of the acquired data Db.

It is to be noted that the following describes a case in which the resolution of the imaging section 11 is controlled. However, similar control can be performed even in the case of controlling at least one of the resolution of the imaging section 11, the frame rate of the imaging section 11, the bit length of data to be output from the imaging section 11, the active pixel region in the pixel array included in the imaging section 11, the influence on the data to be output from the imaging section 11, or the state (standby state/imaging state) of the imaging section 11.

First, the system controller 40 sets the resolution of the imaging section 11 to "low" (step S601). Specifically, the system controller 40 outputs a control signal for setting the resolution of the imaging section 11 to "low" to the imaging controller 13. The imaging controller 13 outputs a control signal for setting the resolution of the imaging section 11 to "low" to the imaging section 11 in response to the control of the system controller 40. As a result, the imaging section 11 performs imaging with the resolution set to "low", and acquires imaging data Da with the resolution set to "low", for example, as shown in (a) in fig. 12 (step S602). Further, the image processor 14 acquires the acquisition data Db from the system controller 40 (step S602).

Thereafter, the image processor 14 generates emotion data Df based on the acquired data Db (step S603). The image processor 14 generates emotion data Df for each type of acquired data Db. Thereafter, the image processor 14 evaluates the imaging data Da based on the emotion data Df (step S604). For example, the image processor 14 calculates intensity data of at least pleasure/discomfort or activity/inactivity based on the emotion data Df and the reference data De, and sets the calculated intensity data as the evaluation data Dd.

For example, it is assumed that in the case where the imaging data Da including a person as shown in (B) in fig. 12 is obtained, the obtained evaluation data Dd satisfies a predetermined criterion. In this case, the image processor 14 outputs an instruction for setting the resolution to "high" to the system controller 40. Upon receiving an instruction to set the resolution to "high" from the image processor 14, the system controller 40 outputs a control signal to the imaging controller 13 based on the instruction. The imaging controller 13 outputs a control signal for setting the resolution of the imaging section 11 to "high" to the imaging section 11 in response to the control of the system controller 40 (step S605). As a result, the imaging section 11 performs imaging with the resolution set to "high", and acquires the imaging data Da whose resolution is set to "high", for example, as shown in (C) in fig. 12.

Thereafter, the electronic device 1 executes the above steps S201 to S205 or the above steps S301 to S305 (step S606). In this way, setting of resolution using emotion is performed.

[ Effect ]

Next, effects of the electronic apparatus 1 according to the present embodiment will be described.

Digital cameras or smart phones have recently been equipped with a function called a "smile shutter" (registered trademark) which automatically releases the shutter in response to the determination result of whether a subject is smiling to accurately capture the timing of the smile.

Incidentally, the user may wish to take a picture not only at a smiling moment but also at a rare moment such as a natural expression moment of a child or an activity. However, when the user holds the camera, the natural expression may disappear and the expression becomes a smile or tense expression, making it difficult to capture the natural expression. In addition, the precious moment comes suddenly, so that the user may miss shooting while holding the camera.

In contrast, in the electronic apparatus 1 according to the present embodiment, the imaging data Da and the emotion data Df are correlated with each other in time series, and the imaging section 11 is controlled based on the emotion data Df. Thus, for example, in the case where the emotion data changes at the timing when the user reacts to the photographic subject, the monitoring of the emotion data Df enables grasping the timing at which the user may wish to photograph. As a result, for example, it becomes possible to obtain the imaging data Dc of the scene meaningful to the user in response to the change of the emotion data Df and the evaluation data Dd without the user manually releasing the shutter. Further, since the user is not required to manually release the shutter, the user is not required to hold the camera, so that a natural expression can be photographed.

Further, the present embodiment uses the acquired data Db including measurement data related to at least one of pulse, heart rate, electrocardiogram, electromyogram, respiration, perspiration, GSR, blood pressure, blood oxygen saturation, skin surface temperature, electroencephalogram, blood flow change, body temperature, motion of the body, motion of the head, center of gravity, rhythm of walking or running, eye condition, or surrounding sound. This enables accurate generation of emotion data Df.

Further, in the present embodiment, the imaging data Da is evaluated based on the emotion data Df, and the imaging section 11 and the imaging signal processor 12 are controlled based on the result of the evaluation (evaluation data Dd). This enables the imaging data Dc of the scene meaningful to the user to be obtained in response to the emotion data Df and the evaluation data Dd without the user manually releasing the shutter.

Further, in the present embodiment, at least pleasant/uncomfortable or active/inactive intensity data is calculated based on the emotion data Df, and the imaging section 11 is controlled based on the intensity data obtained by the calculation. This enables the imaged data Dc of the scene meaningful to the user to be obtained in response to the change of the emotion data Df and the evaluation data Dd without the user manually releasing the shutter.

In addition, in the present embodiment, at least one of the resolution, the frame rate, the bit length, the influence, or the effective pixel region in the imaging section is controlled based on the emotion data Df and the evaluation data Dd. This enables the system controller 40 to, for example, reduce the resolution, reduce the frame rate, reduce the bit length, eliminate the influence, or reduce the effective pixel area in the imaging section, in addition to when the imaging data Dc of the scene that is meaningful to the user is acquired. In such a case, power consumption for causing the electronic apparatus 1 to perform imaging for a long time can be reduced.

Further, in the present embodiment, the imaging section 11 is set to one of the standby state and the imaging state based on the emotion data Df. This enables the system controller 40 to set the imaging section 11 to a standby state, except when the imaging data Dc of a scene that is meaningful to the user is acquired. In such a case, power consumption for causing the electronic apparatus 1 to perform imaging for a long time can be reduced.

Further, in the present embodiment, it is determined whether or not the imaged data Da needs to be stored based on the emotion data Df, and the imaged data Dc of the scene meaningful to the user is stored in the storage section 32 based on the determination. This enables the imaged data Dc of the scene meaningful to the user to be obtained in response to the change of the emotion data Df and the evaluation data Dd without the user manually releasing the shutter.

Further, in the present embodiment, data deletion or overwriting with newly stored imaging data Da is performed for a portion of the imaging data Da stored in the temporary memory 31, the period of which stored in the temporary memory exceeds a predetermined period. This enables the electronic apparatus 1 to perform imaging for a long time even in the case where the temporary storage 31 has a small capacity.

Further, in the present embodiment, the imaging data Da' in which the camera shake has been corrected is generated. This enables the imaging data Dc with less camera shake to be obtained even in the case where the user carries the electronic apparatus 1.

Further, the imaging data Da acquired at the time point t1 and the emotion data Df acquired at the time point t2 later by the delay time td than the time point t1 are correlated with each other. This enables the imaging data Dc to be obtained in consideration of the delay time unique to the emotion data Df.

Further, in the present embodiment, the imaging section 11 is controlled based on the matching result between the reference data De and the imaging data Da. This allows the imaging section 11 to be controlled using face recognition so that data desired to be retained can be selected and stored as the imaging data Dc.

Further, in the present embodiment, the imaging section 11 and the imaging signal processor 12 are controlled based on the reference data De and the emotion data Df. This enables the user to select and store the imaging data Dc which is particularly felt or impressive by the user.

<2. modification >

Next, a modification of the electronic apparatus 1 according to the above-described embodiment will be described. In the following description, the same components as those in the above-described embodiment are denoted by the same reference numerals. In addition, descriptions of the same components as in the above-described embodiment are appropriately omitted.

[ modification A ]

In the above-described embodiment, for example, as shown in fig. 13, the image signal generator 15, the display driver 16, the display controller 17, the display section 18, the audio signal generator 24, and the audio output section 25 may be omitted. In this case, for example, the electronic apparatus 1 may transmit data necessary for displaying the display screen 18A shown in fig. 4 to the electronic apparatus 2 via the communication section 60, the electronic apparatus 2 including the image signal generator 15, the display driver 16, the display controller 17, the display section 18, the audio signal generator 24, and the audio output section 25. In such a case, the delay time td may be set in the electronic device 2 or the type of sensor may be selected.

[ modification B ]

In the modification a described above, the storage section 32 may be omitted, for example, as shown in fig. 14. In this case, the electronic apparatus 1 may store the imaging data Dc in the storage apparatus 3 including the storage section 32 via the communication section 60.

[ modification C ]

For example, in the above-described modification a, the electronic apparatus 1 may be configured to communicate with the display apparatus 2 via the network 4, as shown in fig. 15. The network 4 is a communication standard capable of communicating with the display device 2 through the communication section 60.

[ modification D ]

For example, in the above-described modification B, the electronic apparatus 1 may be configured to communicate with the display apparatus 2 or the storage apparatus 3 via the network 4, as shown in fig. 16.

[ modification E ]

In the above-described embodiment and its modifications, for example, the imaging section 11 may be provided independently of the electronic apparatus 1, as shown in fig. 17. In this case, the image obtained by obtaining the imaging unit 11 is input to the image block 10 via the communication unit 60 or the like. In such a case, effects similar to those of the above-described embodiment can also be achieved.

[ modification F ]

In the above-described embodiment and modifications a to D, the electronic apparatus 1 may further include an attaching portion 1B capable of attaching the electronic apparatus 1 to the body of the user 100, for example, as shown in fig. 18. For example, in this case, the attachment portion 1B is coupled to the main body portion 1A of the electronic apparatus 1. This makes it easier to photograph the natural expression of the subject, since the user 100 does not need to hold the electronic apparatus 1.

[ modification G ]

In the above-described embodiment and modifications a to D, the imaging section 11 may include the imaging elements 1a, 1b, and 1c in which respective directions of the optical axis are different from each other, and acquire imaging data of respective specific directions, for example, as shown in fig. 19. In this case, for example, the imaging elements 1a, 1b, and 1c can be fixedly attached to the cap 110 of the user 100 and can be detached from the cap 110 of the user 100. This enables imaging of a wide range.

[ modification example H ]

In the above-described embodiment and modifications a to D and F, the imaging section 11 may include the lens 1C, for example, as shown in fig. 20. The lens 1C is, for example, a wide-angle lens or a fisheye lens. This enables imaging of a wide range.

Although the present disclosure has been described above with reference to the exemplary embodiments and modifications, these embodiments and modifications should not be construed as limiting the scope of the present disclosure, and may be modified in various ways. It should be understood that the effects described herein are merely examples. The effects of the exemplary embodiments and the modifications of the present disclosure are not limited to the effects described herein. The present disclosure may also include any effect other than the effects described herein.

Also, the present disclosure may have the following configuration.

(1) An electronic device, comprising:

an imaging section that acquires imaging data;

a data generator that generates emotion data based on the acquired data;

a data processor for correlating the imaging data with the emotion data in chronological order; and

a controller that changes the setting of the imaging section in chronological order based on the emotion data.

(2) The electronic device according to (1), wherein the acquisition data includes measurement data related to at least one of pulse, heart rate, electrocardiogram, electromyogram, respiration, sweating, GSR, blood pressure, blood oxygen saturation, skin surface temperature, electroencephalogram, blood flow change, body temperature, motion of the body, motion of the head, center of gravity, rhythm of walking or running, eye condition, or surrounding sound.

(3) The electronic apparatus according to (1) or (2), wherein the controller evaluates the imaging data in time series based on the emotion data, and controls the imaging section based on an evaluation result.

(4) The electronic apparatus according to any one of (1) to (3), wherein the controller performs calculation of intensity data of at least pleasure/discomfort or activity/inactivity based on the emotion data, and controls the imaging section based on the intensity data obtained by the calculation.

(5) The electronic apparatus according to any one of (1) to (4), wherein the controller controls at least one of a resolution, a frame rate, a bit length, an influence, or an effective pixel area in the imaging section based on the emotion data.

(6) The electronic apparatus according to any one of (1) to (4), wherein the controller sets the imaging section to any one of a standby state and an imaging state based on the emotion data.

(7) The electronic apparatus according to any one of (1) to (6), further comprising a storage section that stores the imaging data, wherein,

the controller determines whether the imaging data needs to be stored based on the emotion data, and causes the storage section to store the imaging data based on the determination.

(8) The electronic apparatus according to any one of (1) to (6), further comprising a storage section that stores the imaging data, wherein,

the controller causes the storage section to store the imaging data, and performs data deletion or rewriting with newly stored imaging data on a portion of the imaging data stored in the storage section, the portion of the imaging data stored in the storage section having a period exceeding a predetermined period.

(9) The electronic apparatus according to any one of (1) to (8), wherein the controller detects an amount and a direction of camera shake in the imaging data, determines a cut-out region in the imaging data from the amount and the direction of camera shake that has been detected, and generates the imaging data in which the camera shake has been corrected based on the determination.

(10) The electronic device according to any one of (1) to (9), wherein the data processor correlates imaging data acquired at a first time point with emotion data acquired at a second time point later than the first time point.

(11) The electronic apparatus according to any one of (1) to (10), further comprising an attaching portion capable of attaching the electronic apparatus to a body.

(12) The electronic apparatus according to any one of (1) to (11), wherein,

the imaging section includes a plurality of imaging elements whose directions of respective optical axes are different from each other and which acquire imaging data of respective specific directions, and,

the data processor associates imaging data of a plurality of specific directions obtained by the plurality of imaging elements with the emotion data in time series as imaging data.

(13) The electronic apparatus according to any one of (1) to (12), wherein the imaging section includes a wide-angle lens or a fisheye lens.

(14) The electronic apparatus according to any one of (1) to (13), further comprising a storage section that stores reference data relating to the object, wherein,

the controller controls the imaging section based on a result of matching between the reference data and the imaging data.

(15) The electronic apparatus according to (14), wherein the setting of the imaging section is performed on at least one of resolution, frame rate, bit length, influence, or effective pixel area in the imaging section.

(16) The electronic device according to any one of (1) to (15), further comprising a storage section that stores reference data relating to emotion, wherein,

the controller controls the imaging section based on the reference data and the emotion data.

(17) An imaging system, comprising:

an imaging section that acquires imaging data;

a data generator that generates emotion data based on the acquired data;

a data processor for correlating the imaging data with the emotion data in chronological order; and

a controller that controls the imaging section based on the emotion data.

(18) The imaging system according to (17), further comprising a storage section that stores reference data relating to the subject, wherein,

the controller controls the imaging section based on a result of matching between the reference data and the imaging data.

In the electronic apparatus and the imaging system according to the embodiment of the present disclosure, imaging data and emotion data are correlated with each other in time series, and the imaging section is controlled based on the emotion data. As a result, it becomes possible to obtain imaging data in response to a change in emotion data without requiring the user to manually release the shutter. Therefore, it becomes possible for the user to take a picture without missing the timing at which he/she wants to take a picture. It is to be noted that the effects described herein are not limited and may include any of the effects described in the present disclosure.

This application claims the benefit of japanese priority patent application JP2019-068358, filed at the japanese patent office on 3/29/2019, the entire contents of which are incorporated herein by reference.

It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may be made according to design requirements and other factors insofar as they come within the scope of the appended claims or the equivalents thereof.

32页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:信息处理设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类