Image processing apparatus, image processing method, imaging apparatus, and program

文档序号:1804631 发布日期:2021-11-05 浏览:4次 中文

阅读说明:本技术 图像处理装置、图像处理方法、摄影装置及程序 (Image processing apparatus, image processing method, imaging apparatus, and program ) 是由 藤野梨奈 长井俊朗 佐藤恒夫 于 2020-03-02 设计创作,主要内容包括:本发明的目的在于提供一种能够进行随时间而变化的图像处理的图像处理装置、图像处理方法、摄影装置及程序。图像处理装置(101)具备:图像获取部(100A),从存储有具有表示摄影时间的第1日期和时间信息的摄影图像的存储部获取处理对象的摄影图像;日期和时间信息获取部(100B),从处理对象的摄影图像获取第1日期和时间信息,并获取表示通过图像获取部从存储部获取到处理对象的摄影图像的日期和时间的第2日期和时间信息;经过的日期和时间计算部(100C),对通过日期和时间信息获取部获取的第1日期和时间信息与第2日期和时间信息进行比较,计算从摄影时间经过的日期和时间;及图像处理部(100D),从根据经过的日期和时间的长度而变化的多个图像处理中,根据经过的日期和时间的长度选择图像处理,并对处理对象的摄影图像进行所选择的图像处理。(An object of the present invention is to provide an image processing apparatus, an image processing method, an imaging apparatus, and a program capable of performing image processing that changes with time. An image processing device (101) is provided with: an image acquisition unit (100A) that acquires a captured image of a processing target from a storage unit that stores a captured image having date-and-time-information item 1 indicating the time of capture; a date and time information acquisition unit (100B) that acquires the 1 st date and time information from the captured image of the processing object, and acquires the 2 nd date and time information indicating the date and time at which the captured image of the processing object was acquired from the storage unit by the image acquisition unit; an elapsed date and time calculation unit (100C) that compares the 1 st date and time information acquired by the date and time information acquisition unit with the 2 nd date and time information and calculates the date and time elapsed from the shooting time; and an image processing unit (100D) that selects image processing according to the length of the elapsed date and time from among a plurality of image processing that vary according to the length of the elapsed date and time, and performs the selected image processing on the captured image of the processing target.)

1. An image processing apparatus includes:

an image acquisition unit that acquires a photographed image of a processing target from a storage unit that stores a photographed image having date-1 st and time information indicating photographing time;

a date and time information acquisition unit that acquires the 1 st date and time information from the captured image of the processing object, and acquires 2 nd date and time information indicating the date and time at which the captured image of the processing object was acquired from the storage unit by the image acquisition unit;

an elapsed date and time calculation unit that compares the 1 st date and time information acquired by the date and time information acquisition unit with the 2 nd date and time information, and calculates an elapsed date and time from the photographing time; and

and an image processing unit that selects image processing according to the length of the elapsed date and time from among a plurality of image processing that vary according to the length of the elapsed date and time, and performs the selected image processing on the captured image of the processing target.

2. The image processing apparatus according to claim 1,

the image processing apparatus includes a display control unit that displays a captured image of the processing target subjected to the image processing in the image processing unit on a display unit.

3. The image processing apparatus according to claim 1 or 2,

the image processing apparatus includes a print control unit that causes a print unit to print a photographed image of the processing target that has been subjected to the image processing by the image processing unit.

4. The image processing apparatus according to any one of claims 1 to 3,

the photographed image of the processing object subjected to the image processing by the image processing unit and the photographed image of the processing object before the image processing by the image processing unit are stored in the storage unit.

5. The image processing apparatus according to any one of claims 1 to 3,

the photographed image of the processing object that has been image-processed by the image processing unit is not stored in the storage unit.

6. The image processing apparatus according to any one of claims 1 to 5,

the image processing apparatus includes a relationship input unit that receives an input of a correspondence relationship between the plurality of image processes and the length of the elapsed date and time on a time axis.

7. The image processing apparatus according to any one of claims 1 to 6,

the image processing unit changes the overall color of the photographed image of the processing object in accordance with the length of the elapsed date and time.

8. The image processing apparatus according to any one of claims 1 to 6,

the image processing unit changes the color of a part of the subject in the photographic image of the processing object according to the length of the elapsed date and time.

9. The image processing apparatus according to claim 8,

the image processing unit lightens a background color of the photographic image of the processing object in accordance with the length of the elapsed date and time.

10. The image processing apparatus according to any one of claims 1 to 9,

the image processing unit changes a part of a color or a part of a hue of the photographic image of the processing object according to the length of the elapsed date and time.

11. The image processing apparatus according to any one of claims 1 to 10,

the image processing unit changes a degree of blurring of a partial region of the photographic image of the processing target in accordance with the length of the elapsed date and time.

12. The image processing apparatus according to claim 11,

the image processing unit changes the degree of blurring of the foreground or background of the person in the captured image according to the length of the elapsed date and time.

13. The image processing apparatus according to any one of claims 1 to 12,

the image processing unit changes the decoration image added to the photographed image of the processing object according to the elapsed date and time.

14. An imaging apparatus on which the image processing apparatus according to any one of claims 1 to 13 is mounted.

15. The camera device according to claim 14,

the imaging apparatus includes an imaging mode setting unit for setting one imaging mode from a plurality of imaging modes,

the image processing unit changes the plurality of image processes that vary according to the length of the elapsed date and time in accordance with the shooting mode set by the shooting mode setting unit.

16. An image processing method, comprising the steps of:

acquiring a photographed image of a processing object from a storage unit in which a photographed image having 1 st date and time information indicating photographing time is stored;

acquiring the 1 st date and time information from the photographed image of the processing object, and acquiring the 2 nd date and time information indicating the date and time at which the photographed image of the processing object is acquired from the storage unit;

comparing the 1 st date and time information with the 2 nd date and time information, and calculating the date and time elapsed from the photographing time; and

selecting image processing according to the length of the elapsed date and time from a plurality of image processing that vary according to the length of the elapsed date and time, and performing the selected image processing on the photographic image of the processing object.

17. A program for causing a computer to execute an image processing process, the image processing process comprising the steps of:

acquiring a photographed image of a processing object from a storage unit in which a photographed image having 1 st date and time information indicating photographing time is stored;

acquiring the 1 st date and time information from the photographed image of the processing object, and acquiring the 2 nd date and time information indicating the date and time at which the photographed image of the processing object is acquired from the storage unit;

comparing the 1 st date and time information with the 2 nd date and time information, and calculating the date and time elapsed from the photographing time; and

selecting image processing according to the length of the elapsed date and time from a plurality of image processing that vary according to the length of the elapsed date and time, and performing the selected image processing on the photographic image of the processing object.

Technical Field

The present invention relates to an image processing apparatus, an image processing method, an imaging apparatus, and a program, and relates to an image processing apparatus, an image processing method, an imaging apparatus, and a program that change image processing in accordance with the length of the date and time elapsed from the imaging time.

Background

Conventionally, there has been proposed a technique for performing predetermined processing on an image to make a person who views the image more interested.

For example, patent document 1 describes the following technique: when a plurality of image contents are continuously output, the image contents are continuously output by inserting an effect corresponding to attribute information included in each image content.

Documents of the prior art

Patent document

Patent document 1: japanese patent laid-open No. 2008-61032

Disclosure of Invention

Technical problem to be solved by the invention

Here, one of the changes that are interesting is a change with time. For example, in leather products, the surface of the leather changes gradually as a function of age, each change giving a different aesthetic appearance. Further, for example, the taste of wine gradually changes according to the storage period, and a drinker can enjoy the change in taste.

Patent document 1 describes a technique of selecting a sepia effect (paragraph 0055) when the recording date and time and the playback date and time of the content are compared with each other and the value is equal to or greater than a threshold value (patent document 1), but the change is a one-time change and is not a change with time that gradually changes with elapsed time.

The present invention has been made in view of the above circumstances, and an object thereof is to provide an image processing apparatus, an image processing method, an imaging apparatus, and a program that can perform image processing that changes over time.

Means for solving the technical problem

An image processing apparatus according to an aspect of the present invention for achieving the above object includes: an image acquisition unit that acquires a photographed image of a processing target from a storage unit that stores a photographed image having date-and-time-information 1 indicating a photographing time; a date and time information acquisition unit that acquires 1 st date and time information from the photographed image of the processing object, and acquires 2 nd date and time information indicating the date and time at which the photographed image of the processing object was acquired from the storage unit by the image acquisition unit; an elapsed date and time calculation unit that compares the 1 st date and time information acquired by the date and time information acquisition unit with the 2 nd date and time information, and calculates the date and time elapsed from the photographing time; and an image processing unit that selects image processing according to the length of the elapsed date and time from among a plurality of image processing that vary according to the length of the elapsed date and time, and performs the selected image processing on the captured image of the processing target.

According to this aspect, the date and time elapsed from the shooting time is calculated, and image processing is performed based on the calculated length of elapsed date and time from among a plurality of image processing that vary according to the length of elapsed date and time. This aspect can perform image processing on a captured image that changes with time.

Preferably, the image processing apparatus includes a display control unit that displays a captured image of the processing target subjected to the image processing by the image processing unit on the display unit.

Preferably, the image processing apparatus includes a print control unit that causes the print unit to print a photographic image of a processing target subjected to the image processing by the image processing unit.

Preferably, the storage unit stores a captured image of the processing target subjected to the image processing by the image processing unit and a captured image of the processing target before the image processing by the image processing unit.

Preferably, the captured image of the processing target subjected to the image processing by the image processing unit is not stored in the storage unit.

Preferably, the image processing apparatus includes a relationship input unit that receives an input of a correspondence relationship on a time axis between the plurality of image processes and the lengths of the elapsed dates and times.

Preferably, the image processing unit changes the entire color of the photographed image of the processing target in accordance with the length of the elapsed date and time.

Preferably, the image processing unit changes a color of a part of the subject in the photographic image to be processed, in accordance with the length of the elapsed date and time.

Preferably, the image processing unit lightens the background color of the photographed image of the processing target in accordance with the length of the elapsed date and time.

Preferably, the image processing unit changes a part of a color or a part of a hue of the photographed image to be processed, in accordance with a length of the elapsed date and time.

Preferably, the image processing unit changes the degree of blurring of a partial region of the photographed image to be processed, in accordance with the length of the elapsed date and time.

Preferably, the image processing unit changes the degree of blurring of the foreground or background of the person in the captured image according to the length of the elapsed date and time.

Preferably, the image processing unit changes the decoration image added to the photographed image of the processing object in accordance with the elapsed date and time.

An imaging apparatus according to another aspect of the present invention is mounted with the image processing apparatus.

Preferably, the imaging apparatus includes an imaging mode setting unit that sets one imaging mode from among the plurality of imaging modes, and the image processing unit changes the plurality of image processes that change according to the length of the elapsed date and time in accordance with the imaging mode set by the imaging mode setting unit.

An image processing method according to another aspect of the present invention includes: acquiring a photographed image of a processing object from a storage unit in which a photographed image having date-and-time-information 1 indicating photographing time is stored; acquiring 1 st date and time information from the photographed image of the processing object, and acquiring 2 nd date and time information indicating the date and time at which the photographed image of the processing object was acquired from the storage unit; comparing the 1 st date and time information with the 2 nd date and time information, and calculating the date and time elapsed from the photographing time; and selecting image processing according to the length of the elapsed date and time from among a plurality of image processing that vary according to the length of the elapsed date and time, and performing the selected image processing on the photographic image of the processing target.

A program according to another aspect of the present invention causes a computer to execute an image processing step including: acquiring a photographed image of a processing object from a storage unit in which a photographed image having date-and-time-information 1 indicating photographing time is stored; acquiring 1 st date and time information from the photographed image of the processing object, and acquiring 2 nd date and time information indicating the date and time at which the photographed image of the processing object was acquired from the storage unit; comparing the 1 st date and time information with the 2 nd date and time information, and calculating the date and time elapsed from the photographing time; and selecting image processing according to the length of the elapsed date and time from among a plurality of image processing that vary according to the length of the elapsed date and time, and performing the selected image processing on the photographic image of the processing target.

Effects of the invention

According to the present invention, since the date and time elapsed from the photographing time is calculated and image processing is performed based on the calculated length of elapsed date and time from among a plurality of image processing that change based on the length of elapsed date and time, image processing that changes with time can be performed on a photographed image.

Drawings

Fig. 1 is a front perspective view showing an example of a digital camera with a printer.

Fig. 2 is a rear perspective view of the digital camera with printer.

Fig. 3 is a front view of an instant film.

Fig. 4 is a rear view of the instant film.

Fig. 5 is a block diagram showing a main part of an electrical configuration of the digital camera with printer.

Fig. 6 is a diagram showing an example of a functional configuration of the image processing apparatus.

Fig. 7 is a conceptual diagram illustrating image processing performed by the image processing unit that changes with time.

Fig. 8 is a flowchart showing an image processing method.

Fig. 9 is a diagram illustrating image processing in which the entire color of a captured image changes with time.

Fig. 10 is a diagram illustrating image processing in which the color of a part of the subject in the captured image changes with time.

Fig. 11 is a diagram illustrating image processing in which the same color or hue of a photographed image changes with time.

Fig. 12 is a diagram illustrating image processing in which the degree of blur in a partial region of a captured image changes with time.

Fig. 13 is a diagram for explaining image processing in which a decoration image added to a photographed image changes with time.

Fig. 14 is a diagram showing an example of image processing using sound of a shooting environment.

Fig. 15 is a diagram of an example of processing using information of a shooting location.

Fig. 16 is a diagram showing an example of the emotion setting screen.

Fig. 17 is a diagram showing an example of image processing according to emotion.

Detailed Description

Preferred embodiments of an image processing apparatus, an image processing method, an imaging apparatus, and a program according to the present invention will be described below with reference to the accompanying drawings.

Digital camera with printer

A digital camera with a printer (image capturing apparatus) 10 on which an image processing apparatus 101 (see fig. 6) according to the present invention is mounted is a digital camera with a built-in printer, and has a function of printing a captured image (captured image) on the spot. The digital camera with printer 10 prints on the instant film using the instant film package. The digital camera with printer 10 according to the present embodiment has a recording function, and can record audio in association with a photographed image.

[ appearance Structure ]

Fig. 1 is a front perspective view showing an example of a digital camera with a printer. Fig. 2 is a rear perspective view of the digital camera with printer shown in fig. 1.

As shown in fig. 1 and 2, a digital camera with printer 10 has a portable camera body 12. The camera body 12 has a vertically long rectangular parallelepiped shape with a thin thickness in the front-rear direction and a longer longitudinal dimension than a lateral dimension.

As shown in fig. 1, the camera body 12 includes a photographing lens 14, a release button 16, a recording button 18, a strobe light emission window 20, and the like on the front side. A power button 22a, a menu button 22b, an OK button 22c, a mode switching button 22d, a microphone hole 24, a speaker hole 26, and the like are provided on one side surface of the camera body 12. The release button 16 is a button for instructing recording of a photographic image. The power button 22a is a button for turning on and off the power of the digital camera with printer 10. The menu button 22b is a button for calling up a menu screen. OK button 22C is a button for indicating OK. The mode switching button 22d is a button for switching the automatic printing mode and the manual printing mode in the photographing mode. Further, the shooting mode can be changed by calling up a menu screen with the menu button 22b, displaying a shooting mode change screen, and specifying it with the OK button 22C. Appropriate imaging conditions are set in each imaging mode. The photographing mode includes, for example, a portrait mode and a landscape mode. In this case, the menu button 22b and the OK button 22c constitute a shooting mode setting unit.

As shown in fig. 2, a touch panel display (display unit) 28, a film cover 30, and various operation buttons are provided on the back surface side of the camera body 12. The film cover 30 is a cover that opens and closes the film loading chamber. The operation button classes include a joystick 32a, a print button 32b, a playback button 32c, a cancel button 32d, and the like. The print button 32b is a button for instructing printing. The playback button 32c is a button that instructs switching to the playback mode. The cancel button 32d is a button that instructs operation cancellation.

As shown in fig. 1 and 2, a film discharge port 34 is provided on the upper surface of the camera body 12. The printed instant film is discharged from the film discharge outlet 34.

[ Structure of Printer portion of digital Camera with Printer ]

The digital camera with printer 10 includes a film loading chamber (not shown), a film feeding mechanism 52, a film conveying mechanism 54, a print head 56, and the like (see fig. 5) as components of a printer portion as a printing unit. The film filling chamber is filled with an instant film package having a structure in which a plurality of instant films are accommodated in a housing. Fig. 3 is a front view of the instant film 42 and fig. 4 is a rear view of the instant film 42. In fig. 3 and 4, the direction indicated by the arrow F is the use direction of the film 42, i.e., the film 42 is conveyed in the direction indicated by the arrow F. Therefore, when the digital camera 10 with a printer is loaded, the direction indicated by the arrow F is the discharge direction of the instant film 42.

The instant film 42 is a self-developing type instant film having a rectangular card shape. That is, the back side of the photographic film 42 is an exposure surface 42a, and the front side is an observation surface 42 b. The exposure surface 42a is a surface on which a video is recorded by exposure, and the observation surface 42b is a surface on which a recorded image is observed.

As shown in fig. 3, the viewing surface 42b of the instant film 42 includes a viewing area 42 h. As shown in fig. 4, the exposure surface 42a of the instant film 42 includes an exposure area 42c, a pocket portion 42d, and a collection portion 42 f. That is, the developing process is performed by developing the developing solution of the bag portion 42d to the exposed region 42c after the exposure of the developing film 42. The bag portion 42d contains a developing solution bag portion 42e containing a developing solution. The instant film 42 is passed between the roller pair, whereby the developing treatment liquid of the bag portion 42d is squeezed out from the bag portion 42d and developed to the exposed region 42 c. The developing treatment liquid remaining during the development treatment is captured by the collecting portion 42 f. The absorbent material 42g is contained in the collecting portion 42 f.

The print cartridge is filled in a film loading chamber, not shown, provided inside the camera body 12. At the time of printing, the films are fed one by a claw (claw member) not shown in the figure of the film feeding mechanism 52, and are conveyed by a roller not shown in the figure of the film conveying mechanism 54. During the conveyance, the developing roller pair, not shown, crushes the bag portion 42d of the immediate developing solution 42 to develop the developing solution. The print head 56 is a line-type exposure head, and irradiates print light line by line on the exposure surface 42a of the instant film 42 conveyed by the film conveying mechanism 54, and records a photographic image on the instant film 42 by a single pass. A housing 42i is provided around the observation region 42h, and the photographed image is displayed inside the housing 42 i.

[ Electrical constitution of digital Camera with Printer ]

Fig. 5 is a block diagram showing a main part of an electrical configuration of the digital camera with printer 10.

As shown in fig. 5, the digital camera with printer 10 includes a photographing lens 14, a lens driving unit 62, an image sensor 64, an image sensor driving unit 66, an analog signal processing unit 68, a digital signal processing unit 70, a memory (storage unit) 72, a memory controller 74, a display 28, a display controller 76, a communication unit 78, and an antenna 80. The digital camera with printer 10 includes a film feeding drive unit 82, a film feeding drive unit 84, a head drive unit 86, a flash 88, a flash light emission control unit 90, a microphone 92, a speaker 94, an audio signal processing unit 96, a clock unit 97, an operation unit 98, and a camera control unit 100.

The photographing lens 14 forms an optical image of a subject on the light receiving surface of the image sensor 64. The photographing lens 14 has a focus adjustment function, and includes a diaphragm and a shutter, not shown. The lens driving unit 62 includes a motor and a driving circuit thereof for driving the focus adjustment function of the photographing lens 14, a motor and a driving circuit thereof for driving the diaphragm, and a motor and a driving circuit thereof for driving the shutter, and operates the focus adjustment mechanism, the diaphragm, and the shutter in accordance with a command from the camera control unit 100.

The image sensor 64 is formed of a two-dimensional solid-state imaging Device such as a CCD image sensor (Charge Coupled Device) or a CMOS image sensor (Complementary Metal Oxide Semiconductor). The image sensor 64 has an image pickup area with an aspect ratio corresponding to a printable area of the instant film to be used. The image sensor driving unit 66 includes a driving circuit of the image sensor 64, and operates the image sensor 64 in accordance with a command from the camera control unit 100.

The analog signal processing unit 68 reads an analog image signal for each pixel output from the image sensor 64, performs signal processing (for example, correlated double sampling processing, amplification processing, and the like), digitizes the signal, and outputs the digitized signal.

The digital signal processing unit 70 reads the digital image signal output from the analog signal processing unit 68 and performs signal processing (for example, gradation conversion processing, white balance correction processing, γ correction processing, synchronization processing, YC conversion processing, and the like) to generate image data.

The memory 72 is a non-transitory recording medium that stores image data (photographed image) and audio data obtained by photographing, and uses, for example, a memory card or the like. The memory 72 is an example of a storage unit. The memory controller 74 reads and writes data from and into the memory 72 under the control of the camera control unit 100.

The Display 28 is formed of, for example, a Liquid Crystal Display (LCD), an Organic Electro-Luminescence Display (OELD), or the like. The Display 28 may be formed of a plasma Display, a Field Emission Display (FED), electronic paper, or the like. The display controller 76 causes the display 28 to display an image under the control of the camera control unit 100.

The communication unit 78 wirelessly communicates with another digital camera with printer 10 (another device) via the antenna 80 under the control of the camera control unit 100. The Communication unit 78 performs direct Communication with another device in a short distance by short-distance wireless Communication such as NFC standard (Near Field Communication) or Bluetooth (registered trademark). The communication unit 78 is connectable to an information communication network such as the internet via a Wi-Fi point (Wi-Fi: registered trademark) or the like, and can communicate with another digital camera 10 with a printer (other device) regardless of distance.

The film feeding drive section 82 includes a motor for driving a claw (claw member), not shown, of the film feeding mechanism 52 and a drive circuit thereof, and drives the motor to operate the claw under the control of the camera control section 100.

The film transport driving unit 84 includes a motor and a driving circuit for driving the transport roller pair, not shown, of the film transport mechanism 54, and a motor and a driving circuit for driving the unwinding roller pair, not shown, and operates the transport roller pair and the unwinding roller pair by driving the motors of the transport roller pair and the unwinding roller pair under the control of the camera control unit 100.

The head driving section 86 includes a driving circuit of the print head 56, and drives the print head 56 under the control of the camera control section 100.

The strobe 88 includes, for example, a xenon tube, an LED (Light Emitting Diode), or the like as a Light source, and emits strobe Light to the subject by Emitting Light from the Light source. The flash light is irradiated from a flash light emission window 20 (refer to fig. 1) provided at the front of the camera body 12. The flash light emission control section 90 includes a drive circuit of the flash 88, and causes the flash 88 to emit light in accordance with an instruction from the camera control section 100.

The microphone 92 collects external audio via a microphone hole 24 (refer to fig. 2) provided in the camera body 12. The speaker 94 outputs audio to the outside through the speaker hole 26 provided in the camera body 12. The audio signal processing unit 96 performs signal processing on the audio signal input from the microphone 92, digitizes the signal, and outputs the signal. The audio signal processing unit 96 performs signal processing on the audio data supplied from the camera control unit 100 and outputs the audio data from the speaker 94. The clock section 97 holds information of date and time, and the camera control section 100 sets shooting time (date and time) with reference to the information.

The operation unit 98 includes various operation members such as a release button 16, a record button 18, a power button 22a, a menu button 22b, an OK button 22c, a joystick 32a, a print button 32b, a playback button 32c, and a cancel button 32d, and signal processing circuits thereof, and outputs signals based on operations of the operation members to the camera control unit 100.

The camera control unit 100 is a control unit that collectively controls the operation of the digital camera 10 with a printer. The camera control Unit 100 includes a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), an EEPROM (electrically Erasable and Programmable Read Only Memory), and the like. The camera control unit 100 is a computer including these CPUs and the like, and executes a control program to implement various functions described below.

Functional structure of image processing apparatus

The digital camera with printer 10 is equipped with an image processing apparatus 101, and the main functions of the image processing apparatus 101 are provided in a camera control unit 100.

Fig. 6 is a diagram showing an example of a functional configuration of the image processing apparatus 101 provided in the camera control unit 100. The image processing apparatus 101 includes an image acquisition unit 100A, a date and time information acquisition unit 100B, an elapsed date and time calculation unit 100C, an image processing unit 100D, a display control unit 100E, and a print control unit 100F.

The functions of each part of the camera control unit 100 (image processing apparatus 101) can be realized by various processors (processors) and recording media. The various processors include, for example, a CPU that is a general-purpose processor that executes software (programs) to realize various functions. The various processors also include Programmable Logic Devices (PLDs) such as a GPU (Graphics Processing Unit) and a Field Programmable Gate Array (FPGA), which are processors dedicated to image Processing. A programmable logic device is a processor that is capable of altering circuit structures after manufacture. In addition, a processor having a Circuit configuration specifically designed to execute a Specific process, such as an ASIC (Application Specific Integrated Circuit), or the like, is also included in the various processors described above.

The functions of each section may be implemented by 1 processor, or may be implemented by a plurality of processors of the same kind or different kinds (for example, a plurality of FPGAs, a combination of a CPU and an FPGA, or a combination of a CPU and a GPU). Furthermore, 1 processor may have a plurality of functions. As an example of configuring a plurality of functions by 1 processor, the 1 st is a mode in which 1 processor is configured by a combination of 1 or more CPUs and software as typified by a computer, and the processor is realized as a plurality of functions. The 2 nd System is represented by a System On Chip (SoC) and uses a processor that realizes the functions of the entire System by 1 IC (Integrated Circuit) Chip. In this manner, each function is configured by 1 or more of the various processors as a hardware configuration. More specifically, the hardware configuration of these various processors is a circuit (circuit) in which circuit elements such as semiconductor elements are combined. These circuits may be circuits that implement the above functions by logical operations of logical sums, logical products, logical negations, exclusive or sums, and combinations thereof.

When the processor or the circuit executes software (program), a code readable by a computer (various processors and circuits constituting the camera control unit 100 (image processing apparatus 101) and/or a combination thereof) of the executed software is stored in a non-transitory recording medium such as a ROM, and the computer refers to the software. The software stored in the non-transitory recording medium includes a program for executing photographing and synthesis of a photographed image and data used in execution. The non-transitory recording medium on which the code is recorded may be not a ROM but various magneto-optical recording devices, a semiconductor memory, and the like. The RAM is used as a temporary storage area in processing using software, for example.

The image acquisition unit 100A acquires a photographed image of the processing target from the photographed image stored in the memory 72. The memory 72 stores a plurality of photographed images photographed by the digital camera with printer 10. The photographed image has the 1 st date and time information indicating the photographing time. Here, the 1 st date and time information is information having at least one of the year, month, day, and time when the photographed image was photographed. The selection of the shot image as the processing target from the plurality of shot images stored in the memory 72 is selected in various ways. For example, a list of a plurality of shot images stored in the memory 72 is displayed on the display 28, and the user selects a shot image to be processed from the shot images displayed in the list. In addition, a single or a plurality of photographic images of the processing object may be selected, and all the photographic images stored in the memory 72 may also be selected.

The date and time information acquiring section 100B acquires the 1 st date and time information and the 2 nd date and time information of the photographed image acquired by the image acquiring section 100A. Here, the 2 nd date and time information is information indicating the date and time at which the photographed image of the processing target was acquired from the memory 72 by the image acquisition unit 100A. When the captured image is acquired by the image acquisition unit 100A, the image processing apparatus 101 immediately executes image processing, and therefore, it can be considered that the 2 nd date and time information is substantially the same as the information indicating the date and time at which the image processing is performed.

The elapsed date and time calculation unit 100C compares the 1 st date and time information acquired by the date and time information acquisition unit 100B with the 2 nd date and time information, and calculates the date and time elapsed from the shooting time until the image acquisition unit 100A acquires the shot image. For example, the elapsed date and time calculation unit 100C calculates the elapsed date and time to be 1 day and 2 minutes when the 1 st date and time information is 20 days and 9 days in 12 months and 04 minutes in 2018, and the 2 nd date and time information is 9 days and 06 minutes in 21 months and 12 months in 2018.

The image processing unit 100D performs image processing that changes according to the length of the elapsed date and time on the captured image selected as the processing target. In other words, the image processing unit 100D performs image processing on the captured image that changes with time. Specifically, the image processing unit 100D selects image processing according to the length of the elapsed date and time from among a plurality of image processing that vary according to the length of the elapsed date and time, and performs the selected image processing on the captured image of the processing target.

Fig. 7 is a conceptual diagram illustrating image processing performed by the image processing unit 100D that changes with time. Fig. 7 shows an example of the relationship between 3 types of image processing (image processing a, image processing B, and image processing C) (reference numeral 105) and the elapsed date and time T (reference numeral 107) which are preset by the image processing unit 100D. The elapsed date and time T are image-processed a during the period from Ta to Tab, image-processed B during the period from Tab to Tbc, and image-processed C during the period from Tbc to Tc. For example, image processing a is selected at the elapsed date and time T1, image processing B is selected at the elapsed date and time T2, and image processing C is selected at the elapsed date and time T3. In the case shown in fig. 7, the periods Ta to Tab of the image processing a, the periods Tab to Tbc of the image processing B, and the periods Tbc to Tc of the image processing C are constant, but the present invention is not limited thereto. The correspondence of the plurality of image processings to the lengths of the elapsed dates and times on the time axis can be arbitrarily determined by the user. In the case shown in fig. 7, the user can arbitrarily set the periods Ta to Tab of the image processing a, the periods Tab to Tbc of the image processing B, and the periods Tbc to Tc of the image processing C. At this time, the user inputs the correspondence relationship via the operation unit (relationship input unit) 98. The image processing a, the image processing B, and the image processing C are different image processing from each other, and the captured image changes with time due to changes in the processing contents of the image processing a, the image processing B, and the image processing C. Further, the processing contents of the preset plurality of image processings may be changed according to the photographing mode or the photographing condition. For example, the processing contents of the image processing a, the image processing B, and the image processing C may be changed for each imaging mode, or the processing contents of the image processing a, the image processing B, and the image processing C may be changed for each imaging condition.

The photographed image of the processing target subjected to the image processing by the image processing unit 100D may be stored in the memory 72, or the photographed image subjected to the image processing may not be stored in the memory 72. When the photographed image after the image processing is stored in the memory 72, the user can enjoy the photographed image that changes with time even after a certain time has elapsed. When the photographed image after the image processing is not stored in the memory 72, the user can enjoy the change over time by printing the photographed image after the image processing. The photographed image before the image processing selected as the processing target may or may not be stored in the memory 72.

Returning to fig. 6, the display control unit 100E causes the display 28 to display the photographed image of the processing target subjected to the image processing by the image processing unit 100D. The user can enjoy the temporal change of the photographed image by checking the photographed image after the displayed image processing.

The print control unit 100F controls printing of the photographed image subjected to the image processing. For example, the print control unit 100F causes the print unit to print the photographed image of the processing target subjected to the image processing by the image processing unit 100D automatically or in response to an instruction from the user. The user can enjoy the temporal change of the photographed image by confirming the photographed image after the printed image processing.

Next, an image processing method (image processing step) using the image processing apparatus 101 will be described. Fig. 8 is a flowchart showing an image processing method.

First, the image acquisition unit 100A acquires a captured image of the selected processing target (step S10). Then, the date and time information acquiring unit 100B acquires the 1 st date and time information and the 2 nd date and time information of the photographed image acquired by the image acquiring unit 100A (step S11). Next, the elapsed date and time calculation unit 100C calculates the elapsed date and time from the 1 st date and time information and the 2 nd date and time information (step S12). Then, the image processing unit 100D performs image processing based on the calculated length of the elapsed date and time (step S13).

The respective structures and functions described above can be implemented by any hardware, software, or a combination of both as appropriate. For example, the present invention can be applied to a program for causing a computer to execute the above-described processing steps, a computer-readable recording medium (non-transitory recording medium) on which such a program is recorded, or a computer on which such a program can be installed.

As described above, according to the present embodiment, the date and time elapsed from the shooting time is calculated, and image processing is performed based on the calculated length of the elapsed date and time from among a plurality of image processing that varies based on the length of the elapsed date and time.

Example of < time-dependent image processing >

Next, image processing that changes with time by the image processing unit 100D will be described. The image processing performed by the image processing section 100D that changes with time can be performed in various ways. A specific example of image processing that changes with time will be described below.

[ 1 st example ]

In example 1, the image processing unit 100D changes the entire color of the photographed image of the processing target according to the length of the elapsed date and time. For example, the image processing unit 100D changes the overall color of the captured image by making the image lighter, darker, or different in color according to the length of the elapsed date and time.

Fig. 9 is a diagram illustrating image processing in which the entire color of a captured image changes with time.

The reference numeral 201 denotes a photographed image P1 when the elapsed date and time is T1, the reference numeral 203 denotes a photographed image P1 when the elapsed date and time is T2, and the reference numeral 205 denotes a photographed image P1 when the elapsed date and time is T3. In addition, the length of the elapsed date and time is T1 < T2 < T3.

As shown in fig. 9, the image processing unit 100D performs image processing so as to change the overall color of the captured image according to the length of the elapsed date and time. Specifically, the entire color (for example, sepia) of the image is gradually darkened in the order of the photographed image P1 of the symbol 201, the photographed image P1 of the symbol 203, and the photographed image P1 of the symbol 205. In this way, by performing image processing for changing the entire color of the photographed image according to the date and time elapsed, the photographed image P1 can be changed with time.

[ 2 nd example ]

In example 2, the image processing unit 100D changes the color of a part of the subject in the photographed image to be processed, according to the length of the elapsed date and time. For example, the image processing unit 100D changes the background color of the photographed image of the processing target by making the image lighter, darker, or changing the color according to the length of the elapsed date and time.

Fig. 10 is a diagram illustrating image processing in which the color of a part of the subject in the captured image changes with time.

The reference numeral 207 denotes a photographed image P2 when the elapsed date and time is T1, the reference numeral 209 denotes a photographed image P2 when the elapsed date and time is T2, and the reference numeral 211 denotes a photographed image P2 when the elapsed date and time is T3.

As shown in fig. 10, the image processing unit 100D performs image processing for gradually lightening the color of the background 213 of the photographed image P2 according to the length of the elapsed date and time. Specifically, the background 213 is gradually lightened in color in the order of the photographed image P2 of the symbol 207, the photographed image P2 of the symbol 209, and the photographed image P2 of the symbol 211. In this way, by making the color of the background 213 lighter according to the date and time of the passage, the photographed image P2 can be changed with time.

[ 3 rd example ]

In example 3, the image processing unit 100D changes a part of the color or a part of the hue of the photographed image to be processed, according to the length of the elapsed date and time. Here, a part of the colors represent the same color in the photographed image, and a part of the hues represent the same hue in the photographed image. For example, the same color is a color having the same RGB signal value, and for example, the same hue is a hue belonging to the same group when the hue circle 6 is equally divided.

Fig. 11 is a diagram illustrating image processing in which the same color or hue of a photographed image changes with time.

The symbol 215 indicates a photographed image P3 when the elapsed date and time is T1, the symbol 217 indicates a photographed image P3 when the elapsed date and time is T2, and the symbol 219 indicates a photographed image P3 when the elapsed date and time is T3. In addition, the leaves 221 of the photographic image P3 have the same color.

As shown in fig. 11, the image processing section 100D performs image processing of changing the color of the foliage 221 according to the length of the elapsed date and time. Specifically, the color of the leaf 221 is gradually darkened in the order of the photographed image P3 of the symbol 215, the photographed image P3 of the symbol 217, and the photographed image P3 of the symbol 219. In this manner, by changing the color of the leaves 221 in accordance with the elapsed date and time, the photographed image P3 can be changed over time.

[ 4 th example ]

In example 4, the image processing unit 100D changes the degree of blurring of a partial region of the photographed image to be processed, according to the length of the elapsed date and time. For example, the image processing unit 100D changes the degree of blurring of the foreground or background of the person in the captured image according to the length of the elapsed date and time. In this way, by changing the degree of blur of the foreground or background of the captured image, the depth of field of the captured image can be changed in a simulated manner. The image processing unit 100D detects a partial region where the degree of blurring is changed or a region other than the partial region where the degree of blurring is changed, and performs blurring processing. For example, the image processing unit 100D detects a region of a person (face), and performs blurring processing on a region (for example, background) other than the detected region of the person.

Fig. 12 is a diagram illustrating image processing in which the degree of blur of the background of a captured image changes with time.

Reference numeral 223 denotes a photographed image P4 when the elapsed date and time is T1, reference numeral 225 denotes a photographed image P4 when the elapsed date and time is T2, and reference numeral 227 denotes a photographed image P4 when the elapsed date and time is T3. The photographic image P4 has a person 229 as a main subject and a mountain 231 as a background.

When the date and time T1 has elapsed (reference numeral 223), the photographed image P4 is not changed by the image processing, and the region of the mountain 231 is not blurred. At the elapsed date and time T2 (symbol 225), the focus is focused on the person 229, but the mountain 231 as the background is blurred by the image processing. In the figure, the blurring is represented by double lines, and the size of the interval between the double lines represents the size of the degree of blurring. At the elapsed date and time T3 (symbol 227), the focus is focused on the person 229, but the mountain 231 as the background is blurred by the image processing. Then, at the elapsed date and time T3 (symbol 227), image processing is performed so that the degree of blurring is greater than the elapsed time T2. Here, the degree of blur indicates the degree of blur, and can be adjusted from a small degree of blur to a large degree of blur by image processing. If the degree of blurring of the mountain 231 as the background of the photographed image P4 is increased in accordance with the date and time of the passage of time, the depth of field of the photographed image P4 can be reduced in a simulated manner, and the photographed image P4 can be changed with time.

[ example 5 ]

In example 5, the image processing unit 100D changes the decoration image added to the photographed image of the processing target in accordance with the elapsed date and time. Here, the decoration image is an image added to the photographed image by image processing, and examples thereof include a crack, a movie theater style vertical stripe, and a color patch.

Fig. 13 is a diagram for explaining image processing in which a decoration image added to a photographed image changes with time.

Reference numeral 235 denotes a photographed image P5 when the elapsed date and time is T1, reference numeral 237 denotes a photographed image P5 when the elapsed date and time is T2, and reference numeral 239 denotes a photographed image P5 when the elapsed date and time is T3.

At the elapsed date and time T1 (reference numeral 235), the crack 241 as a decorative image has not been added to the photographed image P5. At the elapsed date and time T2 (reference numeral 237), the crack 241 is added to the photographed image P5 by image processing. At the elapsed date and time T3 (symbol 239), the crack 241 is added to the photographed image P5 by image processing.

In the case shown in fig. 13, the crack 241 changes from the elapsed date and time T2 (symbol 237) to the elapsed date and time T3 (symbol 239). Specifically, the elapsed date and time T2 (symbol 237) to the elapsed date and time T3 (symbol 239) makes the crack 241 longer or the number of cracks 241 increases. In this way, by changing the crack 241 added to the photographed image P5 according to the date and time elapsed, the photographed image P5 can be changed with time.

Although a specific example of image processing that changes with time performed by the image processing unit 100D has been described above, the image processing that changes with time performed by the image processing unit 100D is not limited to the specific example.

< Others >

In the present embodiment, a description has been given of a mode in which image processing is changed in accordance with the date and time elapsed from the shooting time of a photographed image, but there is also a processing in which a person who views an image is interested. Other examples will be described below.

[ processing by Sound ]

The captured image can be subjected to image processing by using sound related to the imaging environment or the captured image. The digital camera with printer 10 includes a microphone 92 and can collect sounds of the shooting environment. Therefore, the sound of the shooting environment is collected by the microphone 92, and the camera control unit 100 stores the input sound of the shooting environment in the memory 72 together with the shot image. The image processing unit 100D performs image processing on the captured image using the sound of the imaging environment stored in the memory 72.

Fig. 14 is a diagram showing an example of image processing using sound of a shooting environment, and is a diagram showing an image shot in a concert hall.

The photographed image P6 is an image photographed in a concert hall. For example, sound of a concert hall is collected by the microphone 92, and image processing is applied to the photographed image. In the photographed image P6, the concert hall is recognized by collecting sounds with the microphone 92, and the musical note 250 is added to the photographed image P6. The image processing unit 100D performs soft image processing when collecting the sound of the imaging environment of the classical concert, or performs edge enhancement or image processing with high contrast when collecting the sound of the imaging environment of a heavy metal or a violent concert. When the sound of the crowd is collected as the sound of the shooting environment, the image processing unit 100D performs image processing for adding noise to the shot image.

[ processing Using information of shooting location ]

By using the information of the photographing location, image processing corresponding to the photographing location is performed on the photographed image or information related to the photographed image is added. The digital camera with printer 10 is provided with a GPS (Global Positioning System) function to detect a shooting location, and performs various processing on a shot image according to the shooting location. For example, in the case of shooting in a concert hall, the image processing unit 100D performs image processing having a sense of presence on the shot image.

Fig. 15 is a diagram showing an example of processing using information of a shooting location, and is a diagram showing a shot image obtained by shooting a lion.

The photographic image P7 is an image of a lion photographed in front of a cage of the lion in a zoo. When the digital camera with printer 10 has the GPS function, the shot image P7 is detected by the GPS function and shot in front of the lion cage of the animal. The sound of the lion is recorded in the memory 72 together with the photographed image P7, or the description (audio or text) of the lion is stored together with the photographed image P7.

[ treatment with emotion ]

The user can input his/her own emotion and perform image processing on the photographed image in accordance with the emotion. For example, the display 28 of the digital camera with printer 10 displays an emotion setting screen, and the user inputs his or her emotion via the emotion setting screen. The image processing unit 100D performs image processing on the captured image according to the input emotion.

Fig. 16 is a diagram showing an example of an emotion setting screen displayed on the display 28 of the digital camera with printer 10. The emotion setting screen shown in fig. 16 is circular, and 4 kinds of emotions (normal (H), lucky (I), sad (J), and startle (K)) are displayed. The user inputs the emotion by operating each of the operation levers L on the emotion setting screen. The image processing unit 100D performs image processing on the captured image based on the input emotion.

Fig. 17 is a diagram showing an example of image processing according to emotion.

As shown in fig. 17, the photographed image P8 performs image processing in accordance with the emotion input by the user on the emotion setting screen (fig. 16). In fig. 17, the emotion input value display 252 on the emotion setting screen is also displayed together with the photographed image P8.

While the present invention has been described with reference to the above examples, it is to be understood that the present invention is not limited to the above embodiments, and various modifications may be made without departing from the spirit of the present invention.

Description of the symbols

10-digital camera with printer, 12-camera body, 14-photographic lens, 16-release button, 18-record button, 20-strobe light-emitting window, 22 a-power button, 22 b-menu button, 22c-OK button, 22 d-mode switching button, 24-microphone hole, 26-speaker hole, 28-display, 30-film cover, 32 a-joystick, 32 b-print button, 32 c-replay button, 32 d-cancel button, 34-film discharge port, 42-instant film, 42 a-exposure surface, 42 b-observation surface, 42 c-exposure area, 42 d-capsule portion, 42 e-development processing liquid capsule portion, 42 f-collection portion, 42 g-absorbent material, 42 h-viewing zone, 42 i-frame, 52-film feeding mechanism, 54-film conveying mechanism, 56-print head, 62-lens driving section, 64-image sensor, 66-image sensor driving section, 68-analog signal processing section, 70-digital signal processing section, 72-memory, 74-memory controller, 76-display controller, 78-communication section, 80-antenna, 82-film feeding driving section, 84-film conveying driving section 86-head driving section, 88-flash, 90-flash lighting control section, 92-microphone, 94-speaker, 96-audio signal processing section, 97-clock section, 98-operation section, 100-camera control section, 100A-image acquisition section, 100B-date and time information acquisition section, 100C-elapsed date and time calculation section, 100D-image processing section, 100E-display control section, 100F-print control section, 101-image processing apparatus.

31页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:文件生成设备、文件生成方法、文件再现设备、文件再现方法和程序

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类