Image processing apparatus, image processing method, and recording medium

文档序号:1201203 发布日期:2020-09-01 浏览:10次 中文

阅读说明:本技术 图像处理装置、图像处理方法及记录介质 (Image processing apparatus, image processing method, and recording medium ) 是由 伊东直哉 中川智洋 横山幸德 于 2020-02-10 设计创作,主要内容包括:本发明提供一种能够使用赋予到图像的标签信息从图像组中直观地检索图像的图像处理装置、图像处理方法及记录介质。在本发明的图像处理装置、图像处理方法及记录介质中,标签信息显示部使赋予到包含于图像组中的所有图像的所有标签信息的至少一部分显示于显示部,标签信息指定部从显示于显示部的标签信息中指定根据来自用户的指示所选择的第1个标签信息作为第1选择标签信息,图像提取部从图像组中提取被赋予第1选择标签信息的图像作为第1检索图像,图像显示部使所有第1检索图像的至少一部分显示于显示部,标签信息显示部使赋予到所有第1检索图像的所有标签信息的至少一部分显示于显示部。(The invention provides an image processing apparatus, an image processing method and a recording medium capable of intuitively searching images from an image group by using tag information given to the images. In an image processing apparatus, an image processing method, and a recording medium according to the present invention, a tag information display unit displays at least a part of all tag information added to all images included in an image group on a display unit, a tag information specification unit specifies 1 st tag information selected in accordance with an instruction from a user from the tag information displayed on the display unit as 1 st selection tag information, an image extraction unit extracts an image to which the 1 st selection tag information is added from the image group as a 1 st search image, an image display unit displays at least a part of all the 1 st search images on the display unit, and a tag information display unit displays at least a part of all the tag information added to all the 1 st search images on the display unit.)

1. An image processing apparatus is characterized by comprising:

a tag information display unit that displays at least a part of tag information added to an image included in the image group on the display unit;

an instruction acquisition unit that acquires an instruction input by a user;

a tag information specifying unit configured to specify, as 1 st selection tag information, tag information selected in accordance with an instruction from the user from among the tag information displayed on the display unit;

an image extracting unit that extracts an image to which the 1 st selection tag information is given from the image group as a 1 st search image; and

an image display unit that displays at least a part of the 1 st search image on the display unit,

the tag information display unit displays at least a part of the tag information added to the 1 st search image on the display unit.

2. The image processing apparatus according to claim 1,

the tag information designating unit designates, as 2 nd selection tag information, tag information selected in accordance with an instruction from the user from the tag information displayed on the display unit simultaneously with the 1 st selection tag information,

the image extracting unit extracts a 2 nd search image to which the 2 nd selection tag information is given from the 1 st search image,

the image display unit displays at least a part of the 2 nd search image on the display unit,

the tag information display unit displays at least a part of the tag information added to the 2 nd search image on the display unit.

3. The image processing apparatus according to claim 1 or 2,

the tag information designating unit simultaneously designates two or more pieces of the tag information selected in accordance with an instruction from the user as the 1 st selection tag information from the tag information displayed on the display unit,

the image extracting unit extracts, as the 1 st search image, images to which all 1 st selection tag information of the two or more 1 st selection tag information is assigned from the image group,

the image display unit displays at least a part of the 1 st search image on the display unit,

the tag information display unit displays at least a part of the tag information added to the 1 st search image on the display unit.

4. The image processing device according to claim 1 or 2, further comprising:

an image analysis unit that analyzes each image included in the image group; and

and a label information adding unit that adds the label information to each image based on the analysis result of each image.

5. The image processing device according to claim 1 or 2, further comprising:

an image information display unit that displays information of one image selected in accordance with an instruction from the user on the display unit; and

and an image information editing unit that edits information of one image displayed on the display unit in accordance with an instruction from the user.

6. The image processing apparatus according to claim 5,

the image information editing unit adds new tag information, which is input in accordance with an instruction from the user, to one image displayed on the display unit as information of the one image.

7. The image processing apparatus according to claim 1 or 2,

the tag information display unit displays the tag information displayed on the display unit in a plurality of categories.

8. The image processing apparatus according to claim 7,

the tag information display unit displays tag information included in at least one of the plurality of categories in an order of a number of images to which the tag information is added.

9. The image processing apparatus according to claim 7,

the tag information display unit displays tag information included in at least one of the plurality of categories in an order of a number of times that the tag information is used as a keyword for searching for an image.

10. The image processing apparatus according to claim 7,

the tag information has reliability information indicating reliability of the tag information,

the tag information display unit displays tag information included in at least one of the plurality of categories in an order of a higher reliability of the tag information, based on reliability information included in the tag information.

11. The image processing device according to claim 1 or 2, further comprising:

a tag information retrieval unit that retrieves, as retrieval tag information, tag information at least a part of which matches a keyword input in accordance with an instruction from the user, from among tag information added to the images included in the image group,

the tag information display unit displays at least a part of the search tag information on the display unit.

12. The image processing apparatus according to claim 11,

the tag information designating unit designates one piece of search tag information selected in accordance with an instruction from the user as 1 st selection information from the search tag information displayed on the display unit,

the image extracting section extracts an image to which the one search tag information is given from the image group as the 1 st search image,

the image display unit displays at least a part of the 1 st search image on the display unit,

the tag information display unit displays at least a part of the tag information added to the 1 st search image on the display unit.

13. The image processing apparatus according to claim 1 or 2,

the display section is a touch panel which is,

the tag information display unit causes the tag information to be displayed on the display unit as a button,

the label information designating unit switches between designating the label information corresponding to the button as the 1 st selection label information and de-designating the label information corresponding to the button as the 1 st selection label information every time the instruction acquiring unit acquires the input to the button in accordance with the instruction from the user.

14. The image processing apparatus according to claim 13,

the tag information designating unit maintains the designation as the 1 st option tag information until the designation as the 1 st option tag information is released in accordance with an instruction from the user.

15. The image processing apparatus according to claim 13,

the label information designating unit designates, as the 1 st selected label information, label information of all subordinate concepts included in label information of a higher-level concept of label information corresponding to a pressed button, when the instruction acquiring unit acquires an input for a certain time period or longer with respect to the button selected from the buttons displayed on the display unit in accordance with the instruction from the user.

16. The image processing apparatus according to claim 1,

when one tag information is selected from the tag information displayed on the display unit in accordance with an instruction from the user, the tag information designating unit cancels the designation of the other designated tag information as the 1 st selection tag information, and designates only the one tag information as the selection tag information.

17. The image processing device according to claim 1, further comprising:

an image analysis unit that analyzes each image included in the image group;

an evaluation value assigning unit that assigns an evaluation value to each of the images based on an analysis result of each of the images; and

a weight setting unit that sets a weight of the tag information selected later to be larger than the tag information selected earlier, among two or more tag information simultaneously selected from the tag information displayed on the display unit in accordance with an instruction from the user,

the evaluation value assigning unit assigns a higher evaluation value to the image to which the tag information selected later is assigned than to the image to which the tag information selected earlier is assigned, based on the weight setting of the tag information.

18. The image processing apparatus according to claim 1 or 2,

the tag information includes tag information of a higher concept having a plurality of tag information of lower concepts,

the tag information display unit displays, on the display unit, tag information of a plurality of lower concepts included in the tag information of the upper concept when the tag information of the upper concept is selected in accordance with an instruction from the user.

19. The image processing apparatus according to claim 1 or 2,

each image included in the image group has evaluation value information of the image,

the image display unit displays the images on the display unit in order of increasing evaluation value of the images.

20. An image processing method, comprising:

a step in which a tag information display unit displays at least a part of tag information added to an image included in the image group on the display unit;

a step in which an instruction acquisition unit acquires an instruction input by a user;

a step in which a tag information specification unit specifies, from the tag information displayed on the display unit, tag information selected in accordance with an instruction from the user as 1 st selected tag information; and

a step in which an image extracting unit extracts an image to which the 1 st selection tag information is given from the image group as a 1 st search image,

displaying at least a part of the 1 st search image on the display unit,

at least a part of the tag information added to the 1 st search image is displayed on the display unit.

21. A computer-readable recording medium having recorded thereon a program for causing a computer to execute each step of the image processing method according to claim 20.

Technical Field

The present invention relates to an image processing apparatus, an image processing method, and a recording medium that search for an image from an image group based on a search condition and display the search result image on a display unit.

Background

For example, the following method is used: when searching for an image desired by a user from an image group held in a smartphone, the user displays a list of images (thumbnail images) included in the image group in time series on a display unit, and searches for the desired image while backtracking the list of images in time series. Further, the following methods are known: a keyword for retrieving an image from an image group is input, and an image matching the keyword is searched for by using additional information of the image such as tag information.

However, in the method of displaying and retrieving the list of images, it is difficult to search for a desired image from a large number of images included in the image group. In the method of searching for an image by inputting a keyword, a desired image may be appropriately searched if it is clear what keyword should be used, but it is difficult to appropriately search for a desired image if the keyword is not clear.

Patent documents 1 and 2 are related to the present invention.

Patent document 1 describes: the image data is searched by performing a character string search based on the additional information added to the image data. For example, by performing any one of AND/OR searches for an exif (exchangeable Image File format) character string to be searched AND a memo character string to be searched, Image data having additional information matching the input character string is searched, AND Image data corresponding to the search result is displayed.

Patent document 2 describes: the plurality of attributes and attribute values associated with the digital image are displayed on a label, and the user selects another set of images by selecting one of the attribute values displayed on the label, and so on. And, when browsing the selected image group, the user checksSearching all images of a specific person, displaying the images in time sequence column, and displayingIt is composed of a base, a cover and a coverTheir associated tags.

Patent document 1: japanese laid-open patent publication No. 2009-124270

Patent document 2: japanese patent No. 5216591

Patent document 1 describes AND search for a plurality of additional information character strings, but the user needs to input a character string to be searched, AND cannot intuitively search for an image.

Patent document 2 describes browsing through other image groups by selecting one of the attribute values of one digital image, but only browsing through an image group corresponding to only one attribute value, for example, images cannot be reduced by a plurality of attribute values.

Disclosure of Invention

An object of the present invention is to provide an image processing apparatus, an image processing method, a program, and a recording medium capable of intuitively searching for an image from an image group using tag information attached to the image.

In order to achieve the above object, an image processing apparatus is provided with: a tag information display unit that displays tag information added to an image included in the image group on the display unit;

an instruction acquisition unit that acquires an instruction input by a user;

a tag information specifying unit that specifies, as selected tag information, tag information selected in accordance with an instruction from a user from among the tag information displayed on the display unit;

an image extraction unit that extracts an image to which selection tag information is given from the image group as a search image; and

an image display unit for displaying the images included in the image group on the display unit,

the tag information display unit displays at least a part of all tag information added to all images included in the image group on the display unit,

the tag information designating unit designates 1 st tag information selected in accordance with an instruction from a user as 1 st selection tag information from the tag information displayed on the display unit,

the image extracting section extracts an image to which the 1 st selection tag information is given from the image group as a 1 st retrieval image,

the image display unit displays at least a part of all the 1 st search images on the display unit,

the tag information display unit displays at least a part of all the tag information added to all the 1 st search images on the display unit.

Preferably, the tag information designating unit designates 2 nd tag information selected in accordance with an instruction from the user as 2 nd selection tag information from the tag information displayed on the display unit in addition to the 1 st selection tag information,

the image extracting section extracts the 1 st search image to which the 2 nd selection tag information is given as a 2 nd search image from the 1 st search image,

the image display unit displays at least a part of all the 2 nd search images on the display unit,

the tag information display unit displays at least a part of all the tag information added to all the 2 nd search images on the display unit.

Preferably, the tag information designating unit simultaneously designates two or more tag information selected in accordance with an instruction from the user as the selected tag information from the tag information displayed on the display unit,

the image extracting unit extracts, as a search image, images to which all of the two or more selection tag information is given from the image group,

the image display unit displays at least a part of all the search images on the display unit,

the tag information display unit displays at least a part of all tag information added to all search images on the display unit.

It is also preferable to provide: an image analysis unit that analyzes each image included in the image group; and

and a label information adding unit for adding label information to each image based on the analysis result of each image.

It is also preferable to provide: an image information display unit that displays information of one image selected in accordance with an instruction from a user on the display unit; and

and an image information editing unit for editing the information of one image displayed on the display unit in accordance with an instruction from a user.

Preferably, the image information editing unit adds new tag information, which is input in response to an instruction from the user, to the one image as information of the one image displayed on the display unit.

Preferably, the label information display unit displays the label information displayed on the display unit in a plurality of categories.

Preferably, the tag information display unit displays the tag information included in at least one of the plurality of categories in an order of the number of images to which the tag information is added.

Preferably, the tag information display unit displays tag information included in at least one of the plurality of categories in an order of a largest number of times the tag information is used as a keyword for searching for an image.

Further, it is preferable that the tag information has reliability information indicating reliability of the tag information,

the tag information display unit displays tag information included in at least one of the plurality of categories in an order of high reliability of the tag information from low reliability of the tag information, based on reliability information included in the tag information.

It is also preferable to provide: a tag information search unit that searches for, as search tag information, tag information at least a part of which matches a keyword input in accordance with an instruction from a user, from among all tag information added to all images,

the tag information display unit displays at least a part of the search tag information on the display unit.

Preferably, the tag information designating unit designates one piece of search tag information selected in accordance with an instruction from the user from among the search tag information displayed on the display unit,

an image extracting unit extracts an image to which one piece of search tag information is given from the image group as a search image, an image display unit displays at least a part of all the search images on a display unit,

the tag information display unit displays at least a part of all tag information added to all search images on the display unit.

Preferably, the display unit is a touch panel,

the label information display unit displays the label information on the display unit as a button,

the label information designating unit switches between designating label information corresponding to the button as selected label information and de-designating the label information corresponding to the button as selected label information every time the button is pressed in accordance with an instruction from a user.

Preferably, the tag information designating unit maintains the designation of the selected tag information until the designation of the selected tag information is released in response to an instruction from the user.

Preferably, the label information designating unit designates, as the selected label information, all the label information of the lower concepts included in the label information of the upper concept of the label information corresponding to the pressed button, when the button selected by the instruction from the user is pressed for a predetermined time or more from among the buttons displayed on the display unit.

Further, preferably, when one of the tag information displayed on the display unit is selected in accordance with an instruction from the user, the tag information designating unit cancels the designation of the other tag information already designated and designates only one of the tag information as the selected tag information.

Preferably, the apparatus further comprises: a weighting setting unit that performs weighting setting on the tag information;

an image analysis unit that analyzes each image included in the image group; and

an evaluation value assigning unit that assigns an evaluation value to each image based on the analysis result of each image,

the weight setting unit sets the weight of the tag information selected later to be larger than the weight of the tag information selected earlier, among the two or more tag information simultaneously selected from the tag information displayed on the display unit in accordance with an instruction from the user,

the evaluation value giving unit sets the evaluation value of the image to which the tag information selected later is given higher than the evaluation value of the image to which the tag information selected earlier is given, based on the weight setting of the tag information.

Preferably, the tag information includes tag information of a higher concept including tag information of a plurality of lower concepts,

the label information display unit displays, on the display unit, a plurality of label information items of lower concepts included in the label information items of upper concepts when the label information items of upper concepts are selected in accordance with an instruction from a user.

Further, it is preferable that each image included in the image group has evaluation value information of the image,

the image display unit displays the images on the display unit in order of the image evaluation value from high to low.

Also, the present invention provides an image processing method including: a step in which a tag information display unit displays tag information added to an image included in the image group on the display unit;

a step in which an instruction acquisition unit acquires an instruction input by a user;

a step in which the tag information specifying unit specifies, as selected tag information, tag information selected in accordance with an instruction from a user from among the tag information displayed on the display unit;

a step in which the image extraction unit extracts an image to which the selection tag information is given from the image group as a search image; and

a step in which the image display unit displays the images included in the image group on the display unit,

at least a part of all the tag information added to all the images included in the image group is displayed on the display unit,

the 1 st tag information selected in accordance with an instruction from a user is designated as the 1 st selection tag information from the tag information displayed on the display section,

an image to which the 1 st selection tag information is given is extracted from the image group as a 1 st retrieval image,

displaying at least a part of all the 1 st search images on a display unit,

at least a part of all the tag information added to all the 1 st search images is displayed on the display unit.

Here, it is preferable that the 2 nd selected tag information selected in accordance with an instruction from the user is designated as the 2 nd selected tag information from the tag information displayed on the display portion at the same time as the 1 st selected tag information,

extracting the 1 st search image given the 2 nd selection tag information from the 1 st search image as a 2 nd search image,

displaying at least a part of all the 2 nd search images on a display unit,

at least a part of all the tag information added to all the 2 nd search images is displayed on the display unit.

Further, it is preferable that two or more pieces of label information selected in accordance with an instruction from a user are simultaneously designated as selection label information from the label information displayed on the display unit,

an image to which all the selection tag information of two or more selection tag information is assigned is extracted from the image group as a search image,

displaying at least a part of all the search images on a display unit,

at least a part of all the tag information added to all the search images is displayed on the display unit.

The present invention also provides a program for causing a computer to execute each step of any one of the image processing methods described above.

The present invention also provides a computer-readable recording medium having a program recorded thereon for causing a computer to execute each step of any one of the image processing methods.

Further, the present invention provides an image processing apparatus including: a tag information display unit that displays tag information added to an image included in the image group on the display unit;

an instruction acquisition unit that acquires an instruction input by a user;

a tag information specifying unit that specifies, as selected tag information, tag information selected in accordance with an instruction from a user from among the tag information displayed on the display unit;

an image extraction unit that extracts an image to which selection tag information is given from the image group as a search image; and

an image display unit for displaying the images included in the image group on the display unit,

the label information display part, the instruction acquisition part, the label information specification part, the image extraction part and the image display part are hardware or a processor for executing programs,

the tag information display unit displays at least a part of all tag information added to all images included in the image group on the display unit,

the tag information designating unit designates 1 st tag information selected in accordance with an instruction from a user as 1 st selection tag information from the tag information displayed on the display unit,

the image extracting section extracts an image to which the 1 st selection tag information is given from the image group as a 1 st retrieval image,

the image display unit displays at least a part of all the 1 st search images on the display unit,

the tag information display unit displays at least a part of all the tag information added to all the 1 st search images on the display unit.

It is also preferable to provide: an image analysis unit that analyzes each image included in the image group; and

and a label information adding unit for adding label information to each image based on the analysis result of each image.

The image analyzing unit and the tag information providing unit are hardware or a processor for executing a program.

It is also preferable to provide: an image information display unit that displays information of one image selected in accordance with an instruction from a user on the display unit; and

and an image information editing unit for editing the information of one image displayed on the display unit in accordance with an instruction from a user.

The image information display unit and the image information editing unit are hardware or a processor that executes a program.

It is also preferable to provide: a tag information search unit that searches for, as search tag information, tag information at least a part of which matches a keyword input in accordance with an instruction from a user, from among all tag information added to all images,

the tag information retrieval unit is hardware or a processor executing a program,

the tag information display unit displays at least a part of the search tag information on the display unit.

Preferably, the apparatus further comprises: a weighting setting unit that performs weighting setting on the tag information;

an image analysis unit that analyzes each image included in the image group; and

an evaluation value assigning unit that assigns an evaluation value to each image based on the analysis result of each image,

the weight setting unit, the image analyzing unit, and the evaluation value assigning unit are hardware or a processor for executing a program,

the weight setting unit sets the weight of the tag information selected later to be larger than the weight of the tag information selected earlier, among the two or more tag information simultaneously selected from the tag information displayed on the display unit in accordance with an instruction from the user,

the evaluation value giving unit sets the evaluation value of the image to which the tag information selected later is given higher than the evaluation value of the image to which the tag information selected earlier is given, based on the weight setting of the tag information.

Effects of the invention

According to the present invention, the user can select tag information corresponding to a desired image to be searched for from among tag information displayed on the display unit by typing in, without typing in a keyword to search for the image, and thus the search can be performed intuitively.

Drawings

Fig. 1 is a block diagram showing an embodiment of the configuration of an image processing system according to the present invention.

Fig. 2 is a block diagram showing an embodiment of the structure of a client.

Fig. 3 is a flowchart showing an embodiment of the operation of the client.

Fig. 4 is a conceptual diagram illustrating an embodiment of an application home screen displayed on a display unit of a client.

Fig. 5 is a conceptual diagram illustrating an embodiment of a browsing screen for specifying an image.

Fig. 6 is a conceptual diagram illustrating an embodiment of a browsing screen for specifying image information.

Fig. 7 is a conceptual diagram illustrating an embodiment of an editing screen for specifying image information.

Fig. 8 is a conceptual diagram illustrating an embodiment of an input screen for new tag information.

Fig. 9 is a conceptual diagram illustrating an embodiment of a selection screen of tag information.

Fig. 10 is a conceptual diagram showing an embodiment of a state in which the 1 st tag information is selected on the tag information selection screen.

Fig. 11 is a conceptual diagram showing an embodiment of a state in which the content of tag information is updated on the basis of the 1 st tag information on the tag information selection screen.

Fig. 12 is a conceptual diagram showing an embodiment of a state in which the 2 nd tag information is selected on the tag information selection screen.

Fig. 13 is a conceptual diagram illustrating an embodiment of an image list screen.

Fig. 14 is a conceptual diagram showing an embodiment of a state in which one search image is selected on the image list screen.

Fig. 15 is a conceptual diagram illustrating an embodiment of a browsing screen for specifying a search image.

Fig. 16 is a conceptual diagram illustrating an embodiment of a time-series image list screen.

Fig. 17 is a conceptual diagram showing an embodiment of a state in which the "search tab" button is clicked on the selection screen of the tab information.

Fig. 18 is a conceptual diagram illustrating an embodiment of a tag information search screen.

Fig. 19 is a conceptual diagram illustrating an embodiment of a selection screen of tag information.

Description of the symbols

10-image processing system, 12-server, 14-client, 16-network, 18-instruction acquisition section, 20-image holding section, 22-image analysis section, 24-evaluation value assignment section, 26-label information assignment section, 28-weight setting section, 30-label information display section, 32-label information designation section, 34-image extraction section, 36-label information retrieval section, 38-image display section, 40-image designation section, 42-image information display section, 44-image information editing section, 46-display section, 50-notification and status area, 52-menu area, 54-navigation area, 56-finger icon, 58, 66, 88- "x" button, 60-information button, 62- "trash can" button, 64- "share" button, 68- "edit" button, 70- "add tag" button, 72, 78, 90- "cancel" button, 74- "save" button, 76-input field of new tag information, 80- "retrieve tag" button, 82- "calendar" button, 84- "other" button, 86-input field of keyword, 92, 94, 96-arrow.

Detailed Description

Hereinafter, an image processing apparatus, an image processing method, a program, and a recording medium according to the present invention will be described in detail with reference to preferred embodiments shown in the drawings.

Fig. 1 is a block diagram showing an embodiment of the configuration of an image processing system according to the present invention. The image processing system 10 shown in fig. 1 includes a server 12 and a plurality of clients 14 connected to the servers 12 via a network 16.

The server 12 has the following functions: the image processing system 10 acquires an image group owned by each of a plurality of users, holds each image included in the image group owned by each user, and shares the image with only a sharing target designated by each user.

The server 12 is not limited to 1, and may be a plurality of servers, and may be constituted by a workstation or the like having a control device, a storage device, a communication device, and the like.

The client 14 is an image processing apparatus of the present invention, and has the following functions: in the image group owned by the user, more specifically, the image is searched using the tag information added to the image as the search condition of the image according to the search condition, and the image of the search result is displayed on a display unit or the like.

The client 14 is constituted by a desktop computer (personal computer) having a control device, an input device, a storage device, a communication device, a display, and the like, a notebook computer, a tablet computer, a mobile terminal such as a mobile phone and a smartphone, and the like.

Hereinafter, a case where the client 14 is a smartphone will be described as an example.

Fig. 2 is a block diagram showing an embodiment of the structure of a client. The client 14 shown in fig. 2 includes an instruction acquisition unit 18, an image holding unit 20, an image analysis unit 22, an evaluation value adding unit 24, a tag information adding unit 26, a weight setting unit 28, a tag information display unit 30, a tag information specifying unit 32, an image extraction unit 34, a tag information search unit 36, an image display unit 38, an image specifying unit 40, an image information display unit 42, an image information editing unit 44, and a display unit 46.

In the client 14, first, the instruction acquisition unit 18 acquires an instruction input from the user of the client 14.

The user can input various instructions by performing a touch operation on the touch panel of the smartphone. Further, the input device is not limited to the touch panel, and an instruction can be input by an input device such as a mouse, a keyboard, or a touch panel.

As will be described in detail later, the instruction from the user includes an instruction to select tag information, an instruction to select an image, an instruction to input tag information added to an image, an instruction to input a search condition (keyword) of tag information, an instruction to display an image on the display unit 46, and the like.

Next, the image holding unit 20 is a storage device such as a semiconductor memory, and holds the image group.

The image group is captured by the user using the camera of the smartphone and is held inside the smartphone. The image group is not particularly limited, and may be, for example, an image group read from the outside into the smartphone, an online image group held in an SNS (Social Network System), an online memory, or the like, or a combination of these image groups.

For example, an image taken with a camera of a smartphone can be set to be automatically uploaded from the smartphone to the server 12. The images uploaded to the server 12 are held in storage areas of the respective users within a storage device of the server 12. The image uploaded to the server 12 may be set to be retained in the image holding unit 20 of the smartphone or may be set to be deleted.

Further, for example, an image captured by a digital camera, an image held in a PC, a notebook computer, a tablet computer, or the like, an image (digital image data) scanned from a photocopy or the like, or the like can be read into the smartphone. The image read from the outside into the smartphone is held in the image holding unit 20, and similarly processed as a part of the image group.

Next, the image analysis unit 22 analyzes each image included in the image group.

The analysis items of the Image are not particularly limited, and include, for example, analysis of Image contents such as detection of an object captured in the Image (person detection and object detection), detection of a person's face, detection of a person's expression, detection of a person's behavior, detection of a scene (night scene, sea, beach, sky, …), detection of an event (sporting event, wedding, graduation ceremony, …), and detection of a user's taste (interest), analysis of Image qualities such as brightness, color, contrast, and degree of blur of the Image, analysis of time information (capturing time) and position information (capturing position) included in additional information of the Image such as Exif (Exchangeable Image File Format), and the like.

Next, the evaluation value assigning unit 24 assigns an evaluation value to each image based on the analysis result of each image.

The method of giving the evaluation value to the image by the evaluation value giving unit 24 is not particularly limited, and for example, a score may be given to each analysis result of the image, a total score of scores of a plurality of analysis results may be calculated, and the total score may be given as the evaluation value of the image.

For example, a person attaches a higher score to a smiling image than a crying face, an angry face, and the like. In addition, an image with appropriate brightness is given a higher score than an image that is too bright and an image that is too dark.

The evaluation value assigning unit 24 temporally sets the evaluation value of the image to which the tag information selected later is assigned to be higher than the evaluation value of the image to which the tag information selected earlier is assigned, based on the weighting setting of the tag information to be described later.

Next, the label information providing unit 26 provides label information to each image based on the analysis result of each image.

The tag information is various information related to the image, and can be exemplified by, for example, the date and position of image capturing, the name of a subject captured on the image, the name of the action of the subject, the name indicating the scene read from the image, the action, the emotion of the person, and the like, and the name of the preference of the user. The tag information is text information, and the tag information can be retrieved by a keyword.

The tag information may include tag information of a higher concept having a plurality of lower concepts. For example, the label information may include label information of a higher-level concept of "Sweets" having label information of two subordinate concepts of "royal dragon (Mont Rlanc)" and "Strawberry Shortcake (Strawberry Shortcake)".

Also, the tag information may have reliability information indicating the reliability of the tag information. The content of the tag information automatically given may not be accurate, but the accuracy of the content of the tag information can be determined from the reliability information.

The tag information adding unit 26 is not an essential component, but may add further tag information to the tag information previously added to the image, or may add new tag information to the image to which no tag information is added.

Next, the weight setting unit 28 sets a weight to the tag information.

The method of weighting the tag information by the weight setting unit 28 is not particularly limited, but when the user searches for an image using two or more tag information, it is considered that tag information selected later in time may be more important than tag information selected earlier. Accordingly, the weight setting unit 28 can set the weight of tag information selected later to be larger than tag information selected earlier in time, among two or more tag information simultaneously selected from the tag information displayed on the display unit 46 in accordance with an instruction from the user.

Next, the tag information display unit 30 displays the tag information added to the image included in the image group on the display unit 46.

The method of displaying the tag information by the tag information display unit 30 is not particularly limited, but the tag information displayed on the display unit 46 can be classified and displayed into a plurality of categories, for example. Accordingly, the tag information can be displayed in a classified manner for each type of tag information, and the user can easily find the tag information to be used as the image search condition from the tag information displayed on the display unit 46.

In this case, the tag information display unit 30 can display the tag information included in at least one of the plurality of categories in an order of increasing or decreasing the number of images to which the tag information is added. This makes it possible to display the label information added to the plurality of images. In addition, tag information represented by consecutive numbers such as the shooting year, month, and day of the image may be displayed in descending or ascending numerical order.

The tag information display unit 30 may display the tag information included in at least one of the plurality of categories in an order of the number of times used as a keyword for searching for an image, and may display the tag information included in at least one of the plurality of categories in an order of the reliability of the tag information from high to low based on the reliability information included in the tag information when the tag information has the reliability information.

When the upper concept label information is selected in response to an instruction from the user, the label information display unit 30 can display the plurality of lower concept label information included in the upper concept label information on the display unit 46.

The label information display unit 30 can display the label information as a button (link) on the display unit (touch panel) 46. This allows the user to easily select the tag information.

Next, the tag information designating unit 32 designates, as the selected tag information, tag information selected in accordance with an instruction from the user, from among the tag information displayed on the display unit 46.

The tag information designating unit 32 designates, as the selection tag information, the search tag information selected in accordance with an instruction from the user, from among the search tag information displayed on the display unit 46.

Next, the image extracting unit 34 extracts, from the image group, an image to which the selection tag information designated by the tag information designating unit 32 is added as a search image.

Here, the tag information display unit 30, the tag information specification unit 32, and the image extraction unit 34 constitute an image search unit of the present invention that searches for an image matching a search condition input in response to an instruction from a user from among the image group as a search image.

Next, the tag information retrieval unit 36 retrieves, as retrieval tag information, tag information at least a part of which matches a keyword input in response to an instruction from the user, from all tag information added to all images included in the image group.

Next, the image display unit 38 displays the images included in the image group on the display unit 46.

The image display unit 38 displays, for example, each image included in the image group, a list of images (thumbnail images) included in the image group, a list of search images (thumbnail images), and the like.

When the image list or the search image list is displayed on the display unit 46, the image display unit 38 can display the images in chronological order in the order of the shooting time. Further, when each image included in the image group has the image evaluation value information, the image display unit 38 can display the images on the display unit 46 in the order of the image evaluation values from high to low.

Next, the image specification unit 40 specifies one image selected in accordance with an instruction from the user from among the images displayed on the display unit 46.

The user can select one image to be viewed from the list of images displayed on the display unit 46. When one image is selected, only the selected one image is displayed on the display unit 46 by the image display unit 38.

Next, the image information display unit 42 displays information on the one image designated by the image designating unit 40, that is, the one image selected in accordance with the instruction from the user, on the display unit 46.

The information of an image includes various information related to the image. In the present embodiment, the image capturing time and the tag information added to the image are included. The present invention is not limited to this, and may include information on the file name of the image, the size of the image, the resolution, and the shooting position.

Next, the image information editing unit 44 edits the information of one image displayed on the display unit 46 by the image information display unit 42 in accordance with an instruction from the user.

The user can edit the tag information and the like added to the image. The tag information automatically given by the tag information giving unit 26 may include inaccurate tag information. The user can correct or delete tag information that is deemed inaccurate.

Further, the user can also assign new tag information to the image. In this case, the image information editing unit 44 adds new tag information, which is input in accordance with an instruction from the user, to one image as information of one image displayed on the display unit 46.

Next, the display unit 46 displays various information.

The display unit 46 displays an image, image information, an image list, a tag information list, and the like. The display unit 46 is a liquid crystal panel, an organic EL (Electro Luminescence) panel, or the like, and in the case of the present embodiment, is a touch panel.

Next, a case will be described in which the client 14 searches for a desired image from an image group using tag information assigned to the image as a search condition, with reference to a flowchart shown in fig. 3.

First, an operation in the case of selecting the 1 st tag information will be described.

In this case, first, at least a part of all the label information added to all the images included in the image group is displayed on the display unit 46 by the label information display unit 30 (step S1).

When the tag information is displayed on the display unit 46, the tag information specifying unit 32 specifies the 1 st tag information selected in accordance with the instruction from the user as the 1 st selection tag information (selection tag information) from the tag information displayed on the display unit 46 (step S2).

When the 1 st selection tag information is specified, the image extracting unit 34 then extracts the image to which the 1 st selection tag information is given as a 1 st search image (search image) from the image group (step S3).

When the 1 st search image is extracted, at least a part of all the 1 st search images is displayed on the display unit 46 by the image display unit 38 (step S4).

Next, at least a part of all the tag information added to all the 1 st search images is displayed on the display unit 46 by the tag information display unit 30 (step S5).

Next, an operation in the case where the 1 st tag information and the 2 nd tag information are simultaneously selected will be described.

In this case, next, the 2 nd selected tag information selected in accordance with the instruction from the user is simultaneously designated as the 2 nd selected tag information (selected tag information) from the tag information displayed on the display unit 46 in addition to the 1 st selected tag information by the tag information designating unit 32 (step S6).

Next, the image extraction unit 34 extracts the 1 st search image to which the 2 nd selection tag information is added as a 2 nd search image (search image) from the 1 st search image (step S7).

Next, at least a part of all the 2 nd search images is displayed on the display unit 46 by the image display unit 38 (step S8).

Next, at least a part of all the tag information added to all the 2 nd search images is displayed on the display unit 46 by the tag information display unit 30 (step S9).

The same applies to the case where three or more pieces of tag information are simultaneously selected. That is, the operation in the case of simultaneously selecting two or more pieces of tag information is as follows.

First, the tag information designating unit 32 simultaneously designates two or more tag information selected in accordance with an instruction from the user as the selected tag information from the tag information displayed on the display unit 46.

Next, the image extracting unit 34 extracts, as a search image, an image to which all the selection tag information of the two or more selection tag information is added from the image group.

Next, at least a part of all the search images is displayed on the display unit 46 by the image display unit 38.

Next, at least a part of all the tag information added to all the search images is displayed on the display unit 46 by the tag information display unit 30.

Hereinafter, the image processing apparatus according to the present invention will be described by taking an example of an application (hereinafter, also simply referred to as APP) that is run on a smartphone by the client 14.

Fig. 4 is a conceptual diagram illustrating an embodiment of an APP home screen displayed on a display unit of a client. The display unit 46 of the smartphone shown in fig. 4 is configured by a touch panel, and the user can use the display unit (touch panel) 46 as the instruction acquisition unit 18 that not only displays various information but also acquires various instructions input from the user by performing a touch operation on the touch panel.

A notification and status area 50 is displayed in the upper part of the main screen, and a menu area 52 is displayed in the lower part thereof. A navigation area 54 is displayed in a lower portion of the main screen.

In the navigation area 54, buttons of "search", "collection", "home", "share", and "various" are displayed in order from the left side to the right side.

When the "search" button is clicked (pressed), a selection screen of the tag information is displayed on the display unit 46.

Then, when the "collect" button is clicked, a composite image such as an album is automatically created using the images included in the image group.

When the "home" button is clicked, the home screen, that is, a list of images included in the image group is displayed on the display unit 46. In the image list, images are displayed in order of shooting time (descending order or ascending order), for example.

If the "share" button is clicked, the image selected according to the instruction from the user is shared with other users.

When the "various" button is clicked, various other functions not displayed in the navigation area can be selected and executed.

A list of images is displayed in the area between the menu area 52 and the navigation area 54. Before the image search, in a state where no tag information is selected as the search condition, at least a part of all images included in the image group, for example, in the image list screen is displayed in time series from the upper side to the lower side of the image list screen in the order of image capturing time, in the case of the present embodiment, in descending order of capturing time.

As indicated by the finger icon 56 in fig. 4, the user can browse by displaying only one image to be clicked on the display unit 46 by clicking the image to be browsed from among the images displayed on the image list screen.

In this case, one image selected in accordance with an instruction from the user is designated as a designated image from the image list displayed on the display unit 46 by the image designating unit 40, and only the designated image is displayed on the display unit 46 by the image display unit 38.

Fig. 5 is a conceptual diagram illustrating an embodiment of a browsing screen for specifying an image. A browsing screen of the designated image is displayed on the entire surface of the display unit 46. As shown in fig. 5, an "x" button 58 is displayed on the upper left portion of the browsing screen of the designated image, and an "information" button 60 is displayed on the lower right portion. A "trash can" button 62, a "share" button 64, and the like are displayed on the lower portion of the browsing screen of the designated image.

For example, the user can close the browsing screen of the specified image by clicking the "x" button 58 to return to the image list screen (main screen). Also, the user can click on the "trash can" button 62 to delete the specified image, or can click on the "share" button 64 to share the specified image with other users. Further, as indicated by a finger icon 56 in fig. 5, the user can return the designated image by clicking an "information" button 60, and can browse the designated image information.

Fig. 6 is a conceptual diagram illustrating an embodiment of a browsing screen for specifying image information. An "x" button 66 is displayed on the upper left portion of the browsing screen in which image information is specified, and an "edit" button 68 is displayed on the upper right portion. A thumbnail image of the designated image is displayed on the top of the viewing screen of the designated image information, and a designated image capturing time, an automatic tag, a self-attached tag, and an "add tag" button 70 are sequentially displayed on the bottom.

The automatically attached tags are classified into categories such as a calendar, a person, a shooting location, an automatically attached tag, and a service tag, and the automatically attached tags are classified into categories to which tags are manually attached.

In the calendar category, tab information buttons of "2018" and "12 month" are included as the shooting year and shooting month of the designated image.

The tag information button of "taro" is included in the person category.

The shooting position category includes label information buttons of "park", "harbor district", and "platform park at seaside".

The automatic label information buttons include "outdoor", "boy", and "sunny" in the label category.

A tag information button containing "scan service" in the scan tag category.

The tag information button containing "juvenile programming" in the tag category is manually assigned.

For example, the user can close the browsing screen of the specified image information by clicking the "x" button 66 to return to the browsing screen of the specified image. Further, as indicated by the finger icon 56, the user can edit the designated image information by clicking the "edit" button 68.

Fig. 7 is a conceptual diagram illustrating an embodiment of an editing screen for specifying image information. A "cancel" button 72 is displayed on the upper left of the editing screen for specifying image information shown in fig. 7, and a "save" button 74 is displayed on the upper right.

In the browsing screen of the designated image information, when the "edit" button 68 is clicked, an "x" button is displayed in a tag information button that can be deleted on the editing screen of the designated image information. When the user clicks an "x" button in the tag information buttons, the image information editing unit 44 deletes the tag information button. As indicated by the finger icon 56 in fig. 7, when the user clicks the "add tag" button 70, an input screen for new tag information is displayed.

When the "cancel" button 72 is clicked on the edit screen of the designated image information, the edit contents of the designated image information are discarded, and the screen returns to the browsing screen of the designated image information. When the "save" button 74 is clicked, the edited contents of the designated image information are saved.

Fig. 8 is a conceptual diagram illustrating an embodiment of an input screen for new tag information. A thumbnail image of a designated image is displayed on the upper part of the input screen of new tag information shown in fig. 8, and an input field 76 of new tag information is displayed on the lower side thereof. A message of "additional tag" for urging the input of new tag information is displayed in the input field 76 of new tag information. A "cancel" button 78 is displayed on the right side of the input field 76 of the new tag information.

When the user clicks the input field 76 of the new tag information, the history of the newly added new tag information is displayed below the input field 76 of the new tag information, and the software keyboard is displayed below the input screen of the new tag information. The user can input the name of the new tag information in the input field 76 of the new tag information using the software keyboard.

In this case, a partial match search is executed every time the user inputs a character in the input field 76 of new tag information, and tag information partially matching the character input in the input field 76 of new tag information is searched from all tag information added to all images included in the image group and displayed in the input screen of new tag information. The user can also select new tag information from the tag information list displayed in the input screen of the new tag information.

When the user clicks the "cancel" button 78, the input screen of the new tag information is closed, and the screen returns to the editing screen of the designated image information.

Next, a selection screen of the tag information will be described.

As described above, when the "search" button is clicked, the selection screen of the tag information is displayed on the display unit 46.

Fig. 9 is a conceptual diagram illustrating an embodiment of a selection screen of tag information. The selection screen of the label information is displayed so as to be superimposed on the upper layer of the region other than the upper partial region in the image list screen and so as to transparently see the image list of the lower layer with a certain transmittance. In a state where the tag information is not selected, a part of all the tag information added to all the images included in the image group is displayed in a plurality of categories on the tag information selection screen. Each piece of label information is displayed as a button on the selection screen of the piece of label information. A "search tag" button 80 is displayed in the lower part of the selection screen of the tag information.

The plurality of categories include a calendar, a person, a shooting position, an automatic tag, a manual tag, and a service tag.

The calendar category includes a tag information button for the year and month in which the image was taken.

The person category includes a tag information button of a person name captured on the image.

The shooting position category includes a position tag information button where the image is shot.

The automatic label assignment type includes a label information button automatically assigned by the label information assignment unit 26.

The manually assigned tag category includes a tag information button manually assigned by the user, that is, a tag information button assigned by the image information editing unit 44.

The service label category includes, for example, a label information button related to a service provided by the image processing system 10 according to the present invention, such as scanning and printing to create digital image data.

On the selection screen of the tag information, in the calendar category, tag information buttons of "2018", "2017", and "2016" as the shooting years of the images are displayed in order from the left side to the right side, and tag information buttons of "12 month", "11 month", "10 month", and "9 month" as the shooting months of the images are displayed in order from the left side to the right side.

In the person category, tag information buttons of "TAKASHI", "AKIRA", and "fuji flower" are displayed in order from the left side to the right side as the person names.

The tag information is not included in the shooting position category.

In the automatic label assignment category, the label information buttons of "bicycle", "ramen", and "concert" are displayed in order from the left side to the right side.

In the manual label category, label information buttons of "group", "lunch today", and "beauty" are displayed in order from the left side to the right side.

The service label category displays a label information button of "scan image".

The tab information buttons included in the categories other than the calendar category are displayed in a horizontal column arrangement. In the calendar category, the label information buttons of the shooting year and the shooting month of the image are arranged in a horizontal row, respectively, and are displayed in descending order of the number of the shooting year and the shooting month. The label information buttons included in the other categories are displayed in order of the number of images to which the label information is given.

In addition, when the number of pieces of label information that cannot be displayed in the left-right direction is included in each category, the user can push out the label information buttons displayed on the selection screen of the label information to the outside by sliding in the left-right direction or the left-right direction while clicking the label information buttons included in each category, and can move the label information buttons not displayed on the selection screen of the label information to display the label information.

The user can select by clicking tag information corresponding to a desired image to be searched from the tag information displayed on the display unit 46 without typing in a keyword to search for the image, and thus can intuitively search for the image.

Here, in a state where the user has not selected any tag information as the search condition, as indicated by a finger icon 56 in fig. 10, the tag information button of "bicycle" is clicked as the 1 st tag information from among the tag information buttons displayed on the selection screen of the tag information.

In this case, the tag information of "bicycle" corresponding to the tag information button of "bicycle" selected according to the instruction from the user is specified as the 1 st selection tag information by the tag information specifying unit 32. The color of the tag information button of "bicycle" is changed by the tag information display unit 30, and "bicycle" is displayed on the left side in the upper area of the image list screen not covered by the selection screen of tag information.

Next, the image extraction unit 34 extracts an image to which tag information of "bicycle" as the 1 st selected tag information is added from the image group as the 1 st search image, and the image display unit 38 displays at least a part of all the 1 st search images on the image list screen.

As shown in fig. 11, at least a part of all the tag information added to all the 1 st search images is displayed on the tag information selection screen by the tag information display unit 30. That is, the content of the tag information displayed on the selection screen of the tag information is updated.

In the case of the present embodiment, the tag information buttons for changing the calendar type from "2018", "2017", and "2016" to "2018" and "2017" are set as the shooting year of the image.

The person category is changed from "TAKASHI", "AKIRA" and "fuji flower" to the tag information buttons "TAKASHI", "AKIRA" and "shantian".

The label information buttons for changing the label types from "bicycle", "ramen" and "taverware" to "bicycle", "mountain" and "guardrail" are automatically assigned.

The tag information buttons for changing the tag category from "group", "lunch today" and "beautiful view" to "group", "beautiful view" and "bicycle travel" are manually assigned.

That is, when only the tag information button of "bicycle" is selected from the tag information buttons displayed on the selection screen of the tag information, the search image to which the tag information of "bicycle" is assigned is narrowed down from the image group, and the search image to which the tag information of "bicycle" is assigned is narrowed down from all the tag information assigned to all the images included in the image group.

Next, as indicated by the finger icon 56 in fig. 12, the user clicks the first "bicycle" tab information button from among the tab information buttons displayed on the selection screen of the tab information, and clicks the "bicycle travel" tab information button as the 2 nd tab information. That is, in a state where the tag information button of "bicycle" is selected, the tag information button of "bicycle travel" is additionally selected, and two pieces of tag information are simultaneously selected.

In this case, the tag information of the first "bicycle" and the tag information of the 2 nd "bicycle trip" corresponding to the tag information button of the "bicycle trip" selected according to the instruction from the user are specified as the 2 nd selection tag information by the tag information specifying part 32. That is, both the tag information of "bicycle" and "bicycle travel" are designated as the selection tag information at the same time. The label information display unit 30 changes the colors of the label information button for "bicycle" and the label information button for "bicycle travel", and the right side of "bicycle" displayed in the upper area of the image list screen not covered by the selection screen of the label information is displayed as "bicycle travel".

Next, the image extracting unit 34 extracts the 1 st search image to which the label information of "bicycle travel" is given as the 2 nd search image from the 1 st search image to which the label information of "bicycle" is given, and as shown in fig. 13, the image display unit 38 displays at least a part of the 2 nd search image on the image list screen.

Although not shown, at least a part of all the tag information added to all the 2 nd search images is displayed on the tag information selection screen by the tag information display unit 30.

That is, when the two tag information buttons of "bicycle" and "bicycle travel" are simultaneously selected, the image group is narrowed down to only the search images to which the two tag information of "bicycle" and "bicycle travel" are assigned, and the image group is narrowed down to only all the tag information of all the search images to which the two tag information of "bicycle" and "bicycle travel" are assigned, from among all the tag information assigned to all the images included in the image group.

The user can close the tab information selection screen as needed by sliding the downward arrow 92 displayed on the top of the tab information selection screen downward. When the selection screen of the tag information is closed, as shown in fig. 13, an image list screen of the area covered by the selection screen of the tag information is displayed. This enables the user to browse the search image list.

When the number of search images that cannot be completely displayed is included in the image list screen, the user can push out a part of the search image displayed on the image list screen to the outside by sliding the search image displayed on the image list screen in the vertical direction so as to move the search image in the vertical direction, and can move and display a part of the search image not displayed on the image list screen into the image list screen.

When the selection screen of the tag information is closed, an upward arrow 94 is displayed on the lower portion of the selection screen of the tag information as shown in fig. 13. The user can open the tab information selection screen as needed by sliding the upward arrow 94 upward. In the case of the present embodiment, the selection screen of the tag information can be switched to the fully open state shown in fig. 9 to 12, the fully closed state shown in fig. 13, and the intermediate position thereof. Further, the selection screen of the tag information may be switched to a plurality of stages of three or more stages.

In addition, when the user clicks the "search" button while the image list screen is displayed, the tab information selection screen can be displayed as needed. Alternatively, the user can further reduce the image by additionally selecting another tag information button and further reducing the tag information in a state where the two tag information of "bicycle" and "bicycle travel" are selected.

The two pieces of tag information, i.e., "bicycle" and "bicycle travel" designated as the selection tag information are maintained by the tag information designating unit 32 until the designation as the selection tag information is released in accordance with an instruction from the user.

For example, a "release" button is provided in advance on the selection screen of the tag information, and the user can release the selection of two tag information, i.e., "bicycle" and "bicycle travel" designated as the selection tag information at a time by clicking the "release" button.

Each time the user clicks the tag information button, the user can switch between selecting and releasing the tag information as the selected tag information. That is, the tag information designating unit 32 switches between selection and release of the selected tag information of the tag information, in response to an instruction from the user, each time the instruction acquiring unit 18 acquires an input to the tag information button.

In this case, the label information designating unit 32 switches between designating the label information corresponding to the button as the selected label information and de-designating the label information corresponding to the button as the selected label information every time the button is pressed in accordance with an instruction from the user.

In other words, since the selection of the tag information is maintained until the selection is released, the search image as a result of the search can be browsed as needed using the selected tag information while the selected tag information is maintained, that is, when the image search is performed using the selected tag information, then the display screen is switched to another display screen, and the display screen is switched back to the selection screen of the tag information again.

Then, the user can select the label information of the higher concept of the label information of the long-pressed button by long-pressing (long tap) one label information button in a state where the label information is displayed on the display unit 46 as a button.

In this case, when one of the label information buttons displayed on the display unit 46 is pressed for a certain period of time or longer and selected in accordance with an instruction from the user, the label information specifying unit 32 specifies, as the selected label information, all of the label information of the lower concepts included in the label information of the upper concept of the label information corresponding to the pressed one of the label information buttons. That is, when the instruction acquisition unit 18 acquires an input for a certain period of time or longer for one of the label information buttons selected in accordance with an instruction from the user from among the label information buttons displayed on the display unit 46, the label information specification unit 32 specifies, as the selected label information, all of the label information of the lower concepts included in the label information of the upper concept of the label information corresponding to the pressed one button.

For example, when the label information button of "Mont Blanc" is pressed for a long time, the label information of all "confectionary pieces" which are the higher-level concept of "Mont Blanc" is selected.

Alternatively, when one piece of tag information is selected by the tag information designating unit 32 from the tag information displayed on the display unit 46 in accordance with an instruction from the user, the designation of the other piece of tag information that has been designated may be cancelled, and only the selected one piece of tag information may be designated as the selected tag information.

Next, as indicated by the finger icon 56 in fig. 14, the user can browse by displaying only one clicked search image on the display unit 46 by clicking one search image to be browsed from among the search images displayed on the image list screen.

In this case, one search image selected in accordance with an instruction from the user is designated as a designated search image from among the search images displayed on the display unit 46 by the image designating unit 40, and only the designated search image is displayed on the display unit 46 by the image display unit 38.

Fig. 15 is a conceptual diagram illustrating an embodiment of a browsing screen for specifying a search image. A browsing screen for specifying the search image is displayed on the entire surface of the display unit 46. An "x" button 58 is displayed on the upper left portion of the browsing screen for specifying the search image shown in fig. 15, a "calendar" button 82 is displayed on the lower left portion, and an "information" button 60 is displayed on the lower right portion. A "trash can" button 62, a "share" button 64, and an "other" button 84 are displayed on the lower portion of the browsing screen in which the search image is specified.

The browsing screen for specifying the search image has the same configuration as the browsing screen for specifying the image, except that a "calendar" button 82 is displayed on the lower left portion thereof.

For example, the user can close the browsing screen designating the search image by clicking the "x" button 58 to return to the search image list screen. Also, the user can click on the "trash can" button 62 to delete the specified search image, or click on the "share" button 64 to share the specified search image with other users, or click on the "other" button 84 to perform other functions. Further, the user can browse the information specifying the search image by clicking the "information" button 60.

The user can sequentially display and browse the search images one by one.

In this case, after only the designated search image is displayed on the display unit 46, whenever an instruction from the user is input, only one search image shot before or after the designated search image shooting time among all the search images is sequentially displayed on the display unit 46 in the order of the search images shot in time series from the designated search image shooting time through the image display unit 38.

For example, when the designated search image is slid to the right by the user, only one search image captured at the time of capturing the designated search image is displayed from all the search images. Thereafter, every time the search image is slid to the right side in accordance with the user's instruction, only every previous search image is displayed in order of the search images photographed in time series from the designation of the search image photographing time.

On the other hand, when the designated search image is slid to the left by the user, only one search image captured at the time of capturing the designated search image is displayed from among all the search images. Thereafter, every time the search image slides to the left in accordance with the user's pointing direction, every subsequent search image is sequentially displayed in the order of the search images photographed in time series from the time when the search image photographing time is specified.

As shown in fig. 15, by clicking a "calendar" button 82 displayed on the review screen of the search image, the user can browse images captured before and after the designated search image capturing time from the image group by displaying the images on the display unit 46.

In this case, when the designated search image is designated, only the search image is designated on the display unit 46, and then the image display unit 38 designates the search image and at least a part of the images captured before and after the designated search image capturing time included in the image group as time-series images on the display unit 46 in accordance with the time-series in accordance with the instruction from the user.

In this way, the user does not need to temporarily return to the image list screen such as the main screen in order to display the images captured before and after the time of capturing one search image, and in the case of the present embodiment, by one operation of clicking the "calendar" button 82, the images captured before and after the designated search image capturing time can be displayed in time series and viewed in accordance with the instruction from the user.

Here, in response to an instruction from the user, for example, the user can perform the same operation by returning to the image list screen such as the main screen so that images captured before and after the designated search image capturing time are displayed in time series. However, in such an operation method, when a large number of image groups for many years are stored in the smartphone, the user is forced to move between the image groups for many years on the image list screen such as the home screen until the date of shooting of a desired image.

In contrast, by using the image processing apparatus of the present embodiment, it is possible to display and browse images captured before and after the designated search image capturing time in time series by one operation of clicking the "calendar" button 82, which is more convenient than the case of returning to the image list screen such as the main screen to display images captured before and after the designated search image capturing time in time series as described above using the conventional technique.

Fig. 16 is a conceptual diagram illustrating an embodiment of a time-series image list screen. The time-series image list screen is displayed in the area between the menu area 52 and the navigation area 54.

Similarly, the user can browse by displaying only one time-series image that is clicked on the display unit 46 by clicking the time-series image to be browsed from among the time-series images displayed on the time-series image list screen.

In this case, one time-series image selected in accordance with an instruction from the user is designated as a designated time-series image from the time-series images displayed on the display unit 46 by the image designating unit 40, and only the designated time-series image is displayed on the display unit 46 by the image display unit 38.

When the time-series images are designated, the image display unit 38 displays only the time-series images on the display unit 46, and then displays only one image captured before or after the time-series image capturing time in the image group in the order of the images captured in time series from the time of the time-series image capturing, on the display unit 46 every time an instruction is input from the user.

The operation when the sequence image is slid to the right or left at the time of designation is the same as the operation when the search image is designated to be slid to the right or left.

In addition, the "calendar" button 82 may be provided instead by a specific touch operation to exert the same function as the "calendar" button 82.

The user can return to the search image list screen by clicking the "home" button on the time-series image list screen.

As described above, since the selection of the tag information is maintained until the selection is released, when the search image list screen is returned, the search image list, which is the result of the search using the selected tag information, is displayed on the search image list screen.

The search condition for searching for an image from the image group before displaying the time-series image on the display unit 46 is not particularly limited as long as it is a condition for searching for an image from the image group based on a condition other than the image capturing time.

The condition of the photographing time of the image includes not only the photographing time of the image but also various conditions related to the photographing time of the image, such as the photographing year, the photographing month, the photographing week, the photographing date, the photographing time, and the like of the image, and the condition other than the photographing time of the image includes all conditions other than the condition related to the photographing time of the image.

As the search condition, the analysis content of the image by the image analysis unit 22, the name of the tag information given by the tag information giving unit 26, and the like can be used, and the name of the subject captured on the image (the name of the person and the name of the object), the name of the expression of the person, the name of the action of the person, the name of the emotion of the person, the name of the scene, the name of the event, the name of the preference of the user, the name of the capturing position, and the like can be exemplified.

When the user cannot find the tag information button to be used as the search condition on the selection screen of the tag information, the user can input a keyword to search the tag information by clicking the "search tag" button 80 as indicated by the finger icon 56 in fig. 17.

When the "search tab" button 80 is clicked, a tab information search screen is displayed on the display unit 46 as shown in fig. 18.

The label information search screen is superimposed on the upper layer except for a partial area on the upper side of the image list screen, and is displayed so that the search image list on the lower layer is transparently viewed at a constant transmittance. An input field 86 for a keyword is displayed on the top of the tag information search screen. A message of "search tag" for urging the input of the keyword is displayed in the keyword input field 86. An "x" button 88 is displayed on the right end portion in the keyword input field 86, and a "cancel" button 90 is displayed on the right side of the keyword input field 86.

Next, it is assumed that the user has input a keyword in the keyword input field 86.

In this case, the tag information retrieval unit 36 retrieves, as the retrieval tag information, tag information at least a part of which matches the keyword input in response to an instruction from the user, from all tag information added to all images included in the image group.

Next, at least a part of the search tag information is displayed on the display unit 46 by the tag information display unit 30.

Next, the tag information specifying unit 32 specifies one piece of search tag information selected in accordance with an instruction from the user from among the pieces of search tag information displayed on the display unit 46.

Next, the image extraction unit 34 extracts an image to which one piece of search tag information is given as a search image from the image group.

Next, at least a part of all the search images is displayed on the display unit 46 by the image display unit 38.

Next, at least a part of all the tag information added to all the search images is displayed on the display unit 46 by the tag information display unit 30.

For example, as shown in FIG. 18, to retrieve "アイスクリ - ム: when "あ" is input in the keyword input field 86 as the tag information of ice cream ", a partial match search is performed, and tag information that partially matches" あ "is searched as search tag information from all tag information added to all images included in the image group.

Next, as candidates for retrieving tag information, as shown in fig. 18, on the tag information retrieval screen, for example, "あいちや/(", "あ one ちや/(", "あまおう", and "アイスクリ one ム): the label information buttons of ice cream "are displayed from the upper side to the lower side in sequence.

When the number of pieces of search tag information that cannot be displayed in its entirety is included in the tag information search screen, the user slides the search tag information displayed on the tag information search screen in the vertical direction to move the list of candidates of search tag information in the vertical direction, thereby pushing a part of the search tag information displayed on the tag information search screen outward and moving and displaying a part of the search tag information not displayed on the tag information search screen into the tag information search screen.

Further, the user can delete the keyword input to the input field 86 at a time by clicking the "x" button 88. And, by clicking the "cancel" button 90, the tag information retrieval screen can be closed to return to the selection screen of the tag information. The user can close the tag information search screen as needed by sliding the downward arrow 96 displayed on the top of the tag information search screen.

Next, as shown in fig. 18 by the finger icon 56, the user clicks, for example, "アイスクリ - ム" from among candidate buttons for searching for tag information displayed on the tag information search screen: label information button for ice cream ".

In this case, as shown in fig. 19, the selection screen of the tag information is displayed on the display unit 46. The action thereafter is the same as in the case of the state after the tag information button of "bicycle" is selected in the state where no tag information is selected.

That is, "アイスクリ - ム: the label information of ice cream "is designated as selection label information, and is displayed as" アイスクリ - ム: ice cream ".

Next, "アイスクリ - ム: the image of the label information of the ice cream "is displayed as a search image on the image list screen.

Further, for "アイスクリ - ム: all the tag information given to the search image of the tag information of ice cream "is displayed on the selection screen of the tag information.

In the subsequent operation, the user can additionally select the tag information button to be used as the search condition, as in the case where the tag information button of "bicycle travel" is additionally clicked in a state where the tag information button of "bicycle" is selected.

When a user photographs a new image, adds new tag information to an image, or deletes tag information added to an image, the reliability of the tag information can be improved by machine learning using AI (Artificial Intelligence) or the like, for example, based on the new image, the new tag information, the deleted tag information, and the like.

For example, the server 12 may have at least some of the functions of the client 14, such as analyzing an image, assigning an evaluation value, assigning tag information, setting a weight, extracting an image, and retrieving tag information, so that the server 12 can execute at least some of the functions.

In the apparatus of the present invention, the hardware configuration of the Processing Unit (Processing Unit) that executes various processes, such as the instruction acquisition Unit 18, the image analysis Unit 22, the evaluation value assignment Unit 24, the label information assignment Unit 26, the weighting setting Unit 28, the label information display Unit 30, the label information specification Unit 32, the image extraction Unit 34, the label information search Unit 36, the image display Unit 38, the image specification Unit 40, the image information display Unit 42, and the image information editing Unit 44, may be dedicated hardware, or may be various processors or computers that execute programs.

Among the various processors are: processors such as a CPU (Central Processing Unit), an FPGA (Field Programmable gate array), and the like, which are general-purpose processors that execute software (programs) to function as various Processing units and can change Circuit configurations after manufacturing, and processors such as Programmable Logic Devices (PLDs), ASICs (Application Specific Integrated circuits) having Circuit configurations specifically designed to execute Specific processes, that is, dedicated circuits, and the like.

The 1 processing unit may be constituted by 1 of these various processors, or may be constituted by a combination of 2 or more processors of the same kind or different kinds, for example, a combination of a plurality of FPGAs, a combination of an FPGA and a CPU, or the like. Further, the plurality of processing units may be configured by 1 of various processors, or 2 or more of the plurality of processing units may be classified and configured by using 1 processor.

For example, the following method is used: as represented by a computer such as a server or a client, C1 processors are configured by a combination of 1 or more CPUs and software, and function as a plurality of processing units. The following embodiments are also described: a processor is used, which is represented by a System on Chip (SoC) or the like, and realizes the functions of the entire System including a plurality of processing units by one IC (Integrated Circuit) Chip.

More specifically, the hardware configuration of these various processors is a circuit (circuit) in which circuit elements such as semiconductor elements are combined.

The method of the present invention is implemented, for example, by a program for causing a computer to execute the steps. A computer-readable recording medium on which the program is recorded can also be provided.

The present invention has been described above in detail, but the present invention is not limited to the above embodiments, and various improvements and modifications can be made without departing from the scope of the present invention.

43页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:图像搜索方法、图像搜索设备和计算机可读存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!