Method and apparatus for object state detection

文档序号:1643153 发布日期:2019-12-20 浏览:14次 中文

阅读说明:本技术 用于对象状态检测的方法和装置 (Method and apparatus for object state detection ) 是由 傅博 张燕 程燕鸣 J·K·瓦尔利 R·E·比奇 I·C·泽加尔 R·J·勒泽斯祖特克 于 2018-05-01 设计创作,主要内容包括:根据货架图像数据对由货架支撑的对象进行对象状态检测的方法,包括:获取货架的多个图像,每个图像包括货架上对象之间的间隙的指示;将图像配准到公共参考系;标识在公共参考系中具有重叠位置的间隙的子集;从该子集生成合并的间隙指示;获取参考数据,该参考数据包括(i)对象的标识符和(ii)对象在公共参考系中的规定位置;基于合并的间隙指示与参考数据的比较,从参考数据中选择目标对象标识符;以及生成并呈现用于目标产品标识符的状态通知。(A method of object state detection for an object supported by a shelf from shelf image data, comprising: obtaining a plurality of images of a shelf, each image including an indication of a gap between objects on the shelf; registering the images to a common reference frame; identifying a subset of gaps having overlapping positions in a common reference frame; generating a merged gap indication from the subset; obtaining reference data comprising (i) an identifier of the object and (ii) a specified position of the object in a common reference frame; selecting a target object identifier from the reference data based on the comparison of the merged gap indication with the reference data; and generating and presenting a status notification for the target product identifier.)

1. A method of object state detection, by an imaging controller, of an object supported by a shelf from shelf image data, the method comprising:

obtaining, at an image preprocessor of the imaging controller, a plurality of images of a shelf, each image including an indication of a gap on the shelf between the objects;

registering, by the image preprocessor, the images to a common reference frame;

identifying, by the image pre-processor, a subset of the gaps having overlapping locations in the common reference frame;

generating, by the image preprocessor, a merged gap indication from the subset;

obtaining, by a comparator of the image controller, reference data comprising (i) an identifier of the object and (ii) a prescribed position of the object in the common reference frame;

selecting, by the comparator, a target object identifier from the reference data based on the comparison of the merged gap indication with the reference data; and

generating and presenting, by a notifier of the image controller, a status notification for the target product identifier.

2. The method of claim 1, wherein the selecting comprises selecting a target object identifier having a specified location that overlaps a location of the consolidated gap indication.

3. The method of claim 2, further comprising determining a degree of coverage of the specified location of the target object identifier indicated by the merged gap.

4. The method of claim 3, further comprising:

determining whether the degree of coverage exceeds an upper threshold; and

wherein generating the status notification comprises generating a backorder notification when the degree of coverage exceeds the upper threshold.

5. The method of claim 4, further comprising:

when the degree of coverage does not exceed the upper threshold, determining whether the degree of coverage exceeds a lower threshold; and

wherein generating the status notification comprises generating a low inventory notification when the degree of coverage exceeds the lower threshold.

6. The method of claim 3, wherein determining the degree of coverage comprises:

retrieving a size and a number of finishes corresponding to the target object identifier and determining a width of the prescribed location based on the size and the number of finishes; and

determining a proportion of the width covered by the width indicated by the merged gap.

7. The method of claim 1, wherein the plurality of images each include a binary gap mask generated from an image of the rack captured by the mobile automated device.

8. The method of claim 1, further comprising:

obtaining a depth measurement corresponding to the image;

registering the depth measurements to the common reference frame; and

validating the merged gap indication based on the depth measurement.

9. The method of claim 8, further comprising:

determining, based on the verification, that the merged gap indication is erroneous; and

in response to the determination, generating the status notification as a congestion notification.

10. A method of object state detection, by an imaging controller, of an object supported by a shelf from shelf image data, the method comprising:

obtaining, by the image controller, input data comprising (i) an input image depicting an area of a shelf, and (ii) a plurality of object indicators containing respective object identifiers and corresponding object locations within the image;

registering, by the image controller, the images to a common reference frame;

obtaining, by the image controller, reference data comprising (i) a reference image depicting the area of the shelf in a full inventory condition, and (ii) a plurality of reference object indicators including respective reference object identifiers and corresponding reference locations of the objects within the image;

registering, by the image controller, the reference data with the input data based on image features of the reference image and the input image;

selecting, by the image controller, a target reference object identifier from the reference data based on the comparison of the object indicator and the reference object indicator; and

generating and presenting, by the image controller, a status notification for the target reference object identifier.

11. The method of claim 10, wherein selecting the target reference object identifier comprises:

identifying a subset of the reference object identifiers having a prescribed location that coincides with a location of the area of the shelf in the common reference frame; and

selecting a reference object identifier from the subset that does not match any of the plurality of object tags.

12. The method of claim 10, further comprising generating the reference data by:

acquiring the reference image of the region before acquiring the input image;

obtaining the plurality of reference object indicators comprising the respective reference object identifier and corresponding reference object location;

registering the reference images to the common reference frame; and

storing the reference image and the reference object indicator.

13. The method of claim 10, further comprising:

obtaining a depth measurement corresponding to the input image;

registering the depth measurements to the common reference frame; and

validating the selected target reference object identifier based on the depth measurement.

14. The method of claim 13, further comprising:

obtaining a position of an edge of the shelf in the common reference frame;

wherein selecting the target reference identifier comprises:

identifying a subset of the reference identifiers having a prescribed location that coincides with a location of the area of the shelf in the common reference frame;

selecting a reference identifier from the subset that matches one of the plurality of object tags and is located at a depth beyond the edge of the shelf that exceeds a predefined threshold.

15. The method of claim 14, wherein generating the status notification comprises generating a low inventory status notification.

16. The method of claim 10, wherein generating the status notification comprises generating a congestion notification when the object identifier of a first object indicator does not match a reference object identifier of one of the reference object indicators having a reference location that matches an object location of the first object indicator.

Background

Environments that manage object inventory, such as products purchased in retail environments, can be complex and mobile. For example, a given environment may contain a variety of objects with different attributes (size, shape, price, etc.). Furthermore, the arrangement and number of objects in the environment may often vary. Still further, imaging conditions, such as illumination, may vary over time and with different locations in the environment. These factors may reduce the accuracy with which information about objects may be collected within an environment.

Drawings

The accompanying figures, in which like reference numerals refer to identical or functionally-similar elements throughout the separate views and which together with the detailed description below are incorporated in and form part of the specification, serve to further illustrate embodiments of the concepts that include the claimed invention and to explain various principles and advantages of those embodiments.

Fig. 1 is a schematic diagram of a mobile automation system.

FIG. 2 is a block diagram of certain internal hardware components of the server in the system of FIG. 1.

Fig. 3 is a flowchart of an object state detection method.

Fig. 4A and 4B depict example input data for the method of fig. 3.

FIG. 5 depicts the location of the input data of FIG. 4 in a common frame of reference.

FIG. 6 illustrates location-based merging of input data.

Fig. 7A and 7B are examples of reference data used in the method of fig. 3.

Fig. 8 shows a comparison of the input data of fig. 4A with the reference data of fig. 7A.

Fig. 9 shows a comparison of the input data of fig. 4B with the reference data of fig. 7B.

FIG. 10 is a flow chart diagram of a state classification method.

Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve understanding of embodiments of the present invention.

The apparatus and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.

Detailed Description

Environments such as warehouses, retail locations (e.g., grocery stores), and the like typically contain a variety of products supported on shelves for selection and purchase by customers. As a result, the composition of the product groups supported by any given shelf module may change over time as products are removed and, in some cases, replaced by customers. Products that have been partially or fully depleted typically require restocking, while products that have been replaced incorrectly (referred to as "corking") typically require replacement at the correct location on the shelf. Conventionally, the detection of replenishment or jamming problems is performed by staff via visual assessment of the shelves and manual barcode scanning. This form of detection is labor intensive and therefore costly and prone to error.

Attempts to automate the detection of product status problems, such as the problems described above, are complicated by the mobility of the environment in which the autonomous data capture system is required to operate. Among other difficulties, digital images of shelves vary in quality with the available lighting, the presence of visual obstructions, and the like. Furthermore, variations in the width of the product present on the shelves and the position of the product on the shelves reduce the accuracy of the machine-generated condition detection.

Examples disclosed herein are directed to a method of object state detection of an object supported by a shelf from shelf image data, comprising: obtaining a plurality of images of a shelf, each image including an indication of a gap between objects on the shelf; registering the images to a common reference frame; identifying a subset of gaps having overlapping positions in a common reference frame; generating a merged gap indication from the subset; obtaining reference data comprising (i) an identifier of the object and (ii) a specified position of the object in a common reference frame; selecting a target object identifier from the reference data based on the comparison of the merged gap indication with the reference data; and generating and presenting a status notification for the target product identifier.

Fig. 1 depicts a mobile automation system 100 in accordance with the teachings of the present disclosure. The system 100 includes a server 101, the server 101 in communication with at least one mobile automation appliance 103 (also referred to herein simply as appliance 103) and at least one mobile device 105 via a communication link 107 (shown in this example to include a wireless link). In the illustrated example, the system 100 is deployed in a retail environment that includes a plurality of shelf modules 110, each shelf module 110 supporting a plurality of products 112. The rack modules 110 are typically arranged in a plurality of lanes, each lane comprising a plurality of modules aligned end to end. More specifically, the device 103 is deployed within a retail environment and communicates with the server 101 (via link 107) to navigate the length of at least a portion of the shelf 110 fully or partially autonomously. The device 103 is equipped with a plurality of navigation and data capture sensors 104, such as image sensors (e.g., one or more digital cameras) and depth sensors (e.g., one or more light detection and ranging (LIDAR) sensors), and is further configured to capture shelf data with the sensors. In this example, the apparatus 103 is configured to capture a series of digital images of the shelf 110 and a series of depth measurements, each of which describes a distance and direction between the apparatus 103 and one or more points on the shelf 110 (such as the shelf itself or products disposed on the shelf).

The server 101 includes a dedicated imaging controller, such as a processor 120, specifically designed to control the mobile automation device 103 to capture data, retrieve the captured data via the communication interface 124, and store the captured data in a repository 132 in the memory 122. The server 101 is further configured to perform various post-processing operations on the captured data and detect the status of the products 112 on the shelves 110. When the processor 120 detects a particular status indicator, the server 101 is further configured to transmit a status notification (e.g., a notification indicating that the product is out of stock, or misplaced) to the mobile device 105. The processor 120 is interconnected with a non-transitory computer-readable storage medium (such as memory 122) having stored thereon computer-readable instructions for detecting out-of-stock and/or low inventory on the shelves 110, as discussed in further detail below. The memory 122 comprises a combination of volatile memory (e.g., random access memory or RAM) and non-volatile memory (e.g., read-only memory or ROM, electrically erasable programmable read-only memory or EEPROM, flash memory). Processor 120 and memory 122 each comprise one or more integrated circuits. In one embodiment, to improve the reliability and processing speed of the large amount of sensor data collected by the mobile automation device 103, a specially designed integrated circuit, such as a Field Programmable Gate Array (FPGA), is designed to detect stock outages and/or low inventory discussed herein, instead of or in addition to the imaging controller/processor 120 and memory 122. As will be appreciated by those skilled in the art, the mobile automation device 103 also includes one or more controllers or processors and/or FPGAs in communication with the controller 120, which are specifically configured to control navigation and/or data capture aspects of the device 103.

The server 101 also includes a communication interface 124 that interconnects the processor 120. The communication interface 124 includes suitable hardware (e.g., transmitters, receivers, network interface controllers, etc.) to allow the server 101 to communicate with other computing devices, particularly the apparatus 103 and the mobile device 105, via the link 107. Link 107 may be a direct link or a link traversing one or more networks, including local area networks and wide area networks. The particular components of the communication interface 124 are selected based on the type of network or other link that the server 101 is required to communicate over. In this example, the wireless local area network is implemented within a retail environment via deployment of one or more wireless access points. Thus, the link 107 includes a wireless link between the apparatus 103 and the mobile device 105 and the access point described above, as well as a wired link (e.g., an ethernet-based link) between the server 101 and the access point.

The memory 122 stores a plurality of applications, each application comprising a plurality of computer readable instructions executable by the processor 120. Execution of the above-described instructions by processor 120 configures server 101 to perform various actions discussed herein. The applications stored in the memory 122 include a control application 128, which control application 128 may also be implemented as a set of logically distinct applications. In general, the processor 120 is configured to implement various functions via execution of the control application 128 or subcomponents thereof. The processor 120, as configured via execution of the control application 128, is also referred to herein as the controller 120. As will now be apparent, some or all of the functions described below as being implemented by the controller 120 may also be performed by pre-configured hardware elements (e.g., one or more ASICs), rather than the execution of the control application 128 by the processor 120.

In this example, in particular, the server 101 is configured to process input data including image and depth data captured by the apparatus 103 and attributes derived from the image and depth data (e.g., attributes of gaps between products 112 on the shelves 110, and attributes of indicators of the products 112) via execution of the control application 128 by the processor 120 to generate status notifications related to the products 112.

Turning now to FIG. 2, before describing operations of the application 128 to detect out-of-stock, low inventory, and/or jammed inventory, certain components of the application 128 will be described in more detail. It will be apparent to those skilled in the art that in other examples, the components of the application 128 may be separated into different applications or combined into other sets of components. Some or all of the components shown in fig. 2 may also be implemented as dedicated hardware components, such as one or more Application Specific Integrated Circuits (ASICs) or FPGAs. For example, in one embodiment, to improve reliability and processing speed, at least some of the components of fig. 2 are programmed directly into the imaging controller 120, which may be an FPGA or ASIC having a circuit and memory configuration specifically designed to optimize high-volume image processing for detecting high-volume sensor data received from the mobile automation device 103. In such embodiments, some or all of the control applications 128 discussed below are embodied in FPGA or ASIC chip logic.

Briefly, the control application 128 includes components configured to obtain input data describing particular attributes of the shelves 110, process the input data for comparison with reference data, and generate product status notifications (such as out-of-stock, low-stock, and jammed notifications) based on the comparison.

More specifically, in this example, the control application 128 includes an image preprocessor 200, which image preprocessor 200 is configured to acquire and process input data describing the shelves 110 and products 112. The control application 128 also includes a reference generator 202, the reference generator 202 configured to generate reference data related to the shelves 110 for use by a comparator 204, the comparator 204 configured to compare the reference data and the input data to identify a mismatch therebetween. The control application 128 also includes a classifier 208, the classifier 208 configured to classify the output of the comparator 204 (i.e., the mismatch described above). Further, the control application 128 includes a verifier configured to verify the output of the classifier 208, and a notifier 216 configured to generate a status notification based on the output of the classifier 208 and the verifier 212.

The functionality of the control application 128 will now be described in more detail with reference to the components shown in FIG. 2. Turning to fig. 3, a method 300 of object state detection is shown. The method 300 will be described in connection with execution on the system 100 as described above.

Execution of the method 300 begins at block 305 where the controller 120, and in particular the image preprocessor 200, is configured to obtain input data that includes at least one shelf image and at least one indicator of an attribute derived from the shelf image. In this example, the shelf image is a digital image (e.g., an RGB image) depicting an area of the shelf 110 and the products 112 supported by the area of the shelf. In some examples, the indicator includes a gap indicator. Turning to FIG. 4A, an input image 400 is shown, including two gap indicators 408-1 and 408-2. Each gap indicator 408 defines a bounding box that indicates the location of the gap between the products 112 relative to the image 400 at which the back 412 of the shelf 110 is visible. As shown in fig. 4A, in this example, a gap indicator 408 is acquired at block 305 as an overlay on the image 400. In other examples, the gap indicator is obtained instead as a metadata field included with the image 400 or as a set of distinct values (e.g., bounding box coordinates) with the image 400. In other examples, the above-described gap indicators are obtained as a binary gap mask (mask) that indicates regions of the image 400 that have been classified as gaps (e.g., having high intensity) and regions of the image 400 that have not been classified as gaps (e.g., having low intensity).

In other examples, referring to FIG. 4B, the indicator includes object indicator 416 (three examples of which are shown 416-1, 416-2, and 416-3). Each object indicator 416 includes an object identifier 420, such as a SKU number, text string, etc., corresponding to the product 112. In the example shown in FIG. 4B, the object identifiers 420-1, 420-2, and 420-3 are text strings that identify the respective products 112. Each object indicator 416 also includes an object location 424 within the image 400. In the illustrated example, object locations 424-1, 424-2, and 424-3 are obtained at block 305 as bounding boxes that are overlaid on the image 400. In other examples, the object location 424 and the object identifier 420 are contained in a metadata field of the image 400 or received as distinct data (e.g., a separate file) associated with the image 400. More specifically, the control application 128 includes a product recognition engine configured to compare various image features of the image 400 to a database of product models and select a product model having image features that match those in the image 400. For each selected product model, the product recognition engine is configured to insert an indicator 416 into the image 400 or otherwise associated with the image 400 that includes locations matching image features and object identifiers 424 corresponding to those features.

At block 305, the image pre-processor 200 is further configured to obtain a depth measurement corresponding to the image 400. The depth measurements and images acquired at block 305 are typically captured and stored in repository 132 at substantially the same time by device 103. Thus, at block 305, the image pre-processor 200 is configured to retrieve the image 400 and depth measurements from the repository. In this example, the depth measurements are registered with the image 400, that is, each depth measurement is assigned a location (e.g., pixel coordinates) within the image 400. In other examples, the image pre-processor 200 is configured to register the depth measurements to the image 400 if the above-mentioned registration has not been completed.

Further, in the present example, the image pre-processor 200 is configured to segment the above-described depth measure based on the indicator shown in fig. 4. That is, the image pre-processor 200 is configured to project the bounding box defined by the indicators 408 or 416 onto the depth measurements and thereby assign a subset of the depth measurements to each bounding box. As will now be apparent, this segmentation assembles sets of depth measurements that correspond to the various gaps (for indicator 408) and machine-identified products 112 (for indicator 416).

Returning to fig. 3, at block 310, the image pre-processor 200 is configured to register the input data to a common reference frame. The common reference frame is a previously defined coordinate system of the retail (or other) environment that contains the shelves 110. Turning to fig. 5, an origin 500 is depicted, which defines a coordinate system; accordingly, each shelf 110, as well as any other objects within the retail environment, may be assigned coordinates relative to the origin 500. At block 310, the image preprocessor 200 is thus configured to identify the region of the shelf 110 depicted by the image 400 acquired at block 305. In this example, the identification of such regions is based on navigation data generated by the apparatus at the time the image 400 and depth measurements are captured and stored in the repository 132. For example, in the current execution of block 310, the image preprocessor 200 identifies that the image 400 and the corresponding depth measurements depict an area 504 of the shelf 110.

At block 315, the image preprocessor 200 is configured to incorporate indicators that overlap in a common reference frame. More specifically, in some examples, the input data acquired at block 305 includes a plurality of images depicting overlapping portions of the shelves 110. The devices 103 typically capture a stream of images while traveling along the shelves 110, and thus each area of each shelf 110 is typically depicted in more than one captured image. Accordingly, at block 305, the image pre-processor 200 acquires a set of neighboring images (i.e., a set of images sequentially captured by the device 103). Fig. 6 depicts an image 400 and a second image 600, the second image 600 depicting an area of the shelf 110 that overlaps the area depicted by the image 400.

At block 305, the image pre-processor 200 is configured to register the images 400 and 600 with each other (i.e., to a common set of pixel coordinates), for example, by applying suitable image feature registration operations (e.g., edge and blob (blob) identification and matching) to the images 400 and 600. Fig. 6 also shows a registered image 604 resulting from the registration of images 400 and 600, with products 112 and shelves omitted for simplicity. Having registered the images 400 and 600, the image pre-processor 200 is configured to identify subsets of indicators (i.e., subsets among the gap indicators 408-1, 408-2, 408-1', and 408-2' in the example of fig. 6) that have overlapping locations in the registered image 604. As seen in fig. 6, the gap indicator from image 600 overlaps the gap indicator from image 400 with a heavier line width.

The gap indicators 408 and 408', although overlapping, do not overlap perfectly, for example, due to different physical locations of the device 103 during capture of the images 400 and 600, and due to imaging artifacts (e.g., illumination or contrast changes) in the images 400 and 600 that affect the control application 128 in detecting the gap. For each subset of overlapping indicators (i.e., the two subsets in the illustrated example, indicators 408-1 and 408-1', and indicators 408-2 and 408-2'), the image preprocessor 200 is configured to select one of the overlapping indicators for further processing via the method 300. For example, the input data may include a confidence level associated with each indicator 408, which is determined during generation of the input data (i.e., gap indicators or object indicators). In such an example, the image pre-processor 200 is configured to select one indicator 408 from each subset having the highest confidence value. In other examples, the image preprocessor 200 is configured to generate a merged indicator consisting of overlapping regions within each subset. Fig. 6 shows two such merged indicators 608-1 and 608-2. In other examples, execution of block 315 is omitted for indicator 416 (i.e., object indicator) or indicator 408 (i.e., gap indicator) or both.

Referring again to fig. 3, at block 320, the comparator 204 is configured to acquire (e.g., from the repository 132) reference data corresponding to a location depicted by the input data in the common reference frame (i.e., location 504 as shown in fig. 5), and to register the reference data with the input data acquired at block 305. Turning to fig. 7A and 7B, two examples of reference data are shown. Specifically, FIG. 7A shows a portion of a reference planogram 700 including reference indicators 704-1, 704-2, and 704-3. Each indicator 704 defines a reference location, e.g., via a bounding box within a common reference frame. Each indicator 704 also includes a product identifier, such as a text string (as shown in FIG. 7A), a SKU number, and so forth. Additionally, the indicators 704 may each include a number of finishes indicating a number of adjacent products that are expected to have the same identifier within the corresponding bounding box. Thus, indicator 704-3 indicates that within the specified bounding box, two boxes of adjacent "Acme crisp rice (Acme Crunch)" are expected to be present on shelf 110.

FIG. 7B depicts another example form of reference data acquired at block 320. In particular, fig. 7B depicts reference data 710 comprising a shelf image (e.g., an RGB image) comprising a plurality of reference object indicators. Each reference object indicator includes a reference location 714-1, 714-2, 714-3, 714-4 (shown as a bounding box in this example), and an associated reference product identifier 718-1, 718-2, 718-3, 718-4. Reference data 710, which reference data 710 may also be referred to as a real graph (realogram), is retrieved from the repository 132. In addition, the reference data 710 includes depth measurements segmented into each bounding box 714 as described above in connection with the performance of block 305.

Prior to its retrieval, reference data 710 may be generated or updated by reference generator 202. In particular, the reference generator 202 is configured to perform blocks 305, 310 of the method 300, and 315 of the method 300, in isolation, in a "full stock" condition in a retail environment). That is, when the retail environment is full of products 112, the device 103 may be configured to traverse the shelves 110 and capture image data and depth measurements. The capturing of input data for reference data generation is performed within a predetermined time interval. For example, the reference data capture may be performed once a day before the retail environment is opened to the customer. Other suitable time periods may also be used for reference data capture.

Having acquired the input image and depth data, reference generator 202 is configured to acquire reference object indicators as shown in fig. 7A and 7B by providing the input data to the product recognition engine described above. As will now be apparent, reference generator 202 receives reference location 714 and reference object identifier 718 from the product recognition engine. The reference generator 202 is then configured to store the images and the reference object indicators in the repository 132 in association with locations within the common reference frame.

At block 325, in response to obtaining the reference data, the comparator 204 is configured to determine whether there is any mismatch between the reference data retrieved at block 320 and the input data obtained and processed at blocks 305 and 315. In some examples, where the input data includes gap indicators 408, the comparator 204 is configured to retrieve the planogram 700 as reference data and determine whether the gap indicators 408 overlap with any of the reference indicators 704. When the determination is negative (i.e., no gap indicator overlaps with any reference indicator 704), execution of the method 300 proceeds to block 335, which will be discussed below. However, in the execution of this example, as seen in FIG. 8, the registration of the input data with planogram 700 reveals that gap indicators 408 (shown as projections 800-1 and 800-2 on planogram 700) overlap with all three reference indicators 704.

Fig. 9 illustrates the registration of input data including object indicator 416 with real map 710. When the input data includes object indicator 416, comparator 204 is configured to retrieve solid graph 710 and determine whether any of reference object indicators 714 and 718 are not represented in the input data and whether any of object indicators 416 are not represented in reference data 710. As seen in FIG. 9, two reference indicators are not represented in the input data, as shown by the shaded areas 900-1 and 900-2 of the reference indicators corresponding to the product "Juice (Juice)" and the product "Acme crisp rice", respectively. Further, one object indicator included in the input data is not represented in the reference data 710, as shown by the shaded area 900-3 corresponding to the object indicator 416-3 of the product "Acme Cola".

Returning to fig. 3, following a positive determination at block 325, the comparator 204 is configured to provide the mismatched object identifier (also referred to as the target object identifier) to the classifier 208. At block 330, the classifier 208 is configured to assign each target object identifier one of a plurality of preconfigured state types. In this example, the status types include an out of stock (OOS) status, a Low Stock (LS) status, and a plug (plug) (PL) status. It will be apparent to those skilled in the art that the OOS status indicates that the corresponding product has been depleted from the shelf 110 and needs replenishment. The LS status indicates that the corresponding product, although not completely depleted, is close to being depleted. The PL status indicates that the corresponding product has been misplaced on the shelf 110, i.e. according to the reference data, the corresponding product is not expected in the location where it was detected.

When a mismatch indicates that an object identifier in the input data is not represented in the reference data, as shown in fig. 9 in connection with the "Acme cola" product, the classifier 208 is configured to assign a PL status type to the object identifier. However, when a mismatch indicates the opposite, i.e., the object identifier in the reference data is not represented in the input data, the classifier 208 is configured to perform the classification process shown in FIG. 10 to select one of the OOS and LS states for the associated object identifier.

Referring to fig. 10, at block 1005 the classifier 208 is configured to determine a degree of coverage of the mismatch identified at block 325. Where a mismatch is revealed by the gap indicator 408, the classifier 208 is configured to determine a proportion of at least one reference object indicator that overlaps the gap indicator 408 at block 1005. Referring again to FIG. 8, for example, as shown by shaded region 800-1, gap indicator 408-1 completely overlaps reference indicator 704-2 (i.e., overlaps 100% of the region defined by reference indicator 704-2), and also overlaps about 12% of the region of reference indicator 704-1. Turning to fig. 9, the shaded areas 900-1 and 900-2 indicate that the entire reference location indicators 714-2 and 714-4 are not represented in the input data (i.e., the degree of coverage of mismatches detected in association with the reference indicators 714-2 and 714-4 is 100%).

In some examples, the classifier 208 is configured to determine the degree of coverage as a number of facings that are expected from the reference data but are not represented in the input data, rather than as a percentage as discussed above. To determine the number of missing facings when the input data includes the gap indicator 408, the classifier 208 is configured to determine a facing width of the associated product from the reference data. Referring to fig. 7, in this example, each reference indicator 704 in planogram 700 defines the total width of the area expected to contain a given product, as well as the number of finishes for that product. Thus, the veneer width is determined by dividing the indicator total width by the number of veneers from the planogram. In other examples, the planogram may also contain data defining the width of each veneer (i.e., the width of the product). Thus, returning to fig. 10, the classifier 208 is configured to determine the veneer width of the mismatched object indicator and determine how many veneer widths are covered by a given gap indicator 408. As shown in fig. 8, the degree of coverage represented by shaded area 800-1 corresponds to one finish of the "juice" product, and corresponds to about 12% of one finish of the "Acme dog food" product. At the same time, the shaded area 800-2 covers about 70% of one veneer of the "Acme crisp rice" product.

The classifier 208 is configured to adjust the number of facings covered by the mismatch, e.g., based on a preconfigured threshold. For example, if the coverage level determined at block 1005 is greater than 65%, the classifier 208 is configured to adjust the coverage level to a veneer. Conversely, if the degree of coverage is less than 50%, the classifier 208 is configured to set the degree of coverage to a zero-level (as such low coverage may be due to expected spaces between products 112, rather than spaces indicating missing products).

When the input data includes object indicator 416, classifier 208 is configured to determine the degree of coverage of the veneer by counting the number of adjacent reference indicators having the same product identifier that are not represented in the input data. As can be seen from FIG. 9, the degree of coverage of the shaded areas 900-1 and 900-2 is one veneer each in this example.

At block 1010, having determined the coverage of the mismatch identified at block 325, the classifier 208 is configured to determine whether each coverage meets or exceeds an upper threshold. The upper threshold may be set as a percentage (e.g., 90% of the area of the planogram indicator 704), or the number of facings. Typically, the upper threshold corresponds to all expected facings of mismatched products. Thus, the upper threshold of reference indicator 704-2 is a facet and the determination of shaded region 800-1 at block 1010 is affirmative. However, the upper threshold for reference indicator 704-3 is two, and the determination at block 1010 for shadow region 800-2 is therefore negative.

When the determination at block 1010 is positive, the classifier 208 assigns an OOS status to the relevant reference object identifier at block 1015. However, when the determination at block 1010 is negative, the classifier 208 proceeds to block 1020 to determine whether the degree of coverage meets or exceeds a lower threshold. The classifier 208 may determine the lower threshold based on the expected number of product finishes (as specified by the reference data 700 or 710). More specifically, the lower threshold is set to the total number of expected facings minus the minimum number of required facings to avoid generating low inventory status notifications. Typically, a low inventory notification is generated when only one veneer remains in the product but the number of veneers expected is greater than one. Thus, the lower threshold is typically one less than the total expected number of facings. Thus, if the degree of coverage satisfies a lower threshold, only one facet of the product is retained, and the classifier 208 is configured to assign an LS state to the relevant reference object identifier at block 1025. Referring again to FIG. 8, the LS status is assigned to the mismatch indicated by shaded area 800-2 (i.e., to the product "Acme Cummit"). However, when the determination at block 1020 is negative, the classifier 208 is configured to assign a "normal" status to the mismatch, indicating that although the product is not adequately stored on the shelf 110, it has not been sufficiently depleted to warrant a low inventory notification.

After the classification process, execution of the method proceeds to block 335. At block 335, the verifier 212 is configured to verify the output of the classifier 208, e.g., based on the depth measurement acquired at block 305. The verification at block 335 includes any one or any combination of several different verifications. For example, the validator 212 may be configured to obtain the shelf edge position relative to the input data and retrieve the known shelf depth (i.e., the distance between the shelf back 412 and the shelf edge). The shelf edge position may be detected from the depth measurement or may be retrieved from the repository 132. Having acquired the shelf edge location and shelf depth, the validator 212 is configured to determine whether the depth measurement of the area corresponding to any product assigned the OOS status at block 330 sufficiently exceeds the shelf edge to confirm the OOS status. That is, the validator 212 is configured to determine whether the depth measurement corresponding to the shadow region 800-1 is greater than the shelf edge depth by a margin substantially equal to the known shelf depth. If the determination is negative, the corresponding gap indicator 408 may not be correct (i.e., there may be a product on the shelf 110 that is detected as a gap). When the determination is negative, the verifier 212 is configured to change the status classification from OOS to PL.

In other examples, for reference object identifiers assigned PL states, the verifier 212 is configured to retrieve segments of depth measurements from the repository 132 corresponding to the relevant reference object identifiers. Since the product assigned the jammed state is misplaced, depth measurements are taken from locations in the common reference frame that are different from the locations described by the input data. In response to retrieving a segment of the depth measurement representing a three-dimensional scan of the plugged product, the validator 212 is configured to compare the retrieved reference depth measurement to a segment depth measurement corresponding to plugging (e.g., a depth measurement corresponding to the shaded region 900-3 in fig. 9). When the reference depth measurement matches the input depth measurement, a plugged state is confirmed. Otherwise, the verifier 212 is configured to discard the jammed status allocation, or to change the allocation to indicate a decreased confidence level, for example.

In further examples, the verifier 212 is configured to obtain the shelf edge locations as described above and determine whether a depth measurement segmented using any object indicators 416 exceeds a depth configurable threshold for the shelf edge locations. If the threshold is exceeded, the corresponding product is placed toward the shelf back 412, away from the shelf edge. For example, referring to fig. 9, object indicator 416 indicates that the product "Acme dog food" is present, as expected by reference to solid diagram 710. However, comparing the depth measurement corresponding to object location 424-1 to the shelf edge reveals that object location 424-1 is farther from the shelf edge than a predetermined threshold (e.g., three centimeters). Thus, the validator 212 is configured to generate additional low inventory status assignments beyond those generated at block 330.

After verification is completed at block 335, the notifier is configured to generate one or more status notifications at block 340 based on the output of the classifier 208 and the verifier 212. Each status notification includes a product identifier, a status assigned to the product identifier, and a location of a corresponding indicator in a common reference frame. Table 1 contains a status notification list generated based on the above examples of input data and reference data.

Table 1: status notification

Product ID Status of state Position of
Acme crisp rice LS [X,Y,Z]
Acme dog food LS [X,Y,Z]
Fruit juice OOS [X,Y,Z]
Acme cola PL [X,Y,Z]

As seen in Table 1, the first, third, and fourth rows represent status notifications generated based on the classification assigned at block 330, while the second row represents status notifications generated by the verifier 212 at block 335. In some examples, the status notification is stored in repository 132. In other instances, the status notification is transmitted directly to a client computing device, such as mobile device 105.

Specific embodiments have been described in the foregoing specification. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings.

The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or element of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.

Moreover, in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," "has," "having," "contains," "containing," "contains," "containing," "covers," "containing," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. The terms "comprises," "comprising," "including," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, or comprises a non-exclusive inclusion, does not exclude the presence of other elements or steps of the same elements. The terms "a" and "an" are defined as one or more unless explicitly stated otherwise herein. The terms "substantially", "approximately", "about" or any other version of these terms are defined as being close as understood by one of ordinary skill in the art, and in one non-limiting embodiment, these terms are defined to be within 10%, in another embodiment within 5%, in another embodiment within 1%, and in another embodiment within 0.5%. The term "coupled", as used herein, is defined as connected, although not necessarily directly, and not necessarily mechanically. A device or structure that is "configured" in a certain way is configured in at least that way, but may also be configured in ways that are not listed.

It will be appreciated that some embodiments may comprise one or more general-purpose or special-purpose processors (or "processing devices"), such as microprocessors, digital signal processors, custom processors and Field Programmable Gate Arrays (FPGAs), and unique stored program instructions, including both software and firmware, that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the methods and/or apparatus described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more Application Specific Integrated Circuits (ASICs), in which various functions or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of these two approaches may also be used.

Furthermore, embodiments may be implemented as a computer-readable storage medium having computer-readable code stored thereon for programming a computer (e.g., including a processor) to perform a method as described and claimed herein. Examples of such computer-readable storage media include, but are not limited to, hard disks, CD-ROMs, optical memory devices, magnetic memory devices, ROMs (read only memories), PROMs (programmable read only memories), EPROMs (erasable programmable read only memories), EEPROMs (electrically erasable programmable read only memories), and flash memories. Moreover, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.

The Abstract of the disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. This Abstract is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Furthermore, in the foregoing detailed description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the detailed description, with each claim standing on its own as a separately claimed subject matter.

23页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:发送设备、接收设备以及操作设备的方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!