Image-based platform inspection

文档序号:976282 发布日期:2020-11-03 浏览:2次 中文

阅读说明:本技术 基于图像的平台检验 (Image-based platform inspection ) 是由 蒂莫西·P·谢里尔 约瑟夫·D·施赖纳 罗伯特·P·米勒 于 2019-01-31 设计创作,主要内容包括:公开了一种为过程准备平台的方法。平台可以被准备有任何必要的组件,之后成像设备可以摄取平台的图像。可以将该图像与参考图像进行对比,并识别出任何差异。该差异可以指示在图像中并显示给操作人员,以便操作人员可以修正与差异相关联的任何错误。(A method of preparing a platform for a process is disclosed. The platform may be prepared with any necessary components, after which the imaging device may take an image of the platform. The image can be compared to a reference image and any differences identified. The discrepancy may be indicated in the image and displayed to the operator so that the operator may correct any errors associated with the discrepancy.)

1. A method, comprising:

a) causing an imaging device to capture a second image of a platform of a structure, the platform comprising a plurality of discrete platform locations in which a plurality of different components are respectively located;

b) comparing, by a processor in communication with the imaging device, the second image with a first image stored in a memory of a computer device;

c) determining, by the processor, whether there are any differences between the second image and the first image; and

d) outputting, by an output device in communication with the processor, an indication of any discrepancy between the second image and the first image, which discrepancy may result in an interruption or failure of a process running on an apparatus.

2. The method of claim 1, wherein the plurality of different components is a second plurality of different components, and the first image includes a first plurality of different components.

3. The method of claim 2, wherein the indication of any differences comprises at least one platform location having a different component in the second image and the first image.

4. The method of claim 1, wherein determining whether there are any differences between the second image and the first image comprises determining a first difference between the second image and the first image, the first difference being associated with a first discrete platform location from the plurality of discrete platform locations, and wherein the indication of any differences comprises a representation of the second image having a highlight region associated with the first discrete platform location, the first discrete platform location being associated with the first difference between the second image and the first image.

5. The method of claim 1, wherein the indication indicates that there is no difference between the second image and the first image that could cause an interruption or failure of the process running on the device.

6. The method of claim 1, wherein a device is programmed to allow a user to configure the types of components to be placed in the plurality of discrete platform locations so that different processes can be run on the device.

7. The method of claim 1, wherein the component comprises a laboratory tool, and wherein the imaging device comprises a camera.

8. The method of claim 1, wherein the device comprises an analyzer that analyzes a biological sample.

9. The method of claim 1, wherein the plurality of different components is a second plurality of different components, and the first image comprises a first plurality of different components, wherein the first image is formed by:

capturing, by the imaging device, the first image prior to capturing the second image, the platform including the plurality of discrete platform locations in which the first plurality of different components are respectively located.

10. The method of claim 9, further comprising:

receiving, by the device, a protocol to perform the process prior to capturing the first image;

performing, by the device, the process after capturing the first image; and

after performing the process, storing the first image in the memory.

11. The method of claim 1, further comprising mapping a discrete platform location of the plurality of discrete platform locations to a range of pixels in the second image.

12. A system, comprising:

means for performing a process; and

an imaging device;

a computer apparatus comprising a processor, an output device coupled to the processor, and a non-transitory computer-readable medium coupled to the processor, the computer apparatus operatively coupled to the imaging device and the apparatus, the non-transitory computer-readable medium comprising code executable by the processor to perform a method comprising:

a) causing the imaging device to capture a second image of a platform of a structure, the platform comprising a plurality of discrete platform locations in which a plurality of different components are respectively located;

b) comparing, by the processor, the second image to a first image stored in a memory of the computer device;

c) determining, by the processor, whether there are any differences between the second image and the first image; and

d) causing the output device to generate an indication of any difference between the second image and the first image that may cause an interruption or failure of the process performed by the apparatus.

13. The system of claim 12, wherein the plurality of different components is a second plurality of different components, and the first image includes a first plurality of different components.

14. The system of claim 13, further comprising:

a platform of a structure, the platform comprising a plurality of discrete platform locations, wherein the indication of any differences comprises at least one platform location having a different component in the second image and the first image.

15. The system of claim 12, wherein determining whether there are any differences between the second image and the first image comprises determining a first difference between the second image and the first image, the first difference being associated with a first discrete platform location from the plurality of discrete platform locations, and wherein the indication of any differences comprises a representation of the second image having a highlight region associated with the first discrete platform location, the first discrete platform location being associated with the first difference between the second image and the first image.

16. The system of claim 12, wherein the indication indicates that there is no difference between the second image and the first image that could cause an interruption or failure of the process running on the device.

17. The system of claim 12, wherein a device is programmed to allow a user to configure the types of components to be placed in the plurality of discrete platform locations so that different processes can be run on the device.

18. The system of claim 12, wherein the assembly comprises a laboratory tool, wherein the apparatus comprises an analyzer that analyzes a biological sample, and wherein the imaging device comprises a camera.

19. The system of claim 12, wherein the plurality of different components is a second plurality of different components and the first image comprises a first plurality of different components, wherein the first image is formed by:

capturing, by the imaging device, the first image prior to capturing the second image, the platform including the plurality of discrete platform locations in which the first plurality of different components are respectively located.

20. The system of claim 19, wherein the method further comprises:

receiving, by the device, a protocol to perform the process prior to capturing the first image;

performing, by the device, the process after the first image is captured; and

after performing the process, storing the first image in the memory.

Background

The sample processing system may be used to analyze a biological sample. Once the sample processing system is programmed and the necessary materials are arranged in a particular manner, the system can automatically analyze or otherwise process the sample. For example, the biological specimen in the test tube may be mounted and held (stage) at a designated location on the platform. Likewise, a pipette tip may be mounted and held (stage) at another location on the platform, and a detection vessel, such as a microplate, may be placed at yet another location on the platform. Then, when the analysis process begins, the robotic arm may remove one or more pipette tips and use the tips to transfer a portion of the sample from some of the test tubes as needed and transport them to a detection vessel for further processing.

While these automated systems can effectively analyze biological samples after proper setup, they still rely on the operator to properly prepare the system before the analysis process begins. If the operator places the array of test tubes in the wrong area, or forgets to prepare the test tubes, the system may not complete the analysis process and/or may cause the system to malfunction. Such human error is common, especially for systems that are often reconfigured to run different experiments, and for processes that involve multiple types of components that need to be placed in specific locations. Thus, human preparation errors often result in failure of the automated biological sample analysis process.

Certain techniques have been used to address this human error problem. For example, a grid-like illustration and virtual representation are used to show the operator where different components should be placed in the pose area. However, even with such tools, operators still make mistakes because they have difficulty associating drawings with the real-world environment. For example, an operator may place the components in a similar manner, but inadvertently move each component one position to one side. In addition, scanners have been introduced which can be moved between stations to check the type of components placed in each station to ensure that each necessary component has been prepared and placed in the correct position. However, to effectively check the entire parking area, the scanning process takes too much time. Accordingly, there remains a need for an improved method of preparing components and staging areas.

Moreover, for some technologies, alternative versions of some components may be acceptable. A fully automated scanning or recognition system of components may identify acceptable replacement components as differing from the requirements of the protocol. This may result in the system unnecessarily aborting those protocols that could otherwise successfully operate. Thus, there is room for human judgment in improved methods for preparing or inspecting components within a staging area.

Embodiments of the present invention address these and other challenges individually and collectively.

Disclosure of Invention

Some embodiments of the invention incorporate an imaging device into a processing system. The imaging device captures images of a platform or other presentation area. When an experiment or other process is initially configured, the imaging device may take an image of the platform properly prepared. Thereafter, when the operator prepares the platform for another execution of the procedure, the imaging device may take another image. The second image may be compared to the first image to identify any differences between the images. The differences may indicate human error in preparing the platform, such as placing a package of components in the wrong area. The real world image of the platform may be modified to highlight the discrepancy to the operator and the operator may continue to check the highlighted location on the platform and make any necessary corrections. Highlighting the real world image of the discrepancy makes it easier for the operator to understand, and thus the operator is likely to be successful in correcting the error.

One embodiment of the invention is directed to a method that includes causing an imaging device to capture a second image of a platform of a structure. The platform includes a plurality of discrete platform locations, wherein there are a plurality of different components in each of the discrete platform locations. The method further includes comparing the second image to the first image stored in the memory of the computer device, determining if there are any differences between the second image and the first image, and outputting an indication (indication) of any differences between the second image and the first image that may cause a process running on the device to be interrupted or failed.

Another embodiment of the invention relates to a system configured to perform the above method.

These and other embodiments of the present invention are described in further detail below with reference to the accompanying drawings.

Drawings

FIG. 1 shows a block diagram of a processing system according to an embodiment of the invention.

FIG. 2 shows a schematic diagram of a platform according to an embodiment of the invention.

FIG. 3 shows a flow diagram of a platform verification process according to an embodiment of the invention.

FIG. 4 illustrates an example reference image of a platform according to an embodiment of the invention.

Fig. 5 shows an example of a contrast between a reference image and a new image according to an embodiment of the present invention.

6A-6B illustrate examples of different platform inspection views that may be used by an operator according to embodiments of the present invention.

Detailed Description

Embodiments of the present invention may be used to prepare a platform or staging area for analysis or other processes. For example, after an operator places different components at different locations on the platform, the imaging device may capture images of the prepared platform. This image can be compared with a correctly set image of another display platform. The difference between the current image and the previous image may be highlighted to the operator. The operator may then examine the region in the real world corresponding to the highlighted region of the image and make any necessary corrections (e.g., replace the incorrect component with the correct component).

Images can be captured and analyzed quickly, thereby facilitating a quick correction process. In addition, the real-world image is better understood for the operator than the platform illustration, so that any errors can be corrected quickly and efficiently.

Before discussing specific embodiments of the invention, some terms may be described in detail.

The "component" may include a part, or an element. The components may include tools and building blocks for performing procedures such as biological experiments or manufacturing processes. Examples of components include supplies for biological experiments, such as sample tubes, pipette tips, biological samples, reagents, chemicals, microwell plates and any other suitable materials or laboratory tools. Sometimes the components may be combined in a tray or other suitable package. In some embodiments, certain types of components may be found in uniquely configured packages or packages with specific tags.

FIG. 1 shows a block diagram of a processing system 100 according to an embodiment of the invention. The processing system 100 includes a control computer 108 operatively coupled to a structure 140, a transport apparatus 141, a processing device 101, and an imaging apparatus 107. Each of which may have an input/output interface to allow data transport between the device and any external devices. One exemplary processing system is the Biomek i7 automated workstation sold by Beckman Coulter, Inc.

Embodiments of the invention may include imaging the platform to determine whether the component has been properly positioned on the platform. For purposes of illustration, the processing system 100 will be primarily described as a sample processing system for processing and analyzing biological samples. However, embodiments may be applied to any other suitable type of process involving a platform having preloaded components.

The structure 140 may include support legs, a power source, the platform 105, and any other suitable features. The platform 105 may include a physical surface (e.g., a planar physical surface) on which components may be placed and accessed for experimentation, analysis, and processing. In some cases, the platform 105 may be a floor or a desktop surface. The platform 105 may be subdivided into a plurality of discrete platform locations for placement of different components. These locations may be directly adjacent or may be spaced apart. Each platform location may include dividers, inserts, and/or any other support structure for separating the different platform locations and housing the assembly. For exemplary purposes, FIG. 1 shows a first location 105A, a second location 105B, and a third location 105C on the platform, but other locations may be included.

The transport apparatus 141 (which may represent multiple transport apparatuses) may prepare and/or transport the assembly between the platform 105 and the processing device 101 and between different locations on the platform 105. Examples of transport equipment may include conveyors, sample rails, pick and place grippers, independently movable laboratory transport elements (e.g., pucks), robotic arms, and other tube or assembly transport mechanisms. In some embodiments, transport apparatus 141 includes a pipetting head configured to transport liquids. Such pipetting heads may transfer liquid within a pipettor tip that is liquid-transportable and may comprise a holder suitable for gripping or releasing other laboratory instruments, such as a microplate.

The processing device 101 may include any number of machines or instruments for performing any suitable processes. For example, the processing device 101 may include an analyzer, which may include any suitable instrument capable of analyzing a sample, such as a biological sample. Examples of analyzers include spectrophotometers, photometers, mass spectrometers, immunoassays, hematology analyzers, microbiological analyzers, and/or molecular biology analyzers. In some embodiments, the processing device 101 may comprise a sample presentation device. The sample fractionating apparatus may include: a sample presentation unit for receiving a sample tube with a biological sample, a sample storage unit for temporarily storing the sample tube or sample retention container, means (or devices) for aliquoting the sample (e.g. aliquotter), and means for accommodating at least one reagent cartridge comprising the reagents required by the analyzer and any other suitable features.

The imaging device 107 may be any suitable device for capturing images of any components on the platform 105 and the platform 105 (or the entire structure 140). For example, the imaging device 107 may be any suitable type of camera, such as a photographic camera, a video camera, a three-dimensional image camera, an infrared camera, and the like. Some embodiments may also include three-dimensional laser scanners, infrared light depth sensing technology, or other tools for creating a three-dimensional surface map of an object and/or chamber.

Control computer 108 may control the processes running on processing system 100, initiate configuration processes, and check whether the settings for the components have been properly prepared for the processes. The control computer 108 may control and/or send messages to the processing apparatus 101, the transportation device 141, and/or the imaging device 107. The control computer 108 may include a data processor 108A, a non-transitory computer readable medium 108B and a data storage 108C coupled to the data processor 108A, one or more input devices 108D and one or more output devices 108E.

Although control computer 108 is depicted in fig. 1 as a single entity, it should be understood that control computer 108 may exist in a distributed system or cloud technology-based environment. In addition, embodiments allow some or all of the control computer 108, processing apparatus 101, transport device 141, and/or imaging device 107 to be combined as part of a single device.

Output device 108E may include any suitable device that may output data. Examples of output devices 108E may include a display screen, speakers, and a data transmission device.

The input device 108D may include any suitable device capable of inputting data into the control computer 108. Examples of input devices include buttons (e.g., keyboard and mouse), touch screen, touch pad, microphone, and the like.

The data processor 108A may include any suitable data computing device or combination thereof. An exemplary data processor may include one or more microprocessors that cooperate to perform desired functions. The data processor 108A may comprise a CPU including at least one high speed data processor capable of executing program components for performing user and/or system generated requests. The CPU may be a microprocessor, such as Athlon, Duron, and/or Opteron, of AMD; PowerPC from IBM and/or Motorola; cell processors by IBM and Sony; celeron, Itanium, Pentium, Xeon, and/or XScale, intel; and/or the like.

The computer-readable medium 108B and the data storage 108C may be any suitable devices or devices that can store electronic data. Examples of memory may include one or more memory chips, disk drives, and the like. Such a memory may operate using any suitable electrical, optical, and/or magnetic mode of operation.

The computer-readable medium 108B may include code executable by the data processor 108A to perform any suitable method. For example, the computer-readable medium 108B may include code executable by the processor 108A to cause the processing system 100 to perform a method including causing the imaging device to capture a second image of a platform of the structure, the platform including a plurality of discrete platform locations at which a plurality of different components are respectively located; comparing the second image with the first image stored in the memory of the computer device; determining whether there is a difference between the second image and the first image; and outputting an indication (indication) of any difference between the second image and the first image that may cause a process running on the device to be interrupted or failed.

The computer-readable medium 108B may include code executable by the data processor 108A to receive and store process steps for one or more protocols (e.g., protocols for analyzing biological samples), and process steps for controlling the structure 140, the transport device 141, and/or the processing apparatus 101 to perform one or more protocols. The computer-readable medium 108B may also include code executable by the data processor 108A for receiving results from the processing device 101 (e.g., results from an analysis of a biological sample) and forwarding the results or using the results for other analysis (e.g., diagnosing a patient). Additionally, the computer-readable medium 108B may include code executable by the data processor 108A for comparing two images of the platform, identifying a difference between the two images, and displaying an image containing the indicated difference to a user.

The data storage component 108C may be internal or external to the control computer 108. The data storage component 108C may include one or more memories, including one or more memory chips, disk drives, and the like. The data storage component 108C can also include a conventional, fault-tolerant, relational, extensible, secure database, such as Oracle on the marketTMOr SybaseTMA database of (2). In some embodiments, data storage 108C may store protocols 108F and images 108G.

The protocol 108F in the data storage component 108C may include information about one or more protocols. A protocol may include information about one or more process steps to be completed, components used in the process, component placement, and/or any other suitable information for completing the process. For example, a protocol may include one or more ordered steps for analyzing a biological sample. The protocol may also include steps for preparing a list of components before starting the process. The components may be mapped to specific locations on the platform 105 where the transport apparatus 141 may retrieve the components for transfer to the processing device 101. The mapping may be encoded as instructions for operating the transport device 141, and the mapping may also be represented by a virtual image shown to the user so that the user may place the component on the platform 105. Embodiments allow processing system 100 to be used by multiple processes (e.g., multiple different biological analyses). Thus, information about multiple protocols 108F can be stored and retrieved as needed. When changing from a first process to a second process or restarting the first process, the components on the platform 105 may be rearranged, replaced, and/or replenished as needed.

The image may include a depiction of one or more objects. By way of example, the images may include digital pictures or photographs, video, three-dimensional pictures and video, color photographs, black and white photographs, high dynamic range images (e.g., combining multiple images taken of the same subject at different exposures together), and the like. The image 108G in the data store 108C may include a visual representation of the reality of the platform 105. In each image, the platform 105 may be shown in a ready state to begin a process in which all necessary components are placed in their proper positions. Each of the images 108G may be associated with a particular protocol from the stored protocols 108F. In some embodiments, there may be a single image for a certain protocol. In other embodiments, there may be multiple images for a certain protocol (e.g., from different angles, with different lighting levels, or with a reasonable replacement of laboratory equipment in certain locations). Image 108G may be stored as various types or formats of image files, including JPEG, TIFF, GIF, BMP, PNG, and/or RAW image files, as well as AVI, WMV, MOV, MP4, and/or FLV video files.

As mentioned above, the platform 105 may be subdivided into a plurality of discrete platform locations for placement of different components. The discrete locations may have any suitable dimensions. Fig. 2 shows an example of a platform 105 having multiple positions. Platform 105 in fig. 2 shows separate areas numbered P1 through P30 and TL1 through TL5, TR1, each of which may operate as separate locations for different types of modules or packages, and a cleaning station. The numbering of certain individual regions in FIG. 2 is covered by overlapping elements: these locations may be identified by numbers that follow the numbers associated with those individual regions with visible numbers. Embodiments allow the platform 105 to have additional or fewer locations as desired. Although these locations may be numbered or named, in the real world, they may or may not be physically labeled (labeled) or marked (marked) on the platform 105.

Embodiments allow some or all of the positions to be occupied by components of a predetermined type, according to certain protocols. For example, fig. 2 shows positions P2, P3, P4, P5, P16, P23, P24, TL4, and TL5, each loaded with a first protocol-specified package of components. Some locations may contain the same type of components. For example, positions P2, P3, and P4 all contain a type of assembly labeled BC230, which may represent a certain type of test tube, microplate, pipette tip, or any other suitable laboratory tool assembly.

In some embodiments, one or more locations may not be a physical part of structure 140 or platform 105, but may be on another surface or floor adjacent to structure 140 and/or platform 105. These locations may be included because they are also accessible by the transport 141. For example, locations TL1 through TL5, TR1, and/or cleaning stations may be physically separate from structure 140 and/or platform 105.

Fig. 3 shows a high levelflowchart of a platform verification process according to an embodiment of the present invention.

At step 302, during configuration of a new protocol (e.g., referred to as a first protocol), an operator may physically configure the layout of a first set of components on the platform. For example, an operator may determine where to place a first type of component (e.g., a laboratory tool such as a pipette tip) in one or more first locations, where to place a second type of component (e.g., a laboratory tool such as a sampling tube) in one or more second locations, and so on. For example, a processing device and platform may be used to perform a plurality of processes, and the processing device, transport apparatus and/or control computer may be programmed to allow a user to configure a number of types of components at a number of discrete locations so that different processes may run on the device.

At step 304, the operator may program the first protocol based on the established positional configuration of the component. For example, the control computer (and/or processing device) may receive information from an operator regarding a particular location where different types of components are to be placed for use during the first procedure. This may be done using a menu that allows the user to drag and drop the component representation to a particular location. The operator may also program the steps of the first procedure (e.g., the steps required to perform certain types of biological sample analysis) and the manner in which the components are used in the first procedure. Embodiments allow for the step of protocol programming to be performed before or after the first image is captured at step 306.

At step 306, the control computer may cause an imaging device (e.g., a camera) to capture a first image of the platform of the structure. Images may be acquired from the still camera position and used as a later reference. The image may be taken when the platform is already fully loaded with components and the first process performed in step 308 has not yet begun (e.g., the components have not yet been used, moved, or disturbed). Thus, the image may capture a plurality of discrete platform locations and a first plurality of different components respectively in the discrete platform locations.

At step 308, the control computer may execute a first process defined by a first protocol. For example, the transport apparatus may transport the component from the platform to one or more instruments of the processing device according to a first process, and the processing device may manipulate the component to perform the first process and return any results to the control computer. For example, the processing device may be an analyzer, and the first process may include analysis of a biological sample. Step 308 may be performed after step 306 so that the image reflects the configuration of the platform presented before the process started.

At step 310, the control computer may store (e.g., in a data store or other memory) the first image of the platform. The first image may be stored in association with the first protocol and may be used as a reference image for future execution of the first process. In some embodiments, if the first process is successfully completed, the operator and/or the control computer may decide to store the first image after step 308. If the first process is not successfully completed, the first image used by the unsuccessful first process may be discarded, the component layout on the platform may be reconfigured and/or the first process may be adjusted, and another first image may be captured. Once the first procedure is successfully completed, the first protocol may be saved in a data storage memory (data storage memory) along with the captured first image corresponding to the successful first protocol.

At step 312, the control computer may then receive a selection of a first protocol. For example, the output device may provide a selection window for an operator to select among various protocols, and the operator may select (e.g., via the input device) a first protocol in order to run a first process. This step may be performed after other processes are run on the processing system.

At step 314, the control computer may display, via the output device, the configuration of the first protocol platform specified by the selected first protocol. The display may include a virtual representation of the platform, different platform locations, and components to be placed at each location. For example, the display may be similar to fig. 2. In some embodiments, the control computer may also display the stored images, as well as any other suitable type of information, to instruct the operator how to prepare the system.

At step 316, the operator or a machine under the control of the operator may physically place the second set of components on the platform. The operator may use the protocol specification and the displayed component diagram as a guide to place specific types of components in specific locations. If the operator correctly places the components, each location will have the same type of components pre-placed during the initial configuration of the protocol (e.g., in step 302).

At step 318, the control computer may cause the imaging device to capture a second image of the platform of the structure. In some embodiments, the second image of the platform is taken after the operator has finished placing the second set of components and before performing the procedure. Thus, the second image may capture a plurality of discrete platform locations and a second plurality of different components respectively in the discrete platform locations. The second image may be taken using the same camera from the same position and angle and/or in the same manner as the first image. As a result, the second image and the first image may be compared to identify the difference. In some cases, multiple images may be taken from different angles (angles) or perspectives (perspectives).

At step 320, the control computer may retrieve the stored first image from a data store (or other memory) based on the selection of the first protocol.

At step 322, the control computer may compare the first image to the second image to identify any relevant discrepancies. For example, the control computer may identify, for each location on the platform, whether the component shown in the second image is different from the component shown in the first image. In some embodiments, the control computer may analyze pixels or objects in each image to identify differences in color, shape, component labels, or any other suitable indicators (indicators). These differences may indicate incorrect components in the second image. In some embodiments, the control computer may determine the actual type of component in each image based on image characteristics (e.g., color, shape, etc.), and determine whether the components are the same. Additional details regarding image contrast are discussed below in view of FIG. 5.

At step 324, the control computer may determine whether any differences (e.g., one or more differences) exist between the first image and the second image based on the image contrast. In addition, the control computer may determine whether any locations in the image are not visible (e.g., due to the position and angle of the camera).

In some embodiments, whether there is a difference between images may be based on a similarity threshold. For example, if the partial similarity of two images is 98%, it is sufficient for the control computer to conclude that there is no meaningful difference.

At step 326, the control computer may display or otherwise output an indication (indication) of any determined differences to the operator (e.g., via an output device). Such differences may cause process interruptions or failures if the processing device is allowed to operate without modification. The output may include the first image and/or the second image, and a disparity indicator (indicators) on the images. For example, the components and locations in question may be highlighted, circled, or otherwise identified to the operator. Indicators (indicators) may show the operator, for example, the location of the platform where there are different components in the second image and the first image. The output may also include the virtual representation from step 314.

If no discrepancy is found, the method may continue to step 330 of the execution process. For example, the output may indicate that there is no difference between the second image and the first image that could cause an interruption or failure of a process running on the device, and the operator may then initiate execution of the first process.

At step 328, the operator may physically correct a configuration error of one or more components based on the displayed differences. Displaying the first and/or second image with the highlighted differences to the user may prompt the operator to respond better because the real-world image is easier to understand than the virtual representation. The differences in the image may not all need to be corrected. For example, the correct type of component may be used, but it may not affect the process because it has a different color). Thus, the operator may make one or more corrections (e.g., swap certain components for certain locations), but may not replace components for each location where the image indicates a discrepancy.

At step 330, the processing system may perform a first process using a second component placed on the platform. The process may be successfully completed because image contrast may enable the operator to identify and correct any preparation errors.

20页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于车辆的传感器组件

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!