Conveying apparatus, liquid ejecting apparatus, reading apparatus, image forming apparatus, control method of conveying apparatus, and recording medium
阅读说明:本技术 传送设备,液体喷射设备,读取设备,图像形成设备,传送设备的控制方法 (Conveying apparatus, liquid ejecting apparatus, reading apparatus, image forming apparatus, control method of conveying apparatus, and recording medium ) 是由 加藤泰一 林智明 于 2018-06-12 设计创作,主要内容包括:一种传送设备包括头单元(210C,210K),用于处理在传送方向上被传送的物体(W);第一支撑构件,其布置在相对于处理位置在传送方向上的上游,以支撑该物体;第二支撑构件,布置在处理位置的传送方向上的下游,以支撑该物体;表面检测器,其在第一支撑构件和第二支撑构件之间以检测物体的第一表面信息;上游表面检测器,其布置在相对于表面检测器在传送方向上的上游以检测物体的第二表面信息;边缘检测器,其布置成与在传送方向上的表面检测器相比更靠近上游表面检测器以检测物体的边缘,以及控制器,其基于表面检测器和上游表面检测器的检测结果以及边缘检测器的检测结果使头单元移动。(A conveying apparatus includes a head unit (210C, 210K) for processing an object (W) conveyed in a conveying direction; a first support member disposed upstream in the conveying direction with respect to the processing position to support the object; a second support member arranged downstream of the processing position in the conveying direction to support the object; a surface detector between the first support member and the second support member to detect first surface information of the object; an upstream surface detector arranged upstream in the conveying direction with respect to the surface detector to detect second surface information of the object; an edge detector disposed closer to the upstream surface detector than the surface detector in the conveying direction to detect an edge of the object, and a controller that moves the head unit based on detection results of the surface detector and the upstream surface detector and a detection result of the edge detector.)
1. A transfer device, comprising:
a head unit configured to perform processing on an object to be conveyed;
a first support member that is arranged upstream in a conveyance direction of the object with respect to a processing position where the conveyed object is processed by the head unit, and that is configured to support the conveyed object.
A second support member that is arranged downstream in the conveying direction with respect to the processing position and is configured to support the conveyed object.
A surface detector disposed between the first and second support members and configured to detect first surface information of the conveyed object;
an upstream surface detector that is disposed upstream in the conveying direction with respect to the surface detector and configured to detect second surface information of the conveyed object;
an edge detector arranged at a position closer to the upstream surface detector than the surface detector in the conveying direction and configured to detect an edge in a width direction of the conveyed object; and
a controller configured to move the head unit based on respective detection results of the surface detector and the upstream surface detector and a detection result of the edge detector.
2. The transmission apparatus according to claim 1, wherein the controller comprises:
a meandering amount calculator configured to calculate a meandering amount in a width direction of the conveyed object between a position facing the upstream surface detector and a position facing the surface detector based on respective detection results of the surface detector and the upstream surface detector;
a misalignment amount calculator configured to calculate a misalignment amount of the edge detected by the edge detector with a preset reference position by comparing the position of the edge with the preset reference position; and
a head movement amount calculator configured to calculate a movement amount for moving the head unit to the correction processing position, the movement amount reflecting the calculated meandering amount and the calculated misalignment amount of the edge in the width direction of the object being conveyed.
3. The transfer apparatus of claim 2,
the head movement amount calculator calculates a movement amount of the head unit using a misalignment amount of an edge of the object to be conveyed at the initial adjustment and a meandering amount of the object to be conveyed during the conveying operation.
4. The transfer apparatus of claim 2,
the head movement amount calculator calculates a movement amount of the head unit using a misalignment amount of an edge of the object conveyed during the conveying operation and a meandering amount of the object conveyed during the conveying operation.
5. The transfer apparatus of claim 4,
the head movement amount calculator calculates a misalignment amount of an edge of the object being conveyed during the conveying operation by obtaining an average of the misalignment amounts during the conveying operation using the moving average.
6. The transfer apparatus of any one of claims 2 to 5,
the surface detector and the upstream surface detector each include a light emitter configured to illuminate the object with light of a particular wavelength;
the upstream surface detector includes an upstream imaging unit configured to image a pattern generated by interference of light applied to a uniform pattern formed on a surface or an inner surface of an object; and
the surface detector includes an imaging unit configured to image a pattern generated by interference of light applied to a uniform pattern formed on a surface or an inner surface of an object, wherein
The meandering amount calculator calculates a meandering amount in a width direction of the conveyed object between a position facing the upstream imaging unit and a position facing the imaging unit, based on the respective imaging patterns imaged by the upstream imaging unit and the imaging unit.
7. The transfer apparatus of any one of claims 1 to 6,
the edge detector is aligned with the upstream surface detector in a width direction, which is a direction orthogonal to a conveying direction in which the object is conveyed.
8. The transfer apparatus of any one of claims 1 to 7,
the object to be conveyed is a long continuous sheet in the conveying direction.
9. The transmission apparatus according to any one of claims 1 to 8, further comprising:
a plurality of head units arranged parallel to each other in a direction along a conveying direction, the plurality of head units each having a configuration extending in a direction orthogonal to the conveying direction;
a plurality of first support members configured to support the conveyed object, the plurality of first support members being located upstream of respective processing positions of the plurality of head units in the conveying direction; and
a plurality of surface detectors arranged in association with the plurality of head units, respectively, along the conveying direction, wherein
The controller moves each of the plurality of head units based on at least two of detection results obtained by the plurality of surface detectors and the upstream surface detector and detection results of the edge detector.
10. A liquid ejection apparatus comprising:
the head unit included in the transfer apparatus according to any one of claims 1 to 9, wherein the head unit is a liquid ejecting unit configured to eject liquid droplets to form an image.
11. A reading device comprising:
the head unit included in the conveying apparatus according to any one of claims 1 to 9, wherein the head unit is a reading unit configured to read a test pattern formed on the object to be conveyed.
12. The transmission apparatus according to any one of claims 1 to 7, further comprising:
a plurality of head units arranged parallel to each other in a direction along a conveying direction, the plurality of head units each having a configuration extending in a direction orthogonal to the conveying direction; and
a plurality of surface detectors arranged in association with the plurality of head units, respectively, along the conveying direction, wherein
The controller moves each of the plurality of head units from the most upstream head unit among the plurality of head units based on at least two detection results of detection results obtained by the plurality of surface detectors and the upstream surface detector and detection results of the edge detector.
13. The transfer apparatus of claim 12,
the object to be conveyed is a transfer belt configured to be conveyed in a conveying direction.
14. An image forming apparatus includes:
the plurality of head units in the conveying apparatus according to claim 13, wherein each of the plurality of head units is an image forming unit configured to transfer a transfer pattern onto the transfer belt, and
the transfer pattern transferred onto the transfer belt is further transferred onto a recording medium.
15. A method for controlling a conveying apparatus that includes a head unit and is configured to convey an object in a conveying direction while performing processing on the conveyed object using the head unit, the method comprising:
detecting first surface information of the conveyed object by an upstream surface detector arranged upstream in a conveying direction with respect to the head unit;
detecting an edge in a width direction of the conveyed object by an edge detector;
detecting second surface information of the conveyed object by a surface detector that is disposed downstream in the conveying direction with respect to the upstream surface detector and the edge detector and that is disposed at a position that is a distance from the upstream surface detector that is greater than a distance between the upstream surface detector and the edge detector in the conveying direction;
calculating a meandering amount in a width direction of the conveyed object between a position facing the upstream surface detector and a position facing the surface detector based on respective detection results of the surface detector and the upstream surface detector;
calculating a misalignment amount of the edge detected by the edge detector from a preset reference position by comparing the position of the edge in the width direction with the preset reference position; and
calculating a movement amount for moving the head unit to the correction processing position, the movement amount reflecting the calculated meandering amount and the calculated misalignment amount of the edge in the width direction of the object to be conveyed; and
the head unit is moved by the calculated movement amount.
Technical Field
The disclosure discussed herein relates to a conveying apparatus, a liquid ejecting apparatus, a reading apparatus, an image forming apparatus, and a control method.
Background
For an apparatus that processes objects while conveying the objects, it is important to adjust the timing of misalignment or the position of misalignment to obtain satisfactory results. For example, the related art discloses a method of adjusting the position of a print head to improve print quality (for example, patent document 1). Specifically, according to this method, positional fluctuations in the transverse direction of a print medium such as a web passing through a continuous sheet printing system are first detected by a sensor. Subsequently, the position of the print head in the lateral direction is adjusted so as to compensate for the positional fluctuations detected by these sensors.
List of cited documents
Patent document
[ patent document 1] Japanese unexamined patent application publication No. 2015-
Disclosure of Invention
Technical problem
However, with the technique of
Technical scheme for solving technical problem
According to one aspect of an embodiment, a transmission apparatus includes
A head unit configured to perform processing on an object to be conveyed in a conveying direction;
a first support member that is arranged at a position upstream in a conveying direction with respect to a processing position where the conveyed object is processed by the head unit, and that is configured to support the conveyed object;
a second support member arranged downstream of the processing position in the conveying direction and configured to support the conveyed object;
a surface detector disposed between the first support member and the second support member and configured to detect first surface information of the conveyed object;
an upstream surface detector that is arranged upstream in the conveying direction with respect to the surface detector and configured to detect second surface information of the conveyed object;
an edge detector arranged at a position closer to the upstream surface detector than the surface detector in the conveying direction and configured to detect an edge in a width direction of the conveyed object; and
a controller configured to move the head unit based on respective detection results of the surface detector and the upstream surface detector and a detection result of the edge detector.
The invention has the advantages of
According to an aspect of the present invention, in the conveying apparatus, the processing position at which the object being conveyed is processed can be corrected with higher accuracy in the direction orthogonal to the conveying direction in which the object is conveyed.
Drawings
Fig. 1 is a schematic diagram illustrating an example of a transmission apparatus according to an embodiment of the present invention;
fig. 2 is a schematic top view showing an example of a liquid ejection apparatus according to a first embodiment of the present invention;
fig. 3 is a schematic side view showing another example of the liquid ejection apparatus according to the first embodiment of the invention;
fig. 4A is a diagram showing an example of the outer shape of a liquid ejection head unit according to a first embodiment of the present invention;
fig. 4B is a diagram showing an example of the outer shape of the liquid ejection head unit according to the first embodiment of the present invention;
fig. 5 is a flowchart showing an example of control relating to head unit position correction according to the first embodiment of the present invention;
fig. 6 is a diagram showing an example of functional blocks of a controller according to the first embodiment of the present invention;
fig. 7 is a perspective view showing an example of a mechanical configuration of a sensor device according to an embodiment of the present invention;
FIG. 8 is a functional block diagram illustrating an example of control using a surface detector according to an embodiment of the present invention;
FIG. 9 is a configuration diagram showing an example of a correlation calculation method according to an embodiment of the present invention;
fig. 10 is a diagram illustrating an example of a peak position search method in correlation calculation according to an embodiment of the present invention;
fig. 11 is a diagram showing an example of a calculation result of the correlation calculation according to the embodiment of the present invention;
fig. 12 is a block diagram showing an example of a hardware configuration for moving a liquid ejection head unit included in a liquid ejection apparatus according to an embodiment of the present invention;
fig. 13 is a schematic top view showing an example of a mechanism for moving a liquid ejection head unit included in a liquid ejection apparatus according to an embodiment of the present invention;
fig. 14 is a schematic top view showing an example of a liquid ejection apparatus according to a comparative example;
fig. 15 is a timing chart showing an example of a method for calculating the fluctuation amount of an object to be conveyed by the liquid ejection apparatus according to a comparative example;
fig. 16 is a diagram showing an example of a method for correcting an image printing position in the case where a reading edge is misaligned at the time of initial adjustment according to an embodiment of the present invention;
fig. 17 is a flowchart illustrating an example of processing for correcting an image printing position in the case where the read sheet edge is misaligned at the time of initial adjustment according to an embodiment of the present invention;
fig. 18 is a diagram showing an example of a method for correcting an image printing position in the case of real-time reading edge misalignment according to an embodiment of the present invention;
fig. 19 is a flowchart illustrating an example of a method of correcting a writing position in a sheet width direction in the case of reading a sheet edge misalignment in real time according to the present invention;
fig. 20 is a schematic top view showing an example of a system provided with a conveying apparatus according to a second embodiment of the present invention;
fig. 21 is a block diagram showing an example of a hardware configuration of a controller of the system according to the second embodiment of the present invention;
fig. 22 is a block diagram showing an example of a hardware configuration of a data management device included in a controller of a system according to an embodiment of the present invention;
fig. 23 is a block diagram showing an example of a hardware configuration of an image output apparatus included in a controller according to an embodiment of the present invention;
fig. 24 is a schematic top view showing an example of a reading apparatus according to a third embodiment of the present invention;
fig. 25 is a schematic side view showing another example of a reading apparatus according to a third embodiment of the present invention;
fig. 26 is a diagram showing an example of the outer shape of a head unit according to a third embodiment of the present invention;
fig. 27 is a schematic functional block diagram showing a reading apparatus according to a third embodiment of the present invention;
fig. 28 is a schematic side view showing an intermediate transfer type image forming apparatus according to a fourth embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described below with reference to the drawings. In the present specification and the drawings, the same reference numerals are assigned to parts or elements having substantially the same functional configuration, and a repetitive description of the same functional configuration will be omitted.
The first embodiment: liquid ejection apparatus
Fig. 1 is a schematic external view showing an example of a transmission apparatus according to an embodiment of the present invention. As a first embodiment of the present invention, a description is given of an example in which a head unit included in a transfer apparatus is a liquid ejection head unit (droplet ejection unit) configured to eject liquid, and the transfer apparatus is a liquid ejection apparatus (droplet ejection apparatus configured to eject liquid).
A liquid ejection apparatus as an example of the conveying apparatus according to the present embodiment is used as an image forming apparatus in which the ejected liquid is a recording liquid. The recording liquid may be, for example, an aqueous ink, an oil-based ink, or the like. As a first embodiment, the configuration of a liquid ejection apparatus serving as a conveying apparatus when serving as the
The object to be conveyed by the
Further, the web W may be a so-called continuous sheet-like printing medium or the like. In other words, the web W to be conveyed is a rollable elongated continuous paper (sheet) or the like. Note that the web W serving as an object to be conveyed is not limited to an elongated sheet, but may be so-called "zigzag paper" or the like, which is a sheet that can be folded to be stored.
Further, the
Fig. 2 is a schematic top view showing an example of a liquid ejection apparatus according to a first embodiment of the present invention. In this example, the
In the example of fig. 2, the
Specifically, it is assumed that the liquid ejection head units are arranged in the order of black (K), cyan (C), magenta (M), and yellow (Y) from upstream to downstream in the conveyance direction of the web W. That is, the liquid ejection head unit located at the most upstream position in the conveying direction is a black (K) liquid ejection head unit (hereinafter referred to as "black liquid
The
In the configuration shown in fig. 2, a plurality of line-
Therefore, the support roller R1 functions as a first support member at an upstream position with respect to the
Further, actuators a1, a2, A3, and a4 configured to move the head units in a direction perpendicular to the conveying direction are connected to the
Further, the
In fig. 2, sensor devices SE1, SE2, SE3 and SE4, which are indicated by broken lines, are located between respective adjacent support rollers of the insertion head unit. In addition, sensor devices SE1, SE2, SE3 and SE4 are located below the
Similarly, a sensor device (upstream surface detecting sensor) SE0 indicated by a broken line is arranged upstream of the
The sensor arrangements (surface detecting sensors) SE1, SE2, SE3 and SE4 and the sensor arrangement (upstream surface detecting sensor arrangement) SE0 are provided for detecting the amount of meandering or positional misalignment of the web W in the conveying direction. The sensor devices SE1, SE2, SE3 and SE4 each function as or as part of a surface detector.
An edge sensor (edge detector) ES0 is arranged upstream of the
Further, an upstream supporting roller R0 disposed upstream of the
In fig. 2, the distance in the conveyance direction between the upstream surface detection sensor device SE0 and the edge sensor ES0 is shorter than the distance between the surface detector SE1 and the
In this case, the edge sensor ES0 is preferably aligned in a direction orthogonal to the conveyance direction with the upstream sensor device SE0 serving as an upstream surface detection sensor.
In fig. 2, support rollers R2, R3, and R4 are arranged between the respective adjacent head units. However, the number of support rollers disposed between the respective adjacent head units is not limited to one; two or more support rollers may be arranged between respective adjacent head units as shown in fig. 3.
Fig. 3 is a schematic side view showing another example of a liquid ejection apparatus according to an embodiment of the present invention. In fig. 1 and 2, an explanation is given of an example of a liquid ejection apparatus in which the conveyance direction X is a horizontal direction. However, in the liquid ejecting apparatus, since tension is applied to the surface of the web W on which the liquid droplets are ejected, the conveyance direction X may be slightly curved in an upward direction, thereby conveying the web W in the Xm direction.
As shown in fig. 3, at opposite ends of one set of support rollers CR1K to CR2Y of the
In addition, it is preferable that the recording medium constituting the web W has an elongated form. Specifically, it is preferable that the length of the recording medium is longer than the distance between the first nip roller NR1 and the second
Each of the liquid
In this example, the black ink lands on the land position of the black liquid
Further, in the example shown in fig. 3, two or more rollers are arranged for each liquid ejection head unit. Among the plurality of rollers, one of two adjacent rollers is arranged at an upstream position with respect to a corresponding one of the liquid ejection head units, and the other is arranged at a downstream position so as to insert the corresponding liquid ejection head unit. In the example of fig. 3, first rollers CR1K, CR1C, CR1M, and CR1Y for conveying the web W to the respective landing positions are arranged at upstream positions with respect to the respective liquid ejection head units. In addition, second rollers CR2K, CR2C, CR2M, and CR2Y for conveying the web W from the respective landing positions are arranged at downstream positions with respect to the respective liquid ejection head units.
Specifically, in order to accurately land the black ink, a black first roller CR1K for conveying the web W to the black land position PK is disposed at a predetermined portion of the web W. Further, a black second roll CR2K for conveying the web W is disposed at a position downstream of the black landing position PK.
In a similar manner, the first roller CR1C for cyan and the second roller CR2C for cyan are arranged with respect to the cyan liquid
When the first roller and the second roller are arranged as described above, so-called "flapping" is reduced at each landing position. Note that the first roller and the second roller may each be used for conveying a recording medium, for example, and may be driven rollers. Further, the first roller and the second roller may be rollers driven to rotate by a motor or the like.
In the example shown in fig. 3, with respect to the
Note that the first roller as an example of the first support member and the second roller as an example of the second support member need not be rotating bodies such as driven rollers or the like. That is, the first roller and the second roller may be any supporting member configured to support an object to be conveyed. For example, the first and second support members may be tubes, shafts, or the like having a circular cross-section. Alternatively, the first support member and the second support member may be curved plates or the like having a circular arc portion that is in contact with the object to be conveyed.
Further, as shown in fig. 2 and 3, the
In addition, the
The arrangement configuration of the sensor device may be either one of fig. 2 and 3; therefore, in fig. 4, and in subsequent figures, the sensor devices will be described using reference numerals SE0 to
More specifically, in the example shown in fig. 2 and 3, the
In the following description, each of the head unit sensor devices SE1 to SE4 and the upstream sensor device SE0 may also be simply collectively referred to as a "sensor device". Note that the configuration and arrangement position of the sensors are not limited to those shown in the present specification and the drawings.
In fig. 2 and 3, a total of five sensor devices are shown; however, the number of sensor devices for detecting the relative position is not limited to five. That is, as shown in fig. 2, the total number of sensor devices including the number of sensor devices per head unit and the number of upstream sensor devices may be larger than the number of liquid ejection head units. For example, two or more sensor devices may be arranged for each liquid ejection head unit. Similarly, two or more upstream sensor devices may be arranged.
As the sensor device, an optical sensor using light such as infrared rays, a sensor using laser light, air pressure, photoelectric, or ultrasonic waves, or the like can be used. Note that the optical sensor may be, for example, a CCD (charge coupled device) camera, a CMOS (complementary metal oxide semiconductor) camera, or the like. A configuration example of the sensor device will be described later with reference to fig. 7.
Here, it is preferable that each head unit sensor is disposed at a position close to the landing position from the corresponding head unit. When each head unit sensor device is arranged at a position close to the corresponding landing position, the distance between the landing position and each head unit sensor device decreases. As the distance between the landing position and each head unit sensor device decreases, the detection error can be reduced. Therefore, the image forming apparatus can accurately detect the relative position of the recording medium between the plurality of detection results in the orthogonal direction and the conveying direction using each head unit sensor device.
Specifically, a position close to the landing position is between the first roller and the second roller. That is, in the illustrated example, as shown in fig. 3, it is preferable that the position where the black sensor device SENK is disposed is the black
As described above, when each head unit sensor device is disposed between the respective rollers, each head unit sensor device can detect the position or the like of the recording medium at a position close to each landing position. In addition, the speed of movement between the rollers may be relatively stable. Therefore, the image forming apparatus can be enabled to accurately detect the position of the recording medium in the orthogonal direction.
In fig. 2, the distance in the conveyance direction between the upstream surface detection sensor device SE0 and the edge sensor ES0 is shorter than the distance between the surface detector SE1 and the
Further, it is preferable that a position between the rollers of each head unit sensor device is arranged closer to the first roller than the landing position. That is, as shown in fig. 3, it is preferable that the position at which each head unit sensor device is arranged is an upstream position with respect to the corresponding landing position.
Specifically, it is preferable that the position at which the black sensor device SENK is arranged is between the black landing position PK and the position at which the black first roller CR1K is arranged, in a direction from the black landing position PK toward upstream (hereinafter, referred to as "black
When each head unit sensor device is arranged at the black upstream portion INTK2, the cyan upstream portion INTC2, the magenta upstream portion INTM2, and the yellow upstream portion INTY2, it is possible to enable the image forming apparatus to accurately detect the position of the recording medium in the orthogonal direction. Further, when each head unit sensor device is disposed at such a position, each head unit sensor device is disposed at an upstream position from the corresponding landing position. The image forming apparatus can accurately detect the position of the recording medium at an upstream position with respect to the landing position by each of the head unit sensor devices, and can calculate the timing of ejection of each liquid ejection head unit, the amount of movement of each liquid ejection head unit, or a combination thereof.
That is, when the position of the web W is detected at an upstream position with respect to the landing position and then the web W is conveyed to the landing position at a downstream position, the timing of ejecting the liquid, the amount of movement of the liquid ejection head unit, or a combination thereof is calculated. Therefore, each landing position can be accurately changed in a corresponding one of the liquid ejection head units in the conveying direction, the orthogonal direction, or both directions.
Liquid ejection head
Fig. 4A and 4B are diagrams illustrating an example of the outer shape of a liquid ejection head unit according to an embodiment of the present invention. Fig. 4A is a schematic plan view illustrating an example of four liquid
As shown in fig. 4A, in this example, each liquid ejection head unit is a line-type liquid ejection head unit. That is, the
Further, in this example, the black (K) liquid
Note that, in this example, the liquid ejection head unit is constituted by four heads arranged in a staggered manner; however, the liquid ejection head unit may be constituted by a single head covering the width direction in a row, or may be constituted by a plurality of heads continuously and closely arranged in the width direction.
Head position control
Next, an outline of the position correction control of the head unit is shown with reference to fig. 5 and 6.
Fig. 5 is a flowchart showing an example of control relating to head unit position correction according to the first embodiment of the present invention. Fig. 6 is a diagram showing an example of functional blocks of a controller of an image forming apparatus according to the first embodiment of the present invention.
In fig. 5, after the sheet conveyance is started in step S1, in step S2, the reflective paper pattern (patch pattern) of the web (sheet) W is imaged (detected) by the surface detection sensor devices (sensor devices SE0 to SE 4). Subsequently, from the detection result of step S2, the sheet meandering amount is calculated (step S3).
More specifically, while the web (sheet) W is conveyed, the image data is combined at a predetermined sampling period by the sensor devices SE1, SE2, SE3, and SE4 installed below the respective head units and the sensor device SE0 arranged upstream of the head units.
Subsequently, from the detection result of step S2, the sheet meandering amount is calculated (step S3). Specifically, based on the correlation calculation results between the sensor devices SE1, SE2, SE3, SE4 installed corresponding to the respective head units and the upstream sensor device SE0 arranged upstream of the head units, the relative position of the web (sheet) between the results of the sensor devices is detected or the amount of meandering of the (web) sheet between the sensor devices is calculated.
Simultaneously with the above operation, in step S4, the edge sensor ES0 detects the position of the edge of the web W being conveyed, and calculates the edge misalignment amount in step S5.
In step S6, a head unit correction amount is calculated using the sheet meandering amount calculated in step S3 and the edge misalignment amount calculated in step S5. Based on the calculated head unit correction amount, driving commands are instructed to the actuators a1 to a4 (see fig. 6) in step S7.
Thereafter, in step S8, the
A method for calculating an edge misalignment amount based on the detected edge results obtained in steps S4 and S5 of fig. 5 is shown below with reference to fig. 6.
In fig. 6, the
The edge
It is assumed that a reference position serving as a reference with respect to the sheet end (edge) read by the edge sensor ES0 is set in advance and stored in the reference
In addition, simultaneously with the edge misalignment amount calculation, the meandering
Further, the sheet edge misalignment amount is calculated by the edge
The head unit correction amount calculator (head movement amount calculator) 330 adds position alignment correction based on the sheet meandering amount during sheet conveyance calculated from the surface detection sensor, and performs writing position correction using the sheet edge misalignment amount via the adder 331 to correct the image printing position.
Therefore, the
Therefore, using the surface detection sensor devices SE0 to SE4, the relative positions between the respective colors with respect to the sheet W can be adjusted, and the absolute position on the sheet can be adjusted by correcting each misalignment amount with respect to the reference position. Therefore, the accuracy of the writing start position in the sheet width direction can be improved. This makes it possible to improve the print quality.
In this case, the edge sensor ES0 is preferably arranged at a position aligned in the orthogonal direction with the upstream sensor device SEN2 serving as the upstream surface detection sensor. That is, when the misalignment amount detected by the edge sensor ES0 is increased to the relative misalignment amount between the upstream sensor device SEN2 and the black sensor device SENK, the
Structure of detector
Fig. 7 is a perspective view showing an example of a mechanical configuration of a sensor device according to an embodiment of the present invention. This configuration is suitable for two types of differently colored sensor arrangements SE1 to SE4 (first sensor arrangement SEN1) and for the upstream sensor arrangement SE0 (second sensor arrangement SEN 2).
The sensor device shown in fig. 7 is configured to capture an image of an uneven paper pattern (hereinafter referred to as a spot pattern) that can be recognized when the surface of an object formed of paper fibers such as a web is irradiated with light. The capture of an image of the speckle pattern is an example of inspecting surface information of the web.
Specifically, the sensor device has a semiconductor laser light source (LD)91 and a collimating optical system (CL)92 as an example of a light source unit (light emitting unit). Further, in order to capture an image showing a speckle pattern or the like, the sensor device has a
As the first position detection method executable using the sensor device of fig. 7, correlation calculation and the like are performed by the
The
Fig. 8 is a functional block diagram illustrating an example of control using a surface detector according to an embodiment of the present invention. For example, as shown in fig. 7, the surface detector 110F10 may be implemented by the sensor apparatus SEN only, or the surface detector 110F10 may be implemented by the sensor apparatus SEN and a part of hardware such as the RAM223 (see fig. 12) of the
In fig. 8, an explanation is given of an example of a combination of the respective detectors of the black liquid
In this example, the surface detector 110F10 for the black liquid
Further, the actuators a1 and a2 are configured to adjust the positions of the
In fig. 8, the surface detector 110F10 for the black liquid
As shown, the
The
The image merging unit 142A acquires an image (captured image) imaged by the
The
As a second movement amount detection method, based on the respective image data stored in the
Further, the meandering
The web W is a member having scattering properties on the surface or inside of the web W. Therefore, when the web W is irradiated with the laser light, the reflected light is diffusely reflected. The diffuse reflection causes the web W to create a pattern on the surface of the web W. That is, the pattern is a pattern called "speckle", that is, a so-called speckle pattern. Thus, when an image of the web W is captured, an image indicative of the speckle pattern is obtained. Since the position of the speckle pattern is identified from the captured image, a predetermined position on the web W can be detected. Note that the speckle pattern is generated by interference of the irradiation laser light due to an uneven pattern (paper pattern) formed on the surface or inside of the web W.
Further, the light source is not limited to a device using laser light. For example, the light source may be an LED (light emitting diode), an organic EL (electroluminescence), or the like. The pattern need not be a speckle pattern depending on the type of light source. An example of a pattern being a speckle pattern is shown below.
Thus, when the web W is conveyed, the speckle pattern of the web W is conveyed together with the web W. Therefore, when the two
That is, when the same spot pattern is detected by the two
More specifically, the cross-correlation calculation is performed on the image data "D1 (n)" and "D2 (n)" indicating the respective images captured by the
Further, the head unit
Further, the
As mentioned above, the detection results indicating the position of the web W may be obtained by one or more detectors using a speckle pattern. As a result, the image forming apparatus can be made to improve the accuracy of detecting the position of the web in the direction orthogonal to the conveyance direction.
Further, the correlation calculation in the first movement amount detection method using one sensor in the detector in fig. 7 is performed, for example, as follows.
Correlation calculation examples
Fig. 9 is a configuration diagram showing an example of a correlation calculation method according to an embodiment of the present invention. For example, the speckle pattern is acquired by a detector, and then correlation calculations are performed by the
Specifically, the meandering
The first two-dimensional fourier transform unit FT1 transforms the first image data D1. Specifically, the first two-dimensional fourier transform unit FT1 includes an orthogonal direction fourier transform unit FT1a and a transfer direction fourier transform unit FT1 b.
The orthogonal direction fourier transform unit FT1a performs one-dimensional fourier transform on the first image data D1 in the orthogonal direction. Then, the transfer direction fourier transform unit FT1b performs one-dimensional fourier transform on the first image data D1 in the transfer direction based on the transform result obtained by the orthogonal direction fourier transform unit FT1 a. Therefore, the orthogonal direction fourier transform unit FT1a and the transmission direction fourier transform unit FT1b perform one-dimensional fourier transform in the orthogonal direction and the transmission direction, respectively. The first two-dimensional fourier transform unit FT1 outputs the thus obtained transform result to the related image data generator DMK.
Similarly, the second two-dimensional fourier transform unit FT2 transforms the second image data D2. Specifically, the second two-dimensional fourier transform unit FT2 includes an orthogonal direction fourier transform unit FT2a, a transfer direction fourier transform unit FT2b, and a complex conjugate unit FT2 c.
The orthogonal direction fourier transform unit FT2a performs one-dimensional fourier transform on the second image data D2 in the orthogonal direction. Then, the transfer direction fourier transform unit FT2b performs one-dimensional fourier transform on the second image data D2 in the transfer direction based on the transform result obtained by the orthogonal direction fourier transform unit FT2 a. Therefore, the orthogonal direction fourier transform unit FT2a and the transmission direction fourier transform unit FT2b perform one-dimensional fourier transform in the orthogonal direction and the transmission direction, respectively.
Next, the complex conjugate unit FT2c calculates the complex conjugate of the transform result obtained by the orthogonal direction fourier transform unit FT2a and the transfer direction fourier transform unit FT2 b. Then, the second two-dimensional fourier transform unit FT2 outputs the complex conjugate calculated by the complex conjugate unit FT2c to the correlated image data generator DMK.
Subsequently, the related image data generator DMK generates related image data based on the transformation result of the first image data D1 output from the first two-dimensional fourier transform unit FT1 and the transformation result of the second image data D2 output from the second two-dimensional fourier
The correlation image data generator DMK includes an integrator DMKa and a two-dimensional inverse fourier transform unit DMKb.
The integrator DMKa integrates the transformation result of the first image data D1 and the transformation result of the second image data D2. Then, the integrator DMKa outputs the integration result to the two-dimensional inverse fourier transform unit DMKb.
The two-dimensional inverse fourier transform unit DMKb performs two-dimensional inverse fourier transform on the integration result obtained by the integrator DMKa. As described above, when the two-dimensional inverse fourier transform is performed, correlated image data is generated. Then, the two-dimensional inverse fourier transform unit DMKb outputs the relevant image data to the peak position search unit SR.
The peak position searching unit SR searches for a peak position having a peak luminance (peak value) that is steepest (i.e., the rise becomes steep) in the generated correlation image data. First, a value representing the magnitude of light intensity, i.e., luminance, is input to the relevant image data. In addition, the luminance is input into the matrix.
In the related image data, the luminances are arranged at the pixel pitch interval of the area sensor, i.e., the pixel size interval. Therefore, it is preferable to perform the search for the peak position after performing the so-called sub-pixel processing. As described above, when the sub-pixel processing is performed, the peak position can be searched for with high accuracy. Therefore, the meandering
For example, as shown in fig. 10, the search by the peak position search unit SR is performed.
Fig. 10 is a diagram illustrating an example of a peak position search method in correlation calculation according to an embodiment of the present invention. In fig. 10, the horizontal axis indicates a position in the conveying direction in the image represented by the related image data, and the vertical axis indicates the luminance of the image represented by the related image data.
Hereinafter, the following description gives examples of three data of the first data value q1, the second data value q2, and the third data value q3 in the luminance indicated by the related image data. That is, in this example, the peak position searching unit SR (see fig. 9) searches for the peak position P on a curve k connecting the first data value q1, the second data value q2, and the third
First, the peak position search unit SR calculates each difference in luminance of an image represented by the related image data. Then, the peak position search unit SR extracts a combination of data values having the largest difference value among the calculated differences. Next, the peak position search unit SR extracts respective data value combinations adjacent to the combination of the data values having the largest difference value. In this way, the peak position search unit SR is enabled to extract three data, for example, the first data value q1, the second data value q2, and the third data value q3 shown in fig. 10.
Then, when the curve k is calculated by connecting the three extracted data, the peak position search unit SR is enabled to search for the peak position P. In this way, the peak position search unit SR can reduce the amount of calculation such as sub-pixel processing and can search for the peak position P at a higher speed. Note that the position where the combination of data values has the largest difference is the steepest position. The sub-pixel processing may be processing other than the above processing.
As described above, when the peak position searching unit SR searches for the peak position, for example, the calculation result as shown in fig. 11 is obtained.
Fig. 11 is a diagram showing an example of a calculation result of correlation calculation according to an embodiment of the present invention. Fig. 11 is a graph showing a correlation intensity distribution of the cross-correlation function. In the figure, the X-axis and the Y-axis indicate the serial number of the pixel. The peak position like the "correlation peak" shown in fig. 11 is searched by the peak position searching unit SR (see fig. 9).
Returning to fig. 9, the calculator CAL calculates the relative position, the movement amount, the movement speed, and the like of the web W. For example, when calculating the difference between the center position of the related image data and the peak position detected by the peak position search unit SR, the calculator CAL is enabled to calculate the relative position and the moving amount of the web W.
Furthermore, based on the relative position obtained, the calculator CAL is enabled to calculate the speed of movement of the web W.
As described above, the meandering
Another example of detection calculation
First, the
Note that fig. 9 shows a diagram representing an example of generation of a ripple in the Y direction (width direction); however, when the fluctuation is generated in the X direction (conveying direction), the peak position occurs at a position shifted also in the X direction.
In addition, the crawling
Note that the setting of the relative position detection range for detecting the speckle pattern in the detector is described in detail in japanese patent application No. 2017-027481.
Head position moving structure
Fig. 12 is a block diagram showing an example of a hardware configuration for moving a liquid ejection head unit included in a liquid ejection apparatus according to an embodiment of the present invention. The configuration shown in fig. 12 is an example in which the
The
Note that the hardware configuration is not limited to the illustrated example. That is, each illustrated device may be included in the image forming apparatus or may be an external device.
Further, each of the illustrated devices may be shared or may be separately provided. For example, the CPU221 or the like may be used to implement the snake
The CPU221 is an example of an arithmetic device and a control unit. Specifically, the CPU221 acquires the detection results of the respective sensors, and performs calculation and the like to calculate the amount of change in the object to be conveyed. Further, the CPU221 controls each actuator and performs control for moving each head unit, and the like.
The speed detection circuit SCR is an electronic circuit configured to detect a moving speed or the like at which an object to be conveyed is detected. For example, a "6 ppi" signal or the like is input to the speed detection circuit SCR. Next, the speed detection circuit SCR calculates the speed at which the object to be conveyed is conveyed based on the detection result from each sensor, the detection result from the encoder, and the like, and sends the calculated result to the CPU221 and the like. Note that the speed detection circuit SCR may be implemented by the
The above-described edge sensor ES0, sensor devices of various colors (surface detection sensors) SE1, SE2, SE3 and SE4, and upstream sensor device SE0 are connected to the
The detection result obtained by each sensor is transmitted to the controller (control device) 300 via a corresponding input/output interface (I/O) and the
Further, the detection result obtained by the edge sensor ES0 is also sent to the
In the
Similarly, the second actuator a2 is connected to the cyan liquid
Each actuator moves a corresponding one of the liquid ejection head units, for example, by a moving mechanism as shown in fig. 13.
Fig. 13 is a schematic top view showing an example of a moving mechanism for moving the liquid ejection head unit included in the liquid ejection apparatus according to the embodiment of the present invention. The moving mechanism is realized by hardware or the like shown in fig. 13, for example. The illustrated example is an example of a moving mechanism configured to move the cyan liquid
In the example of fig. 13, first, an actuator (second actuator a2) such as a linear actuator configured to move the cyan liquid
The actuator a2 is, for example, a linear actuator or a motor. The actuator a2 may also include control circuitry, power circuitry, mechanical components, and the like.
The detection result obtained by using the set detection range is input to the
In the example of fig. 13, the position correction amount calculated from the detection result is, for example, the fluctuation Δ. Therefore, in this example, the
Comparative example
Fig. 14 is a top view of a liquid ejection apparatus according to a comparative example. The configuration of the comparative example shown in fig. 14 does not include the edge sensor ES0, which is different from the configuration in fig. 2 described above.
In this comparative example, the driving amounts of the respective actuators are indicated to move the positions of the
Fig. 15 is a timing chart showing an example of a method for calculating the fluctuation amount (meandering amount) of an object to be conveyed by the liquid ejection apparatus according to the comparative example shown in fig. 14. The control device of the comparative example calculates the fluctuation amount based on the detection result using two or more fourth detection ranges. Specifically, the control device outputs a calculation result representing the amount of fluctuation based on two types of acquired data output from two sensors.
First, the sensor devices SE0 and SE2 acquire surface information. The acquired data shown in the upper part is the upstream detection result output from the upstream position output sensor device SE0, and the acquired data shown in the lower part is the downstream detection result output from the position output
Subsequently, each acquired data is sent to the control device, and thus the detection result is stored in the
The fluctuation amount of each liquid ejection head unit is calculated. An example of calculating the fluctuation amount of the most upstream black liquid
In this example, assume that the interval between the upstream sensor device SE0 and the black sensor device SE1, that is, the distance between the sensors is "L2". Further, it is assumed that the moving speed detected by the speed detection circuit SCR is "V". In addition, it is assumed that the moving time taken to transfer the object from the position of the upstream side sensor device SE0 to the position of the black sensor device SE1 is "T2". In this case, the movement time is calculated as "T2 ═ L2/V".
Further, assume that the sampling interval of the sensor is "a". Further, assume that the number of sampling times between the upstream sensor device SE0 and the black sensor device SE1 is "n". In this case, the number of times of sampling is calculated as "n ═ T2/a".
The illustrated calculation result, that is, the fluctuation amount is assumed to be "Δ X". For example, as shown in fig. 15, when the detection period is "0", the fluctuation amount is calculated by comparing the first detection result S1 before the movement time "T2" and the second detection result S2 when the detection period is "0". Specifically, the fluctuation amount is calculated as "Δ X ═ X2(0) -X1 (n)". When the position of the sensor is closer to the first roller than the landing position, the image forming apparatus calculates fluctuation of the position of the recording medium (sheet) for any deviation in the position of the recording medium as far as the sensor device to drive the actuator.
Next, the image forming apparatus controls the first actuator a1 to move the black liquid
As shown in the drawing, when the fluctuation amount is calculated based on two detection results, that is, detection results obtained by two sensor devices, the fluctuation amount can be calculated without integrating the detection results of the sensor devices. Therefore, when the fluctuation amount is calculated based on the two detection results in this manner, the accumulated detection error added by each sensor can be reduced.
Note that the calculation of the fluctuation amount may be similarly performed in other liquid ejection head units. For example, based on two detection results obtained by the black sensor device SE1 and the cyan sensor device SE2, the fluctuation amount for the cyan liquid
Similarly, the fluctuation amount of the magenta liquid
The detection result used as the upstream detection result is not limited to the detection result detected by the sensor mounted immediately upstream of the liquid ejection head unit to be moved. That is, the upstream detection result may be a detection result detected by any sensor installed upstream of the liquid ejection head unit to be moved. For example, the fluctuation amount of the yellow liquid
In contrast, it is preferable that the downstream detection result is a detection result obtained by a sensor installed at a position closest to the liquid ejection head unit to be moved.
Further, the fluctuation amount of the web (the amount of meandering in the width direction) may be calculated based on three or more detection results.
As described above, the liquid ejection head unit may be moved based on the fluctuation amount calculated from the two or more detection results, and liquid is ejected onto the web to form an image or the like on the recording medium.
In the comparative example, meandering in the width direction is calculated using two or more detection results. Thus, the head units are moved by the respective actuators to compensate for the amount of fluctuation of the web. That is, the landing positions of the droplets ejected from the respective nozzles of the heads of the head unit on the sheet may be aligned with each other in the width direction.
However, when the position in the width direction of the sheet at the sensor position upstream of the
Head position control of the present invention
In contrast to the comparative example described above, according to the embodiment of the present invention, the sheet meandering amount compensation is combined with the position fluctuation compensation with respect to the absolute reference position by the sheet edge shift amount, so as to move the head unit (write position correction) as the image printing position adjustment.
Note that the read timing of the edge misalignment may be performed only at the time of initial adjustment, or may be performed in real time.
Hereinafter, a method of correcting the image printing position in the case where the reading edge is misaligned at the time of the reading initial adjustment is described with reference to fig. 15 and 16, and a method of correcting the image printing position when the reading edge is misaligned in real time is described with reference to fig. 17 and 18.
Position correction at initial adjustment
The following illustrates image printing position correction in the case where the edge of the reading sheet is misaligned at the time of initial adjustment according to an embodiment of the present invention. Fig. 16 is a diagram illustrating an example of a method for correcting an image printing position in the case where a reading edge is misaligned at the time of initial adjustment according to an embodiment of the present invention.
In fig. 16, first, in order to perform initial adjustment before printing, the sheet is conveyed to acquire sheet edge position data (Δ XE 0). The sheet edge position data is temporarily stored in a data storage device (for example, the RAM223 in fig. 12) for writing position correction described later.
The sensor devices SE0 to SE4 shown in fig. 2 constantly merge image data when conveying a sheet at the time of printing.
In this adjustment, in addition to the sheet meandering amount calculated from the acquisition data acquired via the sensor device directly below the
As an example, a method for calculating the correction amount of the image printing position in the most
In fig. 16, based on a comparison between the detection result of the upstream sensor device SE0 and the detection result of the black sensor device SE1 obtained at the previous sheet conveyance time, the meandering amount Δ X1(0) at the position of the black sensor device SE1 at the
ΔX1(0)=X1(0)-X0(-n)-----(1)
Further, the sheet edge misalignment amount Δ XE0 is calculated based on a comparison between the sheet edge position data acquired at the time of initial adjustment and the reference position.
Then, the image printing position correction amount Δ X is calculated by adding the sheet edge misalignment amount Δ XE0 and the meandering amount Δ X1 (0).
ΔX(0)=ΔX1(0)+ΔXE0=X1(0)-X0(-n)+XE0-----(2)
As described above, the image printing position correction is performed using the two data of the sheet meandering amount Δ X1(0) and the sheet edge misalignment
Similarly, the
In this way, the amount of movement of the head unit is calculated as the correction amount of the image printing position using the sheet edge misalignment amount Δ XE0 at the time of initial adjustment and the sheet meandering amount that changes in real time.
In this case, as shown in fig. 2, the upstream sensor device SE0 that outputs the most upstream detection result is arranged at the same position as the edge sensor ES0 in the width direction, which is a direction orthogonal to the conveyance direction (orthogonal direction). Therefore, it is possible to detect the misalignment amount of the detected position of the spot pattern at the position of the upstream sensor device SE0 with respect to the reference position.
In this way, absolute positional misalignment with respect to the amount of change (amount of snake movement) calculated by
The following illustrates an operation of correcting the image printing position in the case where the read sheet edge is misaligned at the time of initial adjustment. Fig. 17 is a diagram illustrating an example of a method for correcting an image printing position in a case where the read sheet edge is misaligned at the time of initial adjustment.
First, when an image forming apparatus as an example of a transmission apparatus is started by START, the
Subsequently, when the initial adjustment mode is executed (yes in S101), the
When the detection of the edge position and the calculation of the misalignment amount are completed, the
Upon ending the initial adjustment mode, the
Note that, in the case where the initial adjustment mode is not required, the
When the printing operation is started and sheet conveyance is started, the meandering amount calculator 310 (see fig. 6) detects the meandering amount of the conveyed sheet based on the acquired data acquired via the plurality of sensor devices SE0 to SE4 (step S107).
In step S108, the head unit
Then, the actuator
In this adjustment, detection of the sheet meandering amount, calculation of the image printing position correction amount, and the correction operation are repeatedly performed in steps S107 to S109 until the printing ends (yes in step S110).
In this adjustment, since the sheet edge misalignment amount is calculated only at the time of initial adjustment, only the sheet meandering amount needs to be calculated during printing. Therefore, the position adjustment of the head unit can be performed with high accuracy while minimizing the load on the controller, for example, the CPU221 (see fig. 12).
Real-time position correction
The following illustrates image printing position correction in the case of reading a sheet edge misalignment in real time according to an embodiment of the present invention. Fig. 18 is a diagram showing an example of a method for correcting an image printing position in the case of reading edge misalignment in real time according to the present invention.
In this adjustment, during conveyance of the sheet at the time of printing, the sensor devices SE0 to SE4 constantly merge image data, and the edge sensor ES0 constantly merges sheet edge position data. Specifically, during conveyance of the sheet, i.e., during printing, the sheet meandering amount is calculated based on the acquired data acquired from the sensor devices directly below the respective head units and acquired from the sensors upstream of these sensor devices, and the sheet edge misalignment amount is calculated based on the sheet edge position data. Subsequently, image printing position correction is performed based on relative position correction of the respective colors obtained by calculating the sheet meandering amount and based on writing position correction obtained by calculating the sheet edge misalignment amount.
As an example, a method for calculating the correction amount of the image printing position in the
ΔX1(0)=X1(0)-X0(-n)-----(3)
Further, in this adjustment, the sheet edge misalignment amount Δ XE0(-n) was calculated at the same timing as the calculation of the acquired data X0(-n) of the
Then, in a manner similar to the formula (2), the image printing position correction amount Δ X is calculated by combining the meandering amount Δ X1(0) and the sheet edge misalignment amount Δ XE0(0) calculated at the same timing.
ΔX(0)=ΔX1(0)+ΔXE0(-n)=X1(0)-X0(-n)+ΔXE0(-n)-----(4)
Therefore, the image printing position correction is performed in real time using two data, which are the sheet meandering amount and the sheet edge misalignment amount.
Similarly, the
Further, the sheet edge misalignment amount may be calculated by obtaining an average (mean) of the newly acquired data by moving the average, or excluding the acquired data having noise using a filter. By calculating the data in this manner, it is possible to avoid adverse effects of sheet edge chipping and noise occurring when the sensor samples the data during printing. Therefore, an accurate image printing position can be obtained.
The following illustrates an operation of image printing position correction in the case of reading a sheet edge misalignment in real time. Fig. 19 is a flowchart illustrating an example of a method for correcting a writing position in a sheet width direction in the case of reading a sheet edge misalignment in real time according to the present invention.
First, when an image forming apparatus as an example of a conveying apparatus is started by START, in step S101, conveyance of a sheet is started while printing is started (step S201).
When conveyance of the sheet is started, the meandering amount calculator 310 (see fig. 6) detects the sheet meandering amount of the conveyed sheet based on the acquired data acquired via the plurality of sensor devices SE0 to SE4 (step S107).
Simultaneously with S202, the edge sensor ES0 acquires the edge position of the sheet, and detects the sheet edge misalignment amount based on the acquired data (step S203).
In step S204, the head unit
Then, the actuator
Until the end of printing (yes in step S206), in steps S202 to S205, meandering amount detection, sheet edge misalignment amount detection, and image printing position correction amount calculation and correction operation based on the meandering amount and the sheet edge misalignment amount are repeatedly performed.
In this control, the image printing position correction is performed in real time for a predetermined period of time during printing. Therefore, even if the sheet edge misalignment occurs during printing, the position of the head unit can be constantly adjusted. Therefore, more accurate position adjustment can be achieved. According to this position adjustment, a higher quality image can be printed.
Second embodiment: system for controlling a power supply
Fig. 20 is a schematic top view showing an example of a system provided with a conveying apparatus according to the second embodiment of the present invention. In the first embodiment, the conveying apparatus is described as a stand-alone image forming apparatus; however, the image forming apparatus may be one apparatus within the
As illustrated in fig. 22, the
The first
The
The first
The web W discharged from the first
The second
The
Control block
Next, a control configuration in the
For example, in the present embodiment, the
The
The printer controller 72C controls the operation of the
The printer controller 72C includes a CPU72Cp, a print control device 72Cc, and a
The CPU72Cp controls the operation of the
The print control device 72Cc transmits and receives data indicating commands, statuses, and the like to and from the
The data lines 70LD-C, 70LD-M, 70LD-Y, and 70LD-K, i.e., a plurality of data lines, are connected to the
The
Fig. 22 is a block diagram showing an example of a hardware configuration of a data management apparatus included in a controller according to an embodiment of the present invention. For example, the plurality of data management apparatuses have the same configuration. An example in which a plurality of data management apparatuses have the same configuration is shown below with reference to a typical example of the
The data management apparatus 72EC has a logic circuit 72ECl and a
For example, the data management device 72EC may execute the function of, for example, a controller (control unit) 300 shown in fig. 5 to operate the actuators AC configured to move the positions of the
The logic circuit 72ECl stores the image data input from the
In addition, the logic circuit 72ECl reads the cyan image data Ic from the storage device 72ECm based on a control signal input from the printer controller 72C. Next, the logic circuit 72ECl transmits the read cyan image data Ic to the
Note that it is preferable that the storage device 72ECm has a capacity capable of storing image data of about three pages. When the storage device 72ECm has a capacity to store image data of about 3 pages, the storage device 72ECm can store image data input from the
Fig. 23 is a block diagram showing an example of a hardware configuration of an image output apparatus included in a controller according to an embodiment of the present invention. As shown in fig. 23, the image output device 72Ei includes an
The output control device 72Eic outputs image data of each color to the liquid ejection head units of each color. That is, the output control device 72Eic controls the liquid ejection head units of the respective colors based on the input image data.
The output control device 72Eic controls a plurality of liquid ejection head units simultaneously or individually. That is, the output control device 72Eic controls to change the timing at which each liquid ejecting head unit ejects the liquid when receiving the timing input. Note that the output control device 72Eic may control any liquid ejection head unit based on a control signal input from the printer controller 72C (fig. 21). Further, the output control device 72Eic may control any liquid ejection head unit based on an operation by a user or the like.
Note that the
The
The conveyance control device 72Ec (fig. 21) is a motor or the like configured to convey the web W or the like. For example, the conveyance control device 72Ec controls a motor or the like connected to each roller to convey the web W.
The third embodiment: reading apparatus
Fig. 24 is a schematic top view showing a configuration example of a reading apparatus according to a third embodiment of the present invention. In the first and second embodiments described above, illustrations of examples are given in which each head unit included in the transfer apparatus is a liquid ejection head unit configured to eject liquid, and the transfer apparatus is a liquid ejection apparatus; however, the head unit may be a reading unit (scanner). In this case, the conveying device functions as a reading device (inspection device) configured to perform reading.
The
The head unit is configured to include one or more read heads mounted in a direction orthogonal to the conveyance direction X (orthogonal direction). For example, as shown in fig. 24, the
As shown in fig. 24, the head unit HD1 and the head unit HD2 include one or more readheads CIS1 and CIS2, respectively. In fig. 24, one CIS head is provided in each head unit HD; however, the head unit HD1 may be provided with a read head CIS3, for example, at a position forming an interleaved configuration with the read head CIS1 and the read
The head unit HD1 and the head unit HD2 each constitute a reading unit, a so-called scanner. Accordingly, the head unit HD1 and the head unit HD2 each read an image or the like formed on the surface of the web W and output image data representing the read image or the like. When the conveying
Note that, in fig. 24, an illustration is given of an example in which the support rollers R1 and R2 are not arranged between the head units of the conveying
Fig. 25 is a schematic side view showing another example of the reading apparatus of fig. 24. As in the first embodiment shown in fig. 3, two pairs of pinch rollers NR1, NR2,
Further, the conveying
The controller CT1 and the actuator controller CT2 are information processing devices. Specifically, the controller CT1 and the actuator controller CT2 have a hardware configuration including a CPU, an electronic circuit, or an arithmetic device such as a combination of these devices, a control device, a storage device, an interface, and the like. The controller CT1 and the actuator controller CT2 may each be a plurality of devices.
Note that the mounting positions of the sensor devices SE1 and SE2 are preferably arranged in the same manner as in fig. 3.
That is, in fig. 24, the distance between the upstream surface detection sensor device SE0 and the edge sensor ES0 in the conveyance direction is also shorter than the distance between the surface detection sensor device SE1 and the
Head Unit processing location examples
Fig. 26 is a schematic view showing a head unit process position according to an embodiment of the present invention. For example, the head CIS1 of the head unit HD1 and the head CIS2 of the head unit HD2 are mounted so as to have the positional relationship shown in fig. 26. Further, each of the heads CIS1 and CIS2 has a plurality of CIS elements arranged in a line, and has a plurality of reading regions Rs associated with the respective CIS elements.
Specifically, the reading head CIS1 of the head unit HD1 reads the reading range SC1 in the orthogonal direction Y to generate read image data. The reading head CIS2 of the head unit HD2 reads the reading range SC2 in the orthogonal direction Y to generate read image data. As shown in fig. 26, the read range SC1 and the read range SC2 partially overlap. Hereinafter, the range in which the read range SC1 and the read range SC2 overlap is referred to as "overlap
Therefore, in the overlapping range SC3, the head unit HD1 and the head unit HD2 can read the same object. That is, since the object read by the head unit HD1 in the overlap range SC3 is conveyed from the upstream side to the downstream side, it is possible to cause the head unit HD2 to read the same object after a predetermined time. Note that since the interval between the head unit HD1 and the head unit HD2 can be obtained in advance, the conveying
Then, the
As described above, by setting the respective head units at different positions and combining the image data of the respective clips, it is possible for the conveying
Functional configuration example
Fig. 27 is a functional block diagram showing a functional configuration example of a transmission apparatus according to an embodiment of the present invention. As shown in fig. 27, the conveying apparatus includes a controller 1F3 in addition to the configuration shown in fig. 25. In addition, it is preferable that the conveying apparatus further includes an image processor 1F5 configured to process the read image data.
The controller 1F3 executes a control process for controlling the head unit HD1 and the
The movement controller 1F31 controls the actuators AC1 and AC2 based on the misalignment amount calculated by the
The processing timing controller 1F32 controls the reading processing timings of the reading heads CIS1 and CI2 in the head units HD1 and HD2, respectively, based on the misalignment amount calculated by the
More specifically, when the misalignment amount in the conveyance direction X is "Δ X" and the moving speed of the web W is "V", the conveyance apparatus changes the process timing to compensate for "Δ X". In this example, the transfer device changes the processing timing of the downstream reading head to "Δ T ═ Δ x/V".
That is, when the web W is conveyed by shifting by "Δ x" with a delay, the conveying device changes the timing of processing to be performed by the reading head CIS2 to delay it by "Δ T". In this way, the conveying apparatus can be enabled to perform processing in the conveying direction X with high accuracy.
When the misalignment amount in the orthogonal direction Y is "Δ Y", the transfer apparatus moves the head unit to compensate for "Δ Y". The conveying
In this way, the conveying apparatus can perform image reading processing to read image data (test chart or the like) with high accuracy in a direction orthogonal to the conveying direction. In particular, when the head unit is moved during the process performed by the head unit to compensate for the misalignment amount, the transfer apparatus can be enabled to enable the head unit to perform the process with high accuracy.
Further, according to the third embodiment, as shown in fig. 24, the upstream sensor device SE0 configured to output the most upstream surface information and the edge sensor ES0 are arranged at the same position in the width direction, which is a direction orthogonal to the conveying direction (orthogonal direction). Therefore, the misalignment amount of the detected position of the spot pattern at the position of the upstream sensor device SE0 with respect to the reference position can be detected.
Thus, it is possible to compensate for the absolute positional misalignment with respect to the amount of change (amount of meandering) calculated by the head unit HD1 directly using the upstream sensor device SE0, or calculated by the head unit HD2 indirectly using the upstream
In this embodiment, as shown in fig. 15 and 16, at the time of inspection, reading of the test chart can be performed by reading the edge misalignment at the time of initial adjustment and then correcting the image reading position. In this case, the edge misalignment of the sheet is calculated only at the time of initial adjustment. Therefore, when the read image data is read while the web W is conveyed, only the sheet meandering amount needs to be calculated. Therefore, it is possible to read a high-quality image while reducing the load on the controller.
Alternatively, as shown in fig. 17 and 18, the edge misalignment may be detected in real time and reflected on the correction of the reading position.
When edge misalignment is detected in real time, the average (mean) of the newly acquired data may be obtained by moving the average, or the sheet edge misalignment amount may be calculated by excluding the acquired data with noise using a filter. Calculating the data in this manner can avoid the adverse effects of sheet edge chipping and noise that occur when the sensor samples the data during reading. Therefore, an accurate image reading position can be obtained.
Adjusting the position of the scanner for a predetermined period of time during reading of the image by detecting edge misalignment in real time; therefore, a higher quality image can be read even when edge misalignment occurs on the sheet during conveyance of the sheet.
In the third embodiment, an illustration is given of an example of an apparatus configured as a single unit; however, an inspection apparatus that is a reading apparatus may be provided as one of the devices of the image forming system shown in fig. 20.
For example, the reading apparatus according to the third embodiment may be arranged at a subsequent stage of the image forming apparatus shown in fig. 2 and 3 to read a test chart which is an image for inspection to adjust the landing position on the sheet.
In this case, as the image inspection, the head units HD1 and HD2 as the readers of the inspection apparatus capture and read the test pattern such as the gradation pattern whose density is adjusted as the inspection image for the landing position correction.
The reading apparatus for reading according to the third embodiment includes a control mechanism (a read result processor and a recording head landing position setting unit, and the like) in addition to a mechanism for reading color information of an image of a scanner or the like constituting the head unit.
Further, in this example, the image forming apparatus described in the first embodiment and the reading apparatus (inspection apparatus) described in the third embodiment may be provided in one image forming system. This configuration enables the landing position to be inspected with higher accuracy, thereby achieving high-quality image formation reflecting the inspection result.
In addition, in the above examples, the image forming apparatus by the form of the inkjet direct transfer has been described in the first embodiment and the second embodiment. However, the conveying apparatus of the present invention is also applicable to an intermediate transfer type image forming apparatus.
The fourth embodiment: intermediate transfer type
Fig. 28 is a schematic diagram illustrating an internal configuration of an intermediate transfer type inkjet
In this configuration, the head units 51C, 51M, 51Y, and 51K eject ink droplets to form an image on the outer peripheral surface of the
At the transfer unit where the
After the transfer of the filmed image, the cleaning
In the
In this configuration, the
When ink droplets are ejected from the
According to a positional relationship similar to that between support roller 525C1, support roller 525C2, and
In the fourth embodiment, the support rollers 525C1, 525M1, 525Y1, and 525K1 are first support members disposed upstream of the processing position of the head unit of the respective colors, and the support rollers 525C2, 525M2, 525Y2, 525K2 are second support members disposed downstream of the processing position of the head unit.
Further, in this configuration, an
In fig. 28, the distance between the upstream surface detection sensor device (
According to the fourth embodiment, the
In this configuration, the
The
In addition, the
Further, the
According to the above configuration of the fourth embodiment, even when the
Further, when the
In the above example, based on the image data acquired from the
In addition, in this configuration, the head unit 51C has no actuator; however, the head unit 51C may have an actuator. By moving the head unit 51C in the orthogonal direction, it is possible to control the position in the direction orthogonal to the conveying direction in which the web W is conveyed when transferring the image from the
In the above example, an image is formed on the
Further, in this example, the image forming apparatus described in the first embodiment and the reading apparatus (inspection apparatus) described in the third embodiment are provided in one image forming system. This configuration of the image forming system enables inspection of the landing position with higher accuracy, thereby achieving high-quality image formation reflecting the inspection result.
In a conveying apparatus (image forming apparatus, image reading apparatus), an object to be imaged and an object to be read are described as one sheet of paper; however, the recording medium as the object to be conveyed is not limited to paper. For example, the recording medium means a medium to which a liquid at least temporarily adheres, to which a liquid adheres, or adheres and permeates. Specific examples of the recording medium used in the embodiments may include all media to which liquid and powder adhere, unless otherwise specified. Specific examples include recording media such as sheets, recording papers, recording sheets, films, cloths, etc., which have a uniform pattern on the surface thereof, which pattern is detectable under irradiation of light, or media such as electronic components, e.g., electronic substrates and piezoelectric elements, powder layers (powder layers), organ models, inspection units, etc. For example, the material of the "object to be conveyed" may be paper, thread, fiber, cloth, leather, metal, plastic, glass, wood, ceramic, or the like; the material of the "object to be transferred" may be a material to which a liquid at least temporarily adheres and which has a uniform pattern on the surface, the material being detectable under irradiation of light.
The preferred embodiments of the present invention have been described above in detail; however, the present invention is not limited to the specific embodiments among the embodiments, and various modifications and alterations can be made within the scope described in the claims.
List of reference numerals
1 "reading apparatus (conveying apparatus, processing apparatus)"
15A image storage unit
15B image storage unit
16A imaging unit (upstream imaging unit)
16B imaging unit
71 host device
91 semiconductor laser source (illuminator)
110 image forming apparatus (liquid ejecting apparatus, conveying apparatus, processing apparatus)
210K Black liquid ejecting head Unit (head Unit, liquid droplet ejecting Unit)
210C cyan liquid ejecting head Unit (head Unit, liquid droplet ejecting Unit)
210M magenta liquid ejecting head Unit (head Unit, liquid droplet ejecting Unit)
210Y yellow liquid ejecting head Unit (head Unit, liquid droplet ejecting Unit)
300 controller (control unit)
310 snake movement calculator
320 edge misalignment calculator
321 comparator
322 reference value storage unit
323 misalignment calculator
324 previous location memory cell
330 head unit correction amount calculator (head movement amount calculator)
331 adder
340 actuator drive command unit
350 jet timing regulator
360 sampling period setting unit
500 image forming apparatus
51C, 51M, 51Y, 51K head unit (image forming unit)
55C sensor device (upstream surface detector)
55M, 55Y, 55K sensor device (surface detector)
520 transfer belt (object to be conveyed)
CIS1, CIS2 reading head
CR1K, CR1C, CR1M, CR1Y backup roll (first support member)
CR2K, CR2C, CR2M, CR2Y backup roll (second support member)
ES0 edge sensor (edge detector)
HD1 head unit (reading head unit)
HD2 head unit (reading head unit)
R0 supporting member
R1 supporting roller (210K first supporting member)
R2 supporting roller (second supporting member of 210K, first supporting member of 210 c)
R3 supporting roller (second supporting member of 210C, first supporting member of 210M)
R4 supporting roller (second supporting member of 210M, first supporting member of 210Y)
R5 supporting roller (210Y second supporting member)
SE0, SEN2 upstream sensor arrangement, upstream surface detecting sensor arrangement (upstream surface detector)
SE1, SENK BLACK SENSOR DEVICE, SURFACE-DETECTING SENSOR DEVICE (SURFACE DETECTOR)
SE2, SENS cyan sensor device, surface detection sensor device (surface detector)
SE3 SENM magenta sensor device, surface detection sensor device (surface detector)
SE4, SENSY yellow sensor device, surface detection sensor device (surface detector)
W web (recording medium, continuous paper, sheet, object to be conveyed)
This application is based on and claims priority from japanese priority application No. 2017-117301, filed on 14.6.2017, and japanese priority application No. 2018-110541, filed on 8.6.2018, the entire contents of which are hereby incorporated by reference.
- 上一篇:一种医用注射器针头装配设备
- 下一篇:印刷物和印刷物的制造方法