Multi-layer flexure for supporting motion image sensor

文档序号:348159 发布日期:2021-12-03 浏览:40次 中文

阅读说明:本技术 用于支撑运动图像传感器的多层挠曲件 (Multi-layer flexure for supporting motion image sensor ) 是由 S·W·米勒 B·V·约翰逊 郑好 于 2020-03-27 设计创作,主要内容包括:一些实施方案可包括一种多层挠曲件,该多层挠曲件可用于相机的光学图像稳定音圈电机(OIS VCM)致动器中。该多层挠曲件模块可包括动态平台和静态平台以及将该动态平台机械连接到该静态平台的多层挠曲臂。在一些示例中,该多层挠曲件可包括电迹线,这些电迹线被配置为将信号从该动态平台传送到该静态平台。这些电迹线可经由这些挠曲臂从该动态平台布线到该静态平台。在一些实施方案中,该多层挠曲件可在与相机的光轴对准的Z方向上具有较大刚度,并且可在对应于OIS VCM致动器的光学图像稳定方向的X方向和Y方向上具有较低刚度。(Some embodiments may include a multi-layer flexure that may be used in an optical image stabilization voice coil motor (OIS VCM) actuator for a camera. The multi-layer flexure module may include a dynamic platform and a static platform and multi-layer flexure arms mechanically coupling the dynamic platform to the static platform. In some examples, the multilayer flexure may include electrical traces configured to transmit signals from the dynamic platform to the static platform. The electrical traces may be routed from the dynamic platform to the static platform via the flexure arms. In some embodiments, the multi-layer flexure may have greater stiffness in the Z-direction aligned with the optical axis of the camera, and may have lower stiffness in the X-direction and Y-direction corresponding to the optical image stabilization direction of the OIS VCM actuator.)

1. A camera, the camera comprising:

a lens holder and one or more lens elements coupled to the lens holder, wherein the one or more lens elements define an optical axis of the camera;

an image sensor configured to capture light passing through the one or more lens elements and convert the captured light into an image signal; and

a flexure assembly, the flexure assembly comprising:

a first frame coupled to the image sensor such that:

the image sensor moves with the first frame; and

the first frame receives the image signal from the image sensor;

a second frame coupled to a fixed part of the camera;

a first layer of flexure arms configured to mechanically connect the first frame to the second frame; and

a second layer of flexure arms mounted above or below the first layer of flexure arms, the second layer of flexure arms configured to mechanically connect the first frame to the second frame;

wherein the first layer of flexure arms and the second layer of flexure arms are at least partially separated by an open space between the flexure arms.

2. The camera of claim 1, wherein the flexure assembly further comprises:

a spacing element mechanically connecting respective ones of the flexure arms of the first layer with respective ones of the flexure arms of the second layer at one or more respective points along respective spans of the flexure arms of the first and second layers.

3. The camera of claim 1, wherein the flexure assembly further comprises:

electrical traces routed from the first frame to the second frame via the first layer flexure arms and via the second layer flexure arms,

wherein the electrical traces are configured to transmit the image signals from the first frame to the second frame.

4. The camera of claim 3, further comprising:

one or more flexural stabilizer members configured to mechanically connect the flexure arms of the first or second layer with other flexure arms of the first or second layer such that the one or more flexural stabilizer members prevent interference between the flexure arms of the first or second layer.

5. The camera of claim 4, wherein a spacer element is positioned between a respective connection to the first frame or the second frame and a respective flex stabilizer member of the one or more flex stabilizer members over a respective span of the flex arm.

6. The camera of claim 5, wherein the one or more flexure stabilizer members constrain movement of flexure arms relative to each other in a plane extending through the first layer or the second layer.

7. The camera of claim 6, wherein the spacing element constrains movement of flexure arms relative to flexure arms of the other layer at respective points in a direction perpendicular to a plane extending through the first or second layer.

8. The camera of claim 5, wherein the spacing element comprises a spacing material mechanically connecting the flexure arms of the first layer to the flexure arms of the second layer at respective points,

wherein the first frame comprises a first layer and a second layer mechanically connected to each other via the spacer material;

wherein the second frame comprises a first layer and a second layer mechanically connected to each other via the spacer material; and is

Wherein portions of the flexure arms of the first layer and the flexure arms of the second layer that are not mechanically connected via the spacing element at the respective points are separated by an air gap.

9. A Voice Coil Motor (VCM) actuator, the VCM actuator comprising:

one or more actuator magnets;

one or more actuator coils;

a dynamic platform configured to be coupled to an image sensor;

a static platform configured to be static relative to the dynamic platform;

a first layer of flexure arms configured to mechanically couple the dynamic platform to the static platform; and

a second layer of flexure arms mounted above or below the first layer of flexure arms and configured to mechanically connect the dynamic platform to the static platform;

wherein the first layer flexing arms and the second layer flexing arms are at least partially separated by an open space between the respective flexing arms of the first and second layers; and is

Wherein the one or more actuator magnets and the one or more actuator coils are configured to magnetically interact to move the dynamic platform relative to the static platform in a plurality of directions parallel to a plane extending through the first and second layers of flexure arms.

10. A voice coil motor actuator as recited in claim 9, further comprising:

a spacing element mechanically connecting respective ones of the flexure arms of the first layer with respective ones of the flexure arms of the second layer at one or more respective points along respective spans of the flexure arms of the first and second layers.

11. A voice coil motor actuator as claimed in claim 10, wherein the spacer element comprises an adhesive bonding material, solder material or metal plating mechanically connecting the flexure arms of the first layer with the flexure arms of the second layer at the one or more respective points along the respective span of the flexure arms of the first and second layers.

12. A voice coil motor actuator as claimed in claim 11, wherein the flexure arms of the first and second layers being mechanically connected at the one or more respective points via the spacer element cause the dynamic platform to be more rigidly connected to the static platform in a Z-direction perpendicular to a plane extending through the first and second layers than in an X-direction or Y-direction parallel to the plane extending through the first and second layers.

13. A voice coil motor actuator as recited in claim 9, further comprising:

one or more flexural stabilizer members configured to mechanically connect the flexure arms of the first or second layer with other flexure arms of the first or second layer such that the one or more flexural stabilizer members prevent interference between the flexure arms of the first or second layer.

14. A voice coil motor actuator as recited in claim 9, further comprising:

electrical traces routed from the dynamic platform to the static platform via the first layer flexure arms and the second layer flexure arms,

wherein the electrical traces are configured to carry image signals from an image sensor coupled to the dynamic platform to circuitry connected to the static platform.

15. A voice coil motor actuator as claimed in claim 14, wherein more than one electrical trace is routed via a separate one of the flexure arms of the first or second layer.

16. A voice coil motor actuator as claimed in claim 9, wherein the first or second layer of flexure arms comprises a respective set of four or fewer flexure arms in respective quadrants of the first or second layer.

17. A mobile multifunction device, the mobile multifunction device comprising:

a camera module, the camera module comprising:

one or more lens elements defining an optical axis; and

an image sensor configured to capture light passing through the one or more lens elements and convert the captured light into an image signal;

a dynamic frame coupled with the image sensor;

a static frame configured to be static relative to the dynamic frame;

a first layer of flexure arms mechanically connecting the dynamic frame to the static frame; and

a second layer of flexure arms mounted above or below the first layer flexure arms, the second layer flexure arms mechanically connecting the dynamic frame to the static frame, wherein the first and second layer flexure arms are at least partially separated by an open space between the flexure arms of the first and second layers; and

an electrical trace configured to transmit the image signal from the dynamic frame to the static frame;

a display; and

one or more processors configured to:

cause the display to present an image based at least in part on one or more of the image signals that have been transferred from the dynamic frame to the static frame via the electrical traces.

18. The multifunction device of claim 17, further comprising: a voice coil motor actuator, the voice coil motor actuator comprising:

one or more actuator magnets; and

one or more of the actuator coils may be,

wherein the one or more processors are configured to:

causing the voice coil motor actuator to move the dynamic frame relative to the static frame in a plurality of directions orthogonal to the optical axis.

19. The multifunction device of claim 17, further comprising:

a spacing element mechanically connecting respective ones of the flexure arms of the first layer with respective ones of the flexure arms of the second layer at respective points along respective spans of the flexure arms of the first and second layers.

20. The multifunction device of claim 17, further comprising:

one or more additional layer flexure arms mounted above or below the first or second layer flexure arms, the one or more additional layer flexure arms configured to mechanically connect the dynamic frame to the static frame, wherein each of the one or more additional layer flexure arms is spaced apart from the other layer flexure arms by an open space between the layer flexure arms.

Technical Field

The present disclosure relates generally to camera actuators and/or suspension systems, and more particularly to actuator/suspension systems for image sensors in cameras having moving image sensor arrangements.

Description of the related Art

The advent of small mobile multi-purpose devices such as smart phones and tablets or tablet devices has led to the need for high resolution low profile cameras to be integrated into small mobile multi-purpose devices. Some small-profile cameras may incorporate an Optical Image Stabilization (OIS) mechanism that can sense and react to external stimuli/disturbances by adjusting the position of the optical lens in the X-axis and/or Y-axis in an attempt to compensate for unwanted movements of the lens. Some small outline cameras may incorporate an auto-focus (AF) mechanism by which the subject focus distance may be adjusted to focus the subject plane in front of the camera at the image plane captured by the camera's image sensor. In some such autofocus mechanisms, the optical lens moves as a single rigid body along the optical axis of the camera (referred to as the Z-axis) to refocus the camera.

Furthermore, high image quality is more easily achieved in small profile cameras if the lens motion along the optical axis is accompanied by minimal parasitic motion in other degrees of freedom, e.g., in the X and Y axes orthogonal to the optical (Z) axis of the camera. Thus, some small-profile cameras that include an autofocus mechanism may also incorporate an Optical Image Stabilization (OIS) mechanism that can sense and react to external stimuli/disturbances by adjusting the position of the optical lens in the X-axis and/or Y-axis in an attempt to compensate for the unwanted action of the lens.

Background

Disclosure of Invention

In some embodiments, a camera includes a lens holder, an image sensor, and a flexure assembly that supports the image sensor relative to the lens holder. The lens holder includes one or more lens elements coupled to the lens holder, wherein the lens elements define an optical axis of the camera. The image sensor is configured to capture light passing through one or more lens elements and to convert the captured light into an image signal. The flexure assembly supports the image sensor above or below the lens holder and includes a first frame coupled to the image sensor, a second frame coupled to a fixed component of the camera, a first layer of flexure arms, and a second layer of flexure arms. A first frame coupled to the image sensor is configured to move with the image sensor and receive an image signal generated by the image sensor. The first layer of flexure arms and the second layer of flexure arms are each configured to mechanically connect the first frame to the second frame and to provide respective paths for electrical traces routing image signals from the first frame to the second frame. The second frame may include one or more connectors that connect the second frame to circuitry of the camera that also processes or causes image signals to be displayed. The second layer of flexure arms are mounted over or under the first layer of flexure arms, with the first and second layer of flexure arms being at least partially separated by an open space between the layer of flexure arms.

The arrangement of the multi-layer flexure arms of the flexure above and below each other in the camera or voice coil motor actuator may provide greater stiffness in the Z-direction along the optical axis of the camera than in the X-or Y-direction orthogonal to the optical axis. This arrangement may also result in the image sensor of the dynamic stage coupled to the flexure assembly maintaining a constant or near constant Z position while allowing the image sensor to move in the X and Y directions, for example, in response to Optical Image Stabilization (OIS) actuation. Additionally, the use of multi-layer flexure arms in the flexure assembly may provide more paths for the electrical traces than a single-layer flexure arm in the flexure assembly. This, in turn, may allow flexure assemblies including multi-layer flexure arms to be more compact than single-layer flexure assemblies while still being able to transmit similar or greater amounts of image signals.

In some embodiments, a voice coil motor actuator includes one or more actuator magnets, one or more actuator coils, a dynamic platform configured to be coupled to an image sensor, and a static platform, wherein the dynamic platform is configured to move relative to the static platform (although the static platform may also move relative to other components of a camera). The voice coil motor actuator also includes a first layer of flexure arms configured to mechanically couple the dynamic platform to the static platform. In addition, the voice coil motor actuator includes a second layer of flexure arms mounted above or below the first layer of flexure arms and configured to mechanically couple the dynamic platform to the static platform. The first layer flexing arms and the second layer flexing arms are at least partially separated by an open space between the respective flexing arms of the first and second layers. The one or more actuator magnets and the one or more actuator coils are configured to magnetically interact to move the dynamic platform relative to the static platform in directions parallel to a plane extending through the first and second layers of flexure arms.

In some embodiments, a mobile multifunction device includes a camera module, the camera module comprising: one or more lens elements defining an optical axis; and an image sensor configured to capture light passing through the one or more lens elements and convert the captured light into an image signal. The mobile multifunctional device further comprises: a dynamic frame coupled with the image sensor; a static frame; a first layer of flexure arms mechanically connecting the dynamic frame to the static frame; and a second layer of flexure arms mounted above or below the first layer of flexure arms mechanically connecting the dynamic frame to the static frame. The dynamic frame and the flexure arms are configured to allow the dynamic frame to move relative to the static frame. The first layer flexing arms and the second layer flexing arms are at least partially separated by an open space between the flexing arms of the first and second layers. The mobile multifunction device also includes electrical traces configured to transfer image signals from the dynamic frame to the static frame. Additionally, the mobile multifunction device includes a display and one or more processors configured to cause the display to present an image based at least in part on one or more of the image signals that have been transferred from the dynamic frame to the static frame via the electrical traces.

Drawings

Fig. 1A illustrates a top view of a multilayer flexure including a dynamic platform, a static platform, and multilayer flexure arms connecting the static platform and the dynamic platform, according to some embodiments.

Fig. 1B illustrates a perspective view of a cross-section of a multi-layer flexure including a dynamic platform, a static platform, and multi-layer flexure arms connecting the static platform and the dynamic platform, according to some embodiments.

Fig. 1C illustrates a perspective view of a cross-section of a multi-layer flexure including a dynamic platform, a static platform, and multi-layer flexure arms connecting the static platform and the dynamic platform, according to some embodiments.

FIG. 1D illustrates electrical traces on flexure arms of a multilayer flexure according to some embodiments.

Fig. 2 illustrates an exploded view of a multilayer flexure including a dynamic platform, a static platform, and multilayer flexure arms connecting the static platform and the dynamic platform, according to some embodiments.

Fig. 3 illustrates an exemplary embodiment of a camera having an actuator module or assembly that includes a multi-layer flexure and can be used, for example, to provide autofocus through lens assembly movement and optical image stabilization through image sensor movement, according to some embodiments.

Fig. 4 illustrates an exemplary embodiment of a camera having an actuator module or assembly that includes a multi-layer flexure and can be used, for example, to provide autofocus through lens assembly movement and optical image stabilization through image sensor movement, according to some embodiments.

Fig. 5 illustrates an exploded view of components of an exemplary embodiment of a camera having an actuator module or assembly that includes a multi-layer flexure and that may be used, for example, to provide autofocus through lens assembly movement and optical image stabilization through image sensor movement, according to some embodiments.

Fig. 6A-6C illustrate a multilayer flexure including a dynamic platform, a static platform, and multilayer flexure arms connecting the static and dynamic platforms displacing the dynamic platform in an X-direction, according to some embodiments.

Fig. 7A-7C illustrate a multilayer flexure including a dynamic platform, a static platform, and multilayer flexure arms connecting the static and dynamic platforms displacing the dynamic platform in the Y-direction, according to some embodiments.

Fig. 8 illustrates an exploded view of a multilayer flexure including a dynamic platform, a static platform, and flexure arms connecting the static platform and the dynamic platform, according to some embodiments.

Fig. 9A-9H each illustrate a cross-sectional view of a respective example flexure arm, according to some embodiments. In some cases, one or more embodiments of the example flexure arms may be used in a multilayer flexure, according to some embodiments.

Fig. 10A-10L each illustrate partial top views of respective example flexure arm configurations according to some embodiments. In some cases, one or more embodiments of the example flexure arm configurations of fig. 8A-8L may be used in a multi-layer flexure, according to some embodiments.

FIG. 11 illustrates a block diagram of a portable multifunction device with a camera in accordance with some embodiments.

Figure 12 illustrates a portable multifunction device with a camera in accordance with some embodiments.

Fig. 13 illustrates an exemplary computer system that may include a camera according to some embodiments. According to some embodiments, an exemplary computer system may be configured to implement aspects of the systems and methods for camera control discussed herein.

This specification includes references to "one embodiment" or "an embodiment". The appearances of the phrase "in one embodiment" or "in an embodiment" are not necessarily referring to the same embodiment. The particular features, structures, or characteristics may be combined in any suitable manner consistent with the present disclosure.

"comprising," the term is open-ended. As used in the appended claims, the term does not exclude additional structures or steps. Consider the claims as cited below: the claims do not exclude that an apparatus comprises additional components (e.g. network interface units, graphics circuits, etc.).

"configured," various units, circuits, or other components may be described or recited as "configured to" perform a task or tasks. In such context, "configured to" is used to connote structure by indicating that the units/circuits/components include structure (e.g., circuitry) that performs such task or tasks during operation. As such, the cells/circuits/components can be said to be configured to perform this task even when the specified cell/circuit/component is not currently operational (e.g., not turned on). The units/circuits/components used with the "configured to" language include hardware-e.g., circuitry, memory storing program instructions executable to perform operations, and so on. Reference to a unit/circuit/component "being configured to" perform one or more tasks is expressly intended to not refer to the sixth paragraph of 35u.s.c. § 112 for that unit/circuit/component. Further, "configured to" may include a general-purpose structure (e.g., a general-purpose circuit) that is manipulated by software and/or firmware (e.g., an FPGA or a general-purpose processor executing software) to operate in a manner that is capable of performing one or more tasks to be solved. "configured to" may also include adjusting a manufacturing process (e.g., a semiconductor fabrication facility) to manufacture a device (e.g., an integrated circuit) suitable for performing or carrying out one or more tasks.

"first", "second", etc. As used herein, these terms serve as labels to the nouns preceding them, and do not imply any type of ordering (e.g., spatial, temporal, logical, etc.). For example, the buffer circuit may be described herein as performing a write operation of a "first" value and a "second" value. The terms "first" and "second" do not necessarily imply that the first value must be written before the second value.

"based on". As used herein, the term is used to describe one or more factors that affect the determination. The term does not exclude additional factors that influence the determination. That is, the determination may be based solely on these factors or at least partially on these factors. Consider the phrase "determine a based on B. In this case, B is a factor that affects the determination of a, and such phrases do not exclude that the determination of a may also be based on C. In other examples, a may be determined based on B alone.

Detailed Description

Multi-layer flexure arrangement

In some embodiments, a multi-layer flexure assembly includes a dynamic platform, a static platform, and a multi-layer flexure arm connecting the static platform and the dynamic platform. The dynamic platform may include a first frame coupled with an image sensor. The static platform may include a second frame coupled with a static portion of the camera, such as a circuit board that receives image signals from the image sensor via electrical traces routed on flexure arms that mechanically connect the dynamic platform and the static platform. The multi-layer flexure may also include spacer elements that mechanically connect the flexure arms of the upper layer to the flexure arms of the lower layer at respective points along respective spans of the flexure arms of the upper and lower layers. For example, spacing elements that mechanically connect flexure arms at different levels of the multi-layer flexure, where the flexure arms of different layers are mechanically connected to each other, may effectively reduce the beam length of the flexure arms by limiting movement of the flexure arms relative to each other in the Z-direction. This may result in the multi-layer flexure having greater stiffness in the Z-direction than in the X or Y-direction. For example, a multi-layer flexure's dynamic stage supporting an image sensor may have a stiffness that limits the motion of the image sensor in the Z-direction (e.g., along the optical axis) up to three times the stiffness of a single-layer flexure in the Z-direction. In addition, the multi-layer flexure may have a stiffness in the X-direction and/or Y-direction similar to the stiffness of a single-layer flexure in the X-direction and/or Y-direction. Thus, the multi-layer flexure may provide greater Z-stiffness (and thus greater image sensor stability along the optical axis) than a single-layer flexure, while still being flexible in the X-direction and Y-direction, such that the magnets and coils of the voice coil motor may cause the image sensor to move in the X-direction and Y-direction.

Additionally, in some embodiments, the multi-layer flexure may have a smaller footprint than a single-layer flexure. For example, in some embodiments, electrical traces may be routed via flexure arms of a multi-layer flexure using more than one layer of flexure arms. Thus, because of the presence of the multi-layer flexure arms, there are more flexure arms in the X-Y footprint of the multi-layer flexure to route electrical traces than in the case of a single-layer flexure. For example, electrical traces may be routed on the flexure arms of different layers of the multi-layer flexure, thereby reducing the number of flexure arms that need to be included in each layer of the multi-layer flexure to route a given number of electrical traces.

Additionally, in some embodiments, a multi-layer flexure may use a thinner metal material for each of the multiple layers than for a flexure that includes only a single layer. This is because the Z-stiffness of the multi-layer flexure is produced by the combined stiffness of the multiple layers and is further influenced by the mechanical connection between the layers via the spacer elements. The combination of layers may form a structure that is stiffer in the Z-direction than the stiffness of any individual layer, and thus may allow individual layers to be constructed from thinner materials than would be required to achieve similar Z-stiffness using a single layer flexure. Using a thinner metal material for each of the layers of the multi-layer flexure may improve manufacturability of the multi-layer flexure by reducing problems associated with etching, which makes manufacturing a single layer flexure with a thicker metal layer more difficult, as compared to a flexure that includes only a single layer.

In some embodiments, the multilayer flexure may further include one or more flexure stabilizers that limit movement of the flexure arms within the layers of the multilayer flexure relative to each other. For example, the flexure stabilizers may connect flexures in the same layer in a plane orthogonal to the optical axis, while the spacer elements may connect flexures in different layers in a plane parallel to the optical axis.

In some embodiments, the Z-stiffness, X-stiffness, and/or Y-stiffness of the multilayer flexure may be tuned by: adjusting the number of flexure arms included in the multi-layer flexure; adjusting a thickness of flexure arms included in the multi-layer flexure; more or fewer spacer elements are included in the multilayer flexure; adjusting a relative position of a spacer element included in the multi-layer flexure; and/or adjusting the number or location of flexural stabilizers included in the multi-layer flexure. In some embodiments, other variables can be adjusted to adjust the Z stiffness, X stiffness, and/or Y stiffness of the multilayer flexure.

Fig. 1A illustrates a top view of a multilayer flexure including a dynamic platform, a static platform, and multilayer flexure arms connecting the static platform and the dynamic platform, according to some embodiments.

The multi-layer flexure 102 includes a static stage/frame 104, a dynamic stage/frame 106, and flexure arms 108. The multi-layer flexure 102 also includes spacing elements 110 that mechanically connect different layers of the flexure arms 108 to one another at respective points along respective spans of the flexure arms 108. In addition, the multi-layer flexure 102 includes flexure stabilizers 120 that maintain separation between flexure arms 108 in a given one of the layers of the multi-layer flexure 102.

In some embodiments, a static platform, such as static platform 104, may include a second frame surrounding flexure arms 108 and dynamic platform 106. In some embodiments, the second frame and the static platform may be the same component, or the second frame may be coupled to the static platform as two components coupled together. In some embodiments, a dynamic platform, such as dynamic platform 106, may be located within a circumference formed at least in part by flexure arms, such as flexure arm 108, and may include a first frame. In some embodiments, the dynamic platform and the first frame may be the same component, or the first frame may be coupled to the dynamic platform as two components coupled together. In some embodiments, a dynamic platform such as dynamic platform 106 may be formed from an image sensor coupled to a first frame that is located within a second frame and is mechanically connected to the second frame via flexure arms at multiple layers of a multi-layer flexure.

In some embodiments, flexure arms 108 may enable dynamic platform 106 and/or the first frame to move relative to static platform 104 and/or the second frame. It is noted that in some embodiments, the static platform 104 and/or the second frame may be coupled to a fixed structure of the camera or mobile device, or may be coupled to another component of the camera or mobile device that is capable of moving relative to other components of the camera or mobile device. Notably, when the multi-layer flexure is included in a Voice Coil Motor (VCM), the first frame may include a dynamic platform coupled to another component of the camera, such as an image sensor. Additionally, the second frame may include a static platform coupled to another component of the camera, such as a printed circuit board or structural element of the camera. In some embodiments, the first frame may include a plurality of dynamic stage layers of the multi-layer flexure coupled together via a spacer material, and the second frame may include a plurality of static stage layers coupled together via a spacer material. Further, the first frame may also include electrical traces carried on one or more of the dynamic platform layers. Additionally, the second frame may include electrical traces carried on one or more of the static platform layers, wherein the flexure arms also carry electrical traces that connect the electrical traces of the dynamic platform layer to the electrical traces of the static platform layer.

In some embodiments, a spacing element, such as spacing element 110, may be positioned along the span of the flexure arm at a location approximately midway between the connection of the flexure arm to the static or dynamic platform and a flexure stabilizer, such as one of the flexure stabilizers 120. In some embodiments in which the connections between the flexure arms and the static or dynamic platform are staggered within one layer of the multi-layer flexure, the spacing elements, such as spacing elements 110, may be staggered in a similar pattern so that the spacing elements are located at the midpoints of the respective spans of the respective flexure arms, such as flexure arms 108. In some embodiments, the portions of the flexure arms corresponding to the spacing elements, such as spacing element 110, may be slightly larger than the other portions of the flexure arms, as shown in fig. 1A-1C, or in some embodiments, the portions of the flexure arms corresponding to the spacing elements may have the same width as the flexure arms.

As discussed in more detail below with respect to fig. 3-4, in some embodiments, the motion of a dynamic platform, such as dynamic platform 106, may be controlled by a Voice Coil Motor (VCM) actuator in the X and Y directions, but may be uncontrolled in the Z direction, where it is intended that the motion in the Z direction be small or zero. Because the multi-layer flexure may have significantly greater stiffness in the Z-direction than in the X-and Y-directions, the multi-layer flexure may flex in the X-and Y-directions while remaining substantially stable (e.g., non-moving) in the Z-direction. In other embodiments, the position of the image sensor in the Z direction may be controlled via a voice coil motor actuator acting in the Z direction. In such embodiments, one or more parameters of the multilayer flexure may be adjusted to adjust the Z-stiffness of the multilayer flexure. For example, fewer spacer elements may be used, or thinner flexure arms may be used, to reduce the Z-stiffness of the multi-layer flexure.

In some embodiments, a multilayer flexure such as multilayer flexure 102 may include four or fewer flexure arms per layer in a given quadrant of the multilayer flexure. For example, the multi-layer flexure 102 shown in FIG. 1A includes four flexure arms per layer per quadrant. For example, the upper layer 114 includes four flexure arms 108 per quadrant. In addition, the lower layer 112 includes four flexure arms 108 per quadrant.

Fig. 1B and 1C illustrate perspective views of a cross-section of a multilayer flexure, according to some embodiments. The multi-layer flexure 102 shown in fig. 1B and 1C may be the same multi-layer flexure 102 shown in fig. 1A.

As shown in fig. 1B and 1C, a spacing element, such as spacing element 110, may include a spacing material 116 placed between upper layer 114 and lower layer 112 of multilayer flexure 102. However, an open space may remain between the flexure arms of the respective layers, such that there is an air gap between the flexure arms that allows the flexure arms at different layers of the multi-layer flexure 102 to move relative to each other. For example, FIG. 1C illustrates an air gap 118 between the flexure arms 108 of the upper layer 114 and the flexure arms 108 of the lower layer 112. In some embodiments, the air gap 118 may have a height of about 80 microns or other suitable height.

FIG. 1C also illustrates a cross-section of the spacing element 110 with a spacing material 116 mechanically connecting the flexure arms 108 of the upper layer 114 with the flexure arms 108 of the lower layer 112. In some embodiments, the spacer material may be or include an adhesive bonding material, a solder material, or a metal plated material that mechanically connects the flexure arms of the upper layer to the flexure arms of the lower layer, wherein the upper and lower flexure arms are mechanically connected at points of the spacer element along the span of the respective flexure arms.

In some embodiments, a flexural stabilizer member, such as flexural stabilizer 120, may be configured to mechanically connect the flexure arm 108 of the upper layer 114 with other flexure arms 108 of the upper layer 114 to prevent interference between flexure arms of the same layer. For example, the flexure stabilizers 114 may maintain a spacing between flexure arms at the flexure stabilizer's location (such as a corner of the multi-layer flexure 102) when flexure arms of a given layer are deformed due to a displacement of the dynamic stage in the X-direction or Y-direction. In a similar manner, the flexure stabilizers of lower layer 112 may prevent interference between flexure arms 108 of lower layer 112. For example, the flexure stabilizers 120 may constrain movement of the flexure arms of the upper or lower layer 114, 112 relative to other ones of the flexure arms of the respective upper or lower layer in a plane extending through the upper or lower layer.

In some embodiments, the spacing elements 110 may be placed at mid-points along the span of the respective flexure arms between the connection to the static platform or the connection to the dynamic platform and the respective flexure stabilizers located at the corners of the flexure arms between the static and dynamic platforms.

In some embodiments, electrical traces may be routed from a dynamic platform, such as dynamic platform 106, to a static platform, such as static platform 104, via flexure arms, such as flexure arm 108. In some embodiments, the electrical traces may be electrically isolated from the metal flexure arms by a polyamide insulation layer. In some implementations, the flexure arm can carry multilayer electrical signal traces that are electrically isolated from the metal flexure of the flexure arm and from each other by polyamide insulation layers. In some embodiments, other types of insulators and trace elements may be used.

For example, fig. 1D illustrates electrical traces on flexure arms of a multilayer flexure according to some embodiments. For example, the dynamic platform 106 is connected to the static platform 104 by a flexure 108 carrying electrical traces comprised of copper deposits 122 shielded by a polyimide layer 124.

Fig. 2 illustrates an exploded view of a flexure including a dynamic platform, a static platform, and a multi-layer flexure arm connecting the static platform and the dynamic platform, according to some embodiments. In some embodiments, the multi-layer flexure 102 shown in fig. 2 may be the same multi-flexure 102 shown in fig. 1A-1D.

In fig. 2, the upper layer 114 is illustrated in an exploded view above the spacer material 116 bonded to the lower layer 112. It is noted that the spacer element 110 comprises a spacer material 116 at the location of the spacer element 110. FIG. 2 also illustrates a connector 128 on the dynamic platform 106 and a connector 126 on the static platform 104, where electrical traces are routed between the connector 128 on the dynamic platform 106 and the connector 126 on the static platform 104 via electrical traces mounted on the flexure arms 108. Connectors 126 and 128 are shown as examples, but in some embodiments any number of connectors and/or different types of connectors may be used.

In some embodiments, an image sensor mounted to a dynamic platform, such as image sensor 308 mounted on dynamic platform 322 (shown in fig. 3), or an optical image stabilization circuit, such as flexible printed circuit board 518 (shown in fig. 5) used as a dynamic platform with respect to a static portion of multi-layer flexure 522, may send signals to or receive signals from connectors of the dynamic or static platform, such as connectors 126 or 128. For example, in some embodiments, the connector 128 on the dynamic platform 106 may couple with a connector of an image sensor, and the connector 126 on the static platform 104 may couple with other camera components that send or receive signals to or from the image sensor. In some embodiments, electrical traces are routed between connectors 126 and 128 via static platform 104, flexure arms 108, and dynamic platform 106. In some embodiments, electrical traces may be located on the static platform 114, flexure arms 108, and dynamic platform 106 of the upper flexure layer 114. In some embodiments, electrical traces may be located on the static platform 114, flexure arms 108, and dynamic platform 106 of the lower flexure layer 112. In some implementations, electrical traces can be routed on both the upper flex layer 114 and the lower flex layer 112. In some implementations, one or more vias can pass through the spacer material 116 to connect a portion of an electrical trace on one of the upper flex layer 114 or the lower flex layer 116 to another portion of an electrical trace on the other of the upper flex layer 114 or the lower flex layer 116. In some embodiments, such electrical traces may receive or transmit information and/or power between components coupled to the respective connector 126 or 128.

In some embodiments, a multilayer flexure, such as multilayer flexure 102, may include electrical traces on both the upper flexure layer and the lower flexure layer. For example, in some embodiments, the upper flex layer 114 and the spacer material 116 may include openings (not shown) adjacent or similar to the connectors 126 that allow some elements of the connectors to pass through the upper layer 114 and the spacer material 116 to engage with connectors (not shown) on the upper surface of the lower layer 112. Additionally, in some embodiments, the connectors may be located on opposite sides of the multilayer flexure assembly, such as multilayer flexure 102. For example, a first set of connectors may be located on the outward facing surface of the upper flex layer 114 and another set of connectors may be located on the outward facing surface of the lower flex layer 112. In such implementations, one or more vias can pass between the layers to transmit signals/power between electrical traces on the upper flex layer 114 and electrical traces on the lower flex layer 112. Additionally, in some embodiments, the spacing element 110 may include a spacing material 116 and vias (not shown) through the spacing material to connect electrical traces located on the respective upper and lower flex layers 114, 112. In addition, although not shown in each figure, other multi-layer flexures as described herein, such as in fig. 1A-1D, 2-8, etc., may include connectors as described above.

Examples of magnetic sensing for auto-focus position detection

In some embodiments, the compact camera module including the multi-layer flexure may also include actuators that deliver functions such as auto-focus (AF) and Optical Image Stabilization (OIS). One approach to delivering a very compact actuator for OIS is to use a Voice Coil Motor (VCM) arrangement.

In some embodiments, an optical image stabilization voice coil motor (OIS VCM) actuator is designed such that the image sensor is mounted on a dynamic frame that translates in the X and Y directions. The image sensor (wire bond, flip/chip, BGA) can be mounted on a dynamic platform with electrical signal traces that are routed using an additive copper deposition process to connect the image sensor from the dynamic platform to a static platform. Flexure arms connect the dynamic platform to the static platform and support electrical signal traces formed by an additive copper deposition process. An Optical Image Stabilization (OIS) coil is mounted on the dynamic platform. In some embodiments, the OIS coil interacts with a common permanent magnet that is also used as part of an Auto Focus (AF) voice coil motor. In some embodiments, the OIS permanent magnet is mounted on the static portion of the optical image stabilization actuator to provide additional lorentz force (e.g., in the case of high in-plane flexural stiffness).

Some embodiments include a camera. The camera may include a lens, an image sensor, and an auto focus voice coil motor (AF VCM) actuator. The lens may include one or more lens elements mounted in a lens holder that define an optical axis (e.g., Z-axis) of the camera. The image sensor may be configured to capture light passing through the lens element. Further, the image sensor may be configured to convert the captured light into image signals that are routed through electrical traces mounted on the flexure arm to other components of the camera, such as other circuitry that further processes the image signals or causes the captured images to be stored or displayed.

In some embodiments, a camera actuator includes an actuator base, an auto-focus voice coil motor, and an optical image stabilization voice coil motor. In some embodiments, an autofocus voice coil motor includes a lens holder mounting attachment movably mounted to an actuator base, a plurality of common magnets mounted to the actuator base, and an autofocus coil fixedly mounted to the lens holder mounting attachment for generating a force for moving the lens holder in a direction of an optical axis of one or more lenses of the lens holder. In some embodiments, an optical image stabilization voice coil motor includes an image sensor carrier (e.g., a dynamic stage) movably mounted to an actuator base, and a plurality of optical image stabilization coils movably mounted to the dynamic stage within a magnetic field of a common magnet for generating forces for moving the dynamic stage in a plurality of directions orthogonal to an optical axis.

In some embodiments, shifting the image sensor allows for reduced moving mass and therefore has significant benefits in terms of power consumption compared to OIS "optics shift" designs. In some embodiments, fabrication is achieved by depositing electrical traces directly on the multilayer flexure using an additive copper deposition process, which enables smaller size packaging while meeting I/O requirements.

In some embodiments, the optical image stabilization coil is mounted on a flexible printed circuit that carries power to the coil for operation of the optical image stabilization voice coil motor.

In some embodiments, the optical image stabilization coil corners are mounted on a flexible printed circuit that is mechanically connected to the actuator base and mechanically isolated from the auto-focus voice coil motor.

Fig. 3 illustrates an exemplary embodiment of a camera having an actuator module or assembly that may be used, for example, to provide autofocus through lens assembly movement in a low profile camera and optical image stabilization through image sensor movement, according to some embodiments.

In the embodiment shown in fig. 3, camera 300 includes a lens element 302 enclosed within a lens assembly 304 in a lens carrier 306. In the embodiment shown in fig. 3, camera 300 includes an image sensor 308 for capturing a digital representation of the light transmitted through lens element 302. In the embodiment shown in fig. 3, camera 300 includes an axial motion (autofocus) voice coil motor 310 for focusing light from lens element 302 onto image sensor 308 by moving lens assembly 304 containing lens element 302 along the optical axis of lens element 302. In the embodiment shown in FIG. 3, the axial motion voice coil motor 310 includes an autofocus suspension assembly 312 for movably mounting the lens carrier 306 to an actuator base 314 such that the lens carrier is movable relative to the actuator base. In the embodiment shown in fig. 3, axial motion voice coil motor 310 comprises: a plurality of common magnets 316 mounted to the actuator base 314 via the autofocus suspension assembly 312; and a focusing coil 318 fixedly mounted to the lens carrier 306 and movably mounted to the actuator base 314 by the autofocus suspension assembly 312 such that the focusing coil is movable with the lens assembly relative to the actuator base.

In the embodiment shown in fig. 3, camera 300 also includes a lateral motion (optical image stabilization OIS) voice coil motor 320. A lateral motion (OIS) voice coil motor 320 includes a dynamic platform 322, flexure arms 324 for mechanically coupling the dynamic platform 322 to a static platform 326, and a plurality of lateral motion (OIS) coils 332 fixedly mounted to the dynamic platform 322 within the magnetic field of a common magnet 316 for generating forces for moving the dynamic platform 322 in a plurality of directions orthogonal to the optical axis of the lens element 302.

In some embodiments, dynamic platform 322, flexure arms 324, and static platform 326 are a single metal or other flexible component. In some embodiments, flexure arms 324 mechanically connect the image sensor 308 resting on a dynamic platform 322 to a static platform 326 of a lateral motion (optical image stabilization) voice coil motor 320, and the flexure arms support electrical signal traces of the multi-layer flexure to electrical signal traces 330 of the camera 300. In some embodiments, the flexure arms 324 include a metal flexure that carries electrical signal traces that are electrically isolated from the metal flexure by a polyamide insulation layer.

In some embodiments, the optical image stabilization coil 332 is mounted on a flexible printed circuit 334 that carries electrical power to the coil 332 for operation of the (optical image stabilization) lateral motion voice coil motor 320.

In some embodiments, the optical image stabilization coil 332 is corner mounted on a flexible printed circuit 334 that is mechanically connected to the actuator base 314 and mechanically isolated from the axial (autofocus) voice coil motor 310.

In some embodiments, a bearing surface end stop is mounted to the base for limiting movement of the optical image stabilization voice coil motor. For example, a bearing surface end stop (or bearing surface end stops) (not shown in fig. 3) may be mounted between the actuator base 314 and the dynamic platform 322 such that Z travel of the dynamic platform 322 in the Z direction away from the lens element 302 is limited. For example, at the bottom of the Z travel distance, the dynamic platform 322 may be stopped by a bearing surface end stop so that the dynamic platform 322 does not affect the actuator base 314.

Fig. 4 illustrates an exemplary embodiment of a camera having an actuator module or assembly that includes a multi-layer flexure and that may be used, for example, to provide autofocus through lens assembly movement in a low profile camera and optical image stabilization through image sensor movement, according to some embodiments.

In some embodiments, camera actuator 400 includes an actuator base 414, an auto-focus voice coil motor 410, and an optical image stabilization voice coil motor 420. Autofocus voice coil motor 410 includes a lens carrier 406 movably mounted to an actuator base 414 via an autofocus VCM suspension system 412, a plurality of common magnets 416 mounted to the base 414 via the suspension system 412, and an autofocus coil 418 fixedly mounted to the lens carrier 406 for generating a force in the direction of the optical axis of one or more lens elements of the lens carrier 406.

In some embodiments, optical image stabilization voice coil motor 420 includes an image sensor carrier 422 (e.g., a dynamic stage) movably mounted to actuator base 414, and a plurality of optical image stabilization coils 430 mounted to image sensor carrier 422 within magnetic field 402 of common magnet 416 for generating forces 404 for moving image sensor carrier 422 in a plurality of directions orthogonal to the optical axis.

In some embodiments, the image sensor carrier 422 further includes one or more flexure members 428 (e.g., flexure arms at multiple layers of a multi-layer flexure) for mechanically connecting the image sensor 424 resting on the image sensor carrier 422 to the frame or static stage 408 of the optical image stabilization voice coil motor 420.

In some implementations, a flexure member (e.g., flexure arm) 428 mechanically and electrically connects the image sensor 424 resting in an image sensor carrier 422 (e.g., a dynamic platform) to the frame 408 (e.g., a static platform) of the optical image stabilization voice coil motor 420, and the flexure member (e.g., flexure arm) 428 includes electrical signal traces. In some embodiments, the flexure arms or members 428 comprise metal flexures that carry electrical signal traces that are electrically isolated from the metal flexures by polyamide insulation layers. In some embodiments, the optical image stabilization coil 430 is mounted on a flexible printed circuit 426 that carries power to the coil 430 for operation of the optical image stabilization voice coil motor. In some embodiments, the optical image stabilization coil 430 is corner mounted on a flexible printed circuit 426 that is mechanically connected to the actuator base 414 and mechanically isolated from the auto-focus voice coil motor 410.

Fig. 5 illustrates components of an exemplary embodiment of a camera having an actuator module or assembly that includes a multi-layer flexure and that may be used, for example, to provide autofocus through lens assembly movement in a low profile camera and optical image stabilization through image sensor movement, according to some embodiments. In various embodiments, camera 500 may include optics 502 (e.g., one or more lens elements mounted in a lens holder), a shield 504, a magnet holder 506, a magnet 508, a lens carrier 510, an Auto Focus (AF) coil 512, an Optical Image Stabilization (OIS) base 514, an OIS coil 516, an OIS flexible printed circuit board (FPC)518, an image sensor 520, a multi-layer flexure 522 (e.g., according to one or more embodiments of the multi-layer flexure described herein), and/or electrical traces 524.

Exemplary motion of a dynamic platform of a multi-layer flexure relative to a static platform

Fig. 6A-6C illustrate a multilayer flexure including a dynamic platform, a static platform, and multilayer flexure arms connecting the static and dynamic platforms displacing the dynamic platform in an X-direction, according to some embodiments.

In fig. 6A-6C, dynamic platform 604 is shifted to the left in the X direction relative to static platform/frame 602. For example, the dynamic platform may be displaced due to lorentz forces generated by the OIS VCM actuator. As can be seen, the flexure arms 606 flex and/or deform to allow the dynamic platform 604 to shift relative to the static platform 602. However, the spacing element 608 mechanically connects the flexure arms of the upper layer 612 with the flexure arms of the lower layer 614 such that the flexure arms 606 do not move in the Z-direction relative to each other at the point where the flexure arms are mechanically connected together at the spacing element 608. As previously discussed, this may reduce the effective beam length of the flexure arms 606 and cause the multi-layer flexure 600 to have greater stiffness in the Z-direction than in the X-direction or in the Y-direction. In some embodiments, the spacing element 608 is positioned along the span 618 of the flexure arm 606 between the flexure stabilizer 610 located at a corner of the flexure arm arrangement and the offset 616 to which the flexure arm is connected. In some embodiments, the spacer element 608 is located at or near the midpoint of the span 618 between the flexure stabilizer 610 and the offset 616 of the dynamic platform 604 or the offset 616 of the static platform 602.

In some embodiments, a flexure stabilizer, such as flexure stabilizer 610, constrains movement of the flexure arms 606 relative to each other at a corner of the flexure arm arrangement (or at another location) in a plane extending through the layers of the multi-layer flexure, such as a plane parallel to the upper layer 612 and/or the lower layer 614. Additionally, the spacing elements may constrain movement of the flexure arms 606 relative to the flexure arms of the upper or lower layers in a plane orthogonal to a plane extending through the upper layer 612 and/or lower layer 614.

Fig. 7A-7C illustrate a multilayer flexure including a dynamic platform, a static platform, and multilayer flexure arms connecting the static and dynamic platforms displacing the dynamic platform in the Y-direction, according to some embodiments.

In fig. 7A-7C, the dynamic platform 604 is shifted upward in the Y-direction relative to the static platform/frame 602. For example, the dynamic platform may be displaced due to lorentz forces generated by the OIS VCM actuator. As can be seen, the flexure arms 606 flex and/or deform to allow the dynamic platform 604 to shift relative to the static platform 602. However, the spacing element 608 mechanically connects the flexure arms of the upper layer 612 with the flexure arms of the lower layer 614 such that the flexure arms 606 do not move in the Z-direction relative to each other at the point where the flexure arms are mechanically connected together at the spacing element 608. As previously discussed, this may reduce the effective beam length of the flexure arms 606 and cause the multi-layer flexure 600 to have greater stiffness in the Z-direction than in the X-direction or in the Y-direction. In some embodiments, the spacing element 608 is positioned along the span 618 of the flexure arm 606 between the flexure stabilizer 610 located at a corner of the flexure arm arrangement and the offset 616 to which the flexure arm is connected. In some embodiments, the spacer element 608 is located at or near the midpoint of the span 618 between the flexure stabilizer 610 and the offset 616 of the dynamic platform 604 or the offset 616 of the static platform 602.

Exemplary exploded views of multilayer flexures

Fig. 8 illustrates an exploded view of a multilayer flexure including a dynamic platform, a static platform, and flexure arms connecting the static platform and the dynamic platform, according to some embodiments.

In some embodiments, a multilayer flexure, such as multilayer flexure 102 or any of the multilayer flexures described herein, can include two or more flexure layers. For example, in some embodiments, a multilayer flexure may include "N" flexure layers, where "N" is a number greater than 1. For example, in some embodiments, a multilayer flexure may include 2, 3, 4, or. For example, the ellipses in FIG. 8 are used to illustrate that additional sets of spacer material and flex layers can be added to provide more layers to the multilayer flexure.

Note that the exploded view in fig. 8 shows the spacer material 806 separated from the upper flex layer 802 and the lower flex layer 804, while the exploded view shown in fig. 2 shows the spacer material 116 coupled to the lower flex layer 112. It is further noted that the spacer material 116 includes a spacer material included in the flexural stabilizers 808 and the spacer elements 810. In some embodiments, each layer of the multi-layer flexure may include a dynamic platform 812, a static platform 814, and flexure arms 816 mechanically coupling the dynamic platform to the static platform. Additionally, electrical traces may be routed along the flexure arms 816 to provide electrical paths between the dynamic platform 812 and the static platform 814.

In some embodiments, some layers may include electrical traces, while other layers do not. For example, some layers may be added to increase the Z-stiffness of the multi-layer flexure, but these layers may not be used to route electrical traces. Rather, in some embodiments, multiple layers or all of the layers may include flexure arms of electrical traces routed therethrough.

In some embodiments, more than one spacer material may be used. For example, in some embodiments, an adhesive spacer material may be used to bond the static mesa layers together or the dynamic mesa layers together, and a solder spacer material may be used to bond the layers together at the spacer elements.

In some embodiments, a multi-layer flexure having two or more layers may include additional spacer material similar to spacer material 806 including flexural stabilizers 808 and spacing elements 810. Additional spacer material may be located under the flex layer 804, and an additional flex layer similar to the flex layer 804 may be located under the additional spacer material. In some embodiments, the pattern can be repeated to add any number of layers to the multilayer flexure. In some embodiments, multiple layers or all layers of a multilayer flexure may include electrical traces. Additionally, in some implementations, some layers of the multi-layer flexure can carry electrical traces while other layers do not. For example, for a multilayer flexure having three layers, the upper and lower outer layers may include electrical traces, while the middle layer does not include electrical traces (or only includes vias connecting the upper and lower layers). In other embodiments, electrical traces can be carried by all three layers of a three-layer flexure or other combined layers.

Exemplary flexure arm configurations

Fig. 9A-9H each illustrate a cross-sectional view of a respective example flexure arm, according to some embodiments. In some cases, one or more embodiments of an example flexure arm may be used in a multilayer flexure of a Voice Coil Motor (VCM) actuator (e.g., any of the multilayer flexures described herein, such as in fig. 1-8 or 10).

Figure 9A illustrates a cross-sectional view of a flexure arm 900a according to some embodiments. For example, a cross-sectional view of the flexure arm 900a may be taken along a plane parallel to the optical axis. The flexure arm 900a may have a width dimension (labeled "w" in fig. 9A) and a height dimension (labeled "h" in fig. 9A). In some examples, the height dimension may be greater than the width dimension. For example, in particular embodiments, the height dimension may be between about 40 microns and 80 microns and the width dimension may be between about 20 microns and 30 microns. It should be appreciated that the height dimension and/or the width dimension may be any other suitable dimension.

Figure 9B illustrates a cross-sectional view of a flexure arm 900B according to some embodiments. The flexure arm 900b may include an electrical trace 902 b. Electrical trace 902b may be configured to transmit a signal (e.g., an image signal) from the dynamic platform to the static platform. The electrical trace 902b may be routed along at least a portion of the flexure arm 900 b. In some examples, the electrical trace 902b may be located at a top portion of the flexure arm 900 b. However, in other examples, the electrical trace 902b may additionally or alternatively be located at a middle portion and/or a bottom portion of the flexure arm 900 b. In some cases, the electrical trace 902b may be a conductive material. For example, the electrical trace 902b may be a copper deposit on the flexure arm 900 b. In some embodiments, the electrical trace 902b may be electrically insulating. For example, the electrical trace 902b may be at least partially coated with a dielectric material 904b (e.g., polyimide).

Figure 9C illustrates a cross-sectional view of a flexure arm 900C according to some embodiments. The flexure arm 900c may include a plurality of electrical traces 902c (e.g., electrical traces 902B described above with reference to figure 9B). Electrical trace 902c may be oriented horizontally side-by-side such that a horizontal plane passes through electrical trace 902 c. The electrical trace 902c may be routed along at least a portion of the flexure arm 900 c. In some examples, the electrical trace 902c may be located at a top portion of the flexure arm 900 c. However, in other examples, the electrical traces 902c may additionally or alternatively be located at the middle and/or bottom portions of the flexure arm 900 c. In some implementations, the electrical traces 902c can be electrically insulated from the rest of the flexure arms 900c and/or from each other. For example, the electrical traces 902c may each be at least partially coated with a dielectric material 904c (e.g., polyimide).

Figure 9D illustrates a cross-sectional view of a flexure arm 900D according to some embodiments. The flexure arm 900d may include a plurality of electrical traces 902d (e.g., electrical traces 902B described above with reference to figure 9B). The electrical trace 902d may be oriented vertically side-by-side such that a vertical plane passes through the electrical trace 902 d. The electrical trace 902d may be routed along at least a portion of the flexure arm 900 d. In some examples, the electrical trace 902d may be located at a top portion of the flexure arm 900 d. However, in other examples, the electrical traces 902d may additionally or alternatively be located at a middle portion and/or a bottom portion of the flexure arm 900 d. In some implementations, the electrical traces 902d can be electrically insulated from the rest of the flexure arm 900d and/or from each other. For example, the electrical traces 902d can each be at least partially coated with a dielectric material 904d (e.g., polyimide).

Figure 9E illustrates a cross-sectional view of a flexure arm 900E according to some embodiments. The flexure arm 900e may include a plurality of electrical traces 902e (e.g., electrical traces 902B described above with reference to figure 9B). Electrical traces 902e may be routed from the dynamic platform to the static platform along at least a portion of the flexure arms. In some cases, one or more of the electrical traces 902e may be located at a top portion of the flexure arm 900e and one or more of the electrical traces 902e may be located at a bottom portion of the flexure arm 900 e. In some implementations, the electrical traces 902d can be electrically insulated from the rest of the flexure arm 900e and/or from each other. For example, the electrical traces 902d can each be at least partially coated with a dielectric material 904e (e.g., polyimide).

Figure 9F illustrates a cross-sectional view of a flexure arm 900F according to some embodiments. The flexure arms 900f may be formed from a variety of materials. For example, the flexure arm 900f may include a first material 902f sandwiching a second material 904 f. In some examples, the first material and/or the second material 904f can include or be one or more electrical traces (e.g., electrical trace 902B described above with reference to fig. 9B).

Figure 9G illustrates a cross-sectional view of a flexure arm 900G according to some embodiments. The flexure arm 900g may include a concave portion 902 g.

Figure 9H illustrates a cross-sectional view of a flexure arm 900H according to some embodiments. The flexure arm 900h may include a convex portion 902 h.

In various embodiments, one or more of the flex stabilizer members described herein can have a cross-section similar to or the same as one or more of the flex arms described herein (e.g., with reference to fig. 9A-9H).

Exemplary flexure arm arrangements for layers of a multi-layer flexure

In some examples, the dynamic platform and/or the static platform of the multi-layer flexure may include one or more offsets (e.g., recessed portions, extruded portions, etc.). In some cases, one or more flexure arms may be coupled to the dynamic and/or static stages with an offset. For example, the dynamic platform may include two recessed portion offsets located at opposite sides of the dynamic platform. However, in some embodiments, the dynamic platform and/or the static platform may include different offset configurations. Some non-limiting examples of offset configurations are described below with reference to fig. 10A-10L.

Fig. 10A-10L each illustrate partial top views of respective example flexure arm configurations according to some embodiments. In some cases, one or more embodiments of the example flexure arm configurations of fig. 10A-10L may be used in a multilayer flexure (e.g., any of the multilayer flexures described herein in fig. 1-9).

The exemplary flexure module configurations of fig. 10A-10L provide some non-limiting examples of design feature variations that may be used in one or more embodiments of the multilayer flexure, VCM actuator, and/or camera described herein.

With respect to flexure arms, some of the example flexure arm configurations of fig. 10A-10L indicate changes in the flexure arms, including, but not limited to, one or more of the following:

(1a) the number of flexing arms can vary. For example, a layer of a multi-layer flexure may include one or more flexure arms. In a particular example, the layers of the multi-layer flexure may include four or fewer flexure arms in an array of flexure arms.

(2a) The flexure arms may be parallel to each other. However, the flexure arms need not be parallel to each other.

(3a) The flexure arms may be parallel to the frame edges (e.g., the edges of the dynamic and/or static stages of the multi-layer flexure).

(4a) The flexure arms may be evenly spaced from one another.

(5a) The width of the flexure arms may vary along and/or between the flexure arms;

(6a) the flexure arms may include features (e.g., depressions, extrusions, apertures, etc.).

(7a) The flexure arms may be rectangular, concave and/or convex in cross-section.

(8a) The flexure arms may be solid material, clad or switched beam.

Some of the exemplary flexure module configurations of fig. 10A-10L indicate changes in the inflection point (also referred to herein as "inflection portions" or "inflection points") relative to the inflection point(s) of the flexure arms (or flexure arm array), including, but not limited to, one or more of the following:

(1b) the flexure arms may include one or more inflection points.

(2b) The turning angle of the turning point may vary. In some examples, the turn angle may be 90 degrees. However, in other examples, the turn angle may be an angle other than 90 degrees.

(3b) The turning radius of the turning point may vary.

With respect to the flex-stabilizer member, some of the example flex-module configurations of fig. 10A-10L indicate changes in the flex-stabilizer member including, but not limited to, one or more of the following:

(1c) one or more flex stabilizer members may connect the flex arms.

(2c) The flex stabilizer member may connect some or all of the flex arms.

(3c) The position of the flex stabilizer member may be any position on the flex arm. In some examples, the position of the flex stabilizer member may differ between the flexure arms.

(4c) The angle between the flex stabilizer member and the flex arms may vary. In some examples, the angle between the flex stabilizer member and the flex arm may be 90 degrees. However, in other examples, the angle may be an angle other than 90 degrees.

Some of the example flexure module configurations of fig. 10A-10L indicate changes in the offsets relative to the offsets of the dynamic and/or static platforms, including, but not limited to, one or more of:

(1d) the offset may be present at the root of the flexure arm where the flexure arm is connected to the dynamic and/or static platform.

(2d) The offset may be, for example, a depression, an extrusion, or the like.

Some of the example flexure module configurations of fig. 10A-10L indicate changes in flexure arm attachment angle relative to flexure arm attachment angles to the dynamic and/or static platforms, including, but not limited to, one or more of:

(1e) the flex arm attachment angle can be varied. In some examples, the flex arm connection angle may be 90 degrees. However, in other examples, the flex arm connection angle may be an angle other than 90 degrees.

(2e) Different flexure arms may have different flexure arm attachment angles.

(3e) For dynamic and/or static platforms having offsets, the flexure arms may be connected to any available edge of the offset.

With respect to flexure arm patterns (which may, in some cases, include patterns formed by flexure arms and flexure stabilizer members), some of the example flexure module configurations of fig. 10A-10L indicate changes in the flexure arm patterns including, but not limited to, one or more of the following:

(1f) the flexure arm pattern may be symmetrical. For example, the flexure arm pattern may be symmetric along at least two axes (e.g., x-axis and y-axis) orthogonal to the optical axis.

(1g) The flexure arm pattern may be asymmetric. For example, the flexure arm pattern may be asymmetric along at least one axis (e.g., x-axis or y-axis) orthogonal to the optical axis.

With respect to the spacing element pattern, the spacing element pattern may include, but is not limited to, one or more of the following:

the spacer element pattern may be symmetric, asymmetric, etc. The spacer element may be located at a midpoint between the span between the offset and the flexural stabilizer. In addition, the spacing elements may follow different patterns on different layers of the multi-layer flexure. For example, for a multilayer flexure having more than two layers, some layers may have more spacers between them than other layers.

Fig. 10A illustrates a partial top view of a flex-module configuration 1000A according to some embodiments. The flexure module configuration 1000a includes a dynamic platform configuration 1002a, a static platform configuration 1004a, a flexure arm configuration 1006a, and a flexure stabilizer member configuration 1008 a.

Fig. 10B illustrates a partial top view of a flex-module configuration 1000B according to some embodiments. The flexure module configuration 1000b includes a dynamic platform configuration 1002b, a static platform configuration 1004b, a flexure arm configuration 1006b, and a flexure stabilizer member configuration 1008 b.

Fig. 10C illustrates a partial top view of a flex-module configuration 1000C according to some embodiments. The flexure module configuration 1000c includes a dynamic platform configuration 1002c, a static platform configuration 1004c, a flexure arm configuration 1006c, and a flexure stabilizer member configuration 1008 c.

Fig. 10D illustrates a partial top view of a flex-module configuration 1000D according to some embodiments. The flexure module configuration 1000d includes a dynamic platform configuration 1002d, a static platform configuration 1004d, and a flexure arm configuration 1006 d.

Fig. 10E illustrates a partial top view of a flex-module configuration 1000E according to some embodiments. The flexure module configuration 1000e includes a dynamic platform configuration 1002e, a static platform configuration 1004e, and a flexure arm configuration 1006 e.

Fig. 10F illustrates a partial top view of a flex-module configuration 1000F according to some embodiments. The flexure module configuration 1000f includes a dynamic platform configuration 1002f, a static platform configuration 1004f, a flexure arm configuration 1006f, and a flexure stabilizer member configuration 1008 f.

Fig. 10G illustrates a partial top view of a flex-module configuration 1000G according to some embodiments. The flexure module configuration 1000g includes a dynamic platform configuration 1002g, a static platform configuration 1004g, and a flexure arm configuration 1006 g.

Fig. 10H illustrates a partial top view of a flexure module configuration 1000H according to some embodiments. The flexure module configuration 1000h includes a dynamic platform configuration 1002h, a static platform configuration 1004h, and a flexure arm configuration 1006 h.

Fig. 10I illustrates a partial top view of a flex-module configuration 1000I according to some embodiments. Flexure module configurations 1000i include a dynamic platform configuration 1002i, a static platform configuration 1004i, a flexure arm configuration 1006i, and a flexure stabilizer member configuration 1008 i.

Fig. 10J illustrates a partial top view of a flex-module configuration 1000J according to some embodiments. The flexure module configuration 1000j includes a dynamic platform configuration 1002j, a static platform configuration 1004j, and a flexure arm configuration 1006 j.

Fig. 10K illustrates a partial top view of a flexure module configuration 1000K, according to some embodiments. The flexure module configuration 1000k includes a dynamic platform configuration 1002k, a static platform configuration 1004k, a flexure arm configuration 1006k, and a flexure stabilizer member configuration 1008 k.

Fig. 10L illustrates a partial top view of a flexure module configuration 1000L according to some embodiments. The flexure module configuration 1000L includes a dynamic platform configuration 1002L, a static platform configuration 1004L, a flexure arm configuration 1006L, and a flexure stabilizer member configuration 1008L.

Multifunction device examples

Embodiments of electronic devices, user interfaces for such devices, and related processes for using such devices are described herein. In some embodiments, the device is a portable communication device, such as a mobile phone, that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of portable multifunction devices include, but are not limited to, those from Apple InciPodAndan apparatus. Other portable electronic devices, such as laptops, cameras, mobile phones, or tablets, may also be used. It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer with a camera. In some embodiments, the device is a gaming computer having an orientation sensor (e.g., an orientation sensor in a game controller). In other embodiments, the device is not a portable communication device, but is a camera.

In the following discussion, an electronic device including a display and a touch-sensitive surface is described. However, it should be understood that the electronic device may include one or more other physical user interface devices, such as a physical keyboard, mouse, and/or joystick.

The device typically supports various applications, such as one or more of the following: a mapping application, a rendering application, a word processing application, a website creation application, a disc editing application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, a fitness support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.

Various applications executable on the device may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the device may be adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture of the device (such as a touch-sensitive surface) may support various applications with a user interface that is intuitive and transparent to the user.

Attention is now directed to embodiments of a portable device having a camera. Fig. 11 illustrates a block diagram of an exemplary portable multifunction device 1100 that can include one or more cameras (e.g., the cameras described above with reference to fig. 3-5), according to some embodiments. For convenience, the camera 1164 is sometimes referred to as an "optical sensor" and may also be named or referred to as an optical sensor system. Device 1100 can include memory 1102 (which can include one or more computer-readable storage media), memory controller 1122, one or more processing units (CPUs) 1120, peripheral interfaces 1118, RF circuitry 1108, audio circuitry 1110, speakers 1111, touch-sensitive display system 1112, microphone 1113, input/output (I/O) subsystem 1106, other input or control devices 1116, and external ports 1124. The device 1100 may include a plurality of optical sensors 1164. These components may communicate over one or more communication buses or signal lines 1103.

It should be understood that device 1100 is just one example of a portable multifunction device, and that device 1100 can have more or fewer components than shown, can combine two or more components, or can have a different configuration or arrangement of components. The various components shown in fig. 11 may be implemented in hardware, software, or a combination of software and hardware, including one or more signal processing circuits and/or application specific integrated circuits.

The memory 1102 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to the memory 1102 by other components of the device 1100, such as the CPU 1120 and the peripheral interface 1118, may be controlled by a memory controller 1122.

Peripheral interface 1118 may be used to couple input and output peripherals of the device to CPU 1120 and memory 1102. The one or more processors 1120 run or execute various software programs and/or sets of instructions stored in the memory 1102 to perform various functions of the device 1100 and process data.

In some embodiments, peripheral interface 1118, CPU 1120, and memory controller 1122 may be implemented on a single chip, such as chip 1104. In some other embodiments, they may be implemented on separate chips.

The RF (radio frequency) circuit 1108 receives and transmits RF signals, also referred to as electromagnetic signals. The RF circuit 1108 converts electrical signals to/from electromagnetic signals and communicates with a communication network and other communication devices via electromagnetic signals. The RF circuitry 1108 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a codec chipset, a Subscriber Identity Module (SIM) card, memory, and so forth. RF circuitry 1108 may communicate with networks, such as the internet, also known as the World Wide Web (WWW), intranets, and/or wireless networks, such as a cellular telephone network, a wireless Local Area Network (LAN), and/or a Metropolitan Area Network (MAN), as well as other devices, via wireless communications. The wireless communication may use any of a variety of communication standards, protocols, and techniques, including, but not limited to, Global System for Mobile communications (GSM), Enhanced Data GSM Environment (EDGE), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), wideband code division multiple Access (W-CDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, and/or IEEE 802.11 η), Voice over Internet protocol (VoIP), Wi-MAX, protocols for email (e.g., Internet Message Access Protocol (IMAP) and/or Post Office Protocol (POP)), instant messages (e.g., extensible Messaging and Presence protocol (XMPP), Session initiation protocol for instant messages and Presence with extensions (SIMPLE), Instant Messaging and Presence Service (IMPS) and/or Short Message Service (SMS)) or any other suitable communication protocol including communication protocols not yet developed on the filing date of this document.

Audio circuitry 1110, speaker 1111, and microphone 1113 provide an audio interface between a user and device 1100. The audio circuit 1110 receives audio data from the peripheral interface 1118, converts the audio data into an electrical signal, and transmits the electrical signal to the speaker 1111. The speaker 1111 converts electrical signals into sound waves audible to humans. The audio circuit 1110 also receives electrical signals converted by the microphone 1113 from sound waves. The audio circuit 1110 converts the electrical signals to audio data and transmits the audio data to the peripheral interface 1118 for processing. Audio data may be retrieved from and/or transmitted to memory 1102 and/or RF circuitry 1108 by peripheral interface 1118. In some embodiments, the audio circuit 1110 further includes a headphone interface (e.g., 1212 of fig. 12). The headphone interface provides an interface between the audio circuitry 1110 and a removable audio input/output peripheral such as an output-only headphone or a headphone having both an output (e.g., a monaural headphone or a binaural headphone) and an input (e.g., a microphone).

I/O subsystem 1106 couples input/output peripheral devices on device 1100, such as touch screen 1112 and other input control devices 1116 to peripheral interface 1118. The I/O subsystem 1106 may include a display controller 1156 and one or more input controllers 1160 for other input or control devices. One or more input controllers 1160 receive/transmit electrical signals from/to other input or control devices 1116. Other input control devices 1116 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and the like. In some alternative embodiments, input controller 1160 may be coupled to (or not coupled to) any of the following: a keyboard, an infrared port, a USB port, and a pointing device such as a mouse. The one or more buttons (e.g., 1208 of fig. 12) may include an up/down button for volume control of the speaker 1111 and/or the microphone 1113. The one or more buttons may include a push button (e.g., 1206 of fig. 12).

The touch sensitive display 1112 provides an input interface and an output interface between the device and the user. The display controller 1156 receives electrical signals from and/or transmits electrical signals to the touch screen 1112. Touch screen 1112 displays visual output to the user. The visual output may include graphics, text, icons, video, and any combination thereof (collectively "graphics"). In some implementations, some or all of the visual output may correspond to a user interface object.

Touch screen 1112 has a touch-sensitive surface, sensor, or group of sensors that accept input from a user based on tactile sensation and/or tactile contact. Touch screen 1112 and display controller 1156 (along with any associated modules and/or sets of instructions in memory 1102) detect contact (and any movement or breaking of the contact) on touch screen 1112 and convert the detected contact into interaction with user interface objects (e.g., one or more soft keys, icons, web pages, or images) displayed on touch screen 1112. In an exemplary embodiment, the point of contact between touch screen 1112 and the user corresponds to a finger of the user.

The touch screen 1112 may use LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies may be used in other embodiments. Touch screen 1112 and display controller 1156 may enableContact and any movement or breaking thereof is detected with any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive technologies, resistive technologies, infrared technologies, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 1112. In one exemplary embodiment, projected mutual capacitance sensing technology is used, such as that from Apple Inc (Cupertino, California)iPodAndthe technique found in (1).

The touch screen 1112 may have a video resolution in excess of 800 dpi. In some embodiments, the touch screen has a video resolution of about 860 dpi. The user may make contact with touch screen 1112 using any suitable object or appendage, such as a stylus, finger, or the like. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which may not be as accurate as stylus-based input due to the larger contact area of the finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the action desired by the user.

In some embodiments, in addition to a touch screen, device 1100 can include a trackpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike a touchscreen, does not display visual output. The trackpad may be a touch-sensitive surface separate from touch screen 1112 or an extension of the touch-sensitive surface formed by the touch screen.

The device 1100 also includes a power system 1162 for powering the various components. Power system 1162 may include a power management system, one or more power sources (e.g., battery, Alternating Current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a Light Emitting Diode (LED)), and any other components associated with the generation, management, and distribution of power in a portable device.

The device 1100 may also include one or more optical sensors or cameras 1164. FIG. 11 shows an optical sensor 1164 coupled to an optical sensor controller 1158 in the I/O subsystem 1106. The optical sensor 1164 may include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The optical sensor 1164 receives light projected through one or more lenses from the environment and converts the light into data representing an image. In conjunction with an imaging module 1143 (also referred to as a camera module), the optical sensor 1164 may capture still images or video. In some implementations, the optical sensor 1164 is located on the back of the device 1100 opposite the touch screen display 1112 on the front of the device, so that the touch screen display 1112 can act as a viewfinder for still image capture and/or video image capture. In some embodiments, another optical sensor is located on the front of the device so that a user can obtain an image of the user for a video conference while viewing other video conference participants on the touch screen display.

The device 1100 may also include one or more proximity sensors 1166. Fig. 11 shows proximity sensors 1166 coupled to peripheral interface 1118. Alternatively, the proximity sensor 1166 may be coupled to the input controller 1160 in the I/O subsystem 1106. In some embodiments, when the multifunction device 1100 is placed near the user's ear (e.g., when the user is making a phone call), the proximity sensor 1166 turns off and disables the touch screen 1112.

The device 1100 includes one or more orientation sensors 1168. In some embodiments, the one or more orientation sensors 1168 include one or more accelerometers (e.g., one or more linear accelerometers and/or one or more rotational accelerometers). In some embodiments, the one or more orientation sensors 1168 include one or more gyroscopes. In some embodiments, the one or more orientation sensors 1168 include one or more magnetometers. In some embodiments, the one or more orientation sensors 1168 include one or more of the following: global Positioning System (GPS), global navigation satellite system (GLONASS), and/or other global navigation system receivers. The GPS, GLONASS, and/or other global navigation system receivers may be used to obtain information about the position and orientation (e.g., portrait or landscape) of the device 1100. In some embodiments, the one or more orientation sensors 1168 include any combination of orientation/rotation sensors. Fig. 11 illustrates one or more orientation sensors 1168 coupled to peripheral interface 1118. Alternatively, one or more orientation sensors 1168 may be coupled to the input controller 1160 in the I/O subsystem 1106. In some implementations, the information is displayed in a portrait view or a landscape view on the touch screen display 1112 based on analysis of data received from the one or more orientation sensors 1168.

In some embodiments, the software components stored in memory 1102 include an operating system 1126, a communications module (or set of instructions) 1128, a contact/motion module (or set of instructions) 1130, a graphics module (or set of instructions) 1132, a text input module (or set of instructions) 1134, a Global Positioning System (GPS) module (or set of instructions) 1135, an arbitration module 1158, and an application program (or set of instructions) 1136. Further, in some embodiments, memory 1102 stores device/global internal state 1157. Device/global internal state 1157 includes one or more of: an active application state indicating which applications (if any) are currently active; display state indicating what applications, views, or other information occupy various areas of the touch screen display 1112; sensor status, including information obtained from the various sensors of the device and the input control device 1116; and location information regarding the location and/or pose of the device.

An operating system 1126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.

The communication module 1128 facilitates communication with other devices via one or more external ports 1124, and also includes various software components for processing data received by the RF circuitry 1108 and/or the external ports 1124. An external port 1124 (e.g., Universal Serial Bus (USB), firewire, etc.) is adapted to couple directly to other devices or indirectly through a network (e.g., the internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector.

The contact/motion module 1130 may detect contact with the touch screen 1112 (in conjunction with the display controller 1156) and other touch-sensitive devices (e.g., a trackpad or a physical click wheel). Contact/motion module 1130 includes various software components for performing various operations related to the detection of contact, such as determining whether contact has occurred (e.g., detecting a finger-down event), determining whether there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining whether contact has terminated (e.g., detecting a finger-up event or a break in contact). The contact/motion module 1130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact represented by the series of contact data may include determining velocity (magnitude), velocity (magnitude and direction), and/or acceleration (change in magnitude and/or direction) of the point of contact. These operations may be applied to a single contact (e.g., single finger contact) or multiple simultaneous contacts (e.g., "multi-touch"/multi-finger contact). In some implementations, the contact/motion module 1130 and the display controller 1156 detect contact on a touch pad.

The contact/motion module 1130 may detect gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns. Thus, gestures may be detected by detecting a particular contact pattern. For example, detecting a single-finger tap gesture includes detecting a finger-down event, and then detecting a finger-up (lift-off) event at the same location (or substantially the same location) as the finger-down event (e.g., at an icon location). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event, then detecting one or more finger-dragging events, and then subsequently detecting a finger-up (lift-off) event.

Graphics module 1132 includes various known software components for rendering and displaying graphics on touch screen 1112 or other display, including components for changing the intensity of the displayed graphics. As used herein, the term "graphic" includes any object that may be displayed to a user, including without limitation text, web pages, icons (such as user interface objects including soft keys), digital images, videos, animations and the like.

In some embodiments, graphics module 1132 stores data representing graphics to be used. Each graphic may be assigned a corresponding code. The graphics module 1132 receives one or more codes for specifying graphics to be displayed from an application program or the like, and also receives coordinate data and other graphics attribute data together if necessary, and then generates screen image data to output to the display controller 1156.

Text input module 1134, which may be a component of graphics module 1132, provides a soft keyboard for entering text in a variety of applications, such as contacts 1137, email 1140, instant messages 1141, browser 1147, and any other application that requires text input.

The GPS module 1135 determines the location of the device and provides this information for use in various applications (e.g., to the phone 1138 for location-based dialing, to the camera 1143 as picture/video metadata, and to applications that provide location-based services such as weather desktop applets, local yellow pages desktop applets, and map/navigation desktop applets).

The applications 1136 may include the following modules (or sets of instructions), or a subset or superset thereof:

a contacts module 1137 (sometimes referred to as an address book or contact list);

phone module 1138;

video conferencing module 1139;

an email client module 1140;

an Instant Messaging (IM) module 1141;

fitness support module 1142;

a camera module 1143 for still images and/or video images;

an image management module 1144;

a browser module 1147;

a calendar module 1148;

desktop applet module 1149, which may include one or more of the following: a weather desktop applet 1149-1, a stock market desktop applet 1149-2, a calculator desktop applet 1149-3, an alarm desktop applet 1149-4, a dictionary desktop applet 1149-5, and other desktop applets acquired by the user and a user created desktop applet 1149-6;

a desktop applet creator module 1150 for forming a user-created desktop applet 1149-6;

a search module 1151;

a video and music player module 1152, which may be made up of a video player module and a music player module;

a notepad module 1153;

map module 1154; and/or

Online video module 1155.

Examples of other applications 1136 that may be stored in memory 1102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.

In conjunction with touch screen 1112, display controller 1156, contact module 1130, graphics module 1132, and text input module 1134, contacts module 1137 may be used to manage contact lists or contact lists (e.g., stored in application internal state 1157), including: adding the name to the address book; deleting names from the address book; associating a telephone number, email address, physical address, or other information with a name; associating the image with a name; classifying and classifying names; providing a telephone number or email address to initiate and/or facilitate communication via telephone 1138, video conference 1139, email 1140, or IM 1141; and so on.

In conjunction with RF circuitry 1108, audio circuitry 1110, speaker 1111, microphone 1113, touch screen 1112, display controller 1156, contact module 1130, graphics module 1132, and text input module 1134, phone module 1138 may be used to enter a sequence of characters corresponding to a phone number, access one or more phone numbers in address book 1137, modify an entered phone number, dial a corresponding phone number, conduct a conversation, and disconnect or hang up when the conversation is complete. As described above, wireless communication may use any of a number of communication standards, protocols, and technologies.

In conjunction with the RF circuitry 1108, the audio circuitry 1110, the speaker 1111, the microphone 1113, the touch screen 1112, the display controller 1156, the optical sensor 1164, the optical sensor controller 1158, the contact module 1130, the graphics module 1132, the text input module 1134, the contact list 1137, and the telephony module 1138, the video conference module 1139 includes executable instructions for initiating, conducting, and terminating video conferences between the user and one or more other participants according to user instructions.

In conjunction with RF circuitry 1108, touch screen 1112, display controller 1156, contact module 1130, graphics module 1132, and text input module 1134, email client module 1140 includes executable instructions for creating, sending, receiving, and managing emails in response to user instructions. In conjunction with the image management module 1144, the email client module 1140 makes it very easy to create and send an email with a still image or a video image captured by the camera module 1143.

In conjunction with RF circuitry 1108, touch screen 1112, display controller 1156, contact module 1130, graphics module 1132, and text input module 1134, instant message module 1141 includes executable instructions for inputting a sequence of characters corresponding to an instant message, modifying previously input characters, sending a corresponding instant message (e.g., using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for phone-based instant messages or using XMPP, SIMPLE, or IMPS for internet-based instant messages), receiving an instant message, and viewing the received instant message. In some embodiments, the sent and/or received instant messages may include graphics, photos, audio files, video files, and/or other attachments supported in MMS and/or Enhanced Messaging Service (EMS). As used herein, "instant message" refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).

In conjunction with RF circuitry 1108, touch screen 1112, display controller 1156, contact module 1130, graphics module 1132, text input module 1134, GPS module 1135, map module 1154, and music player module 1146, workout support module 1142 includes executable instructions for creating a workout (e.g., having time, distance, and/or calorie burn goals); communicating with fitness sensors (sports equipment); receiving fitness sensor data; calibrating a sensor for monitoring fitness; selecting and playing music for fitness; and displaying, storing and transmitting fitness data.

In conjunction with touch screen 1112, display controller 1156, optical sensor 1164, optical sensor controller 1158, contact module 1130, graphics module 1132, and image management module 1144, camera module 1143 includes executable instructions to: capturing still images or video (including video streams) and storing them in memory 1102, modifying characteristics of the still images or video, or deleting the still images or video from memory 1102.

In conjunction with the touch screen 1112, the display controller 1156, the contact module 1130, the graphics module 1132, the text input module 1134, and the camera module 1143, the image management module 1144 includes executable instructions for arranging, modifying (e.g., editing), or otherwise manipulating, labeling, deleting, presenting (e.g., in a digital slide or album), and storing still images and/or video images.

In conjunction with the RF circuitry 1108, the touch screen 1112, the display system controller 1156, the contact module 1130, the graphics module 1132, and the text input module 1134, the browser module 1147 includes executable instructions for browsing the internet (including searching, linking to, receiving, and displaying web pages or portions thereof, and attachments and other files linked to web pages) according to user instructions.

In conjunction with the RF circuitry 1108, the touch screen 1112, the display system controller 1156, the contact module 1130, the graphics module 1132, the text input module 1134, the email client module 1140, and the browser module 1147, the calendar module 1148 includes executable instructions for creating, displaying, modifying, and storing a calendar and calendar-associated data (e.g., calendar entries, to-do, etc.) according to user instructions.

In conjunction with the RF circuitry 1108, the touch screen 1112, the display system controller 1156, the contact module 1130, the graphics module 1132, the text input module 1134, and the browser module 1147, the desktop applet module 1149 is a mini-application (e.g., weather desktop applet 549-1, stock market desktop applet 549-2, calculator desktop applet 1149-3, alarm desktop applet 1149-4, and dictionary desktop applet 1149-5) or a mini-application (e.g., user created desktop applet 1149-6) that may be downloaded and used by a user. In some embodiments, the desktop applet includes an HTML (hypertext markup language) file, a CSS (cascading style sheet) file, and a JavaScript file. In some embodiments, the desktop applet includes an XML (extensible markup language) file and a JavaScript file (e.g., Yahoo! desktop applet).

In conjunction with the RF circuitry 1108, the touch screen 1112, the display system controller 1156, the contact module 1130, the graphics module 1132, the text input module 1134, and the browser module 1147, the desktop applet creator module 1150 may be used by a user to create a desktop applet (e.g., to transfer a user-specified portion of a web page into the desktop applet).

In conjunction with touch screen 1112, display system controller 1156, contact module 1130, graphics module 1132, and text input module 1134, search module 1151 includes executable instructions for searching memory 1102 for text, music, sound, images, video, and/or other files that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.

In conjunction with touch screen 1112, display system controller 1156, contact module 1130, graphics module 1132, audio circuitry 1110, speakers 1111, RF circuitry 1108, and browser module 1147, video and music player module 1152 includes executable instructions to allow a user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, as well as executable instructions for displaying, presenting, or otherwise playing back video (e.g., on touch screen 1112 or on an external display connected through external port 1124). In some embodiments, the device 1100 may include the functionality of an MP3 player.

In conjunction with the touch screen 1112, the display controller 1156, the contact module 1130, the graphics module 1132, and the text input module 1134, the notepad module 1153 includes executable instructions for creating and managing notepads, backlogs, and the like according to user instructions.

In conjunction with RF circuitry 1108, touch screen 1112, display system controller 1156, contact module 1130, graphics module 1132, text input module 1134, GPS module 1135, and browser module 1147, map module 1154 may be used to receive, display, modify, and store executable instructions for the map and data associated with the map (e.g., driving routes; data for stores and other points of interest at or near a particular location; and other location-based data) according to user instructions.

In conjunction with touch screen 1112, display system controller 1156, contact module 1130, graphics module 1132, audio circuit 1110, speaker 1111, RF circuit 1108, text input module 1134, email client module 1140, and browser module 1147, online video module 1155 includes instructions that allow a user to access, browse, receive (e.g., by streaming and/or downloading), play (e.g., on the touch screen or on an external display connected through external port 1124), send an email with a link to a particular online video, and otherwise manage online video in one or more file formats, such as h.264. In some embodiments, the link to a particular online video is sent using instant message module 1141 instead of email client module 1140.

Each of the modules and applications identified above corresponds to a set of executable instructions for performing one or more of the functions described above as well as the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise rearranged in various embodiments. In some embodiments, memory 1102 may store a subset of the modules and data structures described above. Further, memory 1102 may store additional modules and data structures not described above.

In some embodiments, device 1100 is a device that performs the operation of a predefined set of functions thereon solely through a touch screen and/or trackpad. By using a touch screen and/or touch pad as the primary input control device for operation of the device 1100, the number of physical input control devices (such as push buttons, dials, etc.) on the device 1100 can be reduced.

A predefined set of functions that can be performed exclusively through the touch screen and/or trackpad includes navigating between user interfaces. In some implementations, the trackpad, when touched by a user, navigates device 1100 from any user interface that may be displayed on device 1100 to a main, home, or root menu. In such embodiments, the touch pad may be referred to as a "menu button". In some other embodiments, the menu button may be a physical push button or other physical input control device rather than a touchpad.

Fig. 12 depicts an exemplary portable multifunction device 1100 that can include one or more cameras (e.g., the cameras described above with reference to fig. 3-5), according to some embodiments. Device 1100 can have a touch screen 1112. The touch screen 1112 may display one or more graphics within the User Interface (UI) 1200. In this embodiment, as well as other embodiments described below, a user may select one or more of these graphics by, for example, gesturing graphically with one or more fingers 1202 (not drawn to scale in the figure) or with one or more styluses 1203 (not drawn to scale in the figure).

Device 1100 can also include one or more physical buttons, such as a "home" button or menu button 1204. As previously described, menu button 1204 may be used to navigate to any application 1136 in a set of applications that may be executed on device 1100. Alternatively, in some embodiments, menu button 1204 is implemented as a soft key in a GUI displayed on touch screen 1112.

In one embodiment, device 1100 includes a touch screen 1112, menu buttons 1204, a push button 1206 for turning the device on and off and locking the device, a volume adjustment button 1208, a Subscriber Identity Module (SIM) card slot 1210, a headset interface 1212, and a docking/charging external port 1224. Pressing button 1206 may be used to power the device on and off by pressing and holding the button in a pressed state for a predefined time interval; locking the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or unlocking the device or initiating an unlocking process. In alternative embodiments, device 1100 can also accept voice input through microphone 1113 for activating or deactivating some functions.

It should be noted that although many of the examples herein are provided with reference to the optical sensor/camera 1164 (at the front of the device), one or more rear facing cameras or optical sensors pointing opposite the display may be used instead of or in addition to the optical sensor/camera 1164 at the front of the device.

Exemplary computer System

Fig. 13 illustrates an exemplary computer system 1300 that may include one or more cameras (e.g., the cameras described above with reference to fig. 3-5), according to some embodiments. Computer system 1300 may be configured to perform any or all of the embodiments described above. In different embodiments, computer system 1300 may be any of various types of devices, including but not limited to: personal computer systems, desktop computers, laptop computers, notebook computers, tablet computers, all-in-one computers, tablet or netbook computers, mainframe computer systems, handheld computers, workstations, network computers, cameras, set-top boxes, mobile devices, consumer devices, video game controllers, handheld video game devices, application servers, storage devices, televisions, video recording devices, peripheral devices (such as switches, modems, routers), or generally any type of computing or electronic device.

Various embodiments of a camera motion control system as described herein, including embodiments of magnetic position sensing described herein, may be executed in one or more computer systems 1300, which may interact with various other devices. It is noted that any of the components, acts, or functions described above with respect to fig. 3-5 and 11-12 may be implemented on one or more computers configured as the computer system 1300 of fig. 13, according to various embodiments. In the illustrated embodiment, computer system 1300 includes one or more processors 1310 coupled to a system memory 1320 through an input/output (I/O) interface 1330. Computer system 1300 also includes a network interface 1340 coupled to I/O interface 1330, and one or more input/output devices 1350, such as cursor control device 1360, keyboard 1370, and display 1380. In some cases, it is contemplated that embodiments may be implemented using a single instance of computer system 1300, while in other embodiments multiple such systems or multiple nodes making up computer system 1300 may be configured to host different portions or instances of an embodiment. For example, in one embodiment, some elements may be implemented by one or more nodes of computer system 1300 different from those implementing other elements.

In various embodiments, the computer system 1300 may be a single-processor system including one processor 1310 or a multi-processor system including several processors 1310 (e.g., two, four, eight, or another suitable number). Processor 1310 may be any suitable processor capable of executing instructions. For example, in various embodiments, processors 1310 may be general-purpose or embedded processors implementing any of a variety of Instruction Set Architectures (ISAs), such as the x86, PowerPC, SPARC, or MIPS ISAs, or any other suitable ISA. In a multi-processor system, each of processors 1310 may typically, but need not necessarily, implement the same ISA.

The system memory 1320 may be configured to store camera control program instructions 1322 and/or camera control data that are accessible by the processor 1310. In various embodiments, system memory 1320 may be implemented using any suitable memory technology, such as Static Random Access Memory (SRAM), synchronous dynamic ram (sdram), non-volatile/flash type memory, or any other type of memory. In the illustrated embodiment, the program instructions 1322 can be configured to implement a lens control application 1324 that includes any of the functionality described above. Further, existing camera control data 1332 of memory 1320 may include any of the above information or data structures. In some embodiments, program instructions and/or data may be received, transmitted, or stored on a different type of computer-accessible medium or similar medium, separate from system memory 1320 or computer system 1300. Although computer system 1300 is described as implementing the functionality of the functional blocks of the previous figures, any of the functionality described herein may be implemented by such a computer system.

In one embodiment, I/O interface 1330 may be configured to coordinate I/O communications between processor 1310, system memory 1320, and any peripheral devices in the device, including network interface 1340, or other peripheral device interfaces (such as input/output devices 1350). In some embodiments, the I/O interface 1330 may perform any necessary protocol, timing, or other data transformations to convert data signals from one component (e.g., the system memory 1320) into a format suitable for use by another component (e.g., the processor 1310). In some embodiments, I/O interface 1330 may include support for devices attached, for example, through various types of peripheral buses, such as a variant of the Peripheral Component Interconnect (PCI) bus standard or the Universal Serial Bus (USB) standard. In some embodiments, the functionality of I/O interface 1330 may be divided into two or more separate components, such as a north bridge and a south bridge, for example. Further, in some embodiments, some or all of the functionality of the I/O interface 1330 (such as the interface to the system memory 1320) may be incorporated directly into the processor 1310.

Network interface 1340 may be configured to allow data to be exchanged between computer system 1300 and other devices (e.g., carrier or proxy devices) attached to network 1385, or between nodes of computer system 1300. In various embodiments, network 1385 may include one or more networks including, but not limited to, a Local Area Network (LAN) (e.g., ethernet or an enterprise network), a Wide Area Network (WAN) (e.g., the internet), a wireless data network, some other electronic data network, or some combination thereof. In various embodiments, network interface 1340 may support communication over a wired or wireless general purpose data network (such as any suitable type of ethernet network), for example; communication via a telecommunications/telephony network (such as an analog voice network or a digital fiber optic communication network); communication via a storage area network (such as a fibre channel SAN), or via any other suitable type of network and/or protocol.

In some embodiments, input/output devices 1350 may include one or more display terminals, keyboards, keypads, touch pads, scanning devices, voice or optical recognition devices, or any other devices suitable for entering or accessing data by one or more computer systems 1300. Multiple input/output devices 1350 may be present in computer system 1300 or may be distributed across various nodes of computer system 1300. In some embodiments, similar input/output devices may be separate from computer system 1300 and may interact with one or more nodes of computer system 1300 via a wired or wireless connection, such as through network interface 1340.

As shown in fig. 13, memory 1320 may include program instructions 1322, which may be capable of being executed by a processor to implement any of the elements or acts described above. In one embodiment, the program instructions may perform the method described above. In other embodiments, different elements and data may be included. Note that the data may include any of the data or information described above.

Those skilled in the art will appreciate that computer system 1300 is merely illustrative and is not intended to limit the scope of embodiments. In particular, the computer systems and devices may include any combination of hardware or software that can perform the indicated functions, including computers, network devices, internet appliances, PDAs, wireless telephones, pagers, and the like. Computer system 1300 may also be connected to other devices not illustrated, or may otherwise operate as a standalone system. Further, the functionality provided by the illustrated components may be combined in fewer components or distributed in additional components in some embodiments. Similarly, in some embodiments, the functionality of some of the illustrated components may not be provided, and/or other additional functionality may be available.

Those skilled in the art will also recognize that while various items are illustrated as being stored in memory or on storage during use, these items, or portions thereof, may be transferred between memory and other storage devices for purposes of memory management and data integrity. Alternatively, in other embodiments, some or all of these software components may execute in memory on another device and communicate with the illustrated computer system via inter-computer communication. Some or all of the system components or data structures may also be stored (e.g., as instructions or structured data) on a computer-accessible medium or a portable article to be read by a suitable drive, various examples of which are described above. In some embodiments, instructions stored on a computer-accessible medium separate from computer system 1300 may be transmitted to computer system 1300 via transmission media or signals (such as electrical, electromagnetic, or digital signals transmitted over a communication medium such as a network and/or a wireless link). Various embodiments may also include receiving, transmitting or storing instructions and/or data implemented in accordance with the above description upon a computer-accessible medium. Generally speaking, a computer-accessible medium may include a non-transitory computer-readable storage medium or memory medium, such as a magnetic or optical medium, e.g., a disk or DVD/CD-ROM, a volatile or non-volatile medium such as RAM (e.g., SDRAM, DDR, RDRAM, SRAM, etc.), ROM, or the like. In some embodiments, a computer-accessible medium may include transmission media or signals, such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as a network and/or a wireless link.

In various embodiments, the methods described herein may be implemented in software, hardware, or a combination thereof. Additionally, the order of the blocks of a method may be changed, and various elements may be added, reordered, combined, omitted, modified, etc. Various modifications and changes will become apparent to those skilled in the art having the benefit of this disclosure. The various embodiments described herein are intended to be illustrative and not restrictive. Many variations, modifications, additions, and improvements are possible. Thus, multiple examples may be provided for components described herein as a single example. The boundaries between the various components, operations and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific example configurations. Other allocations of functionality are contemplated that may fall within the scope of claims that follow. Finally, structures and functionality presented as discrete components in the exemplary configurations may be implemented as a combined structure or component. These and other variations, modifications, additions, and improvements may fall within the scope of the embodiments as defined in the claims that follow.

52页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:电子设备、拍摄装置、以及移动体

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!