Smooth normals from depth maps for normal-based texture blending

文档序号:1581035 发布日期:2020-01-31 浏览:14次 中文

阅读说明:本技术 用于基于法线的纹理混合的来自深度图的平滑法线 (Smooth normals from depth maps for normal-based texture blending ) 是由 丹尼尔·戈德曼 于 2018-11-15 设计创作,主要内容包括:在纹理映射应用中的平滑表面法线的技术涉及从用于捕获图像以用于纹理映射的每个相机的视角生成平滑法线。沿着这些线,用于捕获图像以用于纹理映射的相机位于相对于纹理映射计算机将纹理图像映射到其上的几何对象的定向。纹理映射计算机放置以几何对象上的一点为中心的滤波器窗口。然后,纹理映射计算机生成滤波器窗口中的点上的平均法线作为该点处的平滑法线。然后,由此针对每个相机计算出的平均法线用于作为该点处的图像值的加权平均的权重。(A technique for smoothing surface normals in texture mapping applications involves generating a smoothed normal from the perspective of each camera used to capture images for texture mapping, along these lines, the camera used to capture images for texture mapping is located at an orientation relative to the geometric object onto which the texture mapping computer maps the texture images.)

1, a method comprising:

receiving, by processing circuitry of a computer configured to perform a texture mapping operation on data representing a geometric object in an image environment, (i) geometric object data representing the geometric object in the image environment and (ii) image data representing respective images of a texture object captured by a plurality of cameras, each of the plurality of cameras having an orientation relative to the texture object;

for each of the plurality of cameras:

obtaining, by the processing circuitry, a smooth normal corresponding to the camera at a point on a surface of the geometric object, the smooth normal evaluated at the point being evaluated by a weighted sum of pixels in a depth map onto which the geometric object is projected; and

generating, by the processing circuitry, a respective weight corresponding to the camera, the weight based on a dot product of the orientation of the camera and the smooth normal corresponding to the camera; and

generating, by the processing circuitry, a weighted average of the images of the texture object captured by the plurality of cameras to produce a texture mapped object in the image environment, the image of the texture object captured by each of the plurality of cameras being weighted by the weight corresponding to that camera.

2. The method of claim 1, wherein obtaining the smooth normal corresponding to each of the plurality of cameras at the point on the surface of the geometric object comprises:

obtaining mean coordinates of a specified point along the th axis of the filter window;

generating a tangent to the surface at the th average coordinate to produce a th average tangent to the surface at the point on the surface of the geometric object;

obtaining a second average coordinate of a specified point along a second axis of the filter window;

generating a tangent to the surface at the second average coordinate to produce a second average tangent to the surface at the point on the surface of the geometric object; and

generating a cross product of the th average tangent and the second average tangent to produce the smoothed normal.

3. The method of claim 2, wherein obtaining the mean coordinate of the designated point along the th axis of the filter window comprises:

averaging the depths of the specified point on a side of the point relative to the th axis to produce a th average point;

averaging the depths of the specified point on a second side of the point relative to the th axis to produce a second average point, and

generating a difference between the th average point and the second average point as the th average tangent to the surface at the point on the surface of the geometric object.

4. The method of claim 2, wherein obtaining the th mean coordinate of the specified point along the second axis of the filter window comprises:

averaging the depths of the specified points located on the th side of the point relative to the second axis to produce a th average point;

averaging the depths of the specified points on a second side of the point relative to the second axis to produce second averaged points; and

generating a difference between the th average point and the second average point as the second average tangent to the surface at the point on the surface of the geometric object.

5. The method of claim 1, wherein the weighting is further based on a visibility of the points based on a number of points located on the surface of the geometric object in a bounding box.

6. The method of claim 1, further comprising performing a projection operation on the geometric object data to generate a two-dimensional depth map of the geometric object, and

wherein obtaining the smooth normal corresponding to each of the plurality of cameras at a point on the surface of the geometric object comprises: averaging normals over an area of the depth map.

7, a computer program product comprising a non-transitory storage medium, the computer program product comprising code that, when executed by processing circuitry of a server computing device configured to perform texture mapping operations on data representing geometric objects in an image environment, causes the processing circuitry to perform a method comprising:

receiving (i) geometric object data representing a geometric object in the image environment and (ii) image data representing respective images of a texture object captured by a plurality of cameras, each of the plurality of cameras having an orientation relative to the texture object;

for each of the plurality of cameras:

obtaining a smooth normal corresponding to the camera at a point on the surface of the geometric object, the smooth normal being evaluated by a weighted sum of pixels in a depth map onto which the geometric object is projected; and

generating respective weights corresponding to the camera, the weights based on a dot product of the orientation of the camera and the smoothed normal corresponding to the camera; and

generating a weighted average of the images of the texture object captured by the plurality of cameras to produce a texture mapped object in the image environment, the image of the texture object captured by each of the plurality of cameras being weighted by the weight corresponding to that camera.

8. The computer program product of claim 7, wherein obtaining the smooth normal corresponding to each of the plurality of cameras at the point on the surface of the geometric object comprises:

obtaining mean coordinates of a specified point along the th axis of the filter window;

generating a tangent to the surface at the th average coordinate to produce a th average tangent to the surface at the point on the surface of the geometric object;

obtaining a second average coordinate of a specified point along a second axis of the filter window;

generating a tangent to the surface at the second average coordinate to produce a second average tangent to the surface at the point on the surface of the geometric object; and

generating a cross product of the th average tangent and the second average tangent to produce the smoothed normal.

9. The computer program product of claim 8, wherein obtaining the mean coordinate of the specified point along the th axis of the filter window comprises:

averaging the depths of the specified point on a side of the point relative to the th axis to produce a th average point;

averaging the depths of the specified point on a second side of the point relative to the th axis to produce a second average point, and

generating a difference between the th average point and the second average point as the th average tangent to the surface at the point on the surface of the geometric object.

10. The computer program product of claim 8, wherein obtaining the th mean coordinate of the specified point along the second axis of the filter window comprises:

averaging the depths of the specified points located on the th side of the point relative to the second axis to produce a th average point;

averaging the depths of the specified points on a second side of the point relative to the second axis to produce second averaged points; and

generating a difference between the th average point and the second average point as the th average tangent to the surface at the point on the surface of the geometric object.

11. The computer program product of claim 7, wherein the weighting is further based on a visibility of the points, the visibility based on a number of points located on the surface of the geometric object in a bounding box.

12. The computer program product of claim 7 wherein the method further includes performing a projection operation on the geometric object data to generate a two-dimensional depth map of the geometric object, and

wherein obtaining the smooth normal corresponding to each of the plurality of cameras at a point on the surface of the geometric object comprises: averaging normals over an area of the depth map.

13, an electronic device configured to perform texture mapping operations on data representing geometric objects in an image environment, the electronic device comprising:

a memory; and

control circuitry coupled to the memory, the control circuitry configured to:

receiving (i) geometric object data representing a geometric object in the image environment and (ii) image data representing respective images of a texture object captured by a plurality of cameras, each of the plurality of cameras having an orientation relative to the texture object;

for each of the plurality of cameras:

obtaining a smooth normal corresponding to the camera at a point on the surface of the geometric object, the smooth normal being evaluated by a weighted sum of pixels in a depth map onto which the geometric object is projected; and

generating respective weights corresponding to the camera, the weights based on a dot product of the orientation of the camera and the smoothed normal corresponding to the camera; and

generating a weighted average of the images of the texture object captured by the plurality of cameras to produce a texture mapped object in the image environment, the image of the texture object captured by each of the plurality of cameras being weighted by the weight corresponding to that camera.

14. The electronic device of claim 13, wherein the control circuitry configured to obtain the smooth normal corresponding to each of the plurality of cameras at the point on the surface of the geometric object is further configured to :

obtaining mean coordinates of a specified point along the th axis of the filter window;

generating a tangent to the surface at the th average coordinate to produce a th average tangent to the surface at the point on the surface of the geometric object;

obtaining a second average coordinate of a specified point along a second axis of the filter window;

generating a tangent to the surface at the second average coordinate to produce a second average tangent to the surface at the point on the surface of the geometric object; and

generating a cross product of the th average tangent and the second average tangent to produce the smoothed normal.

15. The electronic device of claim 14, wherein the control circuitry configured to obtain the mean coordinate of the specified point along the th axis of the filter window is further configured to:

averaging the depths of the specified point on a side of the point relative to the th axis to produce a th average point;

averaging the depths of the specified point on a second side of the point relative to the th axis to produce a second average point, and

generating a difference between the th average point and the second average point as the th average tangent to the surface at the point on the surface of the geometric object.

16. The electronic device of claim 14, wherein the control circuitry configured to obtain the th mean coordinate of the specified point along the second axis of the filter window is further configured to:

averaging the depths of the specified points located on the th side of the point relative to the second axis to produce a th average point;

averaging the depths of the specified points on a second side of the point relative to the second axis to produce second averaged points; and

generating a difference between the th average point and the second average point as the th average tangent to the surface at the point on the surface of the geometric object.

17. The electronic device of claim 13, wherein the weighting is further based on a visibility of the points, the visibility based on a number of points located on the surface of the geometric object in a bounding box.

18. The electronic device defined in claim 13 wherein the control circuitry is further configured to perform a projection operation on the geometric object data to produce a two-dimensional depth map of the geometric object, and

wherein the control circuitry configured to obtain the smoothed normals corresponding to each of the plurality of cameras at points on the surface of the geometric object is further configured to average normals over an area of the depth map.

Technical Field

This specification relates to texture mapping on computer-generated, three-dimensional objects.

Background

In some applications, such as games and movies in virtual reality systems, there may be objects that occlude objects that represent the face of a person.

Disclosure of Invention

In general aspects, a method may include receiving, by processing circuitry of a computer configured to perform a texture mapping operation on data representing a geometric object in an image environment, (i) geometric object data representing the geometric object in the image environment and (ii) image data representing respective images of the texture object captured by a plurality of cameras, each of the plurality of cameras having an orientation relative to the texture object.

The details of or more embodiments are set forth in the accompanying drawings and the description below other features will be apparent from the description, the drawings, and from the claims.

Drawings

FIG. 1 is an illustration of an example electronic environment in which the improved techniques described herein may be implemented.

Fig. 2 is a flow diagram illustrating an example method of implementing the improved technique as shown in fig. 1.

FIG. 3 is a diagram illustrating an example geometric object onto which a texture is to be mapped and an accompanying example depth map, in accordance with the improved technique shown in FIG. 1.

Fig. 4A and 4B are diagrams illustrating an example process of generating a smoothed normal according to the improved technique shown in fig. 1.

FIG. 5 is a diagram illustrating an example process of generating a weighted average texture image in accordance with the improved technique shown in FIG. 1.

Fig. 6 illustrates an example of a computer device and a mobile computer device that may be used with the circuit described herein.

Detailed Description

Although texture mapping using the above-described mean weights in occlusion regions in the presence of occluding objects provides soft transitions in shadow regions, the resulting image is blurred reasons for blurring are that the above-described mean weights are independent of the geometry on which the texture is mapped. for example, in applications such as games and movies in virtual reality systems, the ability to accurately perform texture mapping may depend on the viewpoint of the observer and the viewpoints of or more cameras that captured the image.

Although normal-dependent weighting directly takes into account the shape of the object in question, this weighting may introduce ripples on the texture when estimating the surface normal locally from noise in the scan data, due to the inherent noise of calculating the surface normal.

Conventional methods of smoothing surface normals in texture mapping applications involve averaging the normals of all object surfaces over a local area in the entire voxel grid.

Along these lines, the cameras used to capture the image for texture mapping are located at an orientation relative to the geometric object on which the texture mapping computer maps the texture image.

Further , the texture mapping computer performs the computation of these average normals only on the depth map, rather than the entire voxel grid.

Advantageously, the smoothed normal computed in this manner does not experience artifacts, ripples or blurring associated with methods that do not use texture mapping of smoothed normals, or the expensive computations used in conventional methods that use smoothed normals. In many cases, the average depth map of an object relative to an occluding geometric object or other object occluded by a geometric object indicates that the normal computed within the transition or shadow region near the boundary of the geometric object points outward and is nearly perpendicular to the orientation of the camera. (the normal of a point on the average depth map is equivalent to the average normal of a point on the surface of the geometric object.) therefore, any error in the transition region caused by the averaging process is down-weighted. Furthermore, averaging the normals on the depth map is much faster than over the entire voxel grid.

FIG. 1 is an illustration of an example electronic environment 100 in which the improved techniques described above may be implemented. As shown, in FIG. 1, electronic environment 100 includes network 110, texture mapping computer 120, and image environment server computer 190.

The network 110 is configured and arranged to provide a network connection between the texture mapping computer 120 and the media server computer 190 the network 110 may implement any of the various protocols and topologies typically used for communicating over the internet or other networks further the network 110 may include various components (e.g., cables, switches/routers, gateways/bridges, etc.) used in such communications.

The texture mapping computer 120 is configured to texture map smooth normals that generate geometric objects the texture mapping computer 120 includes a network interface 122, or more processing units 124 and memory 126 the network interface 122 includes, for example, an Ethernet adapter, a token ring adapter, or the like for converting electronic and/or optical signals received from a network into electronic form for use by the user equipment computer 120 the set of processing units 124 includes or more processing chips and/or assemblies the memory 126 includes both volatile memory (e.g., RAM) and non-volatile memory, such as or more ROMs, disk drives, solid state drives, or the like the set of processing units 124 and the memory 126 together form control circuitry configured and arranged to carry out the various methods and functions as described herein.

In some embodiments, or more of the components of texture mapping computer 120 may be or may include a processor (e.g., processing unit 124) configured to process instructions stored in memory 126 examples of such instructions as depicted in fig. 1 include geometric object data manager 130, image data manager 140, camera data manager 150, normal smoothing manager 160, weight generation manager 170, and texture mapping image manager 180 further , as illustrated in fig. 1, memory 126 is configured to store various data described with respect to the respective manager using such data.

The geometric object data manager 130 is configured to receive geometric object data 132 over the network 110 via the network interface 122. in examples, the geometric object data 132 includes triangles or polygons for defining geometric objects (e.g., human heads) in three dimensions. in another example, the geometric object data includes points such as point clouds for defining such geometric objects.

The image data manager 140 is configured to receive image data 142 over the network 110 via the network interface 122 in embodiments, the image data 142 represents an image of a surface of an object, such as a user's face, from different perspectives.

In embodiments, the geometric object data manager 130 and the image data manager 140 are configured to receive geometric object data 132 and image data 142 from an image environment server computer 190 for example, the image environment server computer 190 may be included as part of a virtual reality system that may scan the shape of a user and also have cameras that take images of the user from various perspectives.

The camera data manager 150 is configured to obtain camera orientation data 152 in embodiments, the camera orientation data 152 includes the angle at which each camera acquires the image data 142 relative to a fixed coordinate system in embodiments, the coordinate system is defined relative to a fixed object, such as a wall, in embodiments, the coordinate system is defined at a particular time relative to a user in embodiments, the camera orientation data 152 includes the azimuth and polar angle of each camera.

The normal smoothing manager 160 is configured to obtain surface normals of the geometric objects based on the geometric object data 132 and generate smoothed normals at points in the geometric object data 132 that define the normals, for example, as described above, when the geometric object data 132 includes triangles or polygons, normals at points in the geometric object data 132 may be generated based on the orientation of the triangles or polygons that contain the points.

The weight generation manager 170 is configured to generate weight data 172 for use in the calculation as a weighted average of the image values at a point the weight generation manager 170 is configured to calculate a dot product of the smoothed normal 162 at a point and the camera orientation 152 of the camera in embodiments the weight represented by the weight data 172 is proportional to the dot product raised to a predetermined power in embodiments the predetermined power is between 1 and 10 in embodiments the weight is proportional to the visibility of the point from the perspective of the camera by a step .

The texture map image manager 180 is configured to produce texture map image data 182, the texture map image data 182 representing the results of texture mapping operations that map the image data 142 onto geometric objects represented by the geometric object data 132. in embodiments, the texture map image manager 180 is configured to use the weight data 172 to generate a weighted average of the image data 142. in embodiments, the texture map image data 182 is further configured to send the texture map image data to the image environment server computer 190.

The components (e.g., modules, processing units 124) of texture mapping computer 120 may be configured to operate based on or more platforms (e.g., or more similar or different platforms) that may include or more types of hardware, software, firmware, operating systems, runtime libraries, and the like in some embodiments the components of texture mapping computer 120 may be configured to operate within a cluster of devices (e.g., a server farm).

In embodiments, or more portions of the components shown in the components of texture mapping computer 120 in FIG. 1 may be or may include hardware-based modules (e.g., Digital Signal Processors (DSPs), field programmable arrays (FPGAs), memory), firmware modules, and/or software-based modules (e.g., computer code modules, sets of computer-readable instructions that may be executed on a computer). for example, in embodiments, or more portions of the components of texture mapping computer 120 may be or may include software modules configured for execution by at least processors (not shown). in embodiments, the functionality of the components may be included in different modules and/or different components than those shown in FIG. 1.

Although not shown, in embodiments, components of user device 120 (or portions thereof) may be configured to operate within, for example, a data center (e.g., a cloud computing environment), a computer system, or more server/host devices, etc. in embodiments, components of texture mapping computer 120 (or portions thereof) may be configured to operate within a network, whereby components of texture mapping computer 120 (or portions thereof) may be configured to operate within various types of network environments that may include or more devices and/or or more server devices.

For example, geometric object data manager 130 (and/or portions thereof), image data manager 140 (and/or portions thereof), camera data manager 150 (and/or portions thereof), normal smoothing manager 160 (and/or portions thereof), weight generation manager 170 (and/or portions thereof), and texture mapping image manager 180 (and/or portions thereof) may be a combination of a processor and a memory configured to execute instructions related to a process for implementing or more functions.

In embodiments, memory 126 may be any type of memory, such as random access memory, disk drive memory, flash memory, etc. in embodiments, memory 126 may be implemented as more than memory components (e.g., more than RAM components or disk drive memory) associated with components of user device computer 120. in embodiments, memory 126 may be a database memory in embodiments, memory 126 may be or may include non-local memory, for example, memory 126 may be or may include memory shared by multiple devices (not shown). in embodiments, memory 126 may be associated with a server device (not shown) within the network and configured to service components of user device computer 120. as illustrated in fig. 2, memory 126 is configured to store various data including geometric object data 132, image data 142, camera orientation data 152, smooth normal data 162, weight data 172, and mapped image data 182.

FIG. 2 is a flow diagram depicting an example method 200 of performing texture mapping using a smooth normal. The method 200 may be performed by the software constructs described in connection with FIG. 1 residing in the memory 126 of the texture mapping computer 120 and being executed by the collection of processing units 124.

At 202, texture mapping computer 120 receives (i) geometric object data representing a geometric object in an image environment and (ii) image data representing respective images of a texture object captured by a plurality of cameras, each of the plurality of cameras having an orientation relative to the texture object.

At 204, the texture mapping computer 120 obtains, for each of the plurality of cameras, a smooth normal corresponding to the camera at a point on the surface of the geometric object, the smooth normal being evaluated by a weighted sum of pixels in the depth map onto which the geometric object is projected.

At 206, texture mapping computer 120 generates, for each of the plurality of cameras, a respective weight corresponding to the camera, the weight based on a dot product of the orientation of the camera and the smoothed normal corresponding to the camera.

At 208, the texture mapping computer 120 generates a weighted average of the images of the texture object captured by the plurality of cameras to produce the texture mapped object in the image environment, the images of the texture object captured by each of the plurality of cameras being weighted by the weight corresponding to that camera.

Fig. 3 is a diagram illustrating an example scene containing geometry as viewed from above projected onto a two-dimensional depth map. In the illustration, an object 310 is in front of the wall 300 and is illuminated in front of it. Cameras 320(1) and 320(2) are each oriented at various angles toward object 310. The purpose is to accurately calculate values such as color, brightness, bump height, etc. (hereinafter referred to as "image values") at points 370 on the surface of the object 310. Although only two cameras are shown in fig. 3, there may be any number N of cameras used. In general, the image value at point 370 is the camera image IiWeighted average of (a) above:

Figure BDA0002316518110000101

wherein the weight w corresponding to the ith cameraiIs obtained by

Figure BDA0002316518110000102

Wherein the content of the first and second substances,

Figure BDA0002316518110000103

is the smoothed normal associated with the ith camera at point 370,

Figure BDA0002316518110000104

is the orientation of the ith camera, viIs the visibility of the point 370 associated with the ith camera and α is a predetermined constant.

FIG. 4A is a diagram illustrating a th part of an averaging process according to an embodiment according to which points in the filter window 330 are divided into two groups, those on the th side of the point 370 along the s-axis (shallow) and those on the second side of the point 370 along the s-axis (deep)

Figure BDA0002316518110000105

Where the difference is the difference between the sum at the th side of the point-to-point 370 in the filter window 330 and the sum at the second side of the point-to-point 370 in the filter window 330.

According to this embodiment, the points in the filter window 330 are divided into two groups, those on the side of the point 370 along the t-axis (shallow) and those on the other side of the 370 along the t-axis (deep)

Figure BDA0002316518110000106

Where the difference is the difference between the sum at the th side of the point-to-point 370 in the filter window 330 and the sum at the second side of the point-to-point 370 in the filter window 330.

The smoothed normal at point 370 is then equal to the cross product between the aforementioned smoothed tangents, i.e.,

Figure BDA0002316518110000107

further , any of the graphs in FIG. 4A or FIG. 4B may be used to calculate visibility v via percentage progressive filteringi. Visibility is defined as filteringThe percentage of points in the container window 330 that are visible to other objects that face closely to the occluding object. In the case illustrated in fig. 4A and 4B, the visibility is 6/16. Furthermore, the percentage progressive filtering may use the same texture samples as the normal estimates, resulting in additional speedup by combining the costs of the two estimates.

Referring back to fig. 3, the depth map 360 of the object illustrates what is happening when considering the image to be generated from the camera 320 (1). In fig. 3, the depth map 360 is actually a smooth depth map with continuous behavior in the transition (shaded) region 380. Smoothing occurs over an area oriented based on the orientation of camera 320(1), so similar depth maps associated with camera 320(2) will be different.

The normal to the average depth map 360 represents the average normal at points throughout the scene. Thus, in the transition region 380, the average normal 350 as indicated by the depth map 360 points outward from the camera 320(1) and is nearly perpendicular to the camera 320 (1). At such point to weight wiThe contribution of (c) is negligible because the dot product between the smoothed normal and the camera orientation is almost zero. Thus, where the average normal height is inaccurate, the corresponding weight w will not be matchediProducing any meaningful contribution. Because the greatest inaccuracy in the normal calculation is in the transition region 380, the resulting image will be free of errors such as ripples, artifacts, and blur.

Fig. 5 is a flow diagram depicting an example process 500 for generating weights using a smoothed normal. The method 200 may be performed by the software constructs described in connection with FIG. 1 residing in the memory 126 of the texture mapping computer 120 and being executed by the collection of processing units 124.

At 502, the normal smoothing manager 160 selects a point represented by the point data 132. At 504, the normal smoothing manager 160 selects the image data 142 and the orientation 152 corresponding to the camera by iteratively traversing the camera.

At 506, the normal smoothing manager 160 generates a filter window (e.g., filter window 330) that is parallel to the orientation of the camera. The filter window has a specified size and/or number of points and has two orthogonal axes.

At 508, the normal smoothing manager 160 generates a visibility v based on a ratio of the number of points in the filter window that are near the camera, i.e., the number of points on the surface of the object near the camera, to the total number of points in the filter windowi

At 510, the normal smoothing manager 160 generates an average tangent along the th axis of the filter window as described above with respect to fig. 4A.

At 512, the normal smoothing manager 160 generates an average tangent along the second axis of the filter window as described above with respect to fig. 4B.

At 514, normal smoothing manager 160 generates a smoothed normal at the point by taking the cross product of the average tangent along the th axis of the filter window and the average tangent along the second axis of the filter window then weight generation manager 170 generates weights based on the smoothed normal and visibility as described above with respect to FIG. 3.

If there are more cameras to consider, the normal smoothing manager 160 selects the camera orientation down and repeats 502-514, if not, the texture mapping image manager 180 performs a weighted average of the images using the generated weights to produce image values at points 518.

Fig. 6 illustrates an example of a general purpose computer apparatus 600 and a general purpose mobile computer apparatus 650 that may be used with the techniques described herein.

As shown in fig. 6, computing device 600 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Computing device 650 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart phones, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit embodiments of the inventions described and/or claimed in this document.

Computing device 600 includes a processor 602, memory 604, storage 606, a high speed interface 608 connected to memory 604 and a high speed expansion port 610, and a low speed interface 612 connected to a low speed bus 614 and storage 606, each of the components 602, 604, 606, 608, 610, and 612 are interconnected using various buses and may be mounted on a common motherboard or in other manners as appropriate, processor 602 may process instructions executing within computing device 600, including instructions stored in memory 604 or on storage 606 to display graphical information for a GUI on an external input/output device such as display 616 coupled to high speed interface 608.

Memory 604 stores information within computing device 600, in implementations, memory 604 is or more volatile memory units, in another implementation, memory 604 is or more non-volatile memory units, memory 604 may also be another forms of computer-readable media, such as a magnetic or optical disk.

Storage 606 can provide mass storage for computing device 600 in implementations, storage 606 can be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations.

High speed controller 608 manages bandwidth-intensive operations of computing device 500, while low speed controller 612 manages lower bandwidth-intensive operations, this allocation of functionality is merely exemplary in embodiments, high speed controller 608 is coupled to memory 604, display 616 (e.g., through a graphics processor or accelerator), and high speed expansion ports 610, which may accept various expansion cards (not shown). in an embodiment, low speed controller 612 is coupled to storage 506 and low speed expansion ports 614. low speed expansion ports, which may include various communication ports (e.g., USB, bluetooth, ethernet, wireless ethernet) may be coupled to or more input/output devices, such as a keyboard, pointing device, scanner, or network device such as a switch or router, for example, through a network adapter.

As shown in the figure, computing device 600 may be implemented in a number of different forms, for example, the computing device may be implemented as a standard server 620 or multiple times in sets of such servers.the computing device may also be implemented as a portion of a rack server system 624. additionally, the computing device may be implemented in a personal computer such as a laptop 622. alternatively, components from computing device 600 may be combined with other components (not shown) in a mobile device such as device 650. each of such devices may contain or more of computing devices 600, 650, and the overall system may be made up of multiple computing devices 600, 650 communicating with each other.

Computing device 650 includes, among other components, a processor 652, a memory 664, an input/output device such as a display 654, a communication interface 666, and a transceiver 668. device 650 may also be provided with a storage device such as a miniature hard disk or other device to provide additional storage each of components 650, 652, 664, 654, 666, and 668 are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.

The processor 652 may execute instructions within the computing device 650, including instructions stored in the memory 664. The processor may be implemented as a chipset of chips that include separate and multiple analog and digital processors. The processor may provide, for example, for coordination of the other components of the device 650, such as control of user interfaces, applications run by device 650, and wireless communication by device 650.

The processor 652 may communicate with a user through a control interface 658 and a display interface 656 coupled to the display 654, for example, the display 654 may be a TFT LCD (thin film transistor liquid Crystal display) or OLED (organic light emitting diode) display or other suitable display technology, the display interface 656 may include suitable circuitry for driving the display 654 to present graphical and other information to a user, the control interface 658 may receive commands from a user and convert those commands for submission to the processor 652, additionally, an external interface 662 may be provided in communication with the processor 652 to enable near area communication of the device 650 with other devices, the external interface 662 may provide, for example, wired communication in implementations or may provide wireless communication in other implementations, and multiple interfaces may also be used.

Memory 664 may be implemented as or more of or more computer readable media, or more volatile memory units, or or more non-volatile memory units, expansion memory 674 may also be provided and connected to device 650 through expansion interface 672, which expansion interface 672 may include, for example, a SIMM (Single line memory Module) card interface.

The memory may include, for example, flash memory and/or NVRAM memory, as discussed below, in embodiments a computer program product is tangibly embodied in an information carrier, the computer program product containing instructions that, when executed, perform or more methods, such as those described above.

Device 650 may communicate wirelessly through communication interface 666, which may include digital signal processing circuitry, if necessary. Communication interface 666 may provide for communications under various modes or protocols, such as GSM voice calls, SMS, EMS, or MMS messaging, CDMA, TDMA, PDC, WCDMA, CDMA2000, or GPRS, among others. Such communication may occur, for example, through radio frequency transceiver 668. Additionally, short-range communication may occur, such as using a bluetooth, WiFi, or other such transceiver (not shown). In addition, GPS (global positioning system) receiver module 670 may provide additional navigation-and location-related wireless data to device 650, which may be used as appropriate by applications running on device 650.

The device 650 may also communicate audibly through the use of an audio codec 660, which audio codec 660 may receive verbal information from a user and convert the verbal information into usable digital information. Audio codec 660 may also generate audible sound for a user, such as through a speaker, for example, in a handset of device 650. Such sound may include sound from voice calls, may include recorded sound (e.g., voice messages, music files, etc.), and may also include sound generated by applications operating on device 650.

As shown in the figure, the computing device 650 may be implemented in a number of different forms, for example, the computing device 650 may be implemented as a cellular telephone 680, and the computing device 650 may also be implemented as part of of a smart phone 682, personal digital assistant, or other similar mobile device.

Various embodiments of the systems and techniques described here can be implemented in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof, and these various embodiments can include or more computer programs that are executable and/or interpretable on a special or general purpose programmable system including at least programmable processors, which may be coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, the at least 4634 input devices, and the at least output devices.

These computer programs (also known as programs, software applications or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.

To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.

The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components.

The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.

Various embodiments have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the description.

It will also be understood that when an element is referred to as being "on" another element, being connected to another element, being electrically connected to another element, being coupled to another element, or being electrically coupled to another element, it may be directly on another element, being directly connected to another element, or being directly coupled to another element, or there may be or more intermediate elements.

While certain features of the described embodiments have been illustrated as described herein, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the scope of the embodiments. It is to be understood that they have been presented by way of example only, and not limitation, and various changes in form and details may be made. Any portions of the apparatus and/or methods described herein may be combined in any combination, except mutually exclusive combinations. The embodiments described herein may include various combinations and/or subcombinations of the functions, elements and/or features of the different embodiments described.

In addition, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Accordingly, other embodiments are within the scope of the following claims.

21页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:纸张类处理装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!