Support for enhancing testing of Augmented Reality (AR) applications

文档序号:1602652 发布日期:2020-01-07 浏览:4次 中文

阅读说明:本技术 加强对增强现实(ar)应用的测试的支持 (Support for enhancing testing of Augmented Reality (AR) applications ) 是由 蒂莫西·普西亚基 杰弗里·麦格林 托马斯·索尔特 杰西卡·刘 于 2018-12-20 设计创作,主要内容包括:描述测试增强现实(AR)应用的示例方法。在示例实施方案中,所述方法包括发起目标用于测试的AR应用。所述示例方法进一步包括控制物理模型,以对模拟装置在模拟现实世界空间中的移动进行模拟。所述物理模型的模拟移动生成用于测试所述AR应用的数据。在一些实施方案中,所述方法可以进一步包括发起所述AR应用以在装置模拟器上运行;以及使用虚拟环境。(Example methods of testing Augmented Reality (AR) applications are described. In an example embodiment, the method includes launching an AR application targeted for testing. The example method further includes controlling the physical model to simulate movement of the simulation apparatus in a simulated real-world space. Simulated movement of the physical model generates data for testing the AR application. In some embodiments, the method may further comprise initiating the AR application to run on a device simulator; and using the virtual environment.)

1. A computer-implemented method, comprising:

initiating an Augmented Reality (AR) application targeted for testing on a computing device; and

controlling a physical model to simulate movement of a simulation device in a simulated real world space, the simulated movement of the physical model generating data for testing the AR application.

2. The computer-implemented method of claim 1, wherein the initiating comprises:

initiating the AR application to run on a device simulator, the device simulator running on the computing device.

3. The computer-implemented method of claim 2, wherein the AR application is launched using a virtual environment.

4. The computer-implemented method of claim 2, wherein the device simulator is initiated using virtual device configuration information.

5. The computer-implemented method of claim 2, further comprising:

introducing errors into the data being generated to simulate a fault scenario.

6. The computer-implemented method of claim 1, wherein the controlling comprises:

controlling the physical model using a user control that drives the physical model to simulate movement of the simulation device.

7. The computer-implemented method of claim 1, wherein the generated data is associated with one or more of a position, a velocity, an acceleration, or a rotation of the physical model.

8. The computer-implemented method of claim 1, wherein the testing of the AR application further comprises:

the AR application is launched using a virtual environment to run on a device simulator.

9. A non-transitory computer-readable storage medium having computer-executable program code stored thereon that, when executed on a computer system, causes the computer system to perform operations comprising:

initiating an Augmented Reality (AR) application targeted for testing; and

controlling a physical model to simulate movement of a simulation device in a simulated real world space, the simulated movement of the physical model generating data for testing the AR application.

10. The computer-readable storage medium of claim 9, further comprising code for initiating the AR application to run on a device simulator.

11. The computer-readable storage medium of claim 10, wherein the AR application is launched using a virtual environment.

12. The computer-readable storage medium of claim 10, wherein the device simulator is initiated using virtual device configuration information.

13. The computer-readable storage medium of claim 10, further comprising:

introducing errors into the data being generated to simulate a fault scenario.

14. The computer-readable storage medium of claim 9, further comprising code for controlling the physical model using a user control that drives the physical model to simulate movement of the simulation apparatus.

15. The computer-readable storage medium of claim 9, wherein the generated data is associated with one or more of a position, a velocity, an acceleration, or a rotation of the physical model.

16. The computer-readable storage medium of claim 9, wherein the testing of the AR application further comprises code for:

the AR application is launched using a virtual environment to run on a device simulator.

17. A device simulator, comprising:

a physical model;

an analog sensor; and

a user control is provided to control the operation of the user,

the device simulator is configured to launch an augmented reality AR application targeted for testing,

the physical model is configured to simulate movement of a simulation device in a simulated real world space, the simulated movement of the physical model generating data for testing the AR application, an

The simulated sensor receives the generated data from the physical model and shares with an AR tracking system.

18. The device simulator of claim 17, wherein the AR application is launched using a virtual environment.

19. The device simulator of claim 18, wherein the device simulator is initiated using virtual device configuration information.

20. The device simulator of claim 17, wherein the user control is configured to drive the physical model to simulate movement of the simulated device.

21. The device simulator of claim 17, wherein the generated data is associated with one or more of a position, a velocity, an acceleration, or a rotation of the physical model.

22. The device simulator of claim 17, wherein the device simulator is further configured to launch the AR application using a virtual environment.

Background

The Augmented Reality (AR) experience is highly dependent on the context in which the AR is used. Therefore, the development of AR applications requires access to a test environment that represents the environment that the developer is facing. This may require the developer to physically move around in the test environment, which not only slows the development process, but also limits the test range.

Disclosure of Invention

Example methods, devices, and computer-readable storage media for testing Augmented Reality (AR) applications are described. In an example embodiment, an example method includes initiating an AR application targeted for testing. The example method also includes controlling the physical model to simulate movement of the simulation apparatus in a simulated real-world space. Simulated movement of the physical model generates data for testing the AR application. In some embodiments, the method may further include initiating an AR application to run on the device simulator; and using the virtual environment.

Drawings

Fig. 1 illustrates a block diagram of an example of a computing device having a device simulator for developing/testing AR applications, according to an implementation described herein.

Fig. 2 illustrates a block diagram of an example device simulator for developing/testing an AR application, according to embodiments described herein.

Fig. 3 is a flow diagram of a method of testing an AR application using a device simulator according to an embodiment described herein.

Fig. 4 illustrates an example of a computing device and a mobile computer device that may be used to implement the techniques described herein.

Detailed Description

Augmented Reality (AR) systems do not support testing AR applications. Testing of an AR application may be defined as a process of evaluating the functionality of the AR application to ascertain whether the AR application meets specified requirements and/or to ensure the AR application is flawless (e.g., evaluating the functionality, robustness, quality, etc. of the AR application). The proposed solution to this technical problem consists in testing the AR application using a device simulator running on the AR system. In an example embodiment, a device simulator may be configured to run on a computing device configured with an AR system. A physical model running on the device simulator may be created and movement of the physical model controlled by the user. The physical model when driven by the user generates data that is similar to data generated by real movement of a real device. The generated data is forwarded to a tracking framework of the AR system, which may be displayed to the user so that the user may verify the performance of the AR application. Technical advantages of using the above-described simulation device to test AR applications include testing the flexibility of AR applications while sitting in front of a comfortable desk in a variety of virtual environments and different device configurations. This effectively utilizes time and resources, saves cost and improves the quality of the AR application.

Fig. 1 illustrates a block diagram of a computing device 100 including a device simulator 150, according to at least one example embodiment. In some example embodiments, the device simulator 150 may be used to test an AR application, such as the AR application 190 described herein.

Computing device 100 includes a processor 112, a memory 114, an AR framework 120, an Application Programming Interface (API)130, and/or a device simulator 150. Device simulator 150 receives input, e.g., from a user, via user input 162, and computing device 100 outputs graphics to a display, e.g., via display 140 to the user.

The AR framework 120 includes an AR tracking system 122 to support the AR experience by integrating virtual content with the real world as seen through the device's camera using API 130. For example, in some embodiments, the AR tracking system 122 may support the AR experience through motion tracking, environmental understanding, light estimation, and the like. Motion tracking may allow a device to understand and track the position of the device relative to a word. Environmental understanding may allow the device to detect the size and position of all types of surfaces, such as horizontal, vertical, and angled surfaces, among others. Light estimation may allow a device to estimate the lighting conditions of the environment.

The AR tracking system 122 tracks the location of a device (e.g., a mobile device) as the device moves in the real world (space) and builds its own understanding of the real world. The AR tracking system 122 uses sensors in the device (e.g., cameras, accelerometers, gyroscopes, Inertial Measurement Units (IMUs), Global Positioning Systems (GPS), etc.) to identify points of interest (e.g., keypoints, features, etc.) and track how these points move over time. Based on the combination of the movement of these points and the readings from the device sensors, the AR tracking system 122 may determine the location, orientation, etc. of the device as it traverses the real world.

In addition to identifying points of interest, the AR tracking system 122 may detect flat surfaces (e.g., tables, floors, etc.), and may also estimate average illumination in the surrounding area (or environment). These capabilities of the AR tracking system 122 may be combined to enable the AR framework 122 to build its own understanding of the surrounding real world. Furthermore, the understanding of the real world enables users to place objects, annotations, or other information into seamless integration with the real world. For example, a user may place an object, such as a midday cat, at a corner of a coffee table, annotate an oil painting, etc. with biographical information about the artist. The motion tracking functionality of the AR tracking system 122 allows the user to move around and view these objects from any angle. For example, if the user turns away from the room and returns later, the objects (e.g., a nap kitten at the corner of a coffee table, or annotations on a painting) will be in the position where the user placed them.

AR framework 120 may include an abstraction layer 124 (e.g., a hardware abstraction layer). Abstraction layer 124 represents an interface between an operating system (OS, not shown) of computing device 100 and device simulator 150. That is, abstraction layer 124 provides an interface that makes the OS of computing device 100 agnostic to low-level driver implementations. In an example embodiment, the abstraction layer 124 may support functionality to be implemented in the device simulator 150 without affecting or modifying higher-level systems (e.g., the AR framework 120 and/or the AR tracking system 122). That is, the abstraction layer 124 allows the API130 provided by the computing device 110 (or the OS of the computing device 110) to assume that the OS is interacting with a real device (not a simulated device, such as the simulated device 150). The APIs 124 provided by the OS of the computing device 110 may include graphics APIs, sensor APIs, camera APIs, and the like.

The device simulator 150 simulates a mobile device (e.g., a mobile phone, a tablet computer, etc.) on a computing device 100 (e.g., a desktop computer). In an example embodiment, the device simulator 150 may be an application running on the computing device 100. Device simulator 150 may provide most of the functionality of a real mobile device and/or may have a predefined/preconfigured configuration of different devices or device types. In some embodiments, for example, the user may also configure the device simulator 150 to simulate, for example, the location, network speed, rotation, sensors (e.g., cameras, accelerometers, gyroscopes, IMUs, GPS, etc.), etc. of the device.

In some embodiments, each instance of device simulator 150 may configure size, form factor, OS version, and other desired hardware features using a virtual device configuration, and may act as a stand-alone device with its own private storage device. For example, device emulator 150 may store user data, SD card data, and/or cache associated with a virtual device in a directory dedicated to the virtual device. When a user starts the device emulator 150, the device emulator 150 loads user data, SD card data, and/or cache from the associated virtual device directory. In some embodiments, device simulator 150 may also include, for example, user controls 160, simulation sensors 170, and/or physics models 180. Some or all of the components of the device simulator 150 (e.g., 150, 160, 170, and 180) may be stored together or in combination in the memory 112 and/or implemented by machine-readable instructions executed by the processor 112.

In some embodiments, the physics model 180 may be used to simulate the movement of a real device. That is, the physical model may be an application running on the device simulator 150 that simulates inertial motion of an object through the environment. For example, a user may control movement (e.g., behavior) of the physics model 180 in space using the user input 162 and through the user controls 160. By sending instructions to user controls 160 via user input 162, the user may control the behavior of physical model 180. In an example embodiment, the user controls 160 may be controlled by user inputs 162, which user inputs 162 may include, for example, a WASD key, mouse controls, arrow keys, a joystick, a touchpad, a game controller, and the like. When the user engages user control 160, for example using a W key in a keyboard, to move physics model 180 forward, physics model 180 moves forward in a manner that simulates the movement of a real device in space. For example, movement of the physics model 180 may represent that the real device is moving inertially in space (e.g., with a particular acceleration and velocity, without jumping, etc.). In other words, physics model 180 may simulate the movement of a real device moving in space in the following manner: the position of the physics model 180 is continuous at the second derivative (e.g., acceleration) and the rotation of the physics model 180 is continuous at the first derivative (e.g., angular velocity).

When the AR tracking system 122 assumes that the device being tracked (e.g., device simulator 150) is a physical device moving through the real world, the movement of the physical model 180 is modeled to simulate real physical movements. The simulation of this physical movement may be managed and exposed to other components through physics models 180. In some embodiments, the physics model 180 may smoothly interpolate physical parameters describing the current state of the system and expose aspects of the control system to other components to allow these components to drive movement of the physics model 180 (e.g., to indicate that the physics model 180 is smoothly moved to a particular location and rotation) such that the location, velocity, acceleration, rotation, angular velocity, etc. of the physics model 180 are continuous.

For example, if the user instructs the physical model 180 to move at a speed of 1 meter/second through the user controls 160, the real device cannot suddenly move at a speed of 1m/s (starting at a speed of 0 m/s), but takes some time to move at a speed of 1 m/s. In some embodiments, physical model 180 may be considered a master controller, in that physical model 180 may calculate its own position, velocity, acceleration, rotational velocity, and the like. The physics model 180 computes this information in a way that the information is true in nature (e.g., smooth/continuous, no jumps, etc.). In some embodiments, physics model 180 may be viewed as acting as an intermediary between user controls 160 and other components of device simulator 150 (e.g., simulation sensors 170). For example, upon receiving instructions (e.g., to simulate movement) via user input 162, physical model 180 may generate its own data and/or interact with simulation sensors 170 to correct the generated data (e.g., feedback mechanisms) to ensure that the data generated by physical model 180 represents the true movement of the real device.

Analog sensors 170 may include any type of analog sensor, such as a virtual camera, accelerometer, IMU, and the like. In some implementations, for example, the feeds from the virtual cameras may support camera APIs that are built in such a way that a viewpoint matrix (e.g., position, orientation, etc.) of the camera feeds is set based on the real-time locations reported by the physics model 180. This may ensure that a highly realistic scene with sufficient complexity is presented to provide features to the AR tracking system 122. Additionally, in some implementations, the simulated sensors 170 report simulated IMU data at a high frequency and based on the real-time status of the physics model 180 through the API130 (e.g., IMU API) 118. User controls 160 may drive physics model 180 in real-time so that the user can easily and comfortably move the camera around the world.

In some implementations, to calculate analog IMU readings (e.g., accelerometer, gyroscope measurements), the acceleration and angular velocity in the device's frame of reference must be calculated immediately. In one aspect, polynomial interpolation (e.g., in 3D space for position; in 4D space for rotation) of sufficiently high order may be achieved such that acceleration and angular velocity are continuous, and interpolation of user-controlled set target positions and rotations over a fixed period of time may be achieved. In this way, all derivatives are always available for IMU calculation.

In addition, in order to perform tracking normally, it is not possible to rotate a sensor (e.g., a virtual camera) to one point. Instead, the virtual camera may be placed offset from the center of rotation. For example, the position of the virtual camera may have a small offset from the position of the virtual device so that the user may control and rotate using the user interface of the device simulator 150. That is, for tracking mechanism 122 to work properly, physical model 180 and simulated sensors 170 may be calculated or computed according to the device requirements of AR framework 120. In an example embodiment, this may require a certain offset between the virtual camera and the analog sensor position. In another example, the physics model 180 may simulate the layout, e.g., position, rotation, etc., of different sensors on the device of the IMU relative to the device.

In some embodiments, the user may engage user controls 160 to drive physics model 180 forward to test the AR application using device simulator 150. As physics model 180 moves forward, physics model 180 may generate data related to, for example, the position, velocity, acceleration, rotation, etc. of physics model 180. The data generated by physics model 180 may be shared with analog sensor 170. The analog sensors 170 report the generated data to the AR tracking system 122 through the abstraction layer 124. In other words, the user may control the physics model 180 (via user input 162) using the user controls 160. Upon receiving input from user controls 160, physics model 180 may generate data (e.g., position, velocity, acceleration, rotation, etc.) for physics model 180. Data generated by the physics model 180 is shared with the simulated sensors 170 through the abstraction layer 124 for forwarding to the AR tracking system 122. This allows the AR tracking system 122 to assume that data (e.g., position, velocity, acceleration, rotation, etc.) is received from a real device. This allows a user to conveniently/efficiently test AR applications using the device simulator 150 while sitting on his or her comfort desk.

In some embodiments, for example, device simulator 150 may be used to introduce errors/imperfections in the data generated by physical model 180 to simulate/reproduce error/fault scenarios, such as sensor noise and errors, camera limitations (e.g., lens distortion, motion blur, contrast reduction such as in low light conditions, etc.), and miscalibration and misalignment between camera and sensor data that may be encountered on real devices when running AR applications. This will help simulate AR tracking system failure situations, such as loss of tracking (e.g., when walking along a dark corridor), drift between tracked and physical locations (objects appear to be slipping), and the like.

The example processor 112 of FIG. 1 may be in the form of: microcontrollers, Central Processing Units (CPUs), ASICs, Digital Signal Processors (DSPs), FPGAs, Graphics Processing Units (GPUs), etc., programmed or configured to execute machine-readable instructions stored in memory 114. The instructions, when executed, may cause the processor 112 and/or the aforementioned components, etc., to control a device simulator 150 for simulating real devices, and a physical model 180 for simulating real movements of the mobile device in the real world. In some examples, more than one processor 112 or more than one memory 114 may be included in the computing device 100.

Fig. 2 illustrates a block diagram of a device simulator 150 according to at least one example embodiment. As shown in fig. 2, the analog sensors 170 may include various types of sensors, such as a camera 172, an accelerometer 174, an IMU 176, and so forth. As described above, the simulated sensors provide support for physics model 180 so that physics model 180 can correct data generated by the simulated sensors. In some example embodiments, the physical model 180 may use data from the analog sensor 170 to correct for drift.

Fig. 3 is a flow chart 300 of a method of testing an AR application using a device simulator according to an example embodiment described herein.

At block 310, a user launches an AR application targeted for testing on a computing device. For example, a user may launch (e.g., launch, start, etc.) an AR application 190 on computing device 100. The user may select the AR application 190 to run on the device simulator 150 so that the user may test the AR application 190 using the device simulator 150. As described above, this allows the user to effectively test the AR application.

During launching the AR application 190, the user may select a virtual scene (e.g., a virtual environment, a simulated real-world scene, etc.) to be loaded for testing the AR application. This supports testing of AR applications using various virtual indoor/outdoor environments, depending on the testing requirements. In some embodiments, the virtual scene may be available on the computer device 100 for use by the simulation device 150. The virtual scenario provides the ability to test the AR application 190 in various virtual environments to improve the quality of the AR application and/or user experience.

In some embodiments, the user may select a configuration of the device simulator 150 based on, for example, available virtual device configurations. The virtual device configuration allows device simulator 150 to configure the size, form factor, OS version, memory size, and other desired hardware features of device simulator 150. This provides the user with the flexibility to test AR applications using various device configurations.

At block 320, the user may control the physical model to simulate movement of the simulation device in the simulated real world space. For example, a user may control the physical model 180 to simulate movement of the device simulator 150 in a simulated real world space. In response to the user control moving physics model 180, physics model 180 generates data that represents (or is similar to) data generated by real movement of a real device in space. Data generated by the physical model 180 is shared with the AR tracking system 122 through the abstraction layer 124 for testing AR applications.

For example, a user may use the user controls 160 to control the physical model 180 that generates data representing real movement of a real device, for example, in a virtual environment. For example, the user launches the AR application 190 on the device simulator 150 and loads a virtual scene of the living room. The user may use the user controls 160 to simulate movement of the physical model 180 and generate data that is forwarded to the AR tracking system 122, which processes the data and displays to the user through the API 130. The user views the capabilities of AR application 190 through the camera API using the display (e.g., display 140 to the user) to determine whether AR application 190 is working or executing as intended.

By configuring the device simulator 150 in the configuration desired by the user, the capabilities defined above allow the user to sit on a comfortable desk and test the AR application in any virtual environment (depending on the availability of the virtual environment) with any device configuration. This functionality not only provides the ability to test AR applications, but also effectively tests AR applications.

Fig. 4 illustrates an example of a computing device 400 and a mobile computing device 450 that may be used with the techniques described herein.

For example, computing device 400 may be a device on which simulator 140 is configured to run, and/or mobile computer device 450 may be a mobile device on which an application is run. Computing device 400 is intended to represent various forms of digital computers, such as laptops, desktops, tablets, workstations, personal digital assistants, televisions, servers, blade servers, mainframes, and other appropriate computing devices. Computing device 450 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart phones, and other similar computing devices. The components, connections, and relationships of the components, and the functions of the components, as illustrated herein, are meant to be exemplary only, and are not meant to limit embodiments of the inventions described and/or claimed in this document.

Computing device 400 includes a processor 402, memory 404, storage 406, a high-speed interface 408 connecting to memory 404 and high-speed expansion ports 410, and a low-speed interface 412 connecting to low-speed bus 414 and storage 406. The processor 402 may be a semiconductor-based processor. The memory 404 may be a semiconductor-based memory. Each of the components 402, 404, 406, 408, 410, and 412 are interconnected using various buses, and may be mounted on a common motherboard or in other manners as appropriate. The processor 402 may process instructions for execution within the computing device 400, including instructions stored in the memory 404 or on the storage device 406 to display graphical information for a GUI on an external input/output device, such as display 416 coupled to high speed interface 408. In other embodiments, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also, multiple computing devices 400 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).

The memory 404 stores information within the computing device 400. In one embodiment, the memory 404 is a volatile memory unit or units. In another implementation, the memory 404 is a non-volatile memory unit or units. The memory 404 may also be another form of computer-readable medium, such as a magnetic or optical disk.

Storage 406 is capable of providing mass storage for computing device 400. In one embodiment, the storage device 406 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. The computer program product may be tangibly embodied in an information carrier. The computer program product may also contain instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer-or machine-readable medium, such as the memory 404, the storage device 406, or memory on processor 402.

The high speed controller 408 manages bandwidth-intensive operations for the computing device 400, while the low speed controller 412 manages lower bandwidth-intensive operations. Such allocation of functions is merely exemplary. In one embodiment, the high-speed controller 408 is coupled to memory 404, a display 416 (e.g., through a graphics processor or accelerometer), and a high-speed expansion port 410 that can accept various expansion cards (not shown). In an embodiment, low-speed controller 412 is coupled to storage 406 and low-speed expansion port 414. The low-speed expansion port, which may include various communication ports (e.g., USB, bluetooth, ethernet, wireless ethernet), may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, for example, through a network adapter.

As shown, the computing device 400 may be implemented in a number of different forms. For example, the computing device may be implemented as a standard server 420, or multiple times in a group of such servers. The computing device may also be implemented as part of a rack server system 424. Additionally, the computing device may be implemented in a personal computer such as a laptop computer 422. Alternatively, components from computing device 400 may be combined with other components in a mobile device (not shown), such as device 450. Each of such devices may contain one or more of computing devices 400, 450, and an entire system may be made up of multiple computing devices 400, 450 communicating with each other.

The computing device 450 includes a processor 452, memory 464, an input/output device such as a display 454, a communication interface 466, and a transceiver 468, among other components. The device 450 may also be provided with a storage device, such as a microdrive or other device, to provide additional storage. Each of the components 450, 452, 464, 454, 466, and 468 are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.

The processor 452 may execute instructions within the computing device 450, including instructions stored in the memory 464. The processor may be implemented as a chipset of chips that include a single processor and multiple analog and digital processors. For example, the processor may provide coordination of the other components of the device 450, such as user interfaces, applications run by device 450, and control of wireless communications by device 450.

The processor 452 may communicate with a user through a control interface 458 and a display interface 456 coupled to a display 454. For example, the display 454 may be a TFT LCD (thin film transistor liquid crystal display), or OLED (organic light emitting diode) display, or other suitable display technology. The display interface 456 may comprise suitable circuitry for driving the display 454 to present graphical and other information to a user. The control interface 458 may receive commands from a user and convert the commands for submission to the processor 452. In addition, an external interface 462 may be provided in communication with processor 452, so as to enable near field communication of device 450 with other devices. For example, external interface 462 may be provided for wired communication in some embodiments, or for wireless communication in other embodiments, and multiple interfaces may also be used.

Memory 464 stores information within computing device 450. The memory 464 may be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units. Expansion memory 474 may also be provided and connected to device 450 through expansion interface 472, which may include, for example, a SIMM (Single in line memory Module) card interface. Such expansion memory 474 may provide additional storage space for device 450, or may also store applications or other information for device 450. Specifically, expansion memory 474 may include instructions to carry out or supplement the processes described above, and may include secure information also. Thus, for example, expansion memory 474 may be provided as a security module for device 450 and may be programmed with instructions that allow device 450 to be used securely. In addition, secure applications may be provided by the SIMM card along with additional information, such as placing identification information on the SIMM card in a non-intrusive manner.

As described below, the memory may include, for example, flash memory and/or NVRAM memory. In one embodiment, a computer program product is tangibly embodied in an information carrier. The computer program product contains instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer-or machine-readable medium, such as the memory 464, expansion memory 474, or memory on processor 452, that may receive information via transceiver 468 or external interface 462.

The device 450 may communicate wirelessly through a communication interface 466, which may include digital signal processing circuitry if necessary. Communication interface 466 may provide for communications under various modes or protocols, such as GSM voice calls, SMS, EMS, or MMS messaging, CDMA, TDMA, PDC, WCDMA, CDMA2000, or GPRS, among others. Such communication may occur, for example, through radio-frequency transceiver 468. Additionally, short-range communications may occur, for example, using Bluetooth, Wi-Fi, or other such transceivers (not shown). In addition, GPS (global positioning system) receiver module 470 may provide additional wireless data related to navigation and location to device 450, which may be used as appropriate by applications running on device 450.

Device 450 may also communicate audibly using audio codec 460, which may receive voice information from a user and convert the voice information into usable digital information. Audio codec 460 may likewise generate audible sound for a user, e.g., through a speaker, in a handset of device 450, for example. Such sound may include sound from voice telephony, may include recorded sound (e.g., voice messages, music files, etc.), and may also include sound generated by applications running on apparatus 450.

As shown, the computing device 450 may be implemented in a number of different forms. For example, the computing device may be embodied as a cellular telephone 440. The computing device may also be implemented as part of a smartphone 432, personal digital assistant, or other similar mobile device.

Various embodiments of the systems and techniques described here can be implemented in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include embodiments in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be used for a dedicated or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.

These computer programs (also known as programs, software applications, software modules, software components, or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium," "computer-readable medium" refers to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.

To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with the user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.

The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a client computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), and the internet.

The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.

A number of embodiments have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the invention.

In addition, the logic flows depicted in the figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Accordingly, other embodiments are within the scope of the following claims.

Implementations of the various techniques described herein may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. Embodiments may be implemented as a computer program product, i.e., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable storage device (computer-readable medium), for processing by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers. Thus, a computer-readable storage medium may be configured to store instructions that, when executed, cause a processor (e.g., a processor at a host device, a processor at a client device) to perform a process.

A computer program, such as the one described above, can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be processed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.

Method steps may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method steps also may be performed by, and an apparatus may be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).

Processors suitable for the processing of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. Elements of a computer may include at least one processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.

To provide for interaction with a user, embodiments may be implemented on a computer having a display device, such as a Cathode Ray Tube (CRT), Light Emitting Diode (LED), or Liquid Crystal Display (LCD) monitor, to display information to the user and a keyboard and a pointing device, such as a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with the user; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input generated from the user can be received in any form, including acoustic, speech, or tactile input.

16页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:混合模块的操作

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!