Device-based mobile processing of media content

文档序号:1786420 发布日期:2019-12-06 浏览:12次 中文

阅读说明:本技术 基于装置移动处理媒体内容 (Device-based mobile processing of media content ) 是由 A·班贝格 M·汉诺威 N·利特克 M·B·默里 于 2018-02-26 设计创作,主要内容包括:系统和方法提供用于在计算装置的显示器上的媒体内容的播放期间检测计算装置的移动,从计算装置的一个或多个传感器接收移动数据,分析移动数据以确定移动的方向,基于移动的方向计算媒体内容的显示的转动,以及使媒体内容的显示相对于移动的方向而转动,以显示媒体内容的与移动的方向相关联的部分。(systems and methods provide for detecting movement of a computing device during playback of media content on a display of the computing device, receiving movement data from one or more sensors of the computing device, analyzing the movement data to determine a direction of the movement, calculating a rotation of a display of the media content based on the direction of the movement, and rotating the display of the media content relative to the direction of the movement to display a portion of the media content associated with the direction of the movement.)

1. A method, comprising:

Detecting, by a computing device, movement of the computing device during playback of media content on a display of the computing device;

Receiving, by a computing device, movement data from one or more sensors of the computing device;

Analyzing, by the computing device, the movement data to determine a direction of movement;

Calculating a rotation of the display of the media content based on the direction of movement; and

rotating the display of the media content relative to the direction of movement to display a portion of the media content associated with the direction of movement.

2. The method of claim 1, wherein the one or more sensors comprise at least one of an accelerometer sensor, a gyroscope sensor, and a gravity sensor.

3. The method of claim 1, wherein the movement data is received from an accelerometer sensor, and wherein the movement data comprises an orientation of the computing device comprising a rotation of an axis of the computing device relative to a downward force of gravity.

4. The method of claim 3, wherein the orientation of the computing device further comprises a rotation of the computing device about its z-axis, the z-axis being remote from the display of the computing device.

5. The method of claim 1, wherein the movement data is received from an accelerometer sensor and a gyroscope sensor, and the method further comprises:

Determining that the computing device remains in a planar orientation; and

Calculating the rotation of the display of the media content based on the movement data from the gyroscope sensor.

6. The method of claim 1, wherein the movement data is received from an accelerometer sensor and a gyroscope sensor, and the method further comprises:

Determining that the computing device remains in an approximately planar orientation; and

Calculating the rotation of the display of the media content based on the movement data from both the accelerometer sensor and the gyroscope sensor.

7. The method of claim 1, wherein the movement data is received from an accelerometer, and further comprising:

Determining that the computing device remains in an approximately planar orientation; and

Calculating the rotation of the display of the media content based on the movement data from the gyroscope sensor.

8. the method of claim 1, wherein the media content is initially captured using a circular wide-angle lens.

9. The method of claim 1, wherein displaying the portion of the media content associated with the direction of movement comprises: displaying on a display of the computing device a portion of the media content that was previously not visible.

10. The method of claim 1, further comprising:

detecting parallax motion from the movement data from the one or more sensors;

Analyzing the movement data to determine a direction of movement; and

Causing the display of the media content to slide in an opposite direction of the movement.

11. The method of claim 1, further comprising:

Receiving pinch gesture data from the computing device operating system;

analyzing the pinch gesture data to determine a pinch ratio and a pinch speed;

Calculating a display size of the media content based on the pinch ratio and the pinch speed; and

Causing the media content to be displayed based on the display size.

12. A computing device, comprising:

A processor; and

A computer-readable medium coupled with the processor, the computer-readable medium comprising instructions stored thereon that are executable by the processor to cause a computing device to perform operations comprising:

Detecting movement of the computing device during playback of media content on a display of the computing device;

Receiving movement data from one or more sensors of the computing device;

Analyzing the movement data to determine a direction of movement;

Calculating a rotation of the display of the media content based on the direction of movement; and

Rotating the display of the media content relative to the direction of movement to display a portion of the media content associated with the direction of movement.

13. The computing device of claim 12, wherein the movement data is received from an accelerometer sensor, and wherein the movement data comprises an orientation of the computing device comprising rotation of an axis of the computing device relative to downward gravity, and wherein the orientation of the computing device further comprises rotation of the computing device about a z-axis thereof, the z-axis being remote from the display of the computing device.

14. The computing device of claim 12, wherein the movement data is received from an accelerometer sensor and a gyroscope sensor, and the operations further comprise:

Determining that the computing device remains in a planar orientation; and

Calculating the rotation of the display of the media content based on the movement data from the gyroscope sensor.

15. The computing device of claim 12, wherein the movement data is received from an accelerometer sensor and a gyroscope sensor, and the operations further comprise:

determining that the computing device remains in an approximately planar orientation; and

calculating the rotation of the display of the media content based on the movement data from both the accelerometer sensor and the gyroscope sensor.

16. The computing device of claim 12, wherein the movement data is received from an accelerometer, and the operations further comprise:

determining that the computing device remains in an approximately planar orientation; and

calculating the rotation of the display of the media content based on the movement data from the gyroscope sensor.

17. the computing device of claim 12, wherein the media content is initially captured using a circular wide-angle lens.

18. the computing device of claim 12, wherein to display the portion of the media content associated with the direction of movement comprises to: displaying on a display of the computing device a portion of the media content that was previously not visible.

19. The computing device of claim 12, the operations further comprising:

Receiving pinch gesture data from the computing device operating system;

Analyzing the pinch gesture data to determine a pinch ratio and a pinch speed;

calculating a display size of the media content based on the pinch ratio and the pinch speed; and

Causing the media content to be displayed based on the display size.

20. A non-transitory computer-readable medium comprising instructions stored thereon that are executable by at least one processor to cause a computing device to perform operations comprising:

Detecting movement of the computing device during playback of media content on a display of the computing device;

Receiving movement data from one or more sensors of the computing device;

analyzing the movement data to determine a direction of movement;

Calculating a rotation of the display of the media content based on the direction of movement; and

Rotating the display of the media content relative to the direction of movement to display a portion of the media content associated with the direction of movement.

Background

Sharing media content such as audio, images, and video between user devices (e.g., mobile devices, personal computers, etc.) may require converting the media content into a format consumable by a receiving device and displaying the media content on a mobile device such as a smart phone.

Drawings

The various drawings in the figures illustrate only example embodiments of the disclosure and are not to be considered limiting of its scope.

Fig. 1 is a block diagram illustrating a networked system according to some example embodiments.

fig. 2 is a block diagram illustrating a networked system including details of a camera device, according to some example embodiments.

Fig. 3 illustrates an example smart eyewear, according to some example embodiments.

Fig. 4 illustrates an example display of a circular video format, according to some example embodiments.

Fig. 5 is a flow diagram illustrating aspects of a method according to some example embodiments.

Fig. 6 illustrates an example disparity (parallelax) region map, according to some example embodiments.

Fig. 7 is a flow diagram illustrating aspects of a method according to some example embodiments.

Fig. 8 illustrates an example display on a computing device, according to some example embodiments.

Fig. 9 is a flow diagram illustrating aspects of a method according to some example embodiments.

Fig. 10 illustrates an example of a red-line display, according to some example embodiments.

Fig. 11 illustrates an example display on a computing device, according to some example embodiments.

fig. 12 is a block diagram illustrating an example of a software architecture that may be installed on a machine, according to some example embodiments.

FIG. 13 depicts a diagrammatic representation of a machine in the form of a computer system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed, according to an illustrative embodiment.

Detailed Description

Systems and methods described herein relate to processing media content items to be shared between devices via a messaging system and to be displayed on the devices. For example, a user may record a video using a camera device. The camera device may be capable of capturing video in a circular video format. In one example, the camera device may be a pair of smart glasses that may capture the entire 115 degree field of view. The 115 degree field of view is similar to the angle of view seen by the human eye and gives the camera device the ability to capture circular video. A user may wish to share a circular video with one or more other users and/or view the video on a display of a computing device, such as a smartphone. Example embodiments allow a user to view a video in a circular video format in a display of a computing device, rotate the display of the video to utilize a circular wide-angle view of the video, and view the video in full-screen mode and circular view.

fig. 1 is a block diagram illustrating a networked system 100, the networked system 100 configured to process media content items and to send and receive messages including the processed media content, according to some example embodiments. In one example embodiment, the system is a messaging system configured to receive a plurality of messages from a plurality of users, process media content contained in the messages, and send the messages with the processed media content to one or more users. System 100 may include one or more client devices, such as client device 110. Client device 110 may also be referred to herein as a user device or a computing device. Client devices 110 may include, but are not limited to, mobile phones, desktop computers, laptop computers, Portable Digital Assistants (PDAs), smart phones, tablet computers, ultrabooks, netbooks, notebook computers, multiprocessor systems, microprocessor-based or programmable consumer electronics, gaming consoles, set-top boxes, vehicle-mounted computers, or any other communication device that a user may use to access networked system 100.

In some embodiments, client device 110 may include a display module (not shown) to display information (e.g., in the form of a user interface). In some embodiments, the display module or user interface is used to display media content such as videos (e.g., videos in traditional and circular video formats), images (e.g., photographs), and the like. In further embodiments, client device 110 may include one or more of a touch screen, an accelerometer, a gyroscope, a camera, a microphone, a Global Positioning System (GPS) device, and the like. Client device 110 may be a user's device used to create content media items such as videos, images (e.g., photos), audio, and to send and receive messages containing these media content items to and from other users.

One or more users 106 may be humans, machines, or other devices that interact with the client device 110. In an example embodiment, the user 106 may not be part of the system 100, but may interact with the system 100 via the client device 110 or other device. For example, the user 106 may provide input (e.g., touch screen input or alphanumeric input) to the client device 110, and the input may be communicated to other entities in the system 100 (e.g., the third party server 130, the server system 102, etc.) via the network 104. In this example, in response to receiving input from user 106, other entities in system 100 may transmit information to client device 110 via network 104 for presentation to user 106. In this manner, the user 106 may interact with various entities in the system 100 using the client device 110.

the system 100 may further include a network 104. One or more portions of network 104 may be an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a wireless LAN (wlan), a Wide Area Network (WAN), a wireless WAN (wwan), a Metropolitan Area Network (MAN), a portion of the internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a WiFi network, a WiMax network, another type of network, or a combination of two or more such networks.

Client devices 110 may access various data and applications provided by other entities in system 100 via a web client 112 (e.g., a browser, such as an Internet browser developed by companies in redmond, washington) or one or more client applications 104. The client device 110 may include one or more applications 114 (also referred to as "apps"), such as, but not limited to, a web browser, a messaging application, an electronic mail (email) application, an e-commerce site application, a mapping or location application, a media content editing application, a media content viewing application, and so forth.

In one example, the client application 114 may be a messaging application that allows the user 106 to take a photograph or video (or receive media content from the camera device 108), add a title or otherwise edit the photograph or video, and then send the photograph or video to another user. The client application 114 may further allow the user 106 to view, via the camera device 108, photos or videos that the user 106 has taken via the client device, or to view photos and videos that another user 106 has taken via the client device 110 or the camera device 108 (e.g., in a traditional video format or a circular video format). The message may be short-lived and removed from the receiving user device after viewing or after a predetermined amount of time (e.g., 10 seconds, 24 hours, etc.). The messaging application may further allow the user 106 to create a gallery. A gallery may be a collection of media content (such as photos and videos) that may be viewed by other users who "focus" on the user's gallery (e.g., subscribe to view and receive updates in the user's gallery). The gallery may also be short-lived (e.g., lasting 24 hours, lasting for the duration of an event (e.g., during a concert, sporting event, etc.), or other predetermined time).

The ephemeral message may be associated with a message duration parameter whose value determines the amount of time the ephemeral message will be displayed by the client application 110 to a receiving user of the ephemeral message. The short-time message may be further associated with a message recipient identifier and a message timer. The message timer may be responsible for determining the amount of time that the ephemeral message is displayed to a particular receiving user identified by the message recipient identifier. For example, an ephemeral message may only show the relevant receiving user for a period of time determined by the value of the message duration parameter.

In another example, the messaging application may allow the user 106 to store photos and videos and create a gallery that is not ephemeral and may be sent to other users. For example, photos and videos of recent vacations are combined for sharing with friends and family.

in some embodiments, one or more applications 114 may be included in a given one of the client devices 110 and configured to provide a user interface and at least some functionality locally, where the applications 114 are configured to communicate with other entities in the system 100 (e.g., the server system 102) as needed for data and/or processing capabilities that are not available locally (e.g., access location information, verify users 106, verify payment methods, access media content stored on the server, synchronize media content between the client device 110 and the server computer, etc.). In contrast, one or more applications 114 may not be included in client device 110, and client device 110 may then use its web browser to access one or more applications hosted on other entities in system 100 (e.g., server system 102).

Media content, such as images and videos, may be captured via the client device (e.g., via a camera of the client device) and/or via a separate camera device 108. The camera device 108 may be a standalone camera, may be a wearable device, such as a watch with electronic functionality, a key fob, an eyewear device, and the like. In one example, the camera device 108 is an electronic-enabled eyewear device, such as so-called smart glasses (e.g., SNAP SPECTACLES). An exemplary electronic-enabled eyewear is shown in fig. 3.

Fig. 3 illustrates a pair of smart glasses 300 according to an example embodiment. The smart glasses 300 have one or more integrated cameras (e.g., at opposite ends of the glassware frame, shown as 302 and 304 in one example) with their respective lenses facing forward and having transparent covers.

In one example, the smart glasses 300 or other camera devices 108 may capture video in a circular video format. For example, the camera device 108 may include a circular wide-angle lens that captures the entire 115 degree field of view (e.g., the camera sensor of the camera device 108 captures the entire 115 degree field of view). The 115 degree field of view is similar to the angle of view seen by the human eye and gives the camera device 108 the ability to capture circular video.

In one example for capturing video in a circular video format, the camera device 108 may include a sensor (e.g., a square or rectangular sensor) for capturing images and video. The camera arrangement 108 may further include a housing located in front of the sensor to block portions of the sensor that are outside the circular area (e.g., so light will only strike the circular area). The camera lens may be disposed behind the housing. Thus, the camera device 108 may only capture circular video in a circular area. Optionally, camera device 108 may further crop each circular formatted video (e.g., set surrounding pixel values to zero on a frame-by-frame basis, which is outside of a predetermined circle size) to account for any noise around the edges of the circular region. In this manner, camera device 108 may optimize the circular format during video capture. For example, since the video format may contain a circular video, the circular video may be derived without any other modification. Furthermore, some of the benefits of video compression may be realized by cropping circular content in firmware.

returning to fig. 1, server system 102 may provide server-side functionality to one or more client devices 110 via a network 104 (e.g., the internet or a Wide Area Network (WAN)). The server system 102 can include an Application Programming Interface (API) server 120, a messaging application server 122, and a media content processing server 124, each of which can be communicatively coupled to each other and to one or more data storage devices 126.

According to some example embodiments, the server system 102 may be a cloud computing environment. In an example embodiment, the server system 102 and any servers associated with the server system 102 may be associated with a cloud-based application. The one or more data storage devices 126 may be storage devices that store information such as unprocessed media content, raw media content (e.g., high quality media content) from the user 106, processed media content (e.g., media content formatted for sharing with the client device 110 and viewing on the client device 110), user information, user device information, and so forth. The one or more data storage devices 126 may include cloud-based storage devices external to the server system 102 (e.g., hosted by one or more third party entities external to the server system 102). The data storage 126 may comprise a database, blob storage, or the like.

the media content processing server 124 may provide functionality to perform various processing of media content items. The media content processing server 124 may access one or more data stores 126 to retrieve stored data for processing media content and to store results of the processed media content.

Messaging application server 122 may be responsible for generating and communicating messages between users 106 of client devices 110. Messaging application server 122 may utilize any of a number of messaging networks and platforms to deliver messages to user 106. For example, messaging application server 122 may deliver messages via wired (e.g., internet), Plain Old Telephone Service (POTS), or wireless network (e.g., mobile, cellular, WiFi, Long Term Evolution (LTE), bluetooth) using electronic mail (e-mail), Instant Messaging (IM), Short Message Service (SMS), text, fax, or voice (e.g., voice over IP (VoIP)) messages.

As described above, the user 106 may wish to share various media content items (e.g., video, audio content, images, etc.) with one or more other users. For example, user 106 may take various videos and photographs while he is on vacation using client device 110 or other devices (e.g., camera device 108). User 106 may want to share the best videos and photos of his vacation with his friends and family. The user 106 may utilize a client application 114 (such as a messaging application) on the client device 110 to select the media content items that he wants to share. The user 106 may also edit various media content items using the client application 114. For example, the user 106 may add text to the media content item, select an overlay of the media content item (a label, drawing, other artwork, etc.), may draw on the media content item, crop or change (e.g., redeye reduction, focus, color adjustment, etc.) the media content item, and so forth. An "unprocessed" media content item refers to a media content item that has not been edited using the client application 114.

The user 106 may select media content items that he would like to share with his friends and family via the client application 114. After he has selected a media content item, he may indicate that he wants to share the media content item. For example, he may select an option (e.g., a menu item, a button, etc.) on the user interface of the client application 114 to indicate that he wishes to share the media content item.

The user 106 may view the media content via the client application 114. For example, the user 106 may view media content that he has captured on the client device 110 (e.g., via a camera of the client device 110), the user 106 may view media content captured by others and sent to the user 106, and the user 106 may view media content captured by the camera device 108.

Fig. 2 is a block diagram illustrating a networked system 200 including details of camera device 108, according to some example embodiments. In some embodiments, the camera device 108 may be implemented in the smart glasses 300 of fig. 3 as described above.

As described above with respect to fig. 1, the system 200 includes the camera device 108, the client device 110, and the server system 102. The client device 110 may be a smartphone, tablet computer, tablet, laptop, access point, or any other such device capable of connecting with the camera device 108 using both the low-power wireless connection 225 and the high-speed wireless connection 237. Client device 110 is connected to server system 102 and network 104. As described above, the network 104 may include any combination of wired and wireless connections. Also as described above, the server system 102 may be one or more computing devices as part of a service or network computing system. The client device 110 and any elements of the server system 102 and the network 104 may be implemented using the details of the software architecture 1202 or the machine 1300 described in fig. 12 and 13.

The system 200 may optionally include additional peripheral elements 219 and/or a display 211 integrated with the camera device 210. Such peripheral elements 219 may include biometric sensors, additional sensors, or display elements integrated with the camera device 210. Examples of peripheral elements 219 are further discussed with respect to fig. 12 and 13. For example, the peripheral elements 219 may include any I/O components 1350, including output components 1352, motion components 1358, or any other such elements described herein.

Camera device 108 includes camera 214, video processor 212, interface 216, low power circuitry 220, and high speed circuitry 230. The camera 214 includes a digital camera element, such as a charge coupled device, a lens, or any other light collection element that can be used to collect data, as part of the camera 214.

The interface 216 relates to any source of user commands provided to the camera device 210. In one implementation, interface 216 is a physical button on the camera that when pressed sends a user input signal from interface 216 to low power processor 222. The immediate release after pressing the camera button may be processed by the low power processor 222 as a request to capture a single image. The pressing of the camera button for the first period of time may be handled by the low power processor 222 as a request to capture video data when the button is pressed and to stop video capture when the button is released, wherein the video captured when the button is pressed is stored as a single video file. In some embodiments, the low power processor 222 may have a threshold time period, such as 500 milliseconds or one second, between pressing the button and releasing the button, below which the button press and release is processed as an image request, and above which the button press and release is interpreted as a video request. The low power processor 222 may make this determination at startup of the video processor 212. In other embodiments, interface 216 may be any mechanical switch or physical interface capable of accepting user input associated with a request for data from camera 214. In other embodiments, the interface 216 may have a software component or may be associated with commands received wirelessly from another source.

The video processor 212 includes circuitry for receiving signals from the camera 214 and processing those signals from the camera 214 into a format suitable for storage in the memory 234. The video processor 212 is configured within the camera device 210 so that it can be powered on and started under the control of the low power circuit 220. In addition, the video processor 212 may be powered down by the low power circuitry 220. Depending on various power design elements associated with the video processor 212, the video processor 212 may consume a small amount of power even when in an off state. However, this power consumption is negligible and has a negligible effect on battery life compared to the power used when the video processor 212 is in the on state. The device elements that are in the "off state are still configured within the device so that the low power processor 222 can power the device on and off. Devices referred to as "off" or "powered down" during operation of camera device 108 do not necessarily consume zero power due to leakage or other aspects of system design.

in one example embodiment, the video processor 212 includes a microprocessor Integrated Circuit (IC) that is customized for processing sensor data from the camera 214, and volatile memory used by the microprocessor to operate. To reduce the amount of time the video processor 212 spends powering up to process data, a non-volatile read-only memory (ROM) may be integrated on the IC with instructions for operating or booting the video processor 212. The ROM may be minimized to match the minimum size required to provide the basic functionality of collecting sensor data from the camera 214 so that there are no additional functions that would cause a delay in boot time. The ROM may be configured with Direct Memory Access (DMA) to volatile memory of a microprocessor of the video processor 212. DMA allows memory-to-memory transfer of data from the ROM to the system memory of the video processor 212 regardless of the operation of the main controller of the video processor 212. Providing DMA to the boot ROM further reduces the amount of time from when the video processor 212 is powered on until the sensor data from the camera 214 can be processed and stored. In some embodiments, minimal processing of the camera signals from camera 214 is performed by video processor 212, and additional processing may be performed by applications operating on client device 110 or server system 102.

The low power circuitry 220 includes a low power processor 222 and low power radio circuitry 224. These elements of low power circuit 220 may be implemented as separate elements or may be implemented on a single IC as part of a single system on a chip. Low power processor 222 includes logic for managing the other elements of camera device 108. As described above, for example, the low power processor 222 may accept user input signals from the interface 216. The low power processor 222 may also be configured to receive input signals or instructional communications from the client device 110 via the low power wireless connection 225. Additional details regarding such instructions are further described below. The low power wireless circuitry 224 includes circuit elements for implementing a low power wireless communication system. Bluetooth Smart, also known as bluetooth low energy, is a standard implementation of a low power wireless communication system that may be used to implement the low power wireless circuitry 224. In other embodiments, other low power communication systems may be used.

The high-speed circuitry 230 includes a high-speed processor 232, memory 234, and high-speed radio circuitry 236. The high speed processor 232 may be any processor capable of managing the high speed communications and operations of any general purpose computing system required by the camera device 210. The high-speed processor 232 includes processing resources necessary to manage high-speed data transfers over the high-speed wireless connection 237 using the high-speed wireless circuitry 236. In some embodiments, the high speed processor 232 executes an operating system such as the LINUX operating system or other such operating system, such as the operating system 1204 of FIG. 12. The high-speed processor 232, which executes the software architecture of the camera device 108, is used to manage, among other things, the transmission of data using the high-speed wireless circuitry 236. In some embodiments, the high-speed wireless circuitry 236 is configured to implement an Institute of Electrical and Electronics Engineers (IEEE)802.11 communication standard, also referred to herein as Wi-Fi. In other embodiments, other high-speed communication standards may be implemented by the high-speed wireless circuitry 236.

The memory 234 includes any storage device capable of storing camera data generated by the camera 214 and the video processor 212. Although the memory 234 is shown as being integrated with the high speed circuitry 230, in other embodiments the memory 234 may be a stand alone component. In some such embodiments, a circuit routing line may provide a connection from the video processor 212 or the low power processor 222 to the memory 234 through a chip that includes the high speed processor 232. In other embodiments, the high speed processor 232 may manage the addressing of the memory 234 so that the low power processor 222 will boot the high speed processor 232 at any time that a read or write operation involving the memory 234 is required.

As described above, media content (e.g., video content) may be captured in a circular video format.

fig. 4 shows an example display 400 of a circular video format on a client device 110. As can be seen in this example, the circular video format 402 of the media content (e.g., video) is larger than the display 404 (e.g., screen) of the client device 110. Thus, when in full screen mode as shown in display 404, only a portion of the entire video is displayed. Example embodiments allow a user to rotate client device 110 to rotate media content in display 404 of client device 110 to utilize circular video format 402 and view or expose more area of the circular video format.

fig. 5 is a flow diagram illustrating aspects of a method 500 for play (playback) behavior and interaction when displaying media content (e.g., video content) in circular video format on a client device 110, according to some example embodiments. For purposes of illustration, the method 500 is described with respect to the networked system 100 of fig. 1. It is understood that in other embodiments, the method 500 may be practiced with other system configurations.

In operation 502, a computing device (e.g., client device 110) detects movement of the computing device during playback of media content (or a stream of media content) on a display of the computing device. For example, the user 106 may watch a video captured in a circular video format. The user 106 may tilt or rotate the device to view other areas of the circular video format. The computing device (e.g., via a rotational player) may detect movement of the computing device via one or more sensors of the computing device. For example, the computing device may include an accelerometer sensor, a gyroscope sensor, and/or other motion sensors. The accelerometer sensor measures acceleration (e.g., rate of change of velocity) of the computing device. The accelerometer sensor is also used to determine the orientation of the computing device along its three axes. For example, movement data from an accelerometer may be used to indicate whether the computing device is in portrait mode or landscape mode, and so on. The gyroscope sensor may also provide orientation information and measure any changes in orientation.

In operation 504, the computing device receives movement data from one or more sensors of the computing device. In one example, the movement data is received from an accelerometer sensor, and the movement data includes an orientation of the computing device including a rotation of an axis of the computing device relative to downward gravity. For example, the orientation of the computing device may include a rotation of the computing device about its z-axis away from a display of the computing device. Movement data may also/alternatively be received from a gyroscope sensor, a gravity sensor, or other motion sensor or tool.

In operation 506, the computing device analyzes the movement data to determine a direction of the movement. For example, a user may tilt the computing device to the right, and a turn player of the computing device may detect how much the user tilted the computing device to the right and turned it to the right. The rotating player will then rotate the displayed media content (e.g., video) in the opposite direction so that it appears as if more media content appears when the computing device is tilted or rotated. In one example, the computing device may analyze movement data, such as the computing device from a gyroscope having rotated 9 degrees in the "Z" axis, and the speed at which the user from an accelerometer turned the phone. The computing device analyzes these outputs from the sensors to determine how much the computing device has actually rotated.

In operation 508, the computing device calculates a rotation of the display of the media content based on the direction of movement. For example, the computing device may determine that the phone has rotated 20 degrees in the "Z" axis, which means that the user has rotated the computing device 20 degrees relative to the user, and thus the media content should be moved 20 degrees in the opposite direction.

In one example, movement data is received from an accelerometer sensor and a gyroscope sensor. The computing device may determine that the computing device is held in a planar orientation by the user, and thus, the orientation is blurry. In this example, the computing device calculates a rotation of the display of the media content based on the movement data from the gyroscope sensor.

In another example, the computing device determines that the computing device is held by the user in a near-planar orientation, and thus, uses a combination of measurements from the accelerometers to calculate the rotation, and the gyroscopes are used to calculate the rotation (e.g., using movement data from both the accelerometer sensors and the gyroscope sensors).

in another example, the computing device determines that the computing device is held by the user in a near-planar orientation, but only has movement data from a gyroscope sensor (e.g., the computing device may not have an accelerometer sensor). In this example, the rotation of the display of the media content is calculated based on the movement data from the gyroscope sensor.

In operation 510, the computing device rotates the display of the media content relative to the direction of movement to display a portion of the media content associated with the direction of movement. In one example, displaying a portion of the media content includes displaying a portion of the media content that was previously not visible on a display of the computing device. In this manner, the user 106 may turn the computing device to utilize the circular video format 402 and view or reveal more areas of the circular video format.

in one example, media content is rotated relative to the floor within a display on a computing device. In other words, the anchor point may not be set based on the user starting angular position. For example, if the computing device is held in an upward position, the weight will be at the bottom of the computing device. If the user turns the computing device upside down, the media content will flip such that the weight will again be at the "bottom" of the device (which is the top of the device when it is turned upside down).

In another example, if the computing device is near horizontal, the media content rotation does not track gravity 1: 1, but rather, track the gyroscope data of the computing device. This may be done to avoid small computing device orientation changes from mapping to large rotational changes.

In another example, any media overlay (e.g., text, stickers, special effects, geographic filters, etc.) or other creative tool may rotate as the media content rotates. For example, if the user places a sticker in the lower right of the video while the user holds the computing device up, the user will no longer be able to see the sticker as it will be off the screen/display when the user turns the computing device 90 degrees. In this way, the media overlay remains in the actual position in which it is placed even when the media content is rotated.

In another example, if a sharp change in rotation is detected (e.g., such as when a user turns the computing device), the computing device should update the rotation and animate the transition to make it appear as a smooth transition (e.g., smoothly following the rotation of the computing device). For large rotational variations, the action may be delayed for a short time as it mitigates the rotation of the device. In one example, a sharp rotational change may be considered any rotation greater than ten or twenty degrees.

Some actions by the user may pause the rotation function. For example, adding creative content (e.g., media overlays) may automatically pause the rotation. After creative content is added or placed to the media content, rotation may again be automatically enabled.

In one example, if the computing device does not have a way to measure its rotation, the rotation function is disabled and the display of the media content is shown in full screen mode.

Some example embodiments may include a parallax effect (parallax effect). For example, if the user tilts the computing device slightly, a small portion of the video in the direction of the user's tilt is revealed. In this manner, as the user rotates the computing device back and forth, the motion of the media content is moved (e.g., slid) in an up-and-down and side-to-side motion. This is intended to make it more like the user looking around. In one example, the user will never see the edges of the video no matter how far the user tilts the computing device. When a user begins viewing media content captured in circular video format, the very center of the media content is displayed unless the user previously viewed another media content in circular video format, in which case the location of the video on the screen/display inherits the previous video.

Fig. 6 shows an example disparity region map 600. The diagram 600 in fig. 6 is not shown to scale. The diagram 600 shows a display area on a computing device screen 602 that is slightly smaller than a disparity area 604 (e.g., a full screen area). In one example, 86% of the media content (e.g., video) is visible without parallax. The media content may be in a circular video format 606.

The parallax region 604 is a region where the screen of the computing device can circulate (flow around). For example, the disparity region 604 is a region where the screen can move relative to the video without exposing any edges of the video. Thus, as the user tilts the computing device, the display area on the screen 602 may move (e.g., slide) to a different edge of the disparity region 604. In this way, it appears that the user is looking at more video. This may make the movement feel more natural and more stereoscopic. For example, if the user turns the computing device to the right, he will see some video to the left that he would not normally see. Without the parallax effect, the video would occupy the full screen area (e.g., parallax area 604).

in one example, the size of the media content circle is determined by a cross-section of a screen of the computing device (e.g., the media content size relative to the computing device screen size). In another example, the diameter of the video circle is 7% greater (e.g., 3.5% greater in radius) than the cross-sectional distance of the screen of the computing device, as shown the radius of the disparity 608 increases. This value is intended to give a fine experience.

In one example, the rotation of the device for parallax effects is reported in units of radians per second. For example, the media content in the computing device screen 602 may move on the screen at a rate of 1 radian-2 points (2 multiplier).

In another example, the translation of the media content follows a quadratic curve when moving out of the center, and follows a linear curve when moving toward the center. This gives a feeling of more fluent movement. For example, movement away from the center will start slowly and proceed faster, and movement toward the center will move at the same speed throughout time.

in another example, the rotation in the parallax effect is not based on some set of values or gravity, and the screen translation reacts to any change in rotation. For example, if the user tilts the device 90 degrees to the right and then slightly to the left, the screen will not be on the right edge of the disparity region 604.

in another example, if an abrupt change in translation due to parallax is necessary, such as when the user quickly tilts the computing device, the computing device should update the translation as explained above with respect to an abrupt change in rotation. In one example, the abrupt change in translation is any change in 1/3 that is greater than the range of translation in at least one axis. If this occurs simultaneously with a sharp change in motion, a rolling animation may be used.

Fig. 7 is a flow diagram illustrating aspects of a method 700 for detecting parallax motion and sliding a display accordingly, according to some example embodiments. For illustrative purposes, the method 700 is described with respect to the networked system 100 of fig. 1. It is understood that in other embodiments, method 700 may be implemented with other system configurations.

In operation 702, a computing device (e.g., client device 110) receives movement data from one or more sensors. In one example, parallax motion is detected by monitoring a gyro sensor, which measures the rate of rotation of the computing device about its axis. For example, rotation about the y-axis controls horizontal (side-to-side) parallax motion, and rotation about the x-axis controls vertical (up-down parallax motion). In one example, the disparity motion decelerates as it fades out of the edge of the circular video format and moves at a constant rate as it recenters.

In operation 704, the computing device detects parallax motion from the movement data, and in operation 706, the computing device analyzes the movement data to determine a direction of movement of the computing device. In operation 708, the computing device causes the display of the media content to slide in a direction opposite to the direction of movement.

Some example embodiments allow a user to use a gesture as an input on a computing device, such as a "pinch" on a display of the computing device (e.g., client device 110) while viewing media content in a circular video format, to switch between viewing media content in a full screen mode and viewing media content as a circle. FIG. 8 illustrates example displays 802, 804, and 806 of media content based on a user's pinching or pinching on the display. For example, as shown in example display 802, a user may view media content (e.g., a video) in a full screen mode on a display of a computing device. The display of the computing device may be a touch screen display and the user may use his finger to pinch on the display. When the user pinches and looses his hand on full-screen mode of 802, the display of the computing device will change to a circular display mode as shown in example display 804. When the user pinches and releases his hand on the circular display mode shown in 804, the display of the computing device will change as shown in example display 806 in full screen mode. In one example, the user may pinch and hold, and the media content will continue to be displayed at the size the user holds. For example, the user may pinch and hold the video display at a smaller circular size, and the video will continue to be displayed at that size until the user releases the pinch.

In one example, media content in circular video format may default to full screen mode display. When the user pinches inward on any full-screen mode display of media content in circular video format, the media content shrinks into a small circle, referred to herein as a pinch mode. When the user pinches out on any pinch-mode media content, the media content expands to full-screen mode. The media content may remain in the pinch mode until the media content is displayed in a format other than circular video or the user ends a full screen viewing session. In one example, when the media content is in the pinch mode, an entire view of the circular video format is displayed. In another example, when the media content is in full screen mode, only a center portion of the circular video format is displayed.

in one example, when the user pinches down to the pinch pattern size, the parallax effect is linearly ignored. For example, if the pinch is used such that the circle size is 80% of the default size of the pinch mode, the parallax effect will have a normal effect on the video translation of 20%. In one example, the parallax effect may be limited to 100% and 0% so that the user cannot get a stronger effect by pinching beyond the default full screen mode size.

FIG. 9 is a flow diagram illustrating aspects of a method 900 for manipulating a view of media content via a pinch gesture, according to some example embodiments. For illustrative purposes, the method 900 is described with respect to the networked system 100 of fig. 1. It is understood that in other embodiments, method 900 may be practiced with other system configurations.

in operation 902, a computing device (e.g., client device 110) receives pinch gesture data. For example, there may be a tool included in an operating system of the computing device that can recognize the pinch gesture and send data regarding the pinch gesture. The computing device (e.g., via the client application 114) may request pinch gesture data and receive the pinch gesture data when the pinch gesture data is available.

in operation 904, the computing device analyzes the pinch gesture data to determine a pinch ratio and a pinch speed. For example, the pinch gesture data may include a pinch ratio (e.g., the distance the user's finger moves from a start position to a hold or release position) and a pinch speed (e.g., how fast the user expands, contracts, or releases his finger). In operation 906, the computing device calculates a media content (e.g., video) display size based on the pinch ratio and the pinch speed. For example, if the user starts from full screen mode and begins to pinch down on the screen, the display of the media content will transition from full screen mode to circular format at the size and rate of the user's pinch. And when the user is in the pinch mode and begins to pinch large on the display, the display of media content in the smaller circle will grow to the larger circle at the size and rate of the user's pinch. After the user releases his finger, the media content display size transitions to either a full screen mode or a pinch mode, depending on the location where the user releases his finger and how fast he releases the finger.

For example, if the user is in full screen mode before the user begins to pinch and the user releases his finger within 60% of the full screen mode scale size and the scale speed is greater than a predetermined speed (e.g., > -1/s), the display is animated (e.g., transitioned) to full screen mode. If the user releases his finger at a proportional speed less than a predetermined speed (e.g., < -1/s), the display is animated as a pinch mode. If the user releases his finger within 40% of the size of the scale of the pinch mode, the display is animated as pinch mode.

In another example, if the user is in the pinch mode before the user starts to pinch and the user releases his finger within 40% of the pinch mode scale size and the scale speed is less than a predetermined speed (e.g., <1/s), the display is animated to the pinch mode. If the user releases his finger at a proportional speed greater than a predetermined speed (e.g., >1/s), the display is animated to full screen mode. If the user releases his finger within 60% of the full screen mode scale, the display is animated to full screen mode.

In one example, the display is animated using a spring animation between modes after the user releases his finger. This may include a damping at 0.75, and an initial spring velocity from the velocity of the pinch gesture (which may be converted to pixels of iOS). In one example, the duration may be 0.3 seconds in the case of abs (kneading speed) > 1. In another example, the duration may be 0.4 seconds in the case of abs (kneading speed) <1 (e.g., slightly longer to allow the animation to speed up).

in operation 908, the computing device causes the media content to be displayed based on the media content display size.

In one example, if the user pinches beyond the minimum and maximum circle sizes, the user may see a rubber band effect. The computing device may use a logarithmic curve for over-kneading. In one example, a log-base 15 value may be used for over-kneading on a pinch mode and a log-base 40 may be used for over-kneading on a full-screen mode.

For example:

// kneading mode

...

CGFloat naturalOverScroll=fabs([self pinchingModeScale]-pinchScale);

CGFloat adjustedOverscroll=logx(1+naturalOverScroll,15);

CGFloat adjustedScale=[self pinchingModeScale]–adjustedOverscroll;

...

Full screen mode

...

CGFloat naturalOverScroll=pinchScale-1.f;

CGFloat adjustedOverscroll=logx(1+naturalOverScroll,40);

CGFloat adjustedScale=adjustedOverscroll+1.f;

...

// logarithmic sample function

float logx(float value,float base)

{

return log10f(value)/log10f(base);

Code of the { fraction// sample

// kneading mode

...

CGFloat naturalOverScroll=fabs([self pinchingModeScale]-pinchScale);

CGFloat adjustedOverscroll=logx(1+naturalOverScroll,15);

CGFloat adjustedScale=[self pinchingModeScale]–adjustedOverscroll;

...

Full screen mode

...

CGFloat naturalOverScroll=pinchScale-1.f;

CGFloat adjustedOverscroll=logx(1+naturalOverScroll,40);

CGFloat adjustedScale=adjustedOverscroll+1.f;

...

// logarithmic sample function

float logx(float value,float base)

{

return log10f(value)/log10f(base);

}

fig. 10 shows an example of a red-line display for determining how to display media content in circular video content on a display of a computing device.

As explained above, the media content being displayed may be one of a plurality of media content items (e.g., as part of a gallery). The media content may include video, images (e.g., photos or other images), and so forth. The video may include a conventional video format and a circular video format. The gallery may include a plurality of media content items of different types and formats. The display of the media content item in the gallery may automatically transition to the next media content item in the gallery, or the user may jump to the next media content item via a gesture, control, or other means for indicating that he wishes to view the next media content item in the gallery.

in one example, if the user views a media content item captured in circular video format and the media content item ends or the user taps or otherwise indicates to proceed to the next media content item in the gallery, the next media content item will be displayed to the user in the same mode as the previous media content item if the next media content item was also captured in circular video format. For example, if the user pinches a previous media content item, the next media content item will continue to be displayed identically in exactly the same size, rotation, disparity, etc. as the previous media content item that the user was continuously pinching. In other words, the pinch gesture is not interrupted.

in another example, the media content item remains in the pinch mode until the user finishes viewing the media content item full screen or until a non-circular video format media content item appears or the user selects another story or gallery. FIG. 11 illustrates an example flow of media content items in a gallery. For example, a user may view a first media content item 1102 of a plurality of media content items in a gallery. The first media content item 1102 may be in a conventional video format (e.g., not a circular video format). The user may tap to move to the second media content item 1104. The second media content item 1104 may be a video in a circular video format that is displayed in full screen mode. The user may pinch to view the second media content item in the pinch mode 1106. The user may tap to move to the third media content item 1108, which third media content item 1108 may also be a video in a circular video format. The third media content item 1108 will also be displayed in a pinch mode. The user may tap to move to the fourth media content item 1110. The fourth media content item 1110 may be in a conventional video format and is therefore displayed in a normal full screen mode. The user can tap to move to the fifth media content item 1112, and the fifth media content item 1112 can be a video in a circular video format. The fifth media content item 1112 is displayed in full screen mode.

in another example, the media content item maintains its viewing mode for the entire viewing session. For example, if the user taps to view a previous media content item, the previous media content item appears in any mode (e.g., pinch mode or full screen mode) that is visible to the user when viewing during the last viewing session. In another example, if a user views media content items in a first gallery in a pinch pattern, and then goes to a second gallery, and then returns to the first gallery, the media content items will still be displayed in the pinch pattern.

In another example, if the user decides to exit viewing the media content, the media content item may shrink, and the black background linearly fades as the media content item shrinks. In another example, the user may not pinch or pinch the media content item until it is loaded.

FIG. 12 is a block diagram 1200 illustrating a software architecture 1202 that may be installed on any one or more of the devices described above. For example, in various embodiments, client device 110 and server systems 102, 120, 122, and 124 may be implemented using some or all of the elements of software architecture 1202. FIG. 12 is only an example of a non-limiting software architecture, and it will be understood that many other architectures can be implemented to facilitate the functionality described herein. In various embodiments, the software architecture 1202 is implemented by hardware, such as the machine 1300 of fig. 13, the machine 1300 including a processor 1310, a memory 1330, and I/O components 1350. In this example, software architecture 1202 may be conceptualized as a stack of layers, where each layer may provide specific functionality. For example, software architecture 1202 includes layers such as operating system 1204, libraries 1206, framework 1208, and applications 1210. Operationally, consistent with some embodiments, an application 1210 calls an Application Programming Interface (API) call 1212 through a software stack and receives a message 1214 in response to the API call 1212.

In various embodiments, operating system 1204 manages hardware resources and provides common services. The operating system 1204 includes, for example, a kernel 1220, services 1222, and drivers 1224. Consistent with some embodiments, the kernel 1220 acts as an abstraction layer between hardware and other software layers. For example, the kernel 1220 provides memory management, processor management (e.g., scheduling), component management, network connectivity, and security settings, among other functions. The service 1222 may provide other common services for other software layers. The drivers 1224 are responsible for controlling or interfacing with the underlying hardware, according to some embodiments. For example, the drivers 1224 may include a display driver, a camera driver, or a low power consumption driver, a flash memory driver, a serial communications driver (e.g., a Universal Serial Bus (USB) driver), a driver, an audio driver, a power management driver, and so forth.

in some embodiments, the library 1206 provides a low-level generic infrastructure utilized by the applications 1210. The library 1206 may include a system library 1230 (e.g., a C-standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematical functions, and the like. Further, the libraries 1206 may include API libraries 1232, such as media libraries (e.g., libraries that support the rendering and manipulation of various media formats, such as moving picture experts group-4 (MPEG4), advanced video coding (h.264 or AVC), moving picture experts group layer-3 (MP3), Advanced Audio Coding (AAC), adaptive multi-rate (AMR) audio codec, joint photographic experts group (JPEG or JPG), or portable web graphics (PNG)), graphics libraries (e.g., OpenGL frameworks for rendering in two-dimensional (2D) and three-dimensional (3D) in graphical content on a display), database libraries (e.g., white providing various relational database functions), web libraries (e.g., WebKit providing web browsing functions), and so forth. The library 1206 may likewise include various other libraries 1234 to provide many other APIs to the application 1210.

According to some embodiments, framework 1208 provides a high-level public architecture that can be utilized by applications 1210. For example, the framework 1208 provides various Graphical User Interface (GUI) functions, advanced resource management, advanced location services, and the like. The framework 1208 can provide a wide range of other APIs that can be utilized by the applications 1210, some of which can be specific to a particular operating system 1204 or platform.

In the example embodiment, the applications 1210 include a home application 1250, a contacts application 1252, a browser application 1254, a book reader application 1256, a location application 1258, a media application 1260, a messaging application 1262, a gaming application 1264, and other broadly categorized applications such as third party applications 1266 and media content applications 1267. According to some embodiments, the application 1210 is a program that performs functions defined in the program. One or more applications 1210 constructed in various ways may be created using various programming languages, such as an object-oriented programming language (e.g., Objective-C, Java or C + +) or a procedural programming language (e.g., C or assembly language). In a specific example, the third-party application 1266 (e.g., an application developed by an entity other than the vendor of the particular platform using ANDROIDTM or the IOSTM Software Development Kit (SDK)) may be mobile software running on a mobile operating system, such as IOSTM, ANDROIDTM, Phone, or other mobile operating system. In this example, the third party application 1266 may call an API call 1212 provided by the operating system 1204 in order to perform the functions described herein.

as described above, some embodiments may specifically include a messaging application 1262. In some embodiments, this may be a stand-alone application for managing communications with a server system, such as server system 102. In other embodiments, this functionality may be integrated with another application, such as media content application 1267. The messaging application 1262 may request and display various media content items and may provide the user with the ability to enter data related to the media content items via a touch interface, keyboard, or using a camera device of the machine 1300, communication with a server system via the I/O components 1350, and receipt and storage of the media content items in the memory 1330. Presentation of the media content item and user input associated with the media content item may be managed by the messaging application 1262 using a different framework 1208, library 1206 element, or operating system 1204 element operating on the machine 1300.

Fig. 13 is a block diagram illustrating components of a machine 1300 capable of reading instructions from a machine-readable medium (e.g., a machine-readable storage medium) and performing any one or more of the methodologies discussed herein, in accordance with some embodiments. In particular, fig. 13 shows a schematic diagram of the machine 1300 in the form of an example computer system within which instructions 1316 (e.g., software, programs, applications 1210, applets, applications, or other executable code) may be executed for causing the machine 1300 to perform any one or more of the methods discussed herein. In alternative embodiments, the machine 1300 operates as a standalone device or may be coupled (e.g., networked) to other machines. In a networked deployment, the machine 1300 may operate in the capacity of a server machine 102, 120, 122, 124, etc. or a client device 110 in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine 1300 may include, but is not limited to, a server computer, a client computer, a Personal Computer (PC), a tablet computer, a laptop computer, a netbook, a Personal Digital Assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a network device, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 1316 that specify actions to be taken by the machine 1300, either continuously or otherwise. Moreover, while only a single machine 1300 is illustrated, the term "machine" shall also be taken to include a collection of machines 1300 that individually or jointly execute the instructions 1316 to perform any one or more of the methodologies discussed herein.

In various embodiments, the machine 1300 includes a processor 1310, a memory 1330, and I/O components 1350, which may be configured to communicate with each other via a bus 1302. In an example embodiment, processor 1310 (e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof) includes, for example, processor 1312 and processor 1314, which may execute instructions 1316. The term "processor" is intended to include a multi-core processor 1310, which multi-core processor 1310 may include two or more separate processors 1312, 1314 (also referred to as "cores") that may execute instructions 1316 simultaneously. Although fig. 13 illustrates multiple processors 1310, the machine 1300 may include a single processor 1310 having a single core, a single processor 1310 having multiple cores (e.g., a multi-core processor 1310), multiple processors 1312, 1314 having a single core, multiple processors 1310, 1312 having multiple cores, or any combination thereof.

Memory 1330 includes main memory 1332, static memory 1334, and storage unit 1336, which are accessible to processor 1310 via bus 1302, according to some embodiments. Storage unit 1336 may include a machine-readable storage medium 1338, on which is stored instructions 1316 embodying any one or more of the methodologies or functions described herein. The instructions 1316 may likewise reside, completely or at least partially, within the main memory 1332, within the static memory 1334, within at least one of the processors 1310 (e.g., within a processor's cache memory), or any suitable combination thereof, by the machine 1300 during execution thereof. Accordingly, in various embodiments, main memory 1332, static memory 1334, and processor 1310 are considered machine-readable media 1338.

As used herein, the term "memory" refers to a machine-readable medium 1338 that can store data either temporarily or permanently, and can be considered to include, but is not limited to, Random Access Memory (RAM), Read Only Memory (ROM), cache, flash memory, and cache. While the machine-readable medium 1338 is shown in an example embodiment to be a single medium, the term "machine-readable medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) that are capable of storing the instructions 1316. The term "machine-readable medium" shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 1316) for execution by a machine (e.g., machine 1300), such that the instructions 1316, when executed by one or more processors (e.g., processors 1310) of the machine 1300, cause the machine 1300 to perform any one or more of the methodologies described herein. Thus, "machine-readable medium" refers to a single storage device or appliance, as well as a "cloud-based" storage system or storage network that includes multiple storage devices or appliances. Thus, the term "machine-readable medium" can be taken to include, but is not limited to, one or more data repositories in the form of a solid-state memory (e.g., flash memory), an optical medium, a magnetic medium, other non-volatile memory (e.g., erasable programmable read-only memory (EPROM)), or any suitable combination thereof. The term "machine-readable medium" specifically excludes non-transitory signals per se.

I/O components 1350 include various components for receiving input, providing output, generating output, sending information, exchanging information, collecting measurements, and so forth. In general, it will be appreciated that the I/O components 1350 may include many other components not shown in FIG. 13. The I/O components 1350 are grouped by function, merely to simplify the following discussion, and the grouping is in no way limiting. In various example embodiments, the I/O components 1350 include output components 1352 and input components 1354. The output components 1352 include visual components (e.g., a display such as a Plasma Display Panel (PDP), a Light Emitting Diode (LED) display, a Liquid Crystal Display (LCD), a projector, or a Cathode Ray Tube (CRT)), auditory components (e.g., speakers), tactile components (e.g., a vibrating motor), other signal generators, and so forth. The input components 1354 include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, an electro-optical keyboard, or other alphanumeric input components), point-based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), tactile input components (e.g., physical buttons, a touch screen that provides the location and force of touch gestures or touches, or other tactile input components), audio input components (e.g., a microphone), and so forth.

In some further example embodiments, the I/O components 1350 include a biometric component 1356, a sports component 1358, an environmental component 1360, or a location component 1362, among various other components. For example, the biometric components 1356 include components that detect representations (e.g., hand representations, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, sweat, or brain waves), identify a person (e.g., voice recognition, retinal recognition, facial recognition, fingerprint recognition, or electroencephalogram-based recognition), and so forth. The motion components 1358 include acceleration sensor components (e.g., accelerometers), gravity sensor components, rotation sensor components (e.g., gyroscopes), and the like. The environmental components 1360 include, for example, lighting sensor components (e.g., photometers), temperature sensor components (e.g., one or more thermometers that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometers), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensor components (e.g., machine olfaction detection sensors, gas detection sensors for detecting hazardous gas concentrations for safety or measuring pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to the surrounding physical environment. The positioning components 1362 include positioning sensor components (e.g., Global Positioning System (GPS) receiver components), altitude sensor components (e.g., altimeters or barometers, which can detect air pressure that can be derived from which altitude), orientation sensor components (e.g., magnetometers), and the like.

Communication may be accomplished using various techniques. The I/O components 1350 may include a communications component 1364 operable to couple the machine 1300 to a network 1380 or a device 1370 via a coupler 1382 and a coupler 1372, respectively. For example, communications component 1364 includes a network interface component or another suitable device that interfaces with network 1380. In further examples, the communication components 1364 include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, components (e.g., low power consumption), components, and other communication components that provide communication via other modes. The device 1370 may be another machine 1300 or any of a variety of peripheral devices, such as a peripheral device coupled via a Universal Serial Bus (USB).

Further, in some embodiments, communications component 1364 detects or includes a component operable to detect an identifier. For example, the communication components 1364 include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., optical sensors for detecting one-dimensional barcodes such as Universal Product Code (UPC) barcodes, multi-dimensional barcodes such as Quick Response (QR) codes, Aztec codes, data matrices, digital graphics, maximum codes, PDF417, supercodes, uniform commercial code reduced space symbology (UCC RSS) -2D barcodes, and other optical codes), acoustic detection components (e.g., microphones for identifying tagged audio signals), or any suitable combination thereof. Further, various information can be derived via the communications component 1364 that can indicate a particular location, such as a location via Internet Protocol (IP) geo-location, a location via signal triangulation, a location via detection or NFC beacon signals, and so forth.

In various example embodiments, one or more portions of network 1380 may be an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a wireless LAN (wlan), a Wide Area Network (WAN), a wireless WAN (wwan), a Metropolitan Area Network (MAN), the internet, a portion of the Public Switched Telephone Network (PSTN), a Plain Old Telephone Service (POTS) network, a cellular telephone network, a wireless network, a network, another type of network, or a combination of two or more such networks. For example, network 1380 or a portion of network 1380 may include a wireless or cellular network, and coupling 1382 may be a Code Division Multiple Access (CDMA) connection, a global system for mobile communications (GSM) connection, or another type of cellular or wireless coupling. In this example, the coupling 1382 may implement any of various types of data transmission techniques, such as single carrier radio transmission technology (1xRTT), evolution-data optimized (EVDO) techniques, General Packet Radio Service (GPRS) techniques, enhanced data rates for GSM evolution (EDGE) techniques, third generation partnership project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standards, other standards defined by various standards-making organizations, other remote protocols, or other data transmission techniques.

In an example embodiment, the instructions 1316 are sent or received over a network 1380 using a transmission medium via a network interface device (e.g., a network interface component included in the communications component 1364) and utilizing any one of a number of well-known transmission protocols (e.g., the hypertext transfer protocol (HTTP)). Similarly, in other example embodiments, the instructions 1316 are sent or received to the apparatus 1370 via a coupling 1372 (e.g., a peer-to-peer coupling) using a transmission medium. The term "transmission medium" may be considered to include any intangible medium that is capable of storing, encoding or carrying instructions 1316 for execution by the machine 1300, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.

Further, because the machine-readable medium 1338 does not embody a propagated signal, the machine-readable medium 1338 is non-transitory (in other words, does not have any transitory signals). However, labeling the machine-readable medium 1338 as "non-transitory" should not be construed to mean that the medium cannot move. The medium 1338 should be considered as being transportable from one physical location to another. Additionally, because the machine-readable storage medium 1338 is tangible, the medium 1338 may be considered a machine-readable device.

Throughout the specification, multiple instances may implement a component, an operation, or a structure described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed in parallel, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in the example configurations may be implemented as a combined structure or component. Similarly, structure and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.

although the summary of the present subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to the embodiments without departing from the broader scope of the embodiments of the disclosure.

The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the disclosed teachings. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The detailed description is, therefore, not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.

As used herein, the term "or" may be interpreted in an inclusive or exclusive manner. Further, multiple instances may be provided as a single instance for a resource, operation, or structure described herein. Further, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are contemplated and may fall within the scope of various embodiments of the disclosure. In general, the structures and functionality presented as separate resources in an example configuration can be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within the scope of the embodiments of the disclosure as represented by the claims that follow. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

34页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于进行音频和/或视频会议的方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类