Weather-based avatar

文档序号:91103 发布日期:2021-10-08 浏览:25次 中文

阅读说明:本技术 基于天气的化身 (Weather-based avatar ) 是由 D·阿米泰 M·C·格兰瑟姆 D·怀特 于 2020-02-26 设计创作,主要内容包括:本公开的各方面涉及一种用于基于天气状况生成化身的系统和方法,该系统包括存储程序的计算机可读存储介质。该程序和方法包括确定用户设备的当前位置;取得用户设备的当前位置处的天气状况;自动生成针对与用户设备相关联的人的基于天气的化身,基于天气的化身具有与天气状况相对应的视觉属性;以及,响应于来自请求设备的请求,使得在请求设备上显示基于天气的化身。(Aspects of the present disclosure relate to a system and method for generating an avatar based on weather conditions, the system including a computer-readable storage medium storing a program. The program and method include determining a current location of a user device; obtaining weather conditions at a current location of the user equipment; automatically generating a weather-based avatar for a person associated with the user device, the weather-based avatar having visual attributes corresponding to weather conditions; and causing a weather-based avatar to be displayed on the requesting device in response to the request from the requesting device.)

1. A method, comprising:

determining, by one or more processors, a current location of a user device;

retrieving, by the one or more processors, weather conditions at the current location of the user device;

automatically generating, by the one or more processors, a weather-based avatar for a person associated with the user device, the weather-based avatar having visual attributes corresponding to the weather conditions; and

displaying, by the one or more processors, the weather-based avatar on a requesting device in response to a request from the requesting device.

2. The method of claim 1, wherein the request is a request for weather information, wherein retrieving the weather conditions and generating the weather-based avatar are performed in response to the request for weather information, the method further comprising:

modifying an existing avatar according to the weather conditions to generate the weather-based avatar; and

generating the weather-based avatar and a visual representation of the weather condition for display on the requesting device.

3. The method of claim 1, further comprising:

obtaining a background corresponding to the weather condition; and

generating an avatar having the visual attribute with the background for display.

4. The method of claim 1, further comprising:

storing a plurality of contexts associated with the weather condition; and

selecting a background sequentially or randomly from the plurality of backgrounds.

5. The method of claim 1, further comprising:

storing a plurality of visual attributes of the avatar associated with the weather condition; and

sequentially selecting or randomly selecting the visual attribute from the plurality of visual attributes.

6. The method of claim 1, wherein the weather condition comprises a multi-day weather forecast, wherein the weather-based avatar is displayed with a visual representation of the weather condition, and wherein the visual representation of the weather condition comprises, for each day of the forecast, at least one of: a temperature indicator, a day indicator, and a graphical and textual description of the weather condition.

7. The method of claim 1, further comprising: animating the avatar based on the weather conditions.

8. The method of claim 7, further comprising: determining a current context of the user device, wherein the avatar is animated based on the current context of the user device.

9. The method of claim 8, further comprising:

determining that the user device is in a vehicle based on the current context;

selecting a graphical representation from a plurality of graphical representations of the vehicle based on the weather condition, wherein a first graphical representation of the plurality of graphical representations of the vehicle visually depicts the vehicle in a first state associated with a first weather condition, and wherein a second graphical representation of the plurality of graphical representations of the vehicle visually depicts the vehicle in a second state associated with a second weather condition; and

generating the avatar for display in the graphical representation of the vehicle selected based on the weather conditions.

10. The method of claim 1, wherein the avatar is a first avatar and the weather condition is a first weather condition, and wherein the avatar having the visual attribute is displayed as a first page of a plurality of pages on the user device with a visual representation of the weather condition, the method further comprising:

receiving a request from the requesting device to access a second page of the plurality of pages;

identifying a second user associated with the second page; and

generating, for display, a second avatar having visual attributes corresponding to a second weather condition at a first location of the second user and a given visual representation of the second weather condition in the second page.

11. The method of claim 10, further comprising: receiving, from the requesting device, a user selection of a set of users for which weather information is provided, wherein the second user is identified based on the set of users, wherein the plurality of pages are arranged based on a group in the set of users.

12. The method of claim 11, further comprising:

determining that users in a first group of the set of users are within a specified range of the first location;

determining that users in a second group of the set of users are within a specified range of second locations;

associating the first group of users with the second page such that a plurality of avatars associated with the first group of users are included with the second avatar in the second page; and

associating the second set of users with a third page of the plurality of pages.

13. The method of claim 12, wherein each of the plurality of avatars includes a visual attribute corresponding to the second weather condition, and wherein the second page visually identifies the first location, and wherein the second group is associated with the third page of the plurality of pages.

14. The method of claim 11, further comprising:

accessing current weather conditions at a plurality of locations associated with the set of users;

obtaining historical weather information at the plurality of locations;

comparing the current weather condition with the historical weather information;

determining that a given weather condition at a given location of the plurality of locations associated with a third user differs from the historical weather information at the given location by more than a specified amount; and

modifying the arrangement of the plurality of pages in response to determining that the given weather condition differs from the historical weather information at the given location by more than the specified amount.

15. The method of claim 14, wherein modifying the arrangement comprises: in response to determining that the given weather condition differs from the historical weather information at the given location by more than the specified amount, advancing one of the plurality of pages to an earlier location in a sequence corresponding to the plurality of pages.

16. The method of claim 14, further comprising: determining whether a user of the requesting device is interested in receiving weather information for the given location of the third user, wherein modifying the arrangement is performed in response to determining that the user is interested in receiving weather information for the given location of the third user.

17. The method of claim 14, further comprising: in response to determining that the given weather condition differs from the historical weather information at the given location by more than the specified amount, generating, on the requesting device, for display, an option to send a message to the third user.

18. A system, comprising:

a processor configured to perform operations comprising:

determining a current location of the user equipment;

retrieving weather conditions at the current location of the user device;

automatically generating a weather-based avatar for a person associated with the user device, the weather-based avatar having visual attributes corresponding to the weather conditions; and

causing display of the weather-based avatar on a requesting device in response to a request from the requesting device.

19. The system of claim 18, wherein the operations further comprise:

obtaining a background corresponding to the weather condition; and

generating an avatar having the visual attribute with the background for display.

20. A non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:

determining a current location of the user equipment;

retrieving weather conditions at the current location of the user device;

automatically generating a weather-based avatar for a person associated with the user device, the weather-based avatar having visual attributes corresponding to the weather conditions; and

causing display of the weather-based avatar on a requesting device in response to a request from the requesting device.

Technical Field

The present disclosure relates generally to generating avatars and providing weather information.

Background

Weather sites are some of the most popular if not the most popular sites accessed on the internet. Weather affects our daily lives, and it is not surprising that people use the internet to obtain weather information. Local conditions and forecasts are typically aggregated into a web page for each city or location. The consumer accesses a weather site on the internet, or through an application installed on the mobile device, and can then enter a city or zip code and obtain a local forecast and current conditions for the city. Such weather conditions are often presented in a generic form indicating temperature and the likelihood of rain/snow.

Drawings

In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. To facilitate a discussion of any particular element or act, one or more of the most significant digits in a reference number refer to the drawing number in which that element is first introduced. Some embodiments are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:

fig. 1 is a block diagram illustrating an example messaging system for exchanging data (e.g., messages and related content) over a network in accordance with an example embodiment.

Fig. 2 is a schematic diagram illustrating data that may be stored in a database of a messaging server system, according to an example embodiment.

Fig. 3 is a diagram illustrating a message structure generated by a messaging client application for communication, according to an example embodiment.

FIG. 4 is a block diagram illustrating an example weather avatar generation system in accordance with an example embodiment.

FIG. 5 is a flowchart illustrating example operations of a weather avatar generation system in accordance with example embodiments.

Fig. 6A, 6B, 7, and 8 are illustrative inputs and outputs of a weather avatar generation system according to an example embodiment.

FIG. 9 is a block diagram illustrating a representative software architecture that may be used in conjunction with the various hardware architectures described herein, according to an example embodiment.

Fig. 10 is a block diagram illustrating components of a machine capable of reading instructions from a machine-readable medium (e.g., a machine-readable storage medium) and performing any one or more of the methodologies discussed herein, according to an example embodiment.

Detailed Description

The following description includes systems, methods, techniques, instruction sequences, and computer machine program products that embody illustrative embodiments of the present disclosure. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of the various embodiments. It will be apparent, however, to one skilled in the art that the embodiments may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques have not been shown in detail.

Typically, weather information is presented to a user in a generic form that specifies the current temperature at a particular location and the likelihood of rain/snow. While such systems may well present general weather information, the lack of visual appeal or contact with specific users makes them less attractive and less intuitive to use, which increases their overall complexity. This is because knowing only the temperature and the likelihood of rain/snow may not be sufficient to let each user fully understand the importance of the weather conditions (e.g., selecting the appropriate clothing or determining the severity of the conditions). For example, a user living in a warm region throughout the year may not realize the importance of temperatures below 40 degrees Fahrenheit for selecting a suitable temperature garment.

In addition, typical weather applications and websites require a user to enter a particular location to determine weather conditions at that location. Such interfaces make it difficult and burdensome for users to view weather information elsewhere where their friends and family reside. For example, a user may need to select a friend, then the user may need to determine where the friend lives, and then the user may enter the friend's location into a weather website or application to obtain weather information for the friend's location. Even so, when the user eventually finds weather information for the location of the friend, the user may still not be aware of the severity of the weather condition if the user has not typically suffered such a condition. For example, a user living in a warm region throughout the year may not be aware of the importance of an upcoming storm where a friend of the user lives.

The disclosed embodiments improve the efficiency of using an electronic device by incorporating one or more avatars into a weather application to visually depict weather conditions at one or more given locations. In particular, according to some embodiments, a request for weather information is received from a user device. In response to the request, weather conditions at a current location of the user device are retrieved, and an avatar associated with the user device that includes visual attributes corresponding to the weather conditions is generated. For example, when the weather condition at the current location of the user device is rain, an avatar representing the user of the user device is generated for display, wherein the avatar's face appears frowning and the avatar holds an umbrella. In some embodiments, an avatar may be placed on a background displaying rainy weather conditions, and the avatar and background are presented to the user along with a visual representation of the weather conditions.

By presenting the user with the requested weather information for the given location and the avatar having attributes associated with the weather conditions at the given location, the user is provided with a clearer understanding and appreciation of the importance of the weather conditions. That is, the user obtains the weather conditions at the requested location with minimal user input, and can intuitively determine (by an avatar displaying someone's clothing, actions, and facial expressions) the sensation of suffering from these weather conditions.

In some embodiments, weather information at a location where a user has friends or family is provided in one or more pages associated with the friends or family. Such weather information is presented in each page using one or more avatars of friends or family, the avatars having attributes associated with the weather conditions at their location. To view current or future weather conditions at a location where the user has friends or family, the user accesses the corresponding page of the friends or family by performing a particular gesture (e.g., swiping left or right on the screen). Further, a set of friends or family who live at the same location or within a specified range of the same location are grouped into the same page. In this way, rather than obtaining weather information for a place where a friend or family of the user lives by scrolling through the multi-screen information, meaningful weather information including an avatar representing weather using visual attributes is quickly and simply presented to the user by performing a given gesture to navigate to a page containing weather information for the location of the friend or family member.

In some embodiments, to further improve the speed and simplicity of accessing relevant weather information for the location of a user's friends or family members, the pages are ordered and organized based on the user's interest in the friends or family members and/or the severity of the weather conditions for the location. For example, if it is determined that abnormal weather (e.g., a snowstorm or hurricane or other weather condition different from an average or normal weather condition) occurs at a given location, the page associated with that location is repositioned and sequenced before other pages. In this way, when a user browses different pages containing weather information for different locations where friends and family of the user live, the page for the location where the abnormal weather is determined to occur will arrive and be presented first, or before other pages. Further, when it is determined that abnormal weather is occurring, the page may include an option for the user to send a message to friends and family residing at the location associated with the page. In some implementations, when it is determined that abnormal weather occurs at a given location, the page associated with the location is automatically presented to the user as an initial landing page when the user accesses or launches the weather application.

Fig. 1 is a block diagram illustrating an example messaging system 100 for exchanging data (e.g., messages and associated content) over a network 106. The messaging system 100 includes a plurality of client devices 102, each of which hosts a plurality of applications including a messaging client application 104 and a weather application 105. Each messaging client application 104 is communicatively coupled to the messaging client application 104, the weather application 105, and other instances of the messaging server system 108 over a network 106 (e.g., the internet).

Thus, each messaging client application 104 and weather application 105 is able to communicate and exchange data with another messaging client application 104 and weather application(s) 105 and with the messaging server system 108 via the network 106. The data exchanged between the messaging client application 104 and the weather application 105 and between the messaging client application 104 and the messaging server system 108 includes functions (e.g., commands to invoke functions) and payload data (e.g., text, audio, video, or other multimedia data).

The weather application 105 is an application that includes a set of functions that allow the client device 102 to access the weather avatar generation system 124. In some implementations, the weather application 105 is a component or feature that is part of the messaging client application 104. The weather application 105 allows a user to access weather information for the user's location and/or the location where friends and family of the user reside. The weather application 105 obtains weather conditions at the user location by accessing the weather avatar generation system 124 and presents these weather conditions with avatars having attributes associated with the current weather conditions. For example, if the weather application 105 determines that it is currently raining at the user's location or if the forecast is raining, the weather application 105 indicates a likelihood of raining and presents the user's avatar with a facial expression that is frowning and holding an umbrella. The weather application 105 may retrieve an avatar associated with the user from the weather avatar generation system 124 and access avatar attributes associated with the current weather. Using the avatar attributes obtained from the weather avatar generation system 124, the weather application 105 adjusts the retrieved avatar's characteristics or attributes to have the retrieved attributes associated with the weather.

The embodiments discussed herein are examples of using an avatar to provide weather information in a weather application 105. It should be understood that the same type of information may be provided to a user in any other type of application. For example, the same techniques may be employed in other social media applications. In this case, user avatars depicting weather at a given location may be presented in a chat interface, social media activity or data feed, or on a map-based interface depicting their avatars on a map based on the current location of the user's friends. In a map-based interface, each avatar of a user friend may be modified to depict attributes associated with weather at the avatar's location.

In some embodiments, the weather application 105 determines the current context of the client device 102 that the user is using. For example, the weather application 105 determines that the client device 102 is in a car or airplane. In this case, the weather application 105 takes a graphic associated with the scene (e.g., an image of a convertible car) and adjusts the attributes of the graphic to match the current weather conditions. For example, if the weather application 105 determines that it is currently raining, the weather application 105 presents the car as a top prop with the convertible car and the windshield wipers are on. On the other hand, if the weather application 105 determines that it is currently sunny, the weather application 105 renders the car as a convertible with the top down. The weather application 105 inserts an avatar having attributes associated with weather conditions (e.g., an interior of a car having attributes associated with weather) in the current scenario.

In some embodiments, the weather application 105 retrieves a context associated with the current weather condition. The weather application 105 presents an avatar having attributes associated with the current weather conditions and/or a graphic representing the current context of the client device 102.

In some embodiments, the weather application 105 presents weather conditions at the location where the user's friends/family reside. Such weather conditions are presented in the form of a separate page dedicated to a particular location. The user may initially select a group of users that includes the user's friends/family, and the weather application 105 may determine the location of each selected friend/family. The location may be determined by communicating with messaging client application 104 and/or social networking system 122 in which the profile of the selected friend/family is stored. For each unique location or each location that exceeds a specified threshold, the weather application 105 generates a separate page. The pages are ordered based on the importance of friends or the level of interest in them by the user at a given location associated with the page and/or based on the number of friends/family members the user has at that location. For example, if the user has 10 friends in los Angeles and 3 friends in New York, the page associated with los Angeles is located earlier in the page sequence than the page for New York.

The user navigates between different pages by performing a particular gesture (e.g., sliding left/right on the screen). When a given page is accessed, weather information for the location associated with the page is presented with one or more avatars of friends/family members living at the location associated with the page. Each avatar presented on the page may include attributes associated with the current weather conditions at the location. For example, if a user navigates to a page associated with the los Angeles city where ten friends of the user live, the weather application 105 presents ten different avatars representing each of the friends. When the weather application 105 determines that the current weather conditions in los angeles are clear and the temperature is above 75 degrees fahrenheit, the ten avatars may wear beach gear and play beach volleyballs. If the user then navigates to another page associated with New York City in which the user's three friends live, the weather application 105 presents three different avatars representing each of the three friends. When the weather application 105 determines that the current weather condition in new york is cold and rainy and the temperature is below 45 degrees fahrenheit, the three avatars may wear coats with their faces frown.

In some embodiments, the weather application 105 receives an alert from the weather avatar generation system 124 indicating that an abnormal weather condition is detected at a location associated with one of the pages of the weather application 105. The abnormal weather condition may be a weather condition that differs by more than a specified amount from a historical average weather condition for the same time period (e.g., the same month, the same season, or the same day). For example, the weather avatar generation system 124 sends an alert to the weather application 105 indicating that a snowstorm is detected in New York City where the user's three friends live. In this case, the weather application 105 reorganizes the pages of the weather application 105 to position the page associated with the new york city before all or some of the other pages (e.g., the pages associated with the los angeles city).

In some implementations, in response to receiving an alert indicating an abnormal weather condition, a page associated with new york city may be located as the home page of the weather application 105. The page may indicate that abnormal weather conditions exist in new york city and may include three avatars having attributes associated with the abnormal weather conditions. The page may now also include an option to send a message to all or a selected subset of the friends/family members residing at the location associated with the displayed page. In response to receiving the user's selection of the option, the user may compose a message to send to the selected friend/family member using the messaging client application 104.

The messaging server system 108 provides server-side functions to the particular messaging client application 104 via the network 106. Although certain functions of the messaging system 100 are described herein as being performed by the messaging client application 104 or by the messaging server system 108, it should be understood that the location of certain functions within the messaging client application 104 or the messaging server system 108 is a design choice. For example, it is technically preferable to first deploy certain technologies and functions within the messaging server system 108, and then migrate the technologies and functions to the messaging client application 104 where the client device 102 has sufficient processing power.

The messaging server system 108 supports various services and operations provided to the messaging client application 104. Such operations include sending data to the messaging client application 104, receiving data from the messaging client application 104, and processing data generated by the messaging client application 104. By way of example, the data may include message content, client device information, geo-location information, media annotations and overlays (overlays), virtual objects, message content persistence conditions, social network information, and live event information. Data exchange in the messaging system 100 is invoked and controlled through functions available via a User Interface (UI) of the messaging client application 104.

Turning now specifically to messaging server system 108, an Application Program Interface (API) server 110 is coupled to an application server 112 and provides a programming interface to application server 112. The application server 112 is communicatively coupled to a database server 118, the database server 118 facilitating access to a database 120, the database 120 having stored therein data associated with messages processed by the application server 112.

In particular, handles the API server 110, which server 110 receives and sends message data (e.g., commands and message payloads) between the client device 102 and the application server 112. In particular, the API server 110 provides a set of interfaces (e.g., routines and protocols) that can be called or queried by the messaging client application 104 to invoke functions of the application server 112. The API server 110 exposes various functions supported by the application server 112, including: registering an account; a login function; sending a message from a particular messaging client application 104 to another messaging client application 104 via an application server 112; sending a media file (e.g., an image or video) from the messaging client application 104 to the messaging server application 114 for possible access by another messaging client application 104; setting of a collection of media data (e.g., stories); acquisition of such a set; retrieval of a buddy list of a user of the client device 102; obtaining messages and contents; adding and deleting friends in the social graph; the location of the friend in the social graph; access to user session data; access to avatar information stored on messaging server system 108; and open application events (e.g., related to messaging client application 104).

The application server 112 hosts a number of applications and subsystems, including a messaging server application 114, an image processing system 116, a social networking system 122, and a weather avatar generation system 124. The messaging server application 114 implements a number of message processing techniques and functions, which are particularly concerned with the aggregation and other processing of content (e.g., text and multimedia content) included in messages received from multiple instances of the messaging client application 104. As will be described in further detail, text and media content from multiple sources may be aggregated into a collection of content (e.g., referred to as a story or gallery). The messaging server application 114 then makes these sets available to the messaging client application 104. In view of the hardware requirements of such processing, messaging server application 114 may also perform other processor and memory intensive processing of data at the server side.

The application server 112 also includes an image processing system 116 that is dedicated to performing various image processing operations generally with respect to images or video received within the payload of a message at the messaging server application 114. Portions of the image processing system 116 may also be implemented by the weather avatar generation system 124.

The social networking system 122 supports and makes available various social networking functions and services to the messaging server application 114. To this end, the social networking system 122 maintains and accesses an entity graph within the database 120. Examples of functions and services supported by social-networking system 122 include identifying other users of messaging system 100 with whom a particular user has a relationship or is "paying attention to," and also identifying other entities and interests of the particular user. Such other users may be referred to as friends of the user. Social-networking system 122 may access location information associated with each of the user's friends to determine their residence or current geographic location. Social-networking system 122 may maintain a location profile for each friend of the user indicating the geographic location where the friend of the user lives.

The application server 112 is communicatively coupled to a database server 118, the database server 118 facilitating access to a database 120 having stored in the database 120 data associated with messages processed by the messaging server application 114.

Fig. 2 is a diagram 200 illustrating data that may be stored in the database 120 of the messaging server system 108, according to some example embodiments. Although the contents of database 120 are shown as including a plurality of tables, it should be understood that data may be stored in other types of data structures (e.g., as an object-oriented database).

The database 120 includes message data stored in a message table 214. The entity table 202 stores entity data including an entity map 204. The entities for which records are maintained in entity table 202 may include individuals, corporate entities, organizations, objects, places, events, and so forth. Regardless of the type, any entity to which the data stored by the messaging server system 108 pertains may be a recognized entity. Each entity has a unique identifier and an entity type identifier (not shown).

The entity graph 204 also stores information about relationships and associations between entities. By way of example only, such relationships may be social, professional (e.g., working in the same company or organization), interest-based, or activity-based.

The message table 214 may store a set of conversations between the user and one or more friends or entities. The message table 214 may include various attributes for each conversation, such as a list of participants, a size of the conversation (e.g., number of users and/or number of messages), a chat color of the conversation, a unique identifier of the conversation, and any other characteristics associated with the conversation.

The database 120 also stores annotation data in the annotation table 212 in the exemplary form of a filter. The database 120 also stores the received annotated content in an annotation table 212. The filters for which data is stored in the annotation table 212 are associated with and applied to the video (for which data is stored in the video table 210) and/or the image (for which data is stored in the image table 208). In one example, the filter is an overlay that is displayed as an overlay over the image or video during presentation to the recipient user. The filters may be of various types, including filters from user selections in a filter library presented to the sender user by the messaging client application 104 when the sender user is composing a message. Other types of filters include a geographic location filter (also referred to as a geographic filter) that may be presented to a sender user based on geographic location. For example, based on geographic location information determined by a Global Positioning System (GPS) unit of client device 102, messaging client application 104 may present a neighbor-or location-specific geographic location filter within the UI. Another type of filter is a data filter that may be selectively presented to a sending user by messaging client application 104 based on other input or information collected by client device 102 during a message creation process. Examples of data filters include the current temperature at a particular location, the current speed at which the sender user is traveling, the battery life of the client device 102, or the current time.

Other annotation data that may be stored within the image table 208 is so-called "shot" data. A "shot" may be a real-time special effect and sound that may be added to an image or video.

As described above, video table 210 stores video data that, in one embodiment, is associated with messages for which records are maintained within message table 214. Similarly, image table 208 stores image data associated with messages for which message data is stored within entity table 202. Entity table 202 may associate various annotations from annotation table 212 with various images and videos stored in image table 208 and video table 210.

Avatar weather attributes 207 store avatar attributes or parameters that weather avatar generation system 124 uses to generate avatars representing different weather conditions. For example, avatar weather attributes 207 associate a first plurality of avatar attributes for a first weather condition and a second plurality of avatar attributes for a second weather condition. The avatar attributes may specify facial expressions, animated features, avatar accessories (e.g., umbrellas), avatar apparel, and avatar gestures. Each weather condition may include a different set or combination of different attributes. Avatar attributes may be stored as general instructions for modifying a particular avatar to depict a given set of avatar attributes. For example, a first avatar including features specific to a first user (e.g., hair style and skin tone) may be adjusted based on a first set of avatar attributes to depict a particular pose and have a particular set of apparel associated with the first set of avatar attributes. A second avatar including features specific to a second user may be adjusted based on the same first set of avatar properties to depict the same particular pose as the first avatar and to have the same particular set of garments associated with the first set of avatar properties as the first avatar while maintaining features unique to the second user.

The avatar weather attributes 207 may store graphics or contextual attributes that the weather avatar generation system 124 uses to generate a graphical representation of a scene. For example, the avatar weather attributes 207 may store a first set of parameters for a car driving in a first weather condition (e.g., rainy days) and a second set of parameters for a car driving in a second weather condition (e.g., sunny days). A first parameter set may indicate that windshield wipers are running and the roof of the convertible is up, while a second parameter set may indicate that the roof of the convertible is down or open.

The landscape weather attribute 209 may store a context for the weather avatar generation system 124 to use the avatar to provide weather information. For example, the landscape weather attribute 209 associates a first plurality of contexts for a first weather condition and a second plurality of contexts for a second weather condition. The first weather condition may be a light rain weather, and in this case, the first plurality of backgrounds may include one background depicting a rainbow and the sun and another background depicting a grey cloud blocking the sun. The second weather condition may be snowing weather, and in this case, the second plurality of backgrounds may include one background depicting snowing from the day and another background depicting snowmen on the ground.

The story table 206 stores data relating to a set of messages and associated image, video or audio data that are compiled into a set (e.g., a story or gallery). Creation of a particular collection may be initiated by a particular user (e.g., each user for whom a record is maintained in entity table 202). A user may create a "personal story" in the form of a collection of content that the user has created and transmitted/broadcast. To this end, the UI of the messaging client application 104 may include user-selectable icons to enable the sender user to add specific content to his or her personal story.

Collections can also constitute "live stories," which are collections of content from multiple users that are created manually, automatically, or using a combination of manual and automatic techniques. For example, a "live story" may constitute a curated stream of user-submitted content from various locations and events. For example, a user whose client device has location services enabled and is at a common location event at a particular time may be presented with options via the UI of the messaging client application 104 to contribute content to a particular live story. The live story may be identified to the user by messaging client application 104 based on his or her location. The end result is a "live story" from a community perspective.

Another type of content collection is referred to as a "location story" which enables users whose client devices 102 are located within a particular geographic location (e.g., within a college or college campus) to contribute to a particular collection. In some embodiments, contribution to the location story may require secondary authentication to verify that the end user belongs to a particular organization or other entity (e.g., a student on a college campus).

Fig. 3 is a schematic diagram illustrating the structure of a message 300, the message 300 being generated by a messaging client application 104 for communication with another messaging client application 104 or a messaging server application 114, in accordance with some embodiments. The contents of a particular message 300 are used to populate a message table 214 stored in the database 120 accessible to the messaging server application 114. Similarly, the contents of message 300 are stored in memory as "in-transit" or "in-flight" data for client device 102 or application server 112. Message 300 is shown to include the following components:

message identifier 302: a unique identifier that identifies the message 300.

Message text payload 304: text to be generated by the user through the user interface of the client device 102 and included in the message 300.

Message image payload 306: image data captured by a camera component of the client device 102 or retrieved from a memory of the client device 102 and included in the message 300.

Message video payload 308: video data captured by the camera component or retrieved from a memory component of the client device 102 and included in the message 300.

Message audio payload 310: audio data collected by a microphone or retrieved from a memory component of the client device 102 and included in the message 300.

Message annotation 312: annotation data (e.g., a filter, sticker, or other enhancement function) representing annotations to be applied to message image payload 306, message video payload 308, or message audio payload 310 of message 300.

Message duration parameter 314: a parameter value that indicates an amount of time in seconds for the content of the message (e.g., message image payload 306, message video payload 308, message audio payload 310) to be presented to or made accessible to the user via messaging client application 104.

Message geo-location parameter 316: geographic location data (e.g., latitude and longitude coordinates) associated with the content payload of the message. A plurality of message geo-location parameter 316 values may be included in the payload, each of which is associated with a respective content item included in the content (e.g., a particular image within message image payload 306, or a particular video in message video payload 308).

Message story identifier 318: an identifier value that identifies one or more collections of content (e.g., "stories") associated with a particular content item in the message image payload 306 of the message 300. For example, multiple images within message image payload 306 may each be associated with multiple sets of content using an identifier value.

Message tag 320: each message 300 may be tagged with a plurality of tags, each tag indicating the subject matter of the content included in the message payload. For example, where a particular image included in message image payload 306 depicts an animal (e.g., a lion), a marker value indicative of the relevant animal may be included within message marker 320. The marker values may be generated manually based on user input or may be generated automatically using, for example, image recognition.

Message sender identifier 322: an identifier (e.g., a messaging system identifier, an email address, or a device identifier) indicating the user of the client device 102 on which the message 300 was generated and from which the message 300 was sent.

Message recipient identifier 324: an identifier (e.g., a messaging system identifier, an email address, or a device identifier) indicating the user of the client device 102 to which the message 300 is addressed. In the case of a conversation between multiple users, the identifier may identify each user involved in the conversation.

The content (e.g., values) of the various components of message 300 may be pointers to locations in a table that stores content data values. For example, the image value in the message image payload 306 may be a pointer (or address) to a location in the image table 208. Similarly, values in message video payload 308 may point to data stored in video table 210, values stored in message comment 312 may point to data stored in comment table 212, values stored in message story identifier 318 may point to data stored in story table 206, and values stored in message sender identifier 322 and message recipient identifier 324 may point to user records stored in entity table 202.

FIG. 4 is a block diagram illustrating an example weather avatar generation system 124, according to an example embodiment. The weather avatar generation system 124 includes a weather conditions module 414, a user location module 419, an avatar attribute selection module 416, a background selection module 418, and an avatar display module 420. The weather avatar generation system 124 optionally also includes an abnormal weather determination module 422, a contextual determination module 412, and a contextual graphic selection module 413.

The user location module 419 accesses a list of friends/family members that the user specifies to be included in the weather application 105. Specifically, the user may open the weather application 105 and select an option to add friends. The add friends option may take a list of all friends of the user from the social networking system 122. The user may select which friends the user is most interested in including in the weather application 105. In particular, any friend selected from the list can be used to generate a corresponding page of locations associated with the selected friend. The user location module 419 accesses the friends list and communicates with the social networking system 122 to determine the geographic location of each selected friend.

In some embodiments, the geographic location obtained by the user location module 419 represents the real-time current location of the device associated with each visited friend and/or a pre-stored or pre-designated location associated with the friend that represents where the friend lives or dwells. User location module 419 may group the friends selected by the user based on the similarity of the user's locations. For example, friends associated with a given location within a specified range of a given city (e.g., less than 50 miles) may be grouped together. The specified range may be input by a user or may be selected by an operator of the weather avatar generation system 124. User location module 419 may output a plurality of groups of friends, each of the groups being associated with the same common location in each group. User location module 419 generates different pages for each group of friends such that each page represents a given location.

The page generated by the user location module 419 is provided to the weather conditions module 414 along with the current geographic location or predetermined location of the user of the client device 102. The weather conditions module 414 accesses a third party website or application to obtain weather information for each location associated with the page provided by the user location module 419. For example, the weather conditions module 414 may determine that the user of the client device 102 is currently in new york and accordingly obtain weather conditions from a third-party website of the weather application that include a weather forecast for new york city. The weather condition module 414 can determine that the group of friends indicated by the user location module 419 are associated with a page representing the city of los angeles, and thus, the weather condition module 414 can communicate with a third-party weather website or application to obtain weather conditions (e.g., current weather and/or weather condition forecast) for the city of los angeles. In some embodiments, the weather conditions module 414 may obtain the weather conditions continuously in real time, periodically, or in response to receiving a user request to access the weather application 105.

The weather conditions module 414 provides the weather conditions to the avatar attribute selection module 416, the abnormal weather determination module 422, and the context determination module 412. The avatar attribute selection module 416 accesses the avatar weather attributes 207. In particular, avatar attribute selection module 416 retrieves from avatar weather attributes 207 a list of avatar attributes associated with the weather conditions received from weather conditions module 414. The avatar attribute selection module 416 selects the attributes associated with the weather conditions obtained from the avatar weather attributes 207 in a random, pseudo-random, or round robin manner (e.g., in a sequential manner).

In some embodiments, avatar attribute selection module 416 determines that multiple users are represented by a given page associated with the same weather conditions. For example, three users may be associated with the los Angeles City page, the current weather condition for the city indicated by the weather condition module 414 being sunny. In this case, the avatar attribute selection module 416 selects three different avatar attributes in a random, pseudo-random, or round robin fashion from the avatar weather attributes 207 associated with sunny weather conditions. In some embodiments, multiple avatars may be displayed together for a given location (e.g., on a single page) based on the selected multiple different avatar attributes.

The avatar attribute selection module 416 obtains the avatar associated with the particular user represented by the given page. For example, the avatar attribute selection module 416 obtains the avatars of the users of the client device 102 represented by a first page and obtains three avatars of three users represented by a second page corresponding to the city of los Angeles. Avatar attribute selection module 416 modifies the obtained avatar to include attributes selected from avatar weather attributes 207. For example, the avatar attribute selection module 416 adjusts the pose, accessories, and clothing of a given avatar to represent weather conditions for the location represented by the page on which the avatar is displayed.

The avatar attribute selection module 416 provides the weather conditions and the modified avatar for each page to the background selection module 418. The background selection module 418 accesses the landscape weather attribute 209. In particular, the context selection module 418 retrieves a context list from the landscape weather attribute 209 that is associated with the weather conditions received from the avatar attribute selection module 416. The context selection module 418 selects the context associated with the weather condition obtained from the landscape weather attribute 209 in a random, pseudo-random, or round-robin manner (e.g., in a sequential manner) among the contexts. The context selection module 418 selects a different context from the landscape weather attribute 209 for each page for which weather conditions are obtained.

The context selection module 418 combines the modified avatar provided by the avatar attribute selection module 416 with the selected corresponding context. For example, a given page corresponds to the city of los angeles where three friends of the user live. In this case, avatar attribute selection module 416 generates three avatars depicting three beach-worn friends playing volleyballs (e.g., having a volleyball-specific pose), and background selection module 418 selects the beach as the background. The background selection module 418 combines three avatars wearing beach-mounted volleyballs on a beach background.

The context selection module 418 provides the avatar and context for each page combination to the avatar display module 420. The avatar display module 420 arranges the received backgrounds and avatars into a collection or set of pages of the weather application 105. For example, the avatar display module 420 presents, as a first page, an avatar of the user of the client device 102 having attributes associated with weather conditions at the user location and a background representing the weather conditions at the user location. The avatar display module 420 may present, as a second page, three avatars associated with users living in los Angeles or within a specified range of los Angeles, the avatars having attributes associated with the current weather conditions in los Angeles in combination with a background representing the weather conditions.

The context determination module 412 accesses the social networking system 122 to obtain current context information for the user selected to be included in the weather application 105. In some embodiments, the context determination module 412 communicates with each user's device selected to be included in the weather application 105 to infer a context. For example, the context determination module 412 may communicate with an accelerometer of a given user device to determine the speed at which the device is moving. If the speed exceeds the first specified amount, the context determination module 412 determines that the device and the user are in an automobile. If the speed exceeds a second designated amount that is greater than the first amount, the context determination module 412 determines that the device and the user are in the aircraft. As another example, the context determination may be based on location alone or in combination with speed. For example, the context determination module 412 may determine that the user is located at sea or in a lake, and in response, present an avatar depicting the user on a boat, yacht, or the like.

The scenario determination module 412 provides the determined scenario to the scenario graphic selection module 413, which retrieves a graphic (e.g., an avatar of a car or airplane) representing the determined scenario. The context determination module 412 provides the determined context and the current weather condition to the avatar weather attributes 207 to obtain graphical attributes associated with the weather condition. The contextual graphic selection module 413 selects a given graphic attribute from the avatar weather attributes 207 and modifies the graphic based on the selected graphic attribute. For example, the scene graph selection module 413 modifies the graph representing the car avatar to put the roof down or open when the weather conditions are sunny, and modifies the graph representing the car avatar to prop up or close the roof and animate the windshield wiper when the weather conditions are rain. As another example, the scene graph selection module 413 modifies a graph representing a boat on water with rough water conditions (on rainy days) or calm water conditions (on sunny days).

The modified avatar is provided to an avatar display module 420, which integrates one or more avatars on a given page with the modified graphics. For example, the avatar display module 420 identifies which avatar on a given page is associated with the graphics received from the situational graphics selection module 413. The avatar display module 420 may then insert an avatar on the page into the modified graphic. For example, in sunny conditions, the avatar display module 420 generates a page in which an avatar wearing beach is added or placed within an open-roof or top-down car avatar.

The exceptional weather determination module 422 receives weather conditions from each location for which the weather conditions module 414 generated a page. The abnormal weather determination module 422 compares the weather conditions at the corresponding location to the historical average to determine whether the current conditions differ from the historical average by more than a specified amount. For example, the exceptional weather determination module 422 stores a database with historical weather averages for various geographic locations. The average may be any time range, such as seasonal, daily, monthly, or yearly.

In one example, the anomalous weather determination module 422 can determine that the current weather conditions for a given city (e.g., los angeles) for which the page was generated includes a temperature in excess of 99 degrees fahrenheit. The anomalous weather determination module 422 takes the average daily temperature for the current day or month of the year from the stored historical average and determines the average temperature for los angeles in the year to be 65 degrees fahrenheit. The abnormal weather determination module 422 may determine that the current weather condition of los angeles is more than 20 degrees fahrenheit above the average and may therefore generate an indication or alert that the weather condition is abnormal.

In another example, the anomalous weather determination module 422 can determine that the current weather condition for a given city (e.g., new york) for which the page was generated includes a temperature below negative 5 degrees fahrenheit. The abnormal weather determination module 422 takes the average daily temperature for the current day or month of the year from the stored historical average and determines that the average temperature for new york in the year is 32 degrees fahrenheit. The abnormal weather determination module 422 may determine that the current weather condition in new york is more than 15 degrees fahrenheit below the average and may therefore generate an indication or alert that the weather condition is abnormal.

In another example, the anomalous weather determination module 422 can determine that the current weather conditions for a given city (e.g., new york) for which the page was generated includes more than 25 inches of snow. The unusual weather determination module 422 takes the average daily snowing condition for the current day or month of the year from the stored historical average and determines that the average snowing condition for new york is 2 inches of snow in the year. The abnormal weather determination module 422 may determine that the current weather condition in new york is more than 10 inches above the average snow and may therefore generate an indication or alert that the weather condition is abnormal.

An alert or indication of an abnormal weather condition is provided to the avatar display module 420. The avatar display module 420 may rearrange or order the pages based on the alert or indication. For example, the abnormal weather determination module 422 indicates that new york has an abnormal weather condition. In this case, the avatar display module 420 positions or places the page associated with new york city before all other pages associated with other cities. Thus, when the user navigates from the first page to subsequent pages of the weather application 105, the pages associated with New York City will be navigated and presented first before the pages of other cities.

FIG. 5 is a flowchart illustrating example operations of the gasification aerosol generation system 124 in performing process 500, according to an example embodiment. The process 500 may be embodied in computer readable instructions for execution by one or more processors such that the operations of the process 500 may be performed in part or in whole by functional components of the messaging server system 108 and/or the weather application 105; accordingly, the process 500 is described below by referring to an example thereof. However, in other embodiments, at least some of the operations of process 500 may be deployed on various other hardware configurations. Thus, the process 500 is not intended to be limited to the messaging server system 108, but may be implemented in whole or in part by any other component. Some or all of the operations of process 500 may be parallel, out of order, or omitted entirely.

At operation 502, the weather avatar generation system 124 determines a current location of the user device. For example, the user location module 419 determines the current location of the client device 102 of the user's friend (e.g., by obtaining GPS coordinates of the client device 102).

At operation 503, the weather avatar generation system 124 retrieves weather conditions at the current location of the user device. For example, the weather condition module 414 determines a city associated with the current location of the client device 102 or the GPS coordinates of the client device 102. The weather conditions module 414 communicates with a third party weather service (e.g., a website or third party weather application) to obtain current weather conditions and/or weather forecasts for the determined city or GPS coordinates.

At operation 504, the weather avatar generation system 124 automatically generates a weather-based avatar for a person associated with the user device having visual attributes corresponding to weather conditions. For example, the avatar attribute selection module 416 receives current weather conditions from the weather conditions module 414 and selects avatar attributes associated with the received weather conditions. The selected avatar attributes are used to modify the avatar of the user's friend associated with the client device 102 to represent weather conditions (e.g., modify the avatar's pose and clothing to represent current weather conditions).

At operation 505, the weather avatar generation system 124 causes a weather-based avatar to be displayed on the requesting device in response to receiving the request from the requesting device. For example, the weather application 105 presents a page to the user of the other client device 102 that includes the weather conditions (e.g., temperature) obtained by the weather conditions module 414 and the modified avatar generated by the avatar attribute selection module 416.

Fig. 6A, 6B, 7, and 8 show illustrative inputs and outputs of the weather avatar generation system 124 in accordance with an example embodiment. The inputs and outputs shown in fig. 6A, 6B, 7, and 8 may be implemented by the weather application 105. The weather application 105 may receive a user request for weather. In response, the weather application 105 determines that the user is living in new york and accesses weather information for the user's location. Weather application 105 generates display 601 in which a user's current location 612 is indicated along with current weather 614. The weather application 105 presents a background 610 associated with the current weather 614. In this case, the background 610 shows snow and one snowman, because the current weather 614 indicates a temperature of 30 degrees Fahrenheit and is snowing. The weather application 105 presents an avatar 616 with attributes associated with snowing weather (e.g., the avatar wears a sweater and is animated to play with a snowman). On another day, the weather application 105 determines that the weather is rainy. In this case, the weather application 105 presents a background 620 depicting rain and an avatar 620 who is frowning and holding an umbrella.

In response to the weather application 105 receiving a gesture that the user swipes to the left, the weather application 105 accesses a second page in which an avatar 632 associated with the user's friend is presented in the screen 602. The indicator 624 at the bottom of the screen 602 displays how many pages are available in the weather application 105 using a plurality of points and highlights a given point associated with the current page being displayed. The weather application 105 determines that the user's friend lives in Montana, where the current weather includes a thunderstorm. Thus, the background 630 depicts a lightning and the avatar 632 of the friend is depicted as crouching in a fear of sitting down.

In response to the weather application 105 receiving the gesture that the user swiped to the left, the weather application 105 accesses a third page in which an avatar associated with the user's second friend is presented in screen 603. The weather application 105 determines that the second friend lives in jamaica where the current weather is sunny and the temperature exceeds 60 degrees. Accordingly, a first background 640 of the plurality of backgrounds associated with such a condition (e.g., a sunny day with a temperature above 60 degrees) is selected, wherein sunset and beach are depicted. A first avatar attribute 642 of the plurality of avatar attributes associated with such a condition (e.g., clear day with temperature above 60 degrees) is selected, wherein the avatar is playing a volleyball, and the first avatar attribute 642 is used to modify the avatar of the second friend.

In response to the weather application 105 receiving the gesture that the user swiped to the left, the weather application 105 accesses a fourth page in which an avatar associated with the third friend of the user is presented in screen 604. The weather application 105 determines that the third friend lives in california where the current weather is sunny and the temperature is above 60 degrees. Accordingly, a second background 650 of the plurality of backgrounds associated with such a condition (e.g., a clear day with a temperature above 60 degrees) is selected, wherein an urban landscape having a clear sky is depicted. A second avatar attribute 652 of the plurality of avatar attributes associated with such a condition (e.g., sunny day with a temperature above 60 degrees) is selected, wherein the second avatar attribute 652 wears short sleeves and sunglasses and is used to modify an avatar of a third friend.

In response to the weather application 105 receiving the gesture that the user swiped to the left, the weather application 105 accesses a fifth page in which an avatar associated with the fourth friend of the user is presented in the screen 605. The weather application 105 determines that the fourth friend lives in washington, where the current weather is clear and the temperature exceeds 60 degrees. Accordingly, a third background 660 of the plurality of backgrounds associated with such conditions (e.g., sunny days with temperatures above 60 degrees) is selected, wherein flower, grass and trees are depicted. A third avatar attribute 662 of the plurality of avatar attributes associated with such a condition (e.g., sunny day with a temperature above 60 degrees) is selected, wherein the avatar is jogging, and the third avatar attribute 662 is used to modify the avatar of the fourth friend.

FIG. 7 illustrates another screen 710 for presenting weather information to a user in response to receiving a user request for weather information. The weather application 105 determines that the temperature at the location of the user is below freezing. Accordingly, the avatar 720 is presented with an attribute in which the avatar 720 shakes inside ice. The avatar 720 is presented with the weather forecast 730 for the next few days of the location. An indicator 712 is presented at the top of the screen 710 showing that the current page represents the user's location and includes a number of points indicating how many more pages are available that include weather information for the location where the user's friends live. If the user slides up or down, more weather detail information about the location corresponding to the current page is displayed. When the user slides to the left, FIG. 8 shows the transition from screen 710 (where the page representing the user's location is displayed) to adjacent page 820 (where the location of the user's friend is displayed). Indicator 712 is updated to indicator 810 to identify the friends of the user for whom weather information is provided by specifying the names of the friends. A friend's avatar 830 is also displayed in the page 820, with attributes associated with the weather at the location of the friend.

In some embodiments, the user may jump directly to a particular page by clicking on the corresponding point in the page shown in indicator 810. That is, rather than sliding left/right multiple times to reach a given page, the user may click on a point corresponding to the page that the user wants to access. For example, if the user is on a first page and there are six pages in total, the user may directly access the sixth page by clicking on the sixth point while viewing the first page, rather than sliding left/right five times to reach the page.

Fig. 9 is a block diagram illustrating an example software architecture 906, which example software architecture 906 may be used in conjunction with the various hardware architectures described herein. Fig. 9 is only a non-limiting example of a software architecture, and it will be understood that a number of other architectures may be implemented to facilitate the functionality described herein. The software architecture 906 may be executed on hardware, such as the machine 1000 of fig. 10, the machine 1000 including, among other things, a processor 1004, a memory 1014, and input/output (I/O) components 1018. A representative hardware layer 952 is shown and may represent, for example, the machine 1000 of fig. 10. The representative hardware layer 952 includes one or more processing units 954 with associated executable instructions 904. Executable instructions 904 represent executable instructions of software architecture 906, including implementations of the methods, components, etc., described herein. The hardware layer 952 also includes a memory and/or storage module 956 that also has executable instructions 904. The hardware layer 952 may also include other hardware 958.

In the example architecture of fig. 9, the software architecture 906 may be conceptualized as a stack of layers, where each layer provides a particular function. For example, the software architecture 906 may include layers such as an operating system 902, libraries 920, framework/middleware 918, applications 916, and presentation layers 914. Operationally, an application 916 or other component within these layers may invoke an API call 908 through the software stack and, in response to the API call 908, receive a message 912. The layers shown are representative in nature and not all software architectures have all layers. For example, some mobile or dedicated operating systems may not provide framework/middleware 918, while other operating systems may provide such layers. Other software architectures may include additional or different layers.

The operating system 902 may manage hardware resources and provide common services. The operating system 902 may include, for example, a kernel 922, services 924, and drivers 926. The kernel 922 may act as an abstraction layer between hardware and other software layers. For example, the kernel 922 may be responsible for memory management, processor management (e.g., scheduling), component management, networking, security settings, and so forth. Services 924 may provide other common services for other software layers. The drivers 926 are responsible for controlling or interfacing with the underlying hardware.For example, driver 926 includes a display driver, a camera driver,Drives, flash drives, serial communication drives (e.g., Universal Serial Bus (USB) drives),Drivers, audio drivers, power management drivers, etc., depending on the hardware configuration.

The library 920 may provide a common infrastructure that may be used by the applications 916 and/or other components and/or layers. The library 920 generally provides the following functions: allowing other software components to perform tasks in an easier manner than by interfacing directly with the underlying operating system 902 functions (e.g., kernel 922, services 924, and/or drivers 926). The library 920 may include a system library 944 (e.g., a C-standard library), which system library 944 may provide functions such as memory allocation functions, string manipulation functions, mathematical functions, and the like. Further, the libraries 920 may include API libraries 946 such as media libraries (e.g., libraries to support the rendering and operation of various media formats (e.g., MPEG4, h.264, MP3, AAC, AMR, JPG, PNG)), graphics libraries (e.g., OpenGL framework that may be used to render two-dimensional and three-dimensional graphical content on a display), database libraries (e.g., SQLite that may provide various relational database functions), web libraries (e.g., WebKit that may provide web browsing functions), and so forth. The library 920 may also include a variety of other libraries 948 to provide a number of other APIs to the application 916 and other software components/modules.

Framework/middleware 918 (also sometimes referred to as middleware) provides a high-level general-purpose infrastructure that can be used by applications 916 and/or other software components/modules. For example, the framework/middleware 918 can provide various Graphical User Interface (GUI) functionality, advanced resource management, advanced location services, and the like. The framework/middleware 918 can provide a wide variety of other APIs that can be used by applications 916 and/or other software components/modules, some of which may be specific to a particular operating system 902 or platform.

The applications 916 include built-in applications 938 and/or third party applications 940.Examples of representative built-in applications 938 may include, but are not limited to: a contacts application, a browser application, a book reader application, a location application, a media application, a messaging application, and/or a gaming application. Third-party applications 940 may include the use of Android by entities other than vendors of a particular platformTMOr iOSTMThe application developed by the Software Development Kit (SDK) may be in a mobile operating system (e.g., iOS)TM、AndroidTMPhone or other mobile operating system). The third party application 940 may invoke the API call 908 provided by a mobile operating system, such as operating system 902, to facilitate the functionality described herein.

The applications 916 may utilize built-in operating system functions (e.g., kernel 922, services 924, and/or drivers 926), libraries 920, and framework/middleware 918 to create a UI to interact with a user of the system. Alternatively or additionally, in some systems, interaction with the user may occur through a presentation layer, such as presentation layer 914. In these systems, the application/component "logic" may be separated from aspects of the application/component that interact with the user.

Fig. 10 illustrates a block diagram of components of a machine 1000 capable of reading instructions from a machine-readable medium (e.g., a machine-readable storage medium) and performing any one or more of the methodologies discussed herein, according to some example embodiments. In particular, fig. 10 shows a diagrammatic representation of a machine 1000 in the example form of a computer system within which instructions 1010 (e.g., software, a program, an application, an applet, an application program, or other executable code) for causing the machine 1000 to perform any one or more of the methodologies discussed herein may be executed. As such, the instructions 1010 may be used to implement the modules or components described herein. The instructions 1010 transform the general-purpose, unprogrammed machine 1000 into a specific machine 1000 that is programmed to perform the functions described and illustrated in the described manner. In alternative embodiments, the machine 1000 operates as a standalone device or may be coupled (e.g., networked) to other machines. In a networked deployment, the machine 1000 may operate in the capacity of a server machine or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine 1000 may include, but is not limited to: a server computer, a client computer, a Personal Computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a Personal Digital Assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a network appliance, a network router, a network switch, a network bridge, or any machine capable of executing instructions 1010 that specify actions to be taken by machine 1000, in turn or otherwise. Further, while only a single machine 1000 is illustrated, the term "machine" shall also be taken to include a collection of machines that individually or jointly execute the instructions 1010 to perform any one or more of the methodologies discussed herein.

The machine 1000 may include a processor 1004, memory/storage 1006, and I/O components 1018, which may be configured to communicate with each other, e.g., via a bus 1002. In an example embodiment, processor 1004 (e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof) may include, for example, processor 1008 and processor 1012 that may execute instructions 1010. The term "processor" is intended to include multicore processor 1004, which may include two or more independent processors (sometimes referred to as "cores") that may execute instructions concurrently. Although fig. 10 illustrates multiple processors 1004, the machine 1000 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core processor), multiple processors with a single core, multiple processors with multiple cores, or any combination thereof.

The memory/storage 1006 may include a memory 1014 (such as a main memory, or other memory storage device) and a storage unit 1016, all of which may be accessed by the processor 1004, such as via the bus 1002. The storage unit 1016 and memory 1014 store instructions 1010 embodying any one or more of the methodologies or functions described herein. The instructions 1010 may also reside, completely or partially, within the memory 1014, within the storage unit 1016, within at least one of the processors 1004 (e.g., within a cache memory of the processor), or any combination thereof during execution by the machine 1000. Thus, the memory 1014, the storage unit 1016, and the memory of the processor 1004 are examples of machine-readable media.

The I/O components 1018 may include a variety of components to receive input, provide output, generate output, send information, exchange information, collect measurements, and so forth. The particular I/O components 1018 included in a particular machine 1000 will depend on the type of machine. For example, a portable machine such as a mobile phone would likely include a touch input device or other such input mechanism, while a headless (headset) server machine would likely not include such a touch input device. It should be understood that I/O component 1018 may include a number of other components not shown in FIG. 10. The I/O components 1018 are grouped by function for purposes of simplifying the following discussion only, and the grouping is in no way limiting. In various example embodiments, the I/O components 1018 may include output components 1026 and input components 1028. The output components 1026 may include visual components (e.g., a display such as a Plasma Display Panel (PDP), a Light Emitting Diode (LED) display, a Liquid Crystal Display (LCD), a projector, or a Cathode Ray Tube (CRT)), acoustic components (e.g., speakers), tactile components (e.g., a vibration motor, a resistive mechanism), other signal generators, and so forth. The input components 1028 may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, an electro-optical keyboard, or other alphanumeric input components), point-based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), tactile input components (e.g., physical buttons, a touch screen providing location and/or force of touch or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and/or the like.

In further example embodiments, the I/O components 1018 may include a biometric component 1039, a motion component 1034, an environmental component 1036, or a location component 1038, among a variety of other components. For example, the biometric components 1039 may include components for detecting expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measuring bio-signals (e.g., blood pressure, heart rate, body temperature, sweat, or brain waves), identifying a person (e.g., voice recognition, retinal recognition, facial recognition, fingerprint recognition, or electroencephalogram-based recognition), and so forth. The motion components 1034 may include acceleration sensor components (e.g., accelerometers), gravity sensor components, rotation sensor components (e.g., gyroscopes), and so forth. Environmental components 1036 can include, for example, lighting sensor components (e.g., a photometer), temperature sensor components (e.g., one or more thermometers that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., a barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., an infrared sensor that detects nearby objects), gas sensors (e.g., gas detection sensors for detecting concentrations of harmful gases or measuring pollutants in the atmosphere for safety), or other components that can provide indications, measurements, or signals corresponding to the surrounding physical environment. The location components 1038 may include location sensor components (e.g., GPS receiver components), altitude sensor components (e.g., altimeters or barometers that detect from which altitude the air pressure may be obtained), orientation sensor components (e.g., magnetometers), and so forth.

Communication may be accomplished using a variety of techniques. The I/O components 1018 may include a communications component 1040, the communications component 1040 operable to couple the machine 1000 to a network 1037 or a device 1029 via a coupling 1024 and a coupling 1022, respectively. For example, the communications component 1040 may include a network interface component or other suitable device that interfaces with the network 1037. In further examples, communications component 1040 may include a wired communications component, a wireless communications component, a cellular communications component, a Near Field Communications (NFC) component, a wireless communications component, a cellular communications component, a wireless,Component (e.g.)Low energy consumption),Components, and other communication components that provide communication via other modalities. Device 1029 can be another machine or any of a variety of peripheral devices (e.g., a peripheral device coupled via USB).

Further, the communication component 1040 can detect the identifier or include a component operable to detect the identifier. For example, the communication component 1040 may include a Radio Frequency Identification (RFID) tag reader component, an NFC smart tag detection component, an optical reader component (e.g., an optical sensor for detecting one-dimensional barcodes such as Universal Product Code (UPC) barcodes, multi-dimensional barcodes (e.g., Quick Response (QR) codes, Aztec codes, data matrices, digital graphics, max codes, PDF417, supercodes, UCC RSS-2D barcodes), and other optical codes), or an acoustic detection component (e.g., a microphone for identifying tagged audio signals). In addition, various information can be obtained via the communication component 1040, such as obtaining a location via an Internet Protocol (IP) geo-location, viaSignal triangulation to obtain location, obtaining location via detection of NFC beacon signals that may indicate a particular location, etc.

Glossary

As used herein, a "carrier wave signal" refers to any intangible medium that is capable of storing, encoding, or carrying transient or non-transient instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such instructions. The instructions may be transmitted or received over a network using a transitory or non-transitory transmission medium through a network interface device and using any one of a number of well-known transmission protocols.

In this context, "client device" refers to any machine that interfaces with a communication network to obtain resources from one or more server systems or other client devices. The client device may be, but is not limited to: a mobile phone, desktop computer, laptop, PDA, smart phone, tablet, ultrabook, netbook, notebook, multiprocessor system, microprocessor-based or programmable consumer electronics, gaming console, set-top box, or any other communication device that a user may use to access a network.

In this context, "communication network" refers to one or more portions of a network, which may be an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a Wireless Local Area Network (WLAN), a Wide Area Network (WAN), a Wireless Wide Area Network (WWAN), a Metropolitan Area Network (MAN), the Internet, a portion of the Public Switched Telephone Network (PSTN), a Plain Old Telephone Service (POTS) network, a cellular telephone network, a wireless network, a network for a mobile device, a mobile terminal, a mobile,A network, another type of network, or a combination of two or more such networks. For example, the network or a portion of the network may comprise a wireless or cellular network, and the coupling may be a Code Division Multiple Access (CDMA) connection, a global system for mobile communications (GSM) connection, or another type of cellular or wireless coupling. In this example, the coupling may implement any of a number of types of data transmission techniques, such as single carrier radio transmission technology (1xRTT), evolution-data optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, enhanced data rates for GSM evolution (EDGE) technology, third generation partnership project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standards, other standards defined by various standards-making organizations, other remote protocols, or other data transmission techniques.

In this context, "ephemeral message" refers to a message that is accessible for a time-limited duration. The ephemeral message may be text, an image, a video, etc. The access time for the ephemeral message may be set by the message sender. Alternatively, the access time may be a default setting or a setting specified by the recipient. Regardless of the setting technique, the message is temporary.

In this context, a "machine-readable medium" refers to a component, device, or other tangible medium capable of storing instructions and data, either temporarily or permanently, and may include, but is not limited to: random Access Memory (RAM), Read Only Memory (ROM), cache memory, flash memory, optical media, magnetic media, cache memory, other types of storage devices (e.g., Erasable Programmable Read Only Memory (EPROM)), and/or any suitable combination thereof. The term "machine-readable medium" shall be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) that are capable of storing the instructions. The term "machine-readable medium" shall also be taken to include any medium or combination of media that is capable of storing instructions (e.g., code) for execution by the machine such that the instructions, when executed by one or more processors of the machine, cause the machine to perform any one or more of the methodologies described herein. Thus, "machine-readable medium" refers to a single storage apparatus or device, as well as a "cloud-based" storage system or storage network that includes multiple storage apparatuses or devices. The term "machine-readable medium" does not include a signal by itself.

In this context, a "component" refers to a device, physical entity, or logic having boundaries defined by function or subroutine calls, branch points, APIs, or other techniques that provide partitioning or modularization of specific processing or control functions. Components may be combined through their interfaces with other components to perform a machine process. A component may be a packaged functional hardware unit designed to be used with other components and portions of programs that typically perform the specified functions of the associated function. The components may constitute software components (e.g., code embodied on a machine-readable medium) or hardware components. A "hardware component" is a tangible unit that is capable of performing certain operations and may be configured or arranged in some physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware components of a computer system (e.g., a processor or a set of processors) may be configured by software (e.g., an application or application portion) as a hardware component that operates to perform certain operations described herein.

Hardware components may also be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware component may comprise dedicated circuitry or logic that is permanently configured to perform certain operations. The hardware component may be a special purpose processor such as a Field Programmable Gate Array (FPGA) or ASIC. The hardware components may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, the hardware components may include software executed by a general purpose processor or other programmable processor. After being configured by such software, the hardware components become a particular machine (or particular components of a machine) that is specifically customized to perform the configured functions and are no longer general purpose processors. It will be appreciated that the decision to implement a hardware component mechanically in a dedicated and permanently configured circuit or in a temporarily configured circuit (e.g., configured by software) may be driven by cost and time considerations. Thus, the phrase "hardware component" (or "hardware-implemented component") should be understood to include a tangible entity, be it a physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed), an entity that operates in a certain manner or performs certain operations described herein. In view of embodiments in which the hardware components are temporarily configured (e.g., programmed), each hardware component need not be configured or instantiated at any time. For example, where the hardware components include a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured at different times to be different special-purpose processors (e.g., including different hardware components), respectively. Thus, software configures a particular processor or processors accordingly, e.g., to constitute one particular hardware component at one time and another different hardware component at a different time.

A hardware component may provide information to and receive information from other hardware components. Thus, the described hardware components may be considered to be communicatively coupled. In the case where a plurality of hardware components exist at the same time, communication may be achieved by signal transmission (for example, through an appropriate circuit and bus) between two or more hardware components. In embodiments in which multiple hardware components are configured or instantiated at different times, communication between such hardware components may be achieved, for example, by storing and retrieving information in memory structures accessible to the multiple hardware components. For example, one hardware component may perform an operation and store the output of the operation in a memory device to which it is communicatively coupled. Another hardware component may then access the memory device at a later time to retrieve and process the stored output.

The hardware components may also initiate communication with an input or output device and may operate on a resource (e.g., a collection of information). Various operations of the example methods described herein may be performed, at least in part, by one or more processors that are temporarily configured (e.g., via software) or permanently configured to perform the relevant operations. Whether temporarily configured or permanently configured, such a processor may constitute a processor-implemented component that operates to perform one or more operations or functions described herein. As used herein, "processor-implemented component" refers to a hardware component that is implemented using one or more processors. Similarly, the methods described herein may be implemented at least in part by a processor, where a particular processor or processors are examples of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented components. Further, the one or more processors may also be operable to support performance of related operations in a "cloud computing" environment or as a "software as a service" (SaaS). For example, at least some of the operations may be performed by a set of computers (as an example of a machine including processors), where the operations may be accessed via a network (e.g., the internet) and via one or more appropriate interfaces (e.g., APIs). Execution of certain operations may be distributed among processors, not only residing within a single machine, but also being deployed across multiple machines. In some example embodiments, the processors or processor-implemented components may be located in a single geographic location (e.g., in a home environment, an office environment, or a server farm). In other example embodiments, the processor or processor-implemented component may be distributed across multiple geographic locations.

In this context, "processor" refers to any circuit or virtual circuit (a physical circuit emulated by logic executing on an actual processor) that manipulates data values based on control signals (e.g., "commands," "opcodes," "machine code," etc.) and generates corresponding output signals suitable for operating a machine. The processor may be, for example, a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an ASIC, a Radio Frequency Integrated Circuit (RFIC), or any combination thereof. The processor may further be a multi-core processor having two or more independent processors (sometimes referred to as "cores") that may execute instructions simultaneously.

In this context, "timestamp" refers to a series of characters or encoded information that identifies when a particular event occurred, such as a given date and time, sometimes accurate to a fraction of a second.

Variations and modifications may be made to the disclosed embodiments without departing from the scope of the disclosure. These and other changes or modifications are intended to be included within the scope of the present disclosure, as expressed in the following claims.

33页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:图像处理方法、装置、系统、平台及计算机可读存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类