Manipulation of graphical icons

文档序号:1676837 发布日期:2019-12-31 浏览:26次 中文

阅读说明:本技术 图形图标的操纵 (Manipulation of graphical icons ) 是由 D.J.墨菲 A.L.加德纳三世 A.B.萨赫特-泽尔特泽 于 2018-07-17 设计创作,主要内容包括:用于图形图标操纵的方法、系统和装置,这些方法、系统和装置包括编码在计算机存储介质上的计算机程序。在一个方面,方法包括接收对应于对位于图形界面的第一部分中的图形项的选择的用户输入的动作。这些动作进一步包括接收对应于图形项的拖动的移动输入。这些动作进一步包括确定选择输入的位置在图形界面的第二部分中。这些动作进一步包括通过基于被放置在第二部分中的图形项提供对图形项的改变的表示来代替图形项从而更新图形项。这些动作进一步包括确定选择输入已经停止。这些动作进一步包括提供对图形项的改变以便于输出。(Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for graphical icon manipulation. In one aspect, a method includes an act of receiving user input corresponding to selection of a graphical item located in a first portion of a graphical interface. The actions further include receiving a movement input corresponding to a dragging of the graphical item. The actions further include determining that a location of the selection input is in a second portion of the graphical interface. The actions further include updating the graphical item by providing a representation of changes to the graphical item in place of the graphical item based on the graphical item placed in the second portion. The actions further include determining that the selection input has ceased. The actions further include providing changes to the graphical item to facilitate the outputting.)

1. A computer-implemented method, comprising:

receiving, by the graphical interface, a user input corresponding to a selection of a graphical item located in the first portion of the graphical interface;

while receiving the selection input, receiving, by the graphical interface, a movement input corresponding to placement of the graphical item in a second portion of the graphical interface;

while receiving the selection input, determining that a location of the selection input is in a second portion of a graphical interface;

updating the graphical item by providing a representation of changes to the graphical item in place of the graphical item based on determining that the location of the selection input is in the second portion of the graphical interface and while receiving the selection input;

determining that the selection input has ceased; and is

Based on determining that the selection input has ceased, providing a change to the graphical item for output.

2. The method of claim 1, comprising:

based on determining that the selection input has ceased, ceasing to provide the graphical item for output without changing the graphical item.

3. The method of claim 1 or 2, wherein the first portion of the graphical interface corresponds to a first application and the second portion of the graphical interface corresponds to a second, different application.

4. The method of claim 3, comprising:

accessing an object represented by a graphical item in response to receiving a movement input representing movement of the graphical item through the graphical interface;

receiving, from a first application, a first instruction to render an object while receiving the selection input; and is

Rendering the object according to the first instruction while receiving the selection input and before determining that the location of the selection input is in the second portion of the graphical interface.

5. The method of claim 3 or 4, comprising:

in response to determining that the location of the selection input is in the second portion of the graphical interface, receiving, from a second application, a second instruction to render the object while receiving the selection input,

wherein the graphical item is updated based on the second instruction.

6. The method of claim 3, 4 or 5, comprising:

determining that a second application is configured to provide a plurality of changes to the graphical item; and is

The change to the graphical item is selected from among a plurality of changes to the graphical item.

7. The method of claim 6, wherein the change to the graphical item is selected from among a plurality of changes to the graphical item based on user input.

8. The method of claim 7, wherein the user input comprises an input representing a continuous movement of a graphical item and/or a representation of a change of a graphical item to a different region of the second portion of the graphical user interface.

9. A method as claimed in claim 8 when dependent on claim 3, wherein the different regions of the second portion represent different functional regions of the second application.

10. The method of any preceding claim, wherein the first portion of the graphical interface corresponds to a first portion of an application and the second portion of the graphical interface corresponds to a second, different portion of the application.

11. A system, comprising:

one or more computers and one or more storage devices storing instructions that, when executed by the one or more computers, are operable to cause the one or more computers to perform operations comprising:

receiving, by the graphical interface, a user input corresponding to a selection of a graphical item located in the first portion of the graphical interface;

while receiving the selection input, receiving, by the graphical interface, a movement input corresponding to placement of the graphical item in a second portion of the graphical interface;

while receiving the selection input, determining that a location of the selection input is in a second portion of a graphical interface;

updating the graphical item by providing a representation of changes to the graphical item in place of the graphical item based on determining that the location of the selection input is in the second portion of the graphical interface and while receiving the selection input;

determining that the selection input has ceased; and is

Based on determining that the selection input has ceased, providing a change to the graphical item for output.

12. The system of claim 11, wherein the operations further comprise:

based on determining that the selection input has ceased, ceasing to provide the graphical item for output without changing the graphical item.

13. The system of claim 11 or 12, wherein the first portion of the graphical interface corresponds to a first application and the second portion of the graphical interface corresponds to a second, different application.

14. The system of claim 13, wherein the operations further comprise:

accessing an object represented by the graphical item in response to receiving a movement input corresponding to a dragging of the graphical item through the graphical interface;

receiving, from a first application, a first instruction to render an object while receiving the selection input; and is

The object is reproduced according to a first instruction while receiving a selection input and a movement input.

15. The system of claim 13 or 14, wherein the operations further comprise:

in response to determining that the location of the selection input is in the second portion of the graphical interface, receiving, from a second application, a second instruction to render the object while receiving the selection input,

wherein the graphical item is updated based on the second instruction.

16. The system of claim 13, 14, or 15, wherein the operations further comprise:

determining that a second application is configured to provide a plurality of changes to the graphical item; and is

The change to the graphical item is selected from among a plurality of changes to the graphical item.

17. The system of claim 16, wherein the change to the graphical item is selected from among a plurality of changes to the graphical item based on user input.

18. The system of any of claims 11 to 17, wherein the first portion of the graphical interface corresponds to a first portion of an application and the second portion of the graphical interface corresponds to a second, different portion of the application.

19. A non-transitory computer-readable medium storing software, the software comprising instructions executable by one or more computers, the instructions, when so executed, causing the one or more computers to perform operations comprising:

receiving, by the graphical interface, a user input corresponding to a selection of a graphical item located in the first portion of the graphical interface;

while receiving the selection input, receiving, by the graphical interface, a movement input corresponding to placement of the graphical item in a second portion of the graphical interface;

while receiving the selection input, determining that a location of the selection input is in a second portion of a graphical interface;

updating the graphical item by providing a representation of changes to the graphical item in place of the graphical item based on determining that the location of the selection input is in the second portion of the graphical interface and while receiving the selection input;

determining that the selection input has ceased; and is

Based on determining that the selection input has ceased, providing a change to the graphical item for output.

20. The medium of claim 19, wherein the operations further comprise:

based on determining that the selection input has ceased, ceasing to provide the graphical item for output without changing the graphical item.

21. The medium of claim 19 or 20, wherein a first portion of the graphical interface corresponds to a first application and a second portion of the graphical interface corresponds to a second, different application.

22. The medium of claim 19, 20, or 21, wherein the first portion of the graphical interface corresponds to a first portion of an application and the second portion of the graphical interface corresponds to a second, different portion of the application.

Technical Field

This specification relates generally to graphical user interfaces.

Background

A Graphical User Interface (GUI) is a type of user interface that allows a user to interact with an electronic device through graphical icons and visual indicators, rather than a text-based user interface, such as typed commands. The user performs the action by directly manipulating the graphical icon and the visualization element.

Disclosure of Invention

The user may move the icon around the GUI by dragging the icon from one location to another. Typically, the operating system may provide an indication as to whether the user's action will result in the cutting and pasting of an icon and its underlying files or other types of data, the copying of an icon and its underlying files, or the creation of a shortcut to an underlying file. For example, where the underlying file is to be copied to a location where the user drops the icon, the operating system may enhance the icon with a plus sign.

To provide the user with a better understanding of the meaning of moving an icon and its underlying files between applications, the operating system may adjust the icon while the user hovers the icon over the destination application. The adjustment may include changing the icon to show a representation of what the destination application would show if the user placed the icon into the destination application. For example, a user may move an icon corresponding to a photo into a photo editor. The user drags the icon into the photo editor application. The operating system transforms the icon to display the contents of the image of the underlying photograph as the icon crosses the graphical boundary of the screen on which the photo editor is displayed. The user is still able to move the icon as it is typically during a drag operation. In this case, the user can view the meaning of putting the icon into the photo editor before actually putting the icon into the photo editor.

The meaning of the preview drag and drop operations may be helpful in situations where the user is not aware of how the icon and its underlying file will interact with the destination application. For example, the user may drag the album icon into the photo editor. The album icons may correspond to the music tracks and the album cover images. When the user drags the album icon over the photo editor, the operating system may transition the album icon to display the content of the album art image. If the user does not want to edit the image of the album art, the user may drag the album icon out of the graphical boundary of the photo editor without the photo editor opening the image of the album art.

By providing a preview of the meaning of the drag operation and the drop operation, the operating system may provide improved security and/or protect the privacy of the user by restricting applications that have access to user files. While the user hovers the icon over the destination application, the destination application may not have access to the underlying file. Instead, the destination application provides instructions to the operating system for rendering (rendering) the underlying file. Thus, the user can preview the meaning of the drag and drop operations and decide not to drag the icon into the destination application without the destination application loading the underlying file.

According to an innovative aspect of the subject matter described in the present application, a method for graphical icon manipulation includes the acts of: receiving, by the graphical interface, a user input corresponding to a selection of a graphical item located in the first portion of the graphical interface; receiving a movement input corresponding to a dragging of the graphic item through the graphic interface while receiving the selection input; while receiving the selection input and the movement input, determining that a location of the selection input is in a second portion of the graphical interface; updating the graphical item based on determining that the location of the selection input is in the second portion of the graphical interface and while receiving the selection input, by providing a representation of changes to the graphical item based on the graphical item placed in the second portion in place of the graphical item; determining that the selection input has ceased; and providing the change to the graphical item for output based on determining that the selection input has ceased.

These and other embodiments may each optionally include one or more of the following features. The actions further include, based on determining that the selection input has ceased, ceasing to provide the graphical item for output without changing the graphical item. The first portion of the graphical interface corresponds to a first application and the second portion of the graphical interface corresponds to a second, different application. The actions further include, in response to receiving a movement input corresponding to a dragging of the graphical item through the graphical interface, accessing an object represented by the graphical item; receiving a first instruction from a first application to render an object while receiving a selection input; and reproducing the object according to the first instruction while receiving the selection input and the movement input.

The actions further include, in response to determining that the location of the selection input is in the second portion of the graphical interface, receiving, from the second application, a second instruction to render the object while receiving the selection input. The graphical item is updated based on the second instruction. The actions further include determining that the second application is configured to provide a plurality of changes to the graphical item; and selecting a change to the graphical item from among the plurality of changes to the graphical item. A change to the graphical item from among the plurality of changes to the graphical item is selected based on the user input. The first portion of the graphical interface corresponds to a first portion of the application and the second portion of the graphical interface corresponds to a second, different portion of the application. The user input may include an input representing a continuous movement of the graphical item and/or a representation changing the graphical item to a different region of the second portion of the graphical user interface. The different areas of the second portion may represent different functional areas of the second application.

Other embodiments of this aspect include corresponding systems, apparatus, and computer programs, recorded on computer storage devices, configured to perform the operations of the methods.

The subject matter described in this application can have one or more of the following advantages. The system can provide the user with a preview of what would happen if the user placed an icon in the application without the user actually placing the icon in the application. In this way, the system provides continuous feedback to guide more accurate and efficient human-computer interaction. For example, by providing previews, the system enables more accurate human-computer interaction by helping the user avoid placing icons in incorrect or undesirable applications. Further, by improving accuracy, the user may not have to perform undo operations on unexpected drops, which results in both an increase in task efficiency and an increase in the use of computing resources. Loading content into applications and undo operations can be computationally expensive and use additional computing resources. For example, when content is loaded into an application, at least some of the content may be fetched from storage and loaded into memory, i.e., an operation that requires CPU clock cycles, bus bandwidth, and space in memory. Rather, according to at least some aspects described herein, a user may view the icon changes and decide whether to drop the icon. Further, when content is opened within an application, the application may perform processing on the content. The ability of an application to process content may present security issues in the event that opening is inadvertent or undesirable. In at least some aspects described herein, the system can restrict access by an application to any data to which an icon is linked when a preview is displayed. Restricting access to data may improve security and/or protect the privacy of a user.

The details of one or more embodiments of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.

Drawings

Fig. 1A to 1E and 2A to 2F show example interfaces of graphical icon operations.

FIG. 3 shows a swim lane diagram of graphical icon operations.

FIG. 4 illustrates an example process of graphical icon manipulation.

Fig. 5 illustrates an example of a computing device and a mobile computing device.

In the drawings, like reference numerals designate corresponding parts throughout the several views.

Detailed Description

Fig. 1A-1E illustrate example interfaces for graphical icon manipulation. Briefly, and as described in more detail below, a user selects an icon 110 on a GUI of a mobile device 102 having a touch screen display 104. Icon 110 links to structured data that can be interpreted in one or more ways by one or more applications. The user selects an icon 110 located within the source application 106 and creates a temporary icon that can link to the selected icon 110 and have an appearance that matches the selected icon 110. If the user drags this temporary icon into the destination application 108, the temporary icon changes appearance based on the data that is linked to the selected icon 110 and identified by the destination application 108. The user places the temporary icon into the destination application 108 and creates a new image or icon in the destination application 108 that links to a copy of some or all of the data to which the selected icon 110 links. In this and other examples, an icon may refer to an image or graphical representation of an object or idea. An icon may also refer to a more complex widget, such as a music album consisting of multiple views, text widgets, or any similar widget.

FIG. 1A illustrates an example mobile device 102 having a touch screen display 104 displaying a GUI. In the example of fig. 1A, the mobile device 102 runs two or more applications to simultaneously display at least two of the applications (the source application 106 and the destination application 108) on the GUI of the touch screen display 104 in a split-screen layout. Although the displayed GUI is shown on the touch screen display 104 of the mobile device 102, it may be viewed on any computing device (with or without a touch screen display) configured to display two or more applications.

In the example of FIG. 1A, the source application 106 is a music library application that contains a plurality of selectable icons 110, 112, 114, and 116. Specifically, the icons 110, 112, 114, and 116 represent music albums and link to various types of data including, for example, image data and music data. In this example, the image data is a representation of the respective album art, and the music data is one or more songs on the respective album.

The icon 110 identifies the album and includes an album layout including a artwork area 118 and a title area 120. The photo area 118 displays image data, such as the cover of the corresponding album. The title area 120 may be left blank or may display information related to the corresponding album, such as the name of the corresponding album, the name of the band or artist that produced the corresponding album, or the name of one or more of the songs on the corresponding album. In the example of fig. 1A, the photo area 118 may display a portion of the album art due to limitations on the size and dimensions of the icon 110 and corresponding limitations on the photo area 118.

As shown in FIG. 1A, icons 110, 112, 114, and 116 are visually represented on the touch screen display 104 within the display area of the source application 106. In the example of FIG. 1A, icons 110, 112, 114, and 116 are of similar size and dimensions. The icons in the displayed application, such as icons 110, 112, 114, and 116, may have relative or fixed sizes and dimensions. In some implementations, the size of the icons may depend on the number of other icons displayed in the respective application, such that the more icons that are displayed in a given application, the smaller the size of each icon. For example, if the source application 106 displays an icon, the icon may occupy a majority of the display area of the source application 106 as compared to icons 110, 112, 114, and 116, which each occupy approximately one-fourth of the display area of the source application 106. Similarly, the size of the icon may also, or alternatively, depend on the size of the display area allocated to the corresponding application (e.g., the source application 106 in which the icon was found). The display area of a given application may be larger or smaller for a number of reasons, including, for example, the number of applications in the split-screen mode, the size of the display screen of the computing device, and the user resizing the display area for this particular application.

In the example of fig. 1A, the destination application 108 is an image editor application and does not contain an icon. In some implementations, the destination application 108 may already contain one or more icons similar to the icons 110, 112, 114, and 116 in the source application 106. Because the destination application 108 is an image editor, the icons may be linked to image data rather than another type of data.

FIG. 1B shows a user of the mobile device 102 using a finger 124 to submit a touch input 122 on the touch screen display 104. The touch input 122 selects the icon 110 in the source application 106. In some implementations, the touch input 122 is made by the user touching the touch screen display 104 with a stylus rather than a finger 124. Alternatively, the user may select icon 110 by placing a cursor over the icon or by highlighting the icon using a keyboard and clicking a button without touching input 122. For example, the user may use a mouse to move a cursor over the icon 110 and click the left mouse button to select the icon.

Selection of the icon 110 by the user's touch input 122 creates a conceptual icon 110A that links to the icon 110. As shown in fig. 1B, the conceptual icon 110A has an album layout with a photo area 118A displaying the same image data that is displayed in the photo area 118 of the selected icon 110 as shown in fig. 1A and a title area 120A that may be left blank or may display the same data that may be displayed in the title area 120 of the selected icon 110 as shown in fig. 1A. Thus, in the example of FIG. 1B, the conceptual icon 110A has an appearance that matches the appearance of the selected icon 110 as shown in FIG. 1A. In some implementations, the conceptual icon 110A may be emphasized (emphasised) once it is created. For example, the conceptual icon 110A may be highlighted with a border around it, or other portions of the display area of the source application 106 other than the conceptual icon 110A may be darkened. In some implementations, the icon 110 may remain on the display 104 as the user moves the conceptual icon 110a around the area of the source application 106.

FIG. 1C shows that the user has started a drag motion on the conceptual icon 110A. The user may drag the conceptual icon towards the destination application 108. The user performs this dragging motion by maintaining contact with the touch screen display 104 from the time the user performed the touch input 122, and moving the user's finger 124 to a different location from the point at which the user performed the touch input. As the user drags the finger 124 on the touch screen display 104, the conceptual icon 110A moves with the user's finger 124. In some embodiments, this dragging motion is performed by the user using a stylus rather than their finger 124. Alternatively, when the user selects the icon 110 without touching the input 122, the drag motion may be performed by the user holding down a clicked mouse button to select the icon and moving the mouse or pressing arrow keys to move the created conceptual icon 110A. In some implementations, as discussed above, the conceptual icon 110A may continue to be emphasized during the dragging motion.

FIG. 1D shows that the user has continued to drag the conceptual icon 110B into the destination application 108. The conceptual icon 110B crosses into the boundary of the destination application 108 and the appearance of the conceptual icon 110B changes. The change in appearance of the conceptual icon 110B reflects data linked to the selected icon 110 that the destination application 108 is able to recognize. In this example, because the destination application 108 is an image editor, the destination application 108 is able to identify the image data linked to the selected icon 110. In the example of FIG. 1D, this image data is a representation of album art from a music album, with the selected icon 110 representing the album art.

Once the conceptual icon 110B crosses into the boundary of the destination application 108, the conceptual icon 110B changes the layout from an album layout depicting the appearance of the selected icon 110 to an image layout depicting the image data linked to the selected icon 110. This change in icon layout corresponds to the removal of the header area 120A as shown in FIG. 1C from the conceptual icon 110B. Thus, in this example, the conceptual icon 110B now has a single area, the picture area 118B. The picture area 118B is used to display image data linked to the selected icon 110. In some implementations, the change in the appearance of the conceptual icon 110B can result in a change in the size and/or dimensions of the conceptual icon 110B. In some implementations, a change in the appearance of the conceptual icon 110B may not result in a change in the size or dimensions of the conceptual icon 110B, but does result in an increase in the size of the picture region 118B. In such an implementation, the increase in the size of the picture region 118B is equal to the size of the preexisting header region 120A as shown in fig. 1C. The increased size of the picture region 118B may enable the icon 110B to display a greater portion, if not all, of the image data linked to the selected icon 110. For example, a greater portion of the album art is displayed on the touch screen display 104 in the artwork area 118B of the conceptual icon 110B (as shown in fig. 1D) than the portion of the album art displayed on the touch screen display 104 in the artwork area 118A of the conceptual icon 110A (as shown in fig. 1C). The source application 106 may use a cropped version of the album art in the icon 110, and the device 102 may display the full album art in the conceptual icon 110B.

In some implementations, once the conceptual icon 110B is brought into the boundary of the destination application 108, the conceptual icon 110B may continue to be emphasized. For example, the conceptual icon 110B may continue to be highlighted with a border around it, or anything other than the conceptual icon 110B may be dimmed within the display area of the destination application 108. In some implementations, if the conceptual icon 110B is emphasized by darkening the surrounding display area of the source application 106, once the conceptual icon 110B is brought within the boundary of the destination application 108, the darkening of the surrounding display area of the source application 106 can end with the darkening of the surrounding display area of the destination application 108 beginning. In some implementations, once the conceptual icon 110A (shown in fig. 1C) is brought outside the boundaries of the source application 106, or substantially outside the boundaries of the source application 106, the darkening of the surrounding display area of the source application 106 can end.

By changing the display of the conceptual icon from the conceptual icon 110A to the conceptual icon 110B, the device 102 provides the user with a preview of the actions that the device 102 may perform if the user dropped the conceptual icon 110B into the destination application 108B. In the example of fig. 1D, as the user drags the icon 110A into the destination application 108B, the device 102 changes the icon 110 that displays the cropped version of the music album cover to the full version of the album cover image.

It should be understood that while the drag operation conventionally involves a constant contact with the touch screen display or holding down a mouse button, other embodiments are possible. For example, selection of a conceptual icon may be toggled (e.g., by selecting an option in a context menu or selecting a keyboard key) such that once toggled, continued selection does not require continued user input. Similarly, while the movement of the conceptual icon to the second application may be indicated by the user dragging their finger or dragging a mouse over the display, it should be understood that other embodiments are possible. For example, in the case of switching the selection of a conceptual icon, the movement of the conceptual icon to the destination application may be indicated by a user input in the destination application. For example, selection of a conceptual icon may be toggled while the source application is displayed, and the destination application may then be caused to be displayed (e.g., by way of a task switching tool). The display of the destination application itself may provide a representation of dragging the conceptual icon to the destination application. Alternatively, additional user input (e.g., tapping the display or mouse clicking) may provide a representation of dragging the conceptual icon to the destination application. It should be understood that other user inputs may be provided for representing dragging a conceptual icon from a source application to a destination application.

FIG. 1E shows that the user has performed a drop operation by ending the drag motion of conceptual object 110B (shown in FIG. 1D) within the boundaries of destination application 108, and has lifted finger 124 from touch screen display 104. When the user lifts the finger 124 off the touch screen display 104, as shown in FIG. 1D, the conceptual icon 110B is converted into a new image 126 in the destination application 108. In some implementations, the new image 126 can be created by the user lifting the stylus from the touch screen display 104 instead of lifting the finger 124. Alternatively, creation of the new image 126 may be effectuated without the use of a touch screen display by the user releasing a button that the user clicks to select the icon 110. For example, once the conceptual icon 110B has been dragged into the boundary of the destination application 108, a user who has used the selected icon 110 with a mouse by clicking on the left mouse button can create a new image 126 by releasing the left mouse button.

In the example of FIG. 1E, the new image 126 may be a copy of the same image data linked to the selected icon 110. In some implementations, as shown in FIG. 1D, the new image 126 has a different size and/or dimensions than the conceptual icon 110B. The new image 126 may be larger than the conceptual icon 110B. In some implementations, the new image 126 automatically fills most of the display area of the destination application 108 as it is created. In some implementations, the new image 126 can have a size that is proportional to the size of the conceptual icon 110B as shown in fig. 1D.

In some implementations, where the conceptual icon 110B as shown in FIG. 1D is emphasized, whether by highlighting the conceptual icon 110B or by darkening the surrounding display area of the destination application 108, the emphasis ends when the conceptual icon 110B is converted to a new image 126. In some implementations, other icons are already present within the destination application 108, and creation of a new image 126 can result in re-sizing and/or resizing existing icons. Such reorganization and/or resizing is discussed in more detail below in connection with fig. 2. In some implementations, the number of other icons already present within the destination application 108 can determine the size and/or dimensions of the new image 126.

In some implementations, the location of the user handle on the conceptual icon 110B can help determine what icon layout of the conceptual icon 110B is displayed in the destination application 108. This handle position may also help determine what the new image 126 will be. The handle position on the conceptual icon 110B is determined by the location of the user input touch input 122. In some embodiments, the handle position is where the user places the cursor over a selectable icon (e.g., icon 110) and clicks a button. This feature is employed, in particular, when the destination application 108 can identify multiple types of data that are linked to the selected icon 110. For example, if the selected icon 110 represents an album and the destination application 108 is a library application capable of organizing music data (such as songs) and image data (e.g., pictures), it must be determined whether the conceptual icon 110B has an image layout (as shown in FIG. 1D) or a music layout (as will be discussed in more detail below in connection with FIG. 2). In addition, it must be determined whether the new image 126 will link to music data (where the image 126 is no longer an image) or to image data to which the selected icon 110 links. The handle position on the conceptual icon 110B may assist in making these determinations.

The conceptual icon 110A may be divided into different portions (such as a left portion or a right portion) or into multiple quadrants. In some implementations, these portions are not created until the conceptual icon 110A crosses into the boundary of the destination application 108 and becomes the conceptual icon 110B. In some implementations, the number of data types that the destination application 108 can identify that are linked to the selected icon 110, which can determine the number of portions into which the conceptual icon 110B is divided. For example, if the destination application 108 is capable of identifying two different data types linked to the selected icon 110, such as if the destination application 108 is a library application, the conceptual icon 110B is divided into two portions, e.g., a left portion and a right portion. Thus, if the user makes a touch input 122 on the right side of the selected icon 110, this may cause the conceptual icon 110B to change from the album layout as shown in fig. 1C to the image layout as shown in fig. 1D once the conceptual icon 110B is brought within the boundaries of the destination application 108. With the handle position on the right side of the conceptual icon 110B, when the user lifts the finger 124 off the touch screen display 104, a new image 126 is created that links to the same image data to which the selected icon 110 is linked.

In continuation of the above example, if the user instead makes a touch input 122 on the left side of the selected icon 110, this may cause the conceptual icon 110B to change from the album layout as shown in fig. 1C to the music layout discussed below in connection with fig. 2 once the icon is brought into the boundary of the destination application 108. With the handle position on the left side of the conceptual icon 110B, when the user lifts the finger 124 off the touch screen display 104, one or more new images are created that link to the same music data to which the selected icon 110 is linked. In this example, because the selected icon 110 represents an album and may contain music data for more than one song, once a user with a handle position on the left side of the conceptual icon 110B lifts a finger 124 off the touch screen display 104, a plurality of new icons may be created in the destination application 108, each icon representing a single song from the album represented by the selected icon 110.

Fig. 2A-2F illustrate example interfaces for graphical icon manipulation. Briefly, and as described in more detail below, a user selects an icon 210 on a GUI of a mobile device 202 having a touch screen display 204. The icon 210 links to one or more types of data. The user selects an icon 210 located within the source application 206 and creates a temporary icon that links to the selected icon 210 and has an appearance that matches the selected icon 210. If the user drags this temporary icon into the destination application 208, the temporary icon changes appearance based on the type of data that is linked to the selected icon 210 and identified by the destination application 208. The user drags the temporary icon into the icon area of the destination application 208 and the temporary icon may change appearance a second time to depict data that is linked to the selected icon 210 that is of the data type identified by the destination application 208. The user places the temporary icon into the destination application 208 and may create one or more new icons in the destination application 108 that link to copies of some or all of the data to which the selected icon 210 links.

FIG. 2A illustrates an example mobile device 202 having a touch screen display 204 similar to that described above with respect to FIG. 1A. The mobile device 202 runs two or more applications while displaying at least two of the applications, namely a source application 206 and a destination application 208. Although this example is shown on the touch screen display 204 of the mobile device 202, other examples may be shown on any computing device (with or without a touch screen display) configured to display two or more applications.

In the example of fig. 2A, the source application 206 is a music library application that contains a plurality of selectable icons 210, 212, 214, and 216. This source application 206 may be similar to the source application 106 as shown in FIG. 1A. Specifically, the icons 210, 212, 214, and 216 represent music albums and are linked to various types of data including, for example, image data and music data. In some implementations, the image data is a representation of the cover art of the respective album, and the music data is one or more songs of the respective album.

Similar to the icon 110 discussed above with reference to fig. 1A, the icon 210 represents an album, and thus has an album layout that includes a artwork area 218 and a title area 220. Image region 218 may be similar to image region 118 as shown in FIG. 1A. The header area 220 may be similar to the header area 120 shown in fig. 1A.

As shown in FIG. 2A, icons 210, 212, 214, and 216 are visually represented on the touch screen display 204 within the display area of the source application 206. The icons 210, 212, 214, and 216 may be similar to the icons 110, 112, 114, and 116 discussed above and shown in FIG. 1A.

In the example of fig. 2A, the destination application 208 is a playlist application that contains icons 228, 230, 232, and 234 arranged vertically in a list. Each of the icons 228, 230, 232, and 234 represents a song and is linked to music data of the corresponding song. In addition, icons 228, 230, 232, and 234 have a song layout with a text area (e.g., text area 236) that displays the song name and the number corresponding to the order of the icons in the list of icons of destination application 208.

FIG. 2B shows a user of the mobile device 202 using a finger 224 to provide a touch input 222 on the touch screen display 204. The touch input 222 is used to select an icon 210 in the source application 206. In some implementations, the touch input 222 is made by the user touching the touch screen display 204 with a stylus rather than a finger 224. Alternatively, the user may select an icon, such as icon 210, without touch input 222 in a manner similar to that discussed above with reference to fig. 1B.

Selection of the icon 210 by the user's touch input 222 creates a conceptual icon 210A linked to the icon 210. As shown in fig. 2B, the conceptual icon 210A has an album layout having a picture area 218A displaying the same image data displayed in the picture area 218 of the selected icon 210 and a title area 220A which may be left blank or may display the same data displayed in the title area 220 of the selected icon 210. The conceptual icon 210A may have an appearance that matches the appearance of the selected icon 210, as shown in FIG. 2A.

FIG. 2C shows that the user has started a drag motion on the conceptual icon 210A. The user may drag the conceptual icon 210A towards the destination application 108. This dragging motion is performed with the user's finger 224 in a manner similar to that discussed above with reference to FIG. 1C. In some embodiments, this dragging motion is performed by the user using a stylus rather than their finger 224. Alternatively, such a drag motion may be performed without touch input 222 in a manner similar to that discussed above with reference to FIG. 1C.

FIG. 2D shows that the user has continued to drag the conceptual icon 210B into the destination application 208. As the conceptual icon 210B crosses into the boundary of the destination application 208, the appearance of the conceptual icon 210B changes. The conceptual icon 210B crosses the boundary of the entry destination application 208 when a majority of the conceptual icon 210B crosses the boundary of the entry destination application 208, or when a handle position of the conceptual icon 210B (e.g., a position on the icon 210 where the user initiated the touch input 222) crosses the boundary of the entry destination application 208. The handle position is discussed in more detail above with reference to fig. 1B-1E. In some implementations, the conceptual icon 210B does not change the layout until the icon is completely brought into the boundary of the destination application 208.

In the example of fig. 2D, the change in appearance of the conceptual icon 210B when crossing a boundary into the destination application 208 reflects the type of data that the destination application 208 is able to recognize linked to the selected icon 210. In particular, because the destination application 208 is a playlist application, it can recognize that the selected icon 210 is linked to music data. Because the playlist application can recognize that the selected icon 210 is linked to music data, the conceptual icon 210B changes from the album layout as shown in FIG. 2C to a music layout that displays a visual representation of the list of songs. The song list may be bulleted (as shown in fig. 2D) or may be numbered. In some implementations, the conceptual icon 210B displays a default image that does not reflect the generic music layout of any actual data to which the selected icon 210 is linked. This is in contrast to the conceptual icon 110B shown in fig. 1D, which displays the actual image data to which the selected icon 110 is linked. Alternatively, the conceptual icon 210B may depict the actual music data that is linked to the selected icon 210 and, when the conceptual icon 210B is brought into the boundary of the destination application 208, display some or all of the song titles of the songs of the corresponding album represented by the selected icon 210.

The conceptual icon 210B crosses the boundary of the destination application 208 and the conceptual icon 210B changes the layout from an album layout depicting the appearance of the selected icon 210 to a music layout depicting a visual representation of the list of songs. This change in the layout of the icons corresponds to both the picture area 218A and the title area 220A being replaced by a single song list area 238. In the example of fig. 2D, a change in the appearance of the conceptual icon 210B may result in a change in the size or dimensions of the conceptual icon 210B. Thus, the song list area 238 may be larger in size or smaller in size than the combination of the picture area 218A and the title area 220A as shown in fig. 2C. In some implementations, the song list area 238 may be equal in size to the combination of the picture area 218A and the title area 220A as shown in fig. 2C. In some implementations, where the conceptual icon 210B contains actual music data that is linked to the selected icon 210, the size and/or dimensions of the conceptual icon 210B may change when dragged substantially into the boundary of the destination application 208 based on the number of songs in the corresponding album represented by the selected icon 210.

Fig. 2E shows that the user has continued the dragging motion of the conceptual icon 210C within the destination application 208. In some implementations, the conceptual icon 210C may change appearance a second time when the conceptual icon 210C is brought near icons (e.g., icons 228, 230, 232, and 234 as shown in fig. 2D) within the destination application 208. While the conceptual icon 210C maintains the music layout, its size and scale may change, and the conceptual icon 210C now depicts the music data linked to the selected icon 210 by displaying some or all of the song titles of the songs associated with the corresponding album represented by the selected icon 210. In some implementations, the conceptual icon 210C has been widened to the width of the display area of the destination application 208. In some implementations, a second change in the conceptual icon (e.g., as shown by conceptual icon 210C) may not occur where the conceptual icon 210B contains the actual music data that is linked to the selected icon 210.

The size and/or dimensions of the conceptual icon 210C may depend on the number of songs associated with the respective album represented by the selected icon 210. For example, the conceptual icon 210C may be larger where there are more songs associated with the respective album, or the conceptual icon 210C may be smaller where there are fewer songs associated with the respective album. In some implementations, the conceptual icon 210C may display the maximum number of song titles of the songs of the corresponding album. For example, the conceptual icon 210C may be limited to displaying only the names of three songs, although the selected icon 210 represents an album with more than three songs. When the conceptual icon 210C does not display all of the song titles of the songs of the corresponding album, the conceptual icon 210C may also include a symbol, such as a vertical ellipsis, to signal that the conceptual icon 210C represents more songs than are being displayed.

When the conceptual icon 210C is brought into proximity of icons within the destination application 208 (e.g., icons 228, 230, 232, and 234 shown in fig. 2D), it is placed in alignment with those icons. This may be contrasted with previous representations of conceptual icons, such as conceptual icons 210A and 210B shown in FIGS. 2B-2C, which are hovering over the icons. When the conceptual icon 210C is placed in alignment with an icon within the destination application 208, it may force one or more of the icons to change position. In the example of fig. 2E, icon 234 may be moved out of the display area of destination application 208 and icon 232 may be repositioned near the bottom of the display area of destination application 208. Further, as the conceptual icon 210C is brought into proximity of an icon within the destination application 208, the upper and lower boundaries of the conceptual icon 210C may change from solid to dashed to indicate that the conceptual icon 210C has been placed in alignment with an icon within the destination application 208. In some implementations, instead of the icon 234 being pushed out of the display area of the destination application 208 as shown in fig. 2D, the icons 228, 230, 232, and 234 can be reduced in size so that they all remain visible within the display area of the destination application 208, even though the conceptual icon 210C is placed in alignment with them.

FIG. 2F shows that the user has performed a drop operation by ending the drag motion of conceptual object 210C within the boundaries of destination application 208, and has lifted finger 224 from touch screen display 204. The user lifts their finger 224 off the touch screen display 204 and the conceptual icon 210C is converted into three new icons, namely an icon 240 representing "song X", an icon 242 representing "song Y", and an icon representing "song Z" that is not currently shown in the display area of the destination application 208. Each of these three icons represents one of the songs associated with the respective album associated with the selected icon 210. In the example of FIG. 2F, the album represented by the selected icon 210 contains three songs, with the new icons 240, 242 and the icon representing "Song Z" each representing one of the three songs. The new icons 240, 242 and the icon representing "song Z" each have a song layout with a text area (e.g., text area 236) displaying the name of the corresponding song and the number of icon orders within the list of icons corresponding to the destination application 208. In some implementations, the new icons 240, 242 and the icon representing "Song Z" can be created by the user lifting a stylus from the touch screen display 204 instead of lifting their finger 224. Alternatively, the creation of new icons 240, 242 and the icon representing "Song Z" may be carried out without the use of a touch screen display in a manner similar to that discussed above with reference to FIG. 1E.

The new icons 240, 242 and the icon representing "Song Z" may be placed in the list of icons within the destination application 208 in an order corresponding to the aligned position of the conceptual icon 210C as shown in FIG. 2E. In the example of FIG. 2F, the user performs a drop operation on conceptual object 210C as shown in FIG. 2E when conceptual object 210C is between icon 230 and icon 232. Thus, when the user performs a drop operation, the new icons 240, 242 and the icon representing "Song Z" are placed in the list of icons of the destination application 208 below the icon 230 and above the icon 232. In some implementations, any icons within the destination application 208 that are squeezed out of the display area of the destination application 208 by the conceptual icon 210C as shown in fig. 2E may not continue to be displayed when the new icons 240, 242 and the icon representing "song Z" are created. In addition, in the example of fig. 2F, not all new icons may be displayed. For example, in some implementations, the destination application 208 may have a maximum number of icons, such as four icons, that it can display in its display area. In this example, because the icon representing "Song Z" would be the fifth icon, it is moved out of the display area of the destination application 208 along with icons 232 and 234. In some implementations, if the aligned placement of the conceptual icon 210C and/or the creation of the new icons 240, 242 and the icon representing "song Z" results in any icons within the destination application being moved out of the display area of the destination application 208, a symbolic icon 246, such as a vertical ellipsis, may be displayed to communicate that some icons have been omitted from the display. In some embodiments, the symbol icon 246 may be optional. In the event that the symbolic icon 246 is selectable and if the user clicks on the symbolic icon, the previously undisplayed icon within the destination application 208 will now be displayed within the display area of the destination application 208.

In the example of fig. 2F, the new icons 240, 242 and the icon representing "song Z" each represent a song and link to a copy of the portion of music data linked to the selected icon 210. In some implementations, the new icons 240, 242 and the icon representing "Song Z" collectively have a different size and/or have a different size than the conceptual icon 210C shown in FIG. 2E. In some implementations, the overall size of the new icons 240, 242 and the icon representing "Song Z" may have an overall size and dimensions similar to the conceptual icon 210C shown in FIG. 2E. However, in embodiments where the new icons 240, 242 and the icon representing "Song Z" are collectively larger or smaller than the conceptual icon 210C as shown in FIG. 2E, it may depend on the number of icons already present in the destination application 208. For example, if no icons exist in the destination application 208 prior to creating the new icons 240, 242 and the icon representing "Song Z", the new icons 240, 242 and the icon representing "Song Z" may have a larger overall size than the conceptual icon 210C shown in FIG. 2E in order to attempt to fill the empty space of the destination application 208. However, if one or more icons are already present in the destination application 208, such as icons 228, 230, 232, and 234 shown in FIG. 2A, before the new icons 240, 242 and the icon representing "Song Z" are created, the new icons 240, 242 and the icon representing "Song Z" may have a smaller overall size than the conceptual icon 210C shown in FIG. 2E in order to attempt to display all of the icons within the destination application 208.

In some implementations, the conceptual icons 210A, 210B, or 210C can be highlighted when each icon is being created, when each icon is being dragged in the source application 206, when each icon is being substantially dragged into the destination application 208, and/or when each icon is being placed in alignment with an icon in the destination application 208. The highlighting of the conceptual icon 210A, 210B, or 210C may be done in a manner similar to the manner of the conceptual icons 110A and 110B discussed above with reference to FIGS. 1B-1D.

FIG. 3 shows a swim lane diagram 300 of graphical icon operations. In general, swim lane diagram 300 illustrates an example process for a source application and a destination application to communicate with an operating system of a mobile device or other computing device. This process includes a user selecting a graphical item, dragging a representation of the graphical item in a source application, dragging the representation of the graphical item into a destination application, and dragging and dropping the representation of the graphical item into the destination application. This process will be explained with frequent reference to fig. 1A to 1E and fig. 2A to 2F.

In the example of FIG. 3, the process shown in swim lane diagram 300 can be implemented using a computer-readable storage medium having program code for use by or in connection with a mobile device, a computing device, or any instruction execution system. Additionally, this process may be implemented in the example screen shots of fig. 1A-1E and 2A-2F. For example, as depicted in FIG. 3, a "source application" may refer to source application 106 as shown in FIG. 1A or source application 206 as shown in FIG. 2A; "destination application" may refer to destination application 108 as shown in FIG. 1A, or destination application 208 as shown in FIG. 2A; "operating system" may refer to an operating system of a mobile device, such as mobile device 102 shown in FIG. 1A, or mobile device 202 shown in FIG. 2A; "graphical item" may refer to icon 110 as shown in FIG. 1A, or icon 210 as shown in FIG. 2A; the "representation of a graphical item" may refer to the conceptual icons 110A and 110B as shown in fig. 1A-1D, or the conceptual icons 210A, 210B, or 210C as shown in fig. 2A-2E.

The process illustrated by swim lane diagram 300 may begin at stage 302 where an operating system of a mobile device (e.g., mobile device 102 as shown in FIG. 1A) receives a selection input by a user at stage 302. The user may make the selection input in a number of ways, such as by making a touch input with a finger (e.g., finger 124 shown in FIG. 1B) or with a stylus, such as touch input 122 shown in FIG. 1B. The user may also make selection inputs without touch inputs, as discussed in more detail above with reference to FIG. 1B. The selection input is made on a graphical item within the source application (e.g., icon 110 shown in FIG. 1A). The operating system receives a selection input and indicates to the source application that the user has performed a gesture on the source application at a particular location and requests a link to the selected graphical item (304).

In response to a request (304) by the operating system for a link to a graphical item, the source application sends (306) to the operating system a link to the graphical item at which the user performed the selection input. In addition to sending the link to the graphical item to the operating system, the source application also sends operating system instructions for rendering a representation of the graphical item (308). The operating system receives the links of the graphical items and instructions for rendering representations thereof and renders the representations of the graphical items according to the instructions from the source application (310). In some implementations, the operating system can also render emphasis on the representation of the graphical item. The emphasis of the representation of the graphical item (e.g., the conceptual icon 110A shown in FIG. 1C) is discussed in more detail above with reference to FIGS. 1B-1D.

In some implementations, the source application can perform rendering of the representation of the graphical item. In some implementations, the operating system, upon receiving a link to a graphical item, can send the link to a lookup system to find an application to assist in rendering a representation of the graphical item. In such a scenario, the lookup system may select the source application as the application for assisting in rendering the representation of the graphical item. In some implementations, where the operating system locally identifies how to render the representation of the graphical item, the operating system can process the rendering of the representation of the graphical item without sending a link to the graphical item to the lookup system and/or using instructions for rendering the representation of the graphical item sent by the source application to the operating system.

Through stages 302-310 of the process shown in swim lane diagram 300, the GUI of the mobile device may undergo some changes. Examples of these potential variations are provided by fig. 1A-1C and 2A-2C, where upon selection of a graphical item (e.g., icon 110 as shown in fig. 1A), a representation of the graphical item (e.g., conceptual icon 110A as shown in fig. 1B) is created, which can be manipulated by a user within and outside the boundaries of a source application (e.g., source application 106 as shown in fig. 1A).

The user drags the representation of the graphical item within, or substantially within, the boundaries of the destination application, and the link of the graphical item is sent to the destination application, e.g., destination application 108 as shown in fig. 1A. Dragging a representation of a graphical item (e.g., conceptual icon 210B) substantially into the boundary of a destination application (e.g., destination application 208) is discussed in more detail above with reference to fig. 2C. Upon receiving a link for a graphical item from the operating system (312), the destination application determines how to render a new representation of the graphical item (314). The destination application makes this determination by identifying the type of data it recognizes that is linked to the graphical item. Based on this identification, the destination application sends an instruction to the operating system to render the new representation of the graphical item (316). Upon receiving the instruction to render the new representation of the graphical item, the operating system renders the new representation of the graphical item (e.g., conceptual icon 110B shown in FIG. 1B) according to the source application (e.g., source application 106 shown in FIG. 1A) (318). In some implementations, the operating system can also facilitate providing animation between the rendering of the representation of the graphical item and the rendering of the new representation of the graphical item. In some implementations, the operating system can also continue to render emphasis of the representation of the graphical item (e.g., the conceptual icon 110B as shown in fig. 1D), as discussed in more detail above with reference to fig. 1B-1D.

In some implementations, the operating system provides a different link to the graphical item to the destination application in stage 312 than the link to the graphical item that the operating system received in stage 306. The operating system may provide a link to the graphical item (which is designed to protect the underlying data) while also providing the destination application with sufficient information to determine how to render the graphical item. An operating system that limits exposure of underlying data may protect files from unnecessary exposure to additional applications and protect the privacy of a user by limiting applications that have access to the underlying data.

In some implementations, the destination application can perform rendering of the representation of the graphical item. In some implementations, where the destination application does not recognize the type of data linked to the graphical item, the destination application may indicate to the operating system that it does not know how to render the new representation of the graphical item. The destination application may make this indication by not providing any instructions to the operating system for rendering the graphical item, by requesting the operating system to perform the lookup, and/or providing instructions to the operating system to perform the lookup. During the lookup, the operating system may send a link to the graphical item to the lookup system to find an application to assist in rendering the new representation of the graphical item. In such a scenario, the lookup system may continue to use the source application as an application for assisting in rendering the representation of the graphical item. In some implementations, where the operating system locally identifies how to render the representation of the graphical item, it can handle the rendering of the new representation of the graphical item itself without sending a link to the graphical item to the lookup system. In some implementations, where the destination application does not identify the type of data linked to the graphical item, the operating system can render a default graphical image for the new representation of the graphical item to indicate that the destination application does not identify how to render the representation of the graphical item and that the representation of the graphical item cannot be placed within the boundaries of the destination application.

Through stages 312 to 318 of the process shown in swim lane diagram 300, the GUI of the mobile device may undergo some changes. Examples of these potential variations are provided by fig. 1C-1D and 2C-2E, where as a representation of a graphical item (e.g., conceptual icon 110A as shown in fig. 1C) is dragged into the boundary of the destination application, the representation of the graphical item changes, e.g., conceptual icon 110B as shown in fig. 1D.

At stage 320, when the user places the new representation of the graphical item within the boundaries of the destination application, the destination application renders a new copy of all or a portion of the graphical item. As discussed above with reference to fig. 1E and 2F, the user prevents the drop of a representation of a graphical item (e.g., the conceptual icon 110B shown in fig. 1D) by ending their selection input (e.g., the touch input 122 shown in fig. 1B), by lifting their finger or stylus from the touch screen display of the mobile device, or releasing the key they clicked to select the graphical item. As shown in the examples of fig. 1E and 2F above, the new copy of the graphical item may appear similar to the copy of the new representation of the graphical item. Creation of a new copy of a graphical item may also create a copy of all or a portion of the data to which the graphical item is linked. In some implementations, only data belonging to the type of data identified by the destination application from the graphical item is copied. The new copy of the graphical item is linked to the new copy of the data.

In some implementations, a user may be prevented from placing a new representation of a graphical item within the boundaries of a destination application without the destination application identifying the type of data linked to the graphical item. A user attempting to perform a drop operation on a new representation of a graphical item within the boundaries of the destination application by ending their selection input may result in a new copy of the graphical item not being created. In some implementations, in the event that the user attempts to perform such a drop operation, the representation of the graphical item may disappear because the operating system will stop rendering it. In this scenario, the user may perform a new selection input on the graphical item to recreate the representation of the graphical item.

By stage 320 of the process shown in swim lane diagram 300, the GUI of the mobile device may undergo some changes. Examples of these potential changes are provided by fig. 1D-1E and 2E-2F, where a new icon (e.g., image 126 as shown in fig. 1E) is created when a representation of a graphical item (e.g., conceptual icon 110B as shown in fig. 1D) is placed within the boundaries of the destination application.

FIG. 4 illustrates an example process 400 for graphical icon manipulation. Generally, the process 400 receives a drag input from a user of a graphical item displayed on a graphical interface. The user drags the graphical item to a different portion of the graphical interface and the process 400 updates the graphical item to show the changes the graphical item would have made to the different portion of the graphical interface if the user dropped the graphical item in the different portion. Process 400 will be described as being performed by a computer system comprising one or more computers, such as computing device 100 shown in FIG. 1 or computing device 200 shown in FIG. 2.

The system receives a selection input through the graphical interface that is interpreted as a selection of a graphical item located in the first portion of the graphical interface (410). In some implementations, the graphic is an icon representing a file (such as a photo, song, album, playlist, text file, spreadsheet file, etc.). In some implementations, the selection input is a touch input on a touch screen display. In some implementations, the selection input is a mouse click. In some implementations, the selection input is a voice input. For example, a user may say, "select and drag to scroll albums". In some implementations, the selection input is a switching input, such as in a virtual reality-based directional user interface. For example, a user may select multiple photos in a photo organization application. The group of photographs can be collectively dragged by a selection input as a switch rather than a continuous input. In some implementations, the first portion of the graphical interface is a window of an application, such as a music application. In some implementations, the first portion of the graphical interface is a file browser. In some implementations, the first portion of the graphical interface is a particular region of the application. For example, the first part is a list of images in a photo editor application.

The system receives a movement input through the graphical interface while receiving the selection input, the movement input being interpreted as a dragging of the graphical item (420). For example, the user moves the user's finger across the display while maintaining contact with the display. As another example, a user moves a mouse cursor over a graphical interface while keeping the mouse in a clicked state. These actions move the graphical item on the graphical interface.

The system determines that the location of the selection input is in the second portion of the graphical interface while receiving the selection input and the movement input (430). In some implementations, the second portion of the graphical interface is a second application that is different from the application in which the graphical item was originally located. In some embodiments, the second portion corresponds to a different portion of the application. For example, the application may be a photo editor with a list of images in the border and a main photo editing space in the center of the photo editor. The border may be the first portion and the main photo editing space may be the second portion. In some embodiments, the first and second portions may not be displayed at the same time. The first and second portions may be displayed at the same coordinates of the display.

In some implementations, the system receives an instruction to render a graphical item from an application corresponding to the first portion. The system may use the instructions to render the graphical item while the user drags the graphical item around the first portion of the graphical interface. For example, a user may select a playlist icon from a file browser window. While the user drags the playlist icons around the file browser, the system may reproduce the playlist icons according to instructions from the file browser. In some implementations, the instructions can be for rendering the graphical item when the graphical item appears in the first portion. In the case of the playlist example, prior to the user selecting the playlist icon, the system may render the playlist icon as it appears in the first portion. In this case, the user appears to be moving the playlist icon around the graphical interface while dragging the playlist icon.

The system updates the graphical item by providing a representation of changes to the graphical item based on the graphical item placed in the second portion in place of the graphical item based on determining that the location of the selection input is in the second portion of the graphical interface and while receiving the selection input (440). In some implementations, the system receives instructions from an application of the second portion to render the graphical item as it crosses into the second portion. The application may receive an object pointed to by the graphical item, and the application may provide instructions for rendering the object. Alternatively or additionally, the application may receive a link, reference, or pointer to an object. Rendering corresponds to how the object will appear in the second portion when the user stops providing the selection input. For example, a user may select an album from a file browser. Albums may be represented by icons of music notes. The user drags the album icon into the photo editor. The album icon crosses the border of the photo editor, and the photo editor provides instructions for rendering the album icon. If the user places an album icon in the photo editor, the instructions correspond to what the photo editor will display. In this case, the photo editor may display album art. Thus, the system changes the icon to album art based on instructions provided by the photo editor. This allows the user to preview what the album icon would have occurred if the user placed the album icon in the photo editor.

In some implementations, the receiving application of the second portion may be capable of displaying the graphical item in more than one way. For example, the album icon may link to an album having a track listing and album art. The user can drag the album icon to a presentation application (presentation application). The user drags the album icon across the border of the presentation application. The presentation application analyzes the data to which the album icon is linked. The data includes track listings, music files, and album art. If the user were to place the album icon in the presentation application, the presentation application determines that it is capable of displaying a track listing or album art. In one example, the presentation application may inform the system that the presentation application may render the album icon in two different ways. The system may present the user with two different ways and the user may select one. If the user selects the track listing, the system changes the album icon to a representation of the track listing while the user hovers the album icon over the presentation application. If the user selects album art, the system changes the album icon to a representation of the album art while the user hovers the album icon over the presentation application.

In another example, the presentation application may select how to render the album icon. The presentation application may select options based on capabilities and features of the presentation application. Rendering applications may be more commonly used to display images rather than text. In this case, the presentation application may provide instructions to the system to render the album icon as album art. In some implementations, the presentation application can provide the option to render the album icon as album art or a track listing. The system may make the selection based on the characteristics of the presentation application. The system may also be selected based on the load on the system. For example, the system may use less computing resources (e.g., memory or processing power or both) to render a representation of the track listing rather than album art. In this case, the system may select the track list reproduction option.

The system determines that the selection input has ceased (450). In this case, the user has placed the graphical icon in the second portion. For example, the user may place an album icon in a presentation application or in a photo editor.

The system provides changes to the graphical item for output based on determining that the selection input has ceased (460). For example, a photo editor application may open album art for editing. Album art may be a separate file from the file used to display album art with icons. As another example, the presentation application may display an editable copy of the list of tracks in the album. The presentation application may alternatively display an editable copy of the album art. Where the receiving application is configured to display different representations of the icon, the receiving application may display a representation corresponding to how the icon changes when the icon crosses the boundary of the receiving application. In some implementations, the receiving application can prompt the user on how the receiving application should display the underlying data.

In some implementations, when a graphical item is placed into a receiving application, the receiving application may not cause a visual change to the graphical item. In an example where the receiving application is a music player application, the music player application may play audio files that the user places into the music player. In this case, the user may hover an icon identifying the audio file over the music player application. The icon identifying the audio file may be converted to a music note icon to indicate to the user that the music player application is to play the audio file. The icon may be animated to indicate music playing. Alternatively or additionally, the system may output a portion of the audio file through the speaker while the user hovers the icon over the music playing application. Instead of a portion of an audio file, the output audio may be a generic audio file. By playing the generic audio file, the music player application may not access the underlying audio file while still providing the user with an indication that the music player application will play the audio file if the user drops the icon.

Fig. 5 illustrates an example of a computing device 500 and a mobile computing device 550 that may be used to implement the techniques described herein. Computing device 500 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The mobile computing device 550 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart phones, and other similar computing devices. The components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not limiting.

Computing device 500 includes a processor 502, memory 504, a storage device 506, a high-speed interface 508 connecting to memory 504 and a plurality of high-speed expansion ports 510, and a low-speed interface 512 connecting to low-speed expansion ports 514 and storage device 506. Each of the processor 502, the memory 504, the storage device 506, the high-speed interface 508, the high-speed expansion ports 510, and the low-speed interface 512, are interconnected using various buses, and may be mounted on a common motherboard or in other manners as appropriate. The processor 502 can process instructions for execution within the computing device 500, including instructions stored in the memory 504 or on the storage device 506, to display graphical information for a GUI on an external input/output device, such as display 516 coupled to high speed interface 508. In other embodiments, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also, multiple computing devices may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).

The memory 504 stores information within the computing device 500. In some implementations, the memory 504 is a volatile memory unit or units. In some implementations, the memory 504 is a non-volatile memory unit or units. Memory 504 may also be another form of computer-readable medium, such as a magnetic or optical disk.

The storage device 506 is capable of providing mass storage for the computing device 500. In some implementations, the storage device 506 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices including devices in a storage area network or other configurations. The instructions may be stored in an information carrier. When executed by one or more processing devices (e.g., processor 502), the instructions perform one or more methods, such as those described above. The instructions may also be stored by one or more storage devices, such as a computer or machine readable medium (e.g., memory 504, storage 506, or memory on processor 502).

The high-speed interface 508 manages bandwidth-intensive operations for the computing device 500, while the low-speed interface 512 manages lower bandwidth-intensive operations. Such allocation of functions is merely an example. In some implementations, the high-speed interface 508 is coupled to memory 504, display 516 (e.g., through a graphics processor or accelerator), and to high-speed expansion ports 510, which may accept various expansion cards. In an embodiment, low-speed interface 512 is coupled to storage device 506 and low-speed expansion port 514. The low-speed expansion port 514, which may include various communication ports (e.g., USB, bluetooth, ethernet, wireless ethernet), may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, for example, through a network adapter.

As shown, computing device 500 may be implemented in a number of different forms. For example, it may be implemented as a standard server 520, or multiple times in a group of such servers. Further, it may be implemented in a personal computer such as laptop computer 522. It may also be implemented as part of a rack server system 524. Alternatively, components from computing device 500 may be combined with other components in a mobile device, such as mobile computing device 550. Each of such devices may contain one or more of computing device 500 and mobile computing device 550, and an entire system may be made up of multiple computing devices in communication with each other.

Mobile computing device 550 includes a processor 552, memory 564, an input/output device such as a display 554, a communication interface 566, and a transceiver 568, among other components. The mobile computing device 550 may also be provided with a storage device, such as a microdrive or other device, to provide additional storage. Each of the processor 552, memory 564, display 554, communication interface 566, and transceiver 568 are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.

The processor 552 can execute instructions within the mobile computing device 550, including instructions stored in the memory 564. Processor 552 may be implemented as a chipset of chips that include separate pluralities of analog and digital processors. The processor 552 may provide, for example, for coordination of the other components of the mobile computing device 550, such as control of user interfaces, applications run by the mobile computing device 550, and wireless communication by the mobile computing device 550.

The processor 552 may communicate with a user through a control interface 558 and a display interface 556 coupled to a display 554. The display 554 may be, for example, a TFT (thin film transistor liquid crystal display) display or an OLED (Organic light emitting Diode) display, or other suitable display technology. The display interface 556 may comprise appropriate circuitry for driving the display 554 to present graphical and other information to a user. The control interface 558 may receive commands from a user and convert them for submission to the processor 552. In addition, an external interface 562 may provide communication with processor 552, so as to enable near area communication of the mobile computing device 550 with other devices. External interface 562 may provide, for example, for wired communication in some embodiments, or for wireless communication in other embodiments, and multiple interfaces may also be used.

The memory 564 stores information within the mobile computing device 550. The memory 564 can be implemented as one or more of one or more computer-readable media, one or more volatile memory units, or one or more non-volatile memory units. Expansion Memory 574 may also be provided and connected to mobile computing device 550 through expansion interface 572, which may include, for example, a SIMM (Single In Line Memory Module) card interface. Expansion memory 574 may provide additional storage space for mobile computing device 550, or may also store applications or other information for mobile computing device 550. Specifically, expansion memory 574 may include instructions to carry out or supplement the processes described above, and may include secure information also. Thus, for example, expansion memory 574 may be provided as a security module for mobile computing device 550, and may be programmed with instructions that permit secure use of mobile computing device 550. In addition, secure applications may be provided via the SIMM card, as well as additional information, such as placing identification information on the SIMM card in a non-hackable manner.

As discussed below, the memory may include, for example, flash memory and/or NVRAM memory (non-volatile random access memory). In some embodiments, the instructions are stored in an information carrier. When executed by one or more processing devices (e.g., processor 552), the instructions perform one or more methods, such as those described above. The instructions may also be stored by one or more storage devices, such as one or more computer-or machine-readable media (e.g., memory 564, expansion memory 574, or memory on processor 552). In some implementations, the instructions can be received in a propagated signal, for example, over transceiver 568 or external interface 562.

Mobile computing device 550 may communicate wirelessly through communication interface 566, which communication interface 566 may include digital signal processing circuitry, if necessary. Communication interface 566 may provide for communication under various modes or protocols, such as GSM voice call (Global System for Mobile communication), SMS (Short Message Service), EMS (Enhanced Messaging Service) or MMS Messaging (multimedia Messaging Service), CDMA (Code Division Multiple Access), TDMA (time Division Multiple Access), PDC (Personal digital cellular), WCDMA (Wideband Code Division Multiple Access), CDMA2000 or GPRS (General Packet Radio Service), etc. Such communication may occur, for example, through transceiver 568 using radio frequencies. In addition, short-range communications may occur, such as using a bluetooth, WiFi, or other such transceiver. In addition, a GPS (Global Positioning System) receiver module 570 may provide additional navigation-and location-related wireless data to the mobile computing device 550, which may be used as appropriate by applications running on the mobile computing device 550.

The mobile computing device 550 may also communicate audibly using the audio codec 560, which audio codec 560 may receive verbal information from the user and convert it to usable digital information. The audio codec 560 likewise can generate audible sound for a user (such as through a speaker), such as in a handset of the mobile computing device 550. Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on the mobile computing device 550.

As shown, the mobile computing device 550 may be implemented in many different forms. For example, it may be implemented as a cellular telephone 580. It may also be implemented as part of a smartphone 582, personal digital assistant, or other similar mobile device.

Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.

These computer programs (also known as programs, software applications or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms machine-readable medium and computer-readable medium refer to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a Programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term machine-readable signal refers to any signal used to provide machine instructions and/or data to a programmable processor.

To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display)) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer.

The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a client computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a Local Area Network (LAN), a Wide Area Network (WAN), and the internet.

The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.

Although several embodiments have been described in detail above, other modifications are possible. For example, although the client application is described as accessing the agent(s), in other embodiments the agent(s) may be used by other applications implemented by one or more processors, such as applications executing on one or more servers. Moreover, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other acts may be provided, or eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Accordingly, other implementations are within the scope of the following claims.

31页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于无缝过渡用户界面行为的设备、方法和图形用户界面

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类