Drag and drop for touch screen devices

文档序号:1590863 发布日期:2020-01-03 浏览:31次 中文

阅读说明:本技术 用于触摸屏设备的拖放 (Drag and drop for touch screen devices ) 是由 B·D·尼洛 D·拉哈德贾 M·T·图纳 K·A·雷维斯 C·K·托马斯 S·R·布林 于 2018-04-18 设计创作,主要内容包括:本公开涉及一种实现用于触摸屏设备的拖放的设备,该设备可包括处理器,该处理器被配置为检测在第一应用程序中选择项目的拖动手势。该处理器还可被配置为检测在拖动手势结束时用于将项目放置到第二应用程序中的触摸释放,并且响应于所检测到的触摸释放而向第二应用程序发送消息,该消息包括关于项目的多个表示的信息。该处理器还可被配置为从第二应用程序接收对多个表示中的项目的表示的请求。该处理器还可被配置为向第一应用程序发送对项目的表示的请求。该处理器还可被配置为发起从第一应用程序到第二应用程序的项目的表示的数据传输。(The present disclosure relates to a device for implementing drag and drop for a touch screen device, which may include a processor configured to detect a drag gesture to select an item in a first application. The processor may be further configured to detect a touch release for dropping the item into the second application at the end of the drag gesture, and send a message to the second application in response to the detected touch release, the message including information about the plurality of representations of the item. The processor may be further configured to receive a request from a second application for a representation of an item in the plurality of representations. The processor may be further configured to send a request for a representation of the item to the first application. The processor may also be configured to initiate a data transfer of a representation of an item from a first application to a second application.)

1. An apparatus, comprising:

a processor;

a memory device containing instructions that, when executed by the processor, cause the processor to provide:

a drag-and-drop manager configured to:

managing a drag session corresponding to a drag event including an initial touch input selecting an item in a first application, a drag gesture moving the item, and a touch release for dropping the item into a second application at an end of the drag gesture, wherein the drag session is assigned a drag session identifier;

receiving, from the second application, a request for information corresponding to the item associated with the drag event after the touch release, the request including information corresponding to the drag session identifier; and

sending a message to the second application that includes the information corresponding to the item; and

a touch event manager configured to:

receiving a request from the drag-and-drop manager, generating a copy of the drag event for provision to the drag manager in the form of a dedicated drag event coexisting with the drag event, wherein the drag-and-drop manager is further configured to manage the dedicated drag event and receive a new drag touch; and

determining whether the second application will receive the item from the drag event using a new touch input during the drag session.

2. The apparatus of claim 1, wherein the message comprising the information corresponding to the items comprises a list of representations of the items, each of the representations being provided by the first application, and the list of representations is ordered based at least in part on a quality level associated with each of the representations.

3. The device of claim 2, wherein the touch event manager is further configured to forward the new touch input to the second application, the new touch input comprising an indication of the drag session identifier.

4. The device of claim 1, wherein the drag-and-drop manager is further configured to prevent the item from being placed into the second application when the drag-and-drop manager determines that the second application is not authorized to receive the item.

5. The device of claim 1, wherein determining whether the second application will receive the item from the drag event using the new touch input during the drag session further comprises determining whether a location of the new touch input intersects a user interface corresponding to the second application.

6. The apparatus of claim 5, wherein determining whether the location of the new touch input intersects the user interface corresponding to the second application further comprises:

determining a centroid of respective locations of the new touch input and the previous touch input associated with the drag session identifier to determine whether the second application corresponds to a target application for the drag event.

7. The device of claim 1, wherein the drag-and-drop manager is further configured to forward touch input associated with the drag session identifier to the touch event manager via inter-process communication.

8. The device of claim 1, wherein the drag-and-drop manager is further configured to receive a notification indicating an end of the drag gesture.

9. The device of claim 8, wherein the drag-and-drop manager is further configured to notify the second application to indicate that the drag gesture has ended and the item is to be dropped into the second application.

10. A method, comprising:

managing a drag session corresponding to a drag event including an initial touch input selecting an item in a first application, a drag gesture moving the item, and a touch release for dropping the item into a second application at an end of the drag gesture, wherein the drag session is assigned a drag session identifier;

receiving, from the second application, a request for information corresponding to the item associated with the drag event after the touch release, the request including information corresponding to the drag session identifier; and

sending a message to the second application including the information corresponding to the item, the information received from the first application.

11. The method of claim 10, further comprising:

receiving a request to generate a copy of the drag event in the form of a dedicated drag event that co-exists with the drag event; and

determining whether the second application will receive the item from the drag event using a new touch input during the drag session.

12. The method of claim 10, wherein the message comprising the information corresponding to the item comprises a list of representations of the items, each of the representations being provided by the first application.

13. The method of claim 12, wherein a list of the representations is ordered based at least in part on a level of quality associated with each of the representations.

14. The method of claim 10, further comprising:

preventing the item from being placed into the second application when the drag-and-drop manager determines that the second application is not authorized to receive the item.

15. The method of claim 11, wherein determining whether the second application will receive the item from the drag event using the new touch input during the drag session further comprises determining whether a location of the new touch input intersects a user interface corresponding to the second application.

16. The method of claim 15, wherein determining whether the location of the new touch input intersects the user interface corresponding to the second application further comprises:

determining a centroid of respective locations of the new touch input and the previous touch input associated with the drag session identifier to determine whether the second application corresponds to a target application for the drag event.

17. The method of claim 10, further comprising:

receiving a notification indicating that the drag gesture is complete.

18. The method of claim 17, further comprising:

notifying the second application to indicate that the drag gesture has ended and that the item is to be dropped into the second application.

19. A computer program product comprising code stored in a non-transitory computer-readable storage medium, the code comprising:

code to manage a drag session corresponding to a drag event, the drag event including an initial touch input to select an item in a first application, a drag gesture to move the item, and a touch release to drop the item into a second application at an end of the drag gesture, wherein the drag session is assigned a drag session identifier;

code to receive, from the second application, a request for information corresponding to the item associated with the drag event after the touch release, the request including information corresponding to the drag session identifier; and

code to send a message to the second application including the information corresponding to the item, the information received from the first application.

20. The computer program product of claim 19, wherein the code further comprises:

code to receive a request to generate a copy of the drag event in the form of a dedicated drag event that co-exists with the drag event; and

code to determine whether the second application will receive the item from the drag event using a new touch input during the drag session.

21. A method, comprising:

detecting a drag gesture selecting an item in a first application;

detecting a touch release for dropping the item into a second application at the end of the drag gesture;

sending a message to the second application in response to the detected touch release, the message including information about a plurality of representations of the item;

receiving a request from the second application for a representation of the item in the plurality of representations;

sending the request for the representation of the item to the first application; and

initiating a data transfer of the representation of the item from the first application to the second application.

22. The method of claim 21, wherein each representation of the plurality of representations is associated with a Uniform Type Identifier (UTI).

23. The method of claim 21, wherein the request for the representation of the item indicates a particular representation of the plurality of representations having a highest fidelity.

24. The method of claim 21, wherein initiating the data transfer of the representation of the item from the first application to the second application comprises:

providing a connection with the first application to the second application for performing the data transfer of the representation of the item.

25. The method of claim 21, wherein initiating the data transfer of the representation of the item from the first application to the second application further comprises:

determining that the second application is not currently executing;

starting the second application program; and

providing a connection to the second application with the first application to complete the data transfer.

26. The method of claim 21, further comprising:

determining whether the second application has access to data corresponding to the placed item using a data access policy, wherein the data access policy is based at least in part on whether the first application and the second application are both managed applications; and

denying completion of the drag-and-drop event in response to determining that the second application does not have access to the data corresponding to the placed item.

27. The method of claim 26, wherein determining whether the second application has access to data corresponding to the item being dropped using the data access policy is further based on determining whether the item being dropped is associated with the same type of account associated with the second application or whether the item being dropped is being dragged from a view associated with the same type of account associated with the second application.

28. The method of claim 26, further comprising:

in response to determining that the second application has access to the data corresponding to the placed item, allowing completion of the drag-and-drop event.

29. An apparatus, comprising:

at least one memory; and

at least one processor configured to:

detecting a drag gesture selecting an item in a first application;

detecting a touch release for dropping the item into a second application at the end of the drag gesture;

sending a message to the second application in response to the detected touch release, the message including information about a plurality of representations of the item;

receiving a request from the second application for a representation of the item in the plurality of representations;

sending the request for the representation of the item to the first application; and

initiating a data transfer of the representation of the item from the first application to the second application.

30. The apparatus of claim 29, wherein each representation of the plurality of representations is associated with a Uniform Type Identifier (UTI).

31. The apparatus of claim 29, wherein the request for the representation of the item indicates a particular representation of the plurality of representations having a highest fidelity.

32. The device of claim 29, wherein the at least one processor is further configured to:

providing a connection with the first application to the second application for performing the data transfer of the representation of the item.

33. The device of claim 29, wherein the at least one processor is further configured to:

determining that the second application is not currently executing;

starting the second application program; and

providing a connection to the second application with the first application to complete the data transfer.

34. The device of claim 29, wherein the at least one processor is further configured to:

determining whether the second application has access to data corresponding to the placed item using a data access policy, wherein the data access policy is based on whether the first application and the second application are both managed applications; and

denying completion of the drag-and-drop event in response to determining that the second application does not have access to the data corresponding to the placed item.

35. The device of claim 34, wherein determining whether the second application has access to data corresponding to the item being dropped using the data access policy is further based on determining whether the item being dropped is associated with the same type of account associated with the second application or whether the item being dropped is being dragged from a view associated with the same type of account associated with the second application.

36. The device of claim 34, wherein the at least one processor is further configured to:

in response to determining that the second application has access to the data corresponding to the placed item, allowing completion of the drag-and-drop event.

37. A computer program product comprising code stored in a non-transitory computer-readable storage medium, the code comprising:

code to detect a drag gesture to select an item in a first application;

code to detect a touch release at an end of the drag gesture for dropping the item into a second application;

code to send a message to the second application in response to the detected touch release, the message including information about a plurality of representations of the item;

code to receive a request from the second application for a representation of the item in the plurality of representations;

code to send the request for the representation of the item to the first application; and

code to initiate a data transfer of the representation of the item from the first application to the second application.

38. The computer-program product of claim 37, wherein the code further comprises:

code to determine whether the second application has access rights to data corresponding to the placed item using a data access policy; and

code to deny completion of the drag-and-drop event in response to determining that the second application does not have access to the data corresponding to the dropped item.

39. The computer-program product of claim 37, wherein the code further comprises:

code to determine whether the second application has access to data corresponding to the placed item using a data access policy, wherein the data access policy is based at least in part on whether the first application and the second application are both managed applications;

code to deny completion of the drag-and-drop event in response to determining that the second application does not have access to the data corresponding to the placed item; and

code to allow completion of the drag-and-drop event in response to determining that the second application has access to the data corresponding to the item being placed.

40. The computer program product of claim 39, wherein the code to use the data access policy to determine whether the second application has access to data corresponding to the item being dropped is further based on determining whether the item being dropped is associated with a same type of account associated with the second application or whether the item being dropped is being dragged from a view associated with the same type of account associated with the second application.

41. A method, comprising:

identifying that an item is dragged over a first application, the first application comprising a table of items, the items being arranged in the table in a first arrangement, and each item in the table corresponding to a respective graphical representation of a respective file;

in response to the identifying, copying the first arrangement to generate a second arrangement corresponding to an initial arrangement of the items in the table when the items are first identified as being dragged over the first application;

updating the first arrangement to reflect a changed position of the item in the table resulting from another item being inserted into the table when the item is dragged over the first application;

updating the second arrangement to reflect a changed position of the items in the initial arrangement of the items, the changed position resulting from a position at which the items are released for insertion into the table, the second arrangement not including the other item; and

merging the updated second arrangement with the updated first arrangement to reconcile a changed position resulting from the item being inserted into the table at a position based on the initial arrangement with a changed position resulting from the other item being inserted into the table.

42. The method of claim 41, wherein the first application further comprises a set of second items, each item in the set of second items corresponding to another respective graphical representation of another respective file.

43. The method of claim 41, wherein the items are inserted into the table of items at positions in the initial arrangement of different items.

44. The method of claim 43, wherein the different item is moved to a different location in a table of items in response to the item being inserted.

45. The method of claim 41, wherein the item corresponds to a file stored at a remote location.

46. The method of claim 45, further comprising:

providing for display, until the transfer of the file is complete, a placeholder image at a location in the table at which the item is released.

47. The method of claim 46, wherein the placeholder image comprises a progress bar indicating progress of the transmission of the file.

48. The method of claim 41, wherein the items are arranged in the table based on a particular sort order, the particular sort order comprising an alphabetic sort order.

49. The method of claim 48, further comprising:

determining that the items are out of order according to the particular sort order based on a position in the table where the items are inserted; and

in response to determining that the items are out of order:

determining a second position in the table to move the item based on the particular sort order; and

moving the item to the second location in the table.

50. An apparatus, comprising:

at least one memory; and

at least one processor configured to:

identifying that an item is dragged over a first application, the first application comprising a table of items, the items being arranged in the table in a first arrangement, and each item in the table corresponding to a respective graphical representation of a respective file;

in response to the identifying, copying the first arrangement to generate a second arrangement corresponding to an initial arrangement of the items in the table when the items are first identified as being dragged over the first application;

updating the first arrangement to reflect a changed position of the item in the table resulting from another item being inserted into the table when the item is dragged over the first application;

updating the second arrangement to reflect a changed position of the items in the initial arrangement of the items, the changed position resulting from a position at which the items are released for insertion into the table, the second arrangement not including the other item; and

merging the updated second arrangement with the updated first arrangement to reconcile a changed position resulting from the item being inserted into the table at a position based on the initial arrangement with a changed position resulting from the other item being inserted into the table.

51. The device of claim 50, wherein the first application further comprises a set of second items, each item in the set of second items corresponding to another respective graphical representation of another respective file.

52. The apparatus of claim 50, wherein the items are inserted into the table of items at positions of different items in the initial arrangement.

53. The apparatus of claim 52, wherein the different item is moved to a different location in a table of items in response to the item being inserted.

54. The apparatus of claim 50, wherein the item corresponds to a file stored at a remote location.

55. The device of claim 54, wherein the at least one processor is further configured to:

providing for display, until the transfer of the file is complete, a placeholder image at a location in the table at which the item is released.

56. The device of claim 55, wherein the placeholder image comprises a progress bar indicating progress of the transfer of the file.

57. The apparatus of claim 50, wherein the items are arranged in the table based on a particular sort order, the particular sort order comprising an alphabetical sort order.

58. The device of claim 57, wherein the at least one processor is further configured to:

determining that the items are out of order according to the particular sort order based on a position in the table where the items are inserted; and

in response to determining that the items are out of order:

determining a second position in the table to move the item based on the particular sort order; and

moving the item to the second location in the table.

59. A computer program product comprising code stored in a non-transitory computer-readable storage medium, the code comprising:

code to identify that an item is dragged over a first application, the first application comprising a table of items, the items being arranged in the table in a first arrangement, and each item in the table corresponding to a respective graphical representation of a respective file;

code to copy the first arrangement in response to the identifying to generate a second arrangement corresponding to an initial arrangement of the items in the table when the items are first identified as being dragged over the first application;

code to update the first arrangement to reflect a changed position of the item in the table resulting from another item being inserted into the table when the item is dragged over the first application;

code to update the second arrangement to reflect a changed position of the items in the initial arrangement of the items, the changed position resulting from a position at which the items are released for insertion into the table, the second arrangement not including the other item; and

code to merge the updated second arrangement with the updated first arrangement to reconcile a changed position resulting from the item being inserted into the table at a position based on the initial arrangement and a changed position resulting from the other item being inserted into the table.

60. The computer program product of claim 59, wherein the first application further comprises a set of second items, each item in the set of second items corresponding to another respective graphical representation of another respective file.

61. An apparatus, comprising:

a memory; and

at least one processor configured to:

receiving a request for a representation of an item from a target application;

sending the request for the representation of the item to a source application;

receiving a link to a file provider that enables data transfer of the representation of the item; and

sending the link to the file provider to the target application.

62. The apparatus of claim 61, wherein the file provider is an application separate from the target application and the source application, the file provider storing a representation of the item.

63. The apparatus of claim 61, wherein the data transfer is initiated between the file provider and the target application.

64. The apparatus of claim 61, wherein the file provider is included in a sandbox environment.

65. The apparatus of claim 61, wherein the request for the representation of the item is received from the target application in response to an end of a drag gesture of a drag session associated with the item.

66. A method, comprising:

receiving a request to add a first item of a source application to a drag session associated with at least a second item of the source application;

determining a first identifier of the first item; and

in response to receiving the request, associating the first identifier of the first item with the drag session.

67. The method of claim 66, wherein associating the first identifier of the first item with the drag session comprises:

adding the first identifier of the first item to a data structure that includes at least a second identifier of at least the second item associated with the drag session.

68. The method of claim 67, further comprising:

receiving another request to remove the first item from the drag session; and

in response to receiving the request, disassociating the first identifier of the first item from the drag session.

69. The method of claim 68, wherein disassociating the first identifier of the first item from the drag session comprises:

removing the first identifier of the first item from the data structure.

70. The method of claim 68, further comprising:

in response to disassociating the first identifier of the first item from the drag session, providing a graphical animation indicating that the first item has been removed from the drag session.

71. The method of claim 66, wherein the drag session is associated with a drag session identifier.

72. The method of claim 66, further comprising:

detecting a first touch input corresponding to a drag gesture associated with the drag session, wherein the request to add the first item to the drag session corresponds to a second touch input comprising a selection of the first item, the second touch input being detected concurrently with the first touch input.

73. The method of claim 72, further comprising:

detecting a touch release indicating an end of the drag gesture associated with the drag session;

determining that the touch release is detected in the source application; and

in response to the determination, canceling the drag session.

74. The method of claim 66, wherein the drag session is associated with a third item of the source application upon receiving the request to add the first item to the drag session.

75. A computer program product comprising code stored in a non-transitory computer-readable storage medium, the code comprising:

code to receive a request for a representation of an item from a target application;

code to send the request for the representation of the item to a source application;

code to receive a link to a file provider that enables data transfer of the representation of the item; and

code to send the link to the file provider to the target application.

76. The computer program product of claim 75, wherein the file provider is an application independent of the target application and the source application, the file provider storing a representation of the item.

77. The computer program product of claim 75, wherein the data transfer is initiated between the file provider and the target application.

78. The computer program product of claim 75, wherein the file provider is included in a sandbox environment.

79. The computer program product of claim 75, wherein the request for the representation of the item is received from the target application in response to an end of a drag gesture of a drag session associated with the item.

80. The computer-program product of claim 79, wherein the code further comprises:

code to detect a first touch input corresponding to the drag gesture of the drag session, the first touch input comprising a selection of another item in the source application;

code to receive a request to add the item to the drag session, wherein the request corresponds to a second touch input detected concurrently with the first touch input;

code to determine an identifier of the item in response to the request to add the item; and

code to associate the identifier of the item with the drag session.

Technical Field

This specification relates generally to implementing drag-and-drop functionality on a touch screen electronic device.

Background

Drag and drop gestures enable data to be moved or copied from a source application to a target application. For example, a user may drag a representation of a photograph from a first application and drop the representation of the photograph into a second application. Data corresponding to the photograph may then be copied or moved from the first application to the second application in response to the placing.

Drawings

Some of the features of the subject technology are set forth in the appended claims. However, for purposes of explanation, several embodiments of the subject technology are set forth in the following figures.

FIG. 1 illustrates an exemplary network environment including electronic devices in which the subject system may be implemented in accordance with one or more implementations.

Fig. 2A-2C illustrate exemplary drag-and-drop operations performed on an electronic device including a touch screen according to one or more implementations.

2D-2F illustrate exemplary drag-and-drop operations involving multiple data items performed on an electronic device including a touch screen according to one or more implementations.

Fig. 2G-2H illustrate exemplary multiple drag-and-drop operations involving different data items from different source applications performed on an electronic device including a touch screen according to one or more implementations.

Fig. 2I-2J illustrate exemplary multiple drag-and-drop operations involving multiple data items from one source application to different target applications performed on an electronic device including a touch screen according to one or more implementations.

Fig. 2K-2L illustrate exemplary multiple drag-and-drop operations involving data items from different source applications to different target applications performed on an electronic device including a touch screen according to one or more implementations.

Fig. 3 illustrates an exemplary drag-and-drop architecture that can be implemented on an electronic device including a touch screen in accordance with one or more implementations.

Fig. 4A illustrates a flow diagram of an exemplary process for performing a data transfer as part of a drag-and-drop operation on an electronic device that includes a touch screen in accordance with one or more implementations.

Fig. 4B illustrates a flow diagram of an exemplary process for completing a data transfer as part of a drag-and-drop operation on an electronic device including a touch screen using a file provider in accordance with one or more implementations.

FIG. 4C illustrates a flow diagram of an exemplary process for adding an item to an existing drag session on an electronic device that includes a touch screen in accordance with one or more implementations.

FIG. 4D illustrates a flow diagram of an exemplary process for removing an item from an existing drag session on an electronic device that includes a touch screen in accordance with one or more implementations.

FIG. 5 illustrates a flow diagram of an exemplary process for enforcing a security policy for a drag-and-drop operation on an electronic device including a touch screen in accordance with one or more implementations.

Fig. 6A-6C illustrate exemplary drag-and-drop operations for text selection performed on an electronic device including a touch screen according to one or more implementations.

Fig. 6D illustrates a flow diagram of an exemplary process for performing a drag-and-drop operation for text selected on an electronic device including a touch screen in accordance with one or more implementations.

7A-7B illustrate exemplary drag-and-drop operations performed on an electronic device including a touch screen involving a form view according to one or more implementations.

Figures 8A-8B illustrate drag-and-drop operations involving a collection view performed on an electronic device including a touchscreen according to one or more implementations.

FIG. 9 illustrates a flow diagram of an exemplary process for performing a drag-and-drop operation involving a form view or a collection view on an electronic device that includes a touch screen in accordance with one or more implementations.

FIG. 10 illustrates an electronic system that may be used to implement one or more implementations of the subject technology.

Detailed Description

The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology may be practiced. The accompanying drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a thorough understanding of the subject technology. The subject technology is not limited to the specific details set forth herein, however, and may be practiced with one or more other implementations. In one or more implementations, structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology.

The subject system provides an architecture that enables drag-and-drop functionality with security features on a touch screen electronic device.

FIG. 1 illustrates an exemplary network environment 100 including an electronic device 110 in which the subject system may be implemented in accordance with one or more implementations. However, not all of the depicted components may be used in all implementations, and one or more implementations may include additional or different components than those shown in the figures. Variations in the arrangement and type of these components may be made without departing from the spirit or scope of the claims set forth herein. Additional components, different components, or fewer components may be provided.

Network environment 100 includes electronic device 110, server 120, and server 122, where server 120 and/or server 122 may be included in server group 130. Network 106 communicatively (directly or indirectly) couples, for example, electronic device 110 with server 120 and/or server 122 and/or server farm 130. In one or more implementations, the network 106 may be an interconnected network that may include the internet or a device communicatively coupled to the internet. For purposes of explanation, network environment 100 is shown in FIG. 1 to include an electronic device 110, a server 120, a server 122, and a server farm 130; however, network environment 100 may include any number of electronic devices and any number of servers or data centers including multiple servers.

The electronic device 110 may include a touch screen and may be, for example, a portable computing device, such as a laptop computer including a touch screen, a smartphone including a touch screen, a peripheral device including a touch screen (e.g., a digital camera, a headset), a tablet device including a touch screen, a wearable device including a touch screen (e.g., a watch, a wristband, etc.), any other suitable device including, for example, a touch screen, or any electronic device having a trackpad. In one or more implementations, the electronic device 110 may not include a touch screen, but may support touch screen-like gestures, such as in a virtual reality environment or an augmented reality environment. In one or more implementations, the electronic device 110 may include a touchpad. In FIG. 1, by way of example, the electronic device 110 is depicted as a tablet device having a touch screen. In one or more implementations, electronic device 110 may be and/or may include all or a portion of the electronic devices discussed below with respect to an electronic system (which is discussed below with respect to fig. 10).

Electronic device 110 can implement the subject system to provide drag-and-drop functionality via a touch screen. For example, the electronic device 110 may implement an exemplary drag-and-drop architecture, discussed further below with respect to fig. 3. Examples of drag-and-drop operations performed via a touch screen are further discussed below with respect to fig. 2A-2C, 2D-2F, 2G-2H, 2I-2J, 2K-2L, 6A-6C, 7A-7B, and 8A-8B.

Server 120 and/or server 122 may be part of a computer network or server farm 130, such as in a cloud computing or data center implementation. The server 120, the server 122, and the server farm 130 may store data accessible on the electronic device 110, such as photos, music, text, web pages, and/or content provided therein, and the like. In one or more implementations, the electronic device 110 may support drag-and-drop operations that involve dragging and dropping representations of data, such as image files, text, sound files, video files, applications, and so forth, that are physically stored on the server 120 or the server 122 or on one or more servers in the server farm 130.

Fig. 2A-2C illustrate exemplary drag-and-drop operations performed on an electronic device 110 including a touch screen according to one or more implementations. For purposes of explanation, the exemplary drag-and-drop operation illustrated in fig. 2A-2C is described as being performed on the electronic device 110 of fig. 1. However, the exemplary drag-and-drop operation illustrated in fig. 2A-2C may be performed on any electronic device that includes a touchscreen or any electronic device having a trackpad.

As shown in fig. 2A, the touch screen 210 of the electronic device 110 may simultaneously display two different applications, which may be referred to as a source application 230 (e.g., a photo library application or any application) and a target application 240 (e.g., a document editor application or any application). For purposes of explanation, the applications 230, 240 are shown side-by-side in FIG. 2A; however, the application may currently be displayed in any manner and/or in any orientation. The electronic device 110 may detect an initial touch input based on the user's finger 270 touching the touch screen 210 of the electronic device 110. For example, a touch input may be detected based on a user's finger 270 touching an image 220 displayed in an image library of the source application 230. The image 220 may be a representation of a data item, such as an image file, a sound file, a video file, etc., stored on the electronic device 110 and/or on the server 120 or the server 122. In at least one implementation, the touch input need not touch the image 220, and may instead be near another user interface element or near the image 220. Further, the touch input may not be a static touch input, and may be a gesture or a moving touch input in examples. For purposes of explanation, the drag-and-drop operation is described with respect to image 220; however, any type, form, or representation of data may be dragged from the source application 230 to the target application 240, and vice versa.

Based on one or more factors, such as the duration of the touch input, the touch input in the source application 230 can be recognized by the electronic device 110 as the initiation of a drag session for a drag-and-drop operation. In one example, the touch input may correspond to a long touch gesture in which the touch input is maintained on the touch screen 210 of the electronic device 110 for at least a period of time. Upon detecting a long touch gesture associated with initiation of the drag session, electronic device 110 may cancel (or forgo processing) other current touch inputs received by electronic device 110 in source application 230. In one example, the hierarchy of touch inputs may prioritize long touch gestures to initiate a drag session over other types of touch gestures that may be received during the drag session, and electronic device 110 may delay processing of these other touches until a touch release is detected that corresponds to dropping an item into target application 240. In another example, the hierarchy of touch inputs may prioritize a long touch or press gesture such that the gesture overlays another gesture. Further, although fig. 2A illustrates an exemplary touch input involving a single finger, in other cases, electronic device 110 is configured to detect touch inputs from multiple fingers indicating initiation of a drag session. The electronic device 110 may support any other type of touch input for initiating a drag session. As described in further detail below with reference to fig. 2G-2H, 2I-2J, and 2K-2L, the electronic device 110 supports multiple drag sessions.

In one or more implementations, only the source application 230 may be displayed when the drag session is initiated and/or for the duration of the drag session. For example, the image 220 may be dragged from the source application 230 to a shortcut or other representation of the target application 240. In one or more implementations, the target application 240 may not be executing when the drag session is initiated, and is executed or launched in response to dropping the image 220 onto the representation of the target application 240. In another example, the target application 240 may be launched by hovering the image 220 over a representation of the target application. In another example, the target application 240 may be launched by another selection or touch input to launch the target application 240.

As shown in fig. 2B, the electronic device 110 detects a drag gesture caused by a user's finger 270 dragging on the touch screen 210 of the electronic device 110. The drag gesture drags the selected image 220 from the source application 230 to the target application 240. The example shown in FIG. 2B illustrates an ongoing drag session in which a dragged image 220 is positioned between a source application 230 and a target application 240.

While the drag session is occurring and active, the source application 230 and/or the target application 240 are responsive and may be otherwise interacted with by the user. In particular, the electronic device 110 supports multi-touch input by allowing other touch inputs to occur during the drag session. These other touch inputs may be received by the background process as part of a drag-and-drop architecture, which will be described in more detail below with reference to FIG. 3. In this example, the image 225 in the source application 230 may be moved by the user during the drag session. The image 225 may be a representation of a data item stored on the electronic device 110 and/or on the server 120 or server 122, such as an image file, a video currently playing in an application (streamed locally or over the network 106), a sound file, and so forth. The representation may also be for an application, data item, file, group of files, and the like. As shown in FIG. 2B, the electronic device 110 detects a separate touch input indicating that the user's finger 290 has selected the image 225 and is moving the image 225 within the source application 230. In another example, the image 220 may also be interacted with during a drag session.

As shown in fig. 2C, the electronic device 110 detects completion of the drag gesture when the user's finger 270 is lifted from the touch screen 210 of the electronic device 110. When the electronic device 110 detects completion of the drag gesture, the electronic device 110 determines whether the target application 240 satisfies any security policies and/or data access policies associated with the data item corresponding to the image 220. An exemplary process of implementing a security policy for a drag-and-drop operation is discussed further below with respect to fig. 5. Although the image 220 has been discussed with respect to the user's finger 270, the electronic device 110 is configured to support other types of touch inputs for dragging items as part of a drag gesture. For example, the electronic device 110 may detect touch input from a pen/pencil or electronic stylus device for dragging an item from the source application 230 to the target application 240 as part of a drag session.

As described above, during the drag session, the source application 230 and/or the target application 240 are responsive and may otherwise interact with the user based on multi-touch support provided by the electronic device 110. As shown in FIG. 2C, during the drag session, image 225 has been moved from its initial position in FIG. 2B as electronic device 110 detects that a touch input corresponding to user's finger 290 moved image 225 to a different location within source application 230.

If the target application 240 satisfies any security policies and/or data access policies associated with the data item corresponding to the image 220, the image 220 is allowed to be placed into the target application 240 (such as into the document 250) and the data item corresponding to the image 220 is transferred from the source application 230 to the target application 240. An exemplary process of performing a data transfer as part of a drag-and-drop operation is discussed further below with respect to FIG. 4A.

In one or more implementations, a user can drag and drop (or perform another gesture, such as a flick gesture) a representation of an application (e.g., an application shortcut) over or near a given portion of a display (e.g., a pip region) provided in touch screen 210 of electronic device 110 and the application will launch, with an application window being displayed in the given portion of the display, such as, for example, in conjunction with the simultaneous display of one or more other applications in one or more other portions of the display.

In one or more implementations, while performing the aforementioned data transfer, source application 230 and/or target application 240 may interact with the user and/or be placed in the background without disrupting the data transfer. In addition, a different application may be launched or opened while the data transfer is being performed. Thus, normal operation of the electronic device 110 is supported while data transfer is performed without negatively impacting data transfer and/or system performance.

In one or more implementations, the source application 230 and/or the target application 240 may implement a table view or collection view, and thus the image 220 may be inserted into the table or collection view of the target application 240. Exemplary drag-and-drop operations involving a table view are discussed further below with respect to fig. 7A-7B, and exemplary drag-and-drop operations involving a collection view are discussed further below with respect to fig. 8A-8B. Additionally, an exemplary process for performing a drag-and-drop operation involving a table view or a collection view is discussed further below with respect to FIG. 9.

In one or more implementations, a user can drag a text selection from the source application 230 to the target application 240, rather than dragging an image 220 corresponding to a data item from the source application 230 to the target application 240. Examples of performing a drag-and-drop operation for text selection are discussed further below with respect to fig. 6A-6C. An exemplary process of performing a drag-and-drop operation for text selection is discussed further below with respect to fig. 6D.

In one or more implementations, the electronic device 110 provides an animation indicating that the image 220 is now part of the drag session, which may be provided by the electronic device 110 after detecting a particular gesture. In one example, the gesture may be a long touch gesture in which the user presses a finger on the representation of the data item (e.g., image 220) and holds the finger there for a predetermined period of time. After detecting the gesture, the electronic device 110 may, for example, perform an animation of the display image 220 lifting off of the user interface of the source application 230. The electronic device 110 can customize this animation to include any other type of animation that indicates that the image 220 is part of a drag session. In one implementation, the source application 230 may implement, specify, or provide animations.

In one or more implementations, the electronic device 110 provides an animation indicating that the image 220 is being dropped into the target application 240, which may be provided after completion of the drag gesture is detected (e.g., when the user's finger 270 is lifted from the touch screen 210 of the electronic device 110). For example, the electronic device 110 may perform an animation of the display image 220 zooming out and/or fading out from the user interface of the target application 240. The electronic device 110 can customize this animation to include any other type of animation.

In one or more implementations, the drag-and-drop operation can be cancelled based on a detected input (or a set of inputs) from the user indicating to cancel the drag session, and/or based on the electronic device 110 determining that the drop operation cannot be completed when the user lifts their finger to perform the drop operation (e.g., due to one or more security constraints discussed further below). For example, the electronic device 110 may detect that the user released the drag gesture within the source application 230 without moving the image 220 over the target application 240. In another example, the electronic device 110 may detect that the user moved the image 220 over the target application 240, but then moved the image 220 back to the source application 230 and released the drag gesture over the source application 230. Upon detecting the cancellation of the drag session, the electronic device 110 may provide an animation indicating that the drag session is cancelled. For example, electronic device 110 may perform an animation in which display image 220 zooms in and/or out from a user interface of source application 230. The electronic device 110 can customize this animation to include any other type of animation.

In one or more implementations, the subject system allows for simultaneous dragging of multiple data items as part of the same drag session. When a drag session is initiated and/or additional items may be added to the ongoing drag session, the data items may be selected together. For example, when the user's finger 270 is dragging the image 220, the user may select the image 225 to be included in the drag session from the source application 230 using another gesture, such as a tap from another finger. Thus, the subject system supports multi-touch input, where multiple touch inputs, such as multiple fingers, can be detected simultaneously. In addition, the subject system allows for the removal of one or more items from the same drag session. For example, images 225 added to the same drag session may also be removed. The subject system also supports gestures for interacting with data items included in the same drag session. For example, the electronic device 110 may detect that the user performed a pinch zoom-in gesture and may zoom-in on a plurality of items that have been included as part of a drag session. Further, the electronic device 110 may detect that the user performed a pinch-out gesture and may zoom out a plurality of items that have been included as part of the drag session. The electronic device 110 can support other types of touch inputs for interacting with items in a drag session. For example, the electronic device 110 may detect a touch input from two fingers of the user, which would allow the item to be rotated and/or zoomed based on the two-finger touch input.

The subject system supports the inclusion of multiple data items as part of a drag session. Fig. 2D-2F illustrate exemplary drag-and-drop operations involving multiple data items performed on an electronic device 110 including a touch screen according to one or more implementations. For purposes of explanation, the exemplary drag-and-drop operations shown in fig. 2D-2F are described as being performed on the electronic device 110 of fig. 1. However, the exemplary drag-and-drop operations illustrated in fig. 2D-2E may be performed on any electronic device that includes a touch screen.

As shown in fig. 2D, the electronic device 110 may detect an initial touch input based on a finger 270 of a user touching the touch screen 210 of the electronic device 110. For example, the touch input may be a long touch gesture detected based on the user's finger 270 remaining touching the image 220 for a predetermined period of time. In response to the touch input, the electronic device 110 can include the image 220 as part of a new drag session.

To include another data item in the drag session, electronic device 110 can detect a second initial touch input based on the user's finger 290 touching touch screen 210 of electronic device 110. For example, the touch input may be a flick gesture detected based on a user's finger 290 touching an image 225 in an image library of the source application 230. In response to the second touch input, the electronic device 110 can add the image 225 as part of the drag session. Although a flick gesture is described as an example, the source application may utilize any gesture or type of touch input for adding a data item to an existing drag session. In this way, multiple data items may be added to the same drag session before the user drags the items to the target application 240. The plurality of items included in the drag session may be arranged in different ways. In one example, the items may be arranged in a stack, where the items are organized in a fan pattern or grid, or the items may be arranged substantially adjacent to each other.

In one or more implementations, the electronic device 110 can provide an animation to indicate that another data item has been added to the drag session. For example, the added items may be animated to move toward an existing drag session item or stack of items.

As shown in fig. 2E, the electronic device 110 detects a drag gesture caused by a user's finger 270 dragging on the touch screen 210 of the electronic device 110. In this example, the drag gesture drags the selected images 220 and 225 from the source application 230 to the target application 240 as part of the same drag session. The example shown in FIG. 2E illustrates an ongoing drag session in which dragged images 220 and 225 are positioned between a source application 230 and a target application 240. Although fig. 2E shows image 220 positioned over image 225, other arrangements of images are possible. In another example, the image 225 may be placed on the right side of the image 220 while being dragged, and vice versa.

As shown in fig. 2F, the electronic device 110 detects completion of the drag gesture when the user's finger 270 is lifted from the touch screen 210 of the electronic device 110. When the electronic device 110 detects completion of the fling gesture, the electronic device 110 determines whether the target application 240 satisfies any security policies and/or data access policies associated with the data items corresponding to the images 220 and 225.

If the target application 240 satisfies any security policies and/or data access policies associated with the data items corresponding to the image 220 and the image 225, the image 220 and/or the image 225 is allowed to be placed into the target application 240 (such as into the document 250) and the data items corresponding to the image 220 and/or the image 225 are transmitted from the source application 230 to the target application 240. If the target application 240 does not satisfy the security policy or data access policy associated with the data item corresponding to the image 220 and/or the data item corresponding to the image 225, the placement operation may be partially or fully undone.

In one or more implementations, electronic device 110 may support multiple drag sessions that occur simultaneously while processing other touch inputs. For example, a user may launch and/or interact with another application during multiple drag sessions without negatively impacting system performance. Other examples of multiple drag sessions related to multiple drag and drop operations are described below.

Fig. 2G-2H illustrate exemplary multiple drag-and-drop operations involving different data items from different source applications performed on an electronic device 110 including a touch screen according to one or more implementations. For purposes of explanation, the exemplary drag-and-drop operations shown in fig. 2G-2H are described as being performed on the electronic device 110 of fig. 1. However, the exemplary drag-and-drop operations shown in fig. 2G-2H may be performed on any electronic device that includes a touch screen.

As shown in fig. 2G, to include the data item in the first drag session, the electronic device 110 can detect an initial touch input based on the user's finger 270 touching the touch screen 210 of the electronic device 110. For example, the touch input may be a long touch gesture detected based on the user's finger 270 remaining touching the image 220 for a predetermined period of time. In response to the touch input, the electronic device 110 can include the image 220 as part of a new drag session associated with the source application 230.

In fig. 2G, a second source application 235 is provided below the source application 230 (e.g., a web browser application or any application). To include a data item corresponding to the image 282 in a second drag session associated with a different source application 235, the electronic device 110 may detect a second initial touch input based on the user's finger 290 touching the touch screen 210 of the electronic device 110. For example, the touch input may be a long touch gesture detected based on the user's finger 290 remaining touching the image 282 for a predetermined period of time. In response to the touch input, electronic device 110 can include image 282 as part of a second new drag session. The image 282 may be a representation of a data item, such as an image file, a sound file, etc., stored on the electronic device 110 and/or on the server 120 or the server 122.

The electronic device 110 may detect a first drag gesture caused by a user's finger 270 dragging on the touch screen 210 of the electronic device 110. In this example, the drag gesture drags the image 220 from the source application 230 to the target application 240 as part of the first drag session.

The electronic device 110 may detect a second drag gesture caused by the user's finger 290 dragging on the touch screen 210 of the electronic device 110. In this example, the drag gesture drags the representation of the image 282 from the source application 235 to the target application 240 as part of the second drag session.

As shown in FIG. 2H, the electronic device 110 detects completion of the first drag gesture when the user's finger 270 is lifted from the touch screen 210 of the electronic device 110. When the electronic device 110 detects completion of the first drag gesture, the electronic device 110 determines whether the target application 240 satisfies any security policies and/or data access policies associated with the data item corresponding to the image 220.

Similarly, the electronic device 110 detects completion of the second drag gesture when the user's finger 290 is lifted from the touch screen 210 of the electronic device 110. When the electronic device 110 detects completion of the second drag gesture, the electronic device 110 determines whether the target application 240 satisfies any security policies and/or data access policies associated with the data item corresponding to the image 282.

If the target application 240 satisfies any security policies and/or data access policies associated with the data items corresponding to the image 220 and the image 225, the image 220 and/or the image 225 is allowed to be placed into the target application 240 (such as into the document 252), and the data items corresponding to the image 220 and/or the image 282 are transferred from the source application 230 and/or the source application 235 to the target application 240. In the example of FIG. 2H, the data transfer of the image 220 may be a move operation, where the image 220 is moved from the source application 230 to the destination application 240. In contrast, the data transfer of the image 282 may be a copy operation, where the image 282 is copied from the source application 235 to the target application 240 and inserted into the target application 240 as the copied image 284.

Fig. 2I-2J illustrate exemplary multiple drag-and-drop operations involving multiple data items from one source application to different target applications performed on an electronic device 110 including a touch screen according to one or more implementations. For purposes of explanation, the exemplary drag-and-drop operations illustrated in fig. 2I-2J are described as being performed on the electronic device 110 of fig. 1. However, the exemplary drag-and-drop operation illustrated in fig. 2I-2J may be performed on any electronic device that includes a touch screen.

As shown in fig. 2I, to include the data item in the first drag session, the electronic device 110 can detect an initial touch input based on the user's finger 270 touching the touch screen 210 of the electronic device 110. For example, the touch input may be a long touch gesture detected based on the user's finger 270 remaining touching the image 220 for a predetermined period of time. In response to the touch input, the electronic device 110 can include the image 220 as part of a new drag session associated with the source application 230.

To include a data item corresponding to the image 225 in a second drag session associated with the same source application 230, the electronic device 110 can detect a second initial touch input based on the user's finger 290 touching the touch screen 210 of the electronic device 110. For example, the touch input may be a long touch gesture detected based on the user's finger 290 remaining touching the image 225 for a predetermined period of time. In response to the touch input, electronic device 110 can include image 225 as part of a second new drag session associated with source application 230.

The electronic device 110 may detect a first drag gesture caused by a user's finger 270 dragging on the touch screen 210 of the electronic device 110. In this example, the drag gesture drags the image 220 from the source application 230 to the target application 240 as part of the first drag session.

The electronic device 110 may detect a second drag gesture caused by the user's finger 290 dragging on the touch screen 210 of the electronic device 110. In this example, the drag gesture drags the image 225 from the source application 230 to a different target application 245 (e.g., a rendering application or any application) as part of a second drag session.

As shown in FIG. 2J, the electronic device 110 detects completion of the first drag gesture when the user's finger 270 is lifted from the touch screen 210 of the electronic device 110. When the electronic device 110 detects completion of the first drag gesture, the electronic device 110 determines whether the target application 240 satisfies any security policies and/or data access policies associated with the data item corresponding to the image 220.

Similarly, the electronic device 110 detects completion of the second drag gesture when the user's finger 290 is lifted from the touch screen 210 of the electronic device 110. When the electronic device 110 detects completion of the second fling gesture, the electronic device 110 determines whether the target application 245 satisfies any security policies and/or data access policies associated with the data item corresponding to the image 225.

If the target application 240 satisfies any security policies and/or data access policies associated with the data item corresponding to the image 220, the image 220 is allowed to be placed into the target application 240 (such as into the document 255) and the data item corresponding to the image 220 is transferred from the source application 230 to the target application 240.

If the target application 245 satisfies any security policies and/or data access policies associated with the data item corresponding to the image 225, the image 225 is allowed to be placed into the target application 245 and the data item corresponding to the image 225 is transferred from the source application 230 to the target application 245.

Fig. 2K-2L illustrate exemplary multiple drag-and-drop operations involving data items from different source applications to different target applications performed on an electronic device 110 including a touch screen according to one or more implementations. For purposes of explanation, the exemplary drag-and-drop operations shown in fig. 2K-2L are described as being performed on the electronic device 110 of fig. 1. However, the exemplary drag-and-drop operations shown in fig. 2K-2L may be performed on any electronic device that includes a touch screen.

As shown in fig. 2K, to include the data item in the first drag session, the electronic device 110 may detect an initial touch input based on the user's finger 270 touching the touch screen 210 of the electronic device 110. For example, the touch input may be a long touch gesture detected based on the user's finger 270 remaining touching the image 220 for a predetermined period of time. In response to the touch input, the electronic device 110 can include the image 220 as part of a new drag session associated with the source application 230.

In fig. 2K, a second source application 235 is provided below the source application 230 (e.g., a web browser application or any application). To include a data item corresponding to the image 282 in a second drag session associated with a different source application 235, the electronic device 110 may detect a second initial touch input based on the user's finger 290 touching the touch screen 210 of the electronic device 110. For example, the touch input may be a long touch gesture detected based on the user's finger 290 remaining touching the image 282 for a predetermined period of time. In response to the touch input, electronic device 110 can include image 282 as part of a second new drag session.

The electronic device 110 may detect a first drag gesture caused by a user's finger 270 dragging on the touch screen 210 of the electronic device 110. In this example, the drag gesture drags the image 220 from the source application 230 to the target application 240 as part of the first drag session.

The electronic device 110 may detect a second drag gesture caused by the user's finger 290 dragging on the touch screen 210 of the electronic device 110. In this example, the drag gesture drags the representation of the image 282 from the source application 235 to the second target application 245 as part of a second drag session.

As shown in FIG. 2L, the electronic device 110 detects completion of the first drag gesture when the user's finger 270 is lifted from the touch screen 210 of the electronic device 110. When the electronic device 110 detects completion of the first drag gesture, the electronic device 110 determines whether the target application 240 satisfies any security policies and/or data access policies associated with the data item corresponding to the image 220.

Similarly, the electronic device 110 detects completion of the second drag gesture when the user's finger 290 is lifted from the touch screen 210 of the electronic device 110. When the electronic device 110 detects completion of the second drag gesture, the electronic device 110 determines whether the second target application 245 satisfies any security policies and/or data access policies associated with the data item corresponding to the image 282.

If the target application 240 satisfies any security policies and/or data access policies associated with the data item corresponding to the image 220, the image 220 is allowed to be placed into the target application 240 (such as into the document 256) and the data item corresponding to the image 220 is transferred from the source application 230 to the target application 240.

If the target application 245 satisfies any security policies and/or data access policies associated with the data item corresponding to the image 282, the image 282 is allowed to be placed into the target application 245 and the data item corresponding to the image 282 is transmitted from the source application 235 to the target application 245.

In the example of FIG. 2L, the data transfer of the image 220 may be a move operation, where the image 220 is moved from the source application 230 to the target application 240. In contrast, the data transfer of the image 282 may be a copy operation, where the image 282 is copied from the source application 235 to the second target application 245.

Fig. 3 illustrates an exemplary drag-and-drop architecture 300 that can be implemented on an electronic device 110 that includes a touch screen in accordance with one or more implementations. For purposes of explanation, the drag-and-drop architecture 300 is described as being implemented by the electronic device 110 of fig. 1 and 2, such as by a processor and/or memory of the electronic device 110; however, the drag-and-drop architecture 300 may be implemented by any other electronic device that includes a touch screen. However, not all of the depicted components may be used in all implementations, and one or more implementations may include additional or different components than those shown in the figures. Variations in the arrangement and type of these components may be made without departing from the spirit or scope of the claims set forth herein. Additional components, different components, or fewer components may be provided.

The drag-and-drop architecture 300 includes a drag-and-drop manager 310 configured to manage drag sessions corresponding to drag events between the source application 230 and the target application 240. The drag-and-drop architecture 300 supports multiple drag sessions. In particular, the drag-and-drop manager 310 is configured to manage independent drag sessions corresponding to drag events between the source application 235 and the target application 245. For purposes of explanation, the following discussion makes reference to a drag session associated with the source application 230 and the target application 240; however, the following discussion is also applicable to drag sessions associated with the source application 235 and the target application 245.

In one or more implementations, the drag-and-drop manager 310 can be implemented as a User Interface (UI) process, such as an application or daemon running on the electronic device 110 with system level permissions, including a rendering context associated with the application that enables the application to draw or render on any user interface displayed on the touch screen, and also allows the application to create a drag session associated with a drag event. In one example, the rendering context associated with an application (e.g., drag-and-drop manager 310) is a transparent full screen layer located above any user interface displayed on the touch screen.

In one or more implementations, the drag-and-drop manager 310 manages drag item previews during a drag session. For example, when dragged, an item from a source application can be provided as a preview corresponding to a graphical representation of the item. For example, the preview of the item may be a thumbnail image, a video clip, or any other suitable graphical representation, depending on the type of item. When an item is dragged from a source application into a target application, a preview of the item presented in the target application may be a different graphical representation than the preview in the source application. The drag-and-drop manager can provide an animation that transitions or changes to a different representation when an item is dragged from a source application over a target application, and vice versa. Further, each preview in the source application 230 and/or the target application 240 can be dynamically generated such that when the preview is rendered within the application, the preview transitions from a first type of graphical representation to a second type of graphical representation, and so on. In another example, the drag-and-drop manager 310 may not provide a preview of the item during the drag session, or only provide a preview when the item is dragged only in the source application or only in the target application.

In one implementation, the drag-and-drop manager 310 may use a portal to provide a preview of items in the target application 240. A portal refers to a pixel-by-pixel reference to a GUI object specified by the source application 230 that enables the drag-and-drop manager 310 to access and manipulate the specified GUI object for providing a preview of the item in the target application 240. Thus, a portal is not a copy of a specified GUI object of an application. Instead, the portal "points to" or "references" the GUI objects of the application, such as in a render tree. An example of a portal of this type is disclosed in more detail in U.S. provisional patent application entitled "CoreAnimation Portals" filed on 16.5.2017 (attorney docket No. P34669US1), which is hereby incorporated by reference in its entirety for all purposes.

Referring to fig. 2A, the drag event includes an initial touch input selecting an item (e.g., image 220) in the source application 230. The initial touch input may be a long touch or press gesture that indicates the start of a drag event and initiates the creation of a new drag session for the drag event. In at least one implementation, the source application 230 (or any source application) initiates the drag session only through the drag-and-drop manager 310 and there is no direct communication channel with the target application 240 (or any target application). The drag event also includes a drag gesture to move the item, as shown in FIG. 2B, and a touch release to drop the item into the target application 240 at the end of the drag gesture, as shown in FIG. 2C. A drag session identifier is assigned to the drag session, which in some cases is used to associate a new drag touch, as explained further herein. As used herein, an item (or data item) may refer to a file, content in a file, a group of files, text, an application, or other object that includes data or a data link (local or cloud-based). Such items may be selected as part of a drag and drop operation and included as part of an associated drag session. Further, each item (or data item) may be assigned its own unique item identifier, which may be used to identify the item during the drag session.

As described above, multiple drag sessions are supported by drag-and-drop manager 310. As previously described with reference to fig. 2K and 2L, the drag events associated with the source application 235 and the target application 245 may be assigned with corresponding drag session identifiers for the independent drag sessions. The source application 235 (or any source application) initiates the drag session only through the drag-and-drop manager 310 and there is no direct communication channel with the target application 245 (or any target application). Referring to fig. 2K, the drag event associated with the source application 235 includes an initial touch input (e.g., a long touch gesture indicating that the drag event starts and initiates a new drag session) that selects an item (e.g., the image 282, which may be associated with a unique item identifier) in the source application 235. The drag event also includes a drag gesture to move the item, and a touch release to drop the item into the target application 245 at the end of the drag gesture, as shown in FIG. 2L.

Drag-and-drop architecture 300 includes a touch event manager 320 configured to manage touch events as they are received through drag-and-drop architecture 300. The touch event manager 320 can be implemented as a background process (e.g., a daemon process) that executes on the electronic device 110 and is configured to receive all touch input into the subject system. For example, the touch event manager 320 can detect an initial touch input (e.g., a long touch gesture in the source application 230 or 235) indicating the start of a drag event in a given source application and forward the touch input to the drag-and-drop manager 310 for processing and creating a new drag session and its associated drag session identifier. Upon detecting a long touch gesture associated with the initiation of the drag session, the touch event manager 320 may cancel (or forgo processing) other current touch inputs received in the source application. In one example, the hierarchy of touch inputs may prioritize long touch gestures to initiate a drag session over other types of touch gestures that may be received during the drag session, and touch event manager 320 may delay processing of these other touches until a touch release is detected that corresponds to dropping an item into target application 240. In another example, the hierarchy of touch inputs may prioritize a long touch or press gesture such that the gesture overlays another gesture.

The touch event manager 320 may receive a request from the drag-and-drop manager 310 to generate a copy of the drag event in the form of a dedicated drag event coexisting with the drag event or a separate drag event. The dedicated drag event is provided to the drag-and-drop manager 310, which is further configured to manage the dedicated drag event and receive a new drag touch (e.g., corresponding to a new drag session) through the dedicated drag event. These new drag touches may be associated with a drag session identifier for the drag session. In one or more implementations, the touch event manager 320 provides an interface for the drag-and-drop manager 310 to provide touch input associated with a particular drag session identifier using inter-process communication (IPC).

During the drag session, when a touch input is received that has been tagged by the drag session identifier, the touch event manager 320 is further configured to perform a hit-test of the drag event to determine whether the target application 240 (or any target application) is configured and/or authorized to receive items from the drag event. As described herein, hit testing refers to an operation for determining whether a location of a current touch input on a touch screen of electronic device 110 (or any electronic device) intersects a corresponding application on the screen. In at least one implementation, a hit test is performed on each touch input to determine the corresponding application as a potential target application for the placed item. Hit testing may be rate limited to mitigate potential performance issues of the drag-and-drop architecture 300 and/or the electronic device 110. In one implementation, touch event manager 320 can determine respective locations for all touches of the drag session based on the drag session identifier, calculate a centroid of the touch locations, and perform a hit test on the location of the centroid to determine potential target applications for the dropped item.

In one or more implementations, the touch event manager 320 associates a new unique identifier with the touch path of a given drag event and uses the new unique identifier to verify whether a newly received touch input that is part of the drag event matches an existing touch input known to the touch event manager 320. During the drag session, the touch event manager 320 may forward touch input labeled with the drag session identifier to the drag-and-drop manager 310, the source application 230, and/or the target application 240. In one or more implementations, the target application 240 requests an XPC connection (e.g., an interprocess communication mechanism with sandbox feature) from the drag-and-drop manager 310 when a touch input associated with a drag session identifier is received. The XPC connection may provide a sandbox environment to limit the type of information that may be accessed in order for the target application 240 to communicate with the drag-and-drop manager 310. For example, before the drag event ends, the drag-and-drop manager 310 will not release any data regarding one or more representations of the item.

Upon touch release by the user's finger 270 at the end of the drag gesture shown in FIG. 2C (indicating the end of the drag event), the touch event manager 320 notifies the drag-and-drop manager 310 and the target application 240.

Using the touch event manager 320 to deliver touch events to the drag-and-drop manager 310 and the target application 240 (or any target application) advantageously enables the drag-and-drop manager 310 and the target application 240 (or any target application) to receive touch events with minimal delay (e.g., nearly simultaneously). Further, synchronization with other touch events received by the target application 240 (or any target application) is provided to enable the target application 240 to operate normally on any additional touch inputs that are not part of a drag event.

The drag-and-drop manager 310 may be configured to notify the target application 240 that the drag event has ended and that the item may be dropped into the target application 240 when accepted for receipt by the target application 240. In one or more implementations, the target application 240 can indicate to the drag-and-drop manager 310 that it wishes to receive the dropped item. For example, the drag-and-drop manager 310 may receive a request from the target application 240 for information corresponding to an item associated with a drag event. The request may include, for example, a drag session identifier.

The drag-and-drop manager 310 can request additional information about the item from the source application 230, such as a list of available representations of the item. For example, each representation may be a different digital version of the item, and the list may be ordered by fidelity or quality level. The drag-and-drop manager 310 may receive additional information corresponding to the item from the source application 230 and may provide the additional information corresponding to the item to the target application 240. The target application 240 may then utilize the received additional information to initiate a data transfer for the particular representation of the item, which will be discussed further below with respect to FIG. 4A.

The drag-and-drop manager 310 controls the flow of information to any target application and any request by the target application may not be fulfilled until the drag-and-drop manager 310 has determined that the request should be allowed.

In one or more implementations, as a security feature, the drag-and-drop architecture 300 can provide the target application 240 with minimal or no information about the dragged item until after the drag-and-drop architecture 300 verifies that the target application 240 is authorized/configured to receive the dragged item, and/or until after the target application 240 accepts the receipt of the dragged item. Thus, during the drag session, the target application 240 may know that a certain item is dragged over it, but the target application 240 may not have access to any specific information about the dragged item.

Fig. 4A illustrates a flow diagram of an exemplary process 400 for performing a data transfer as part of a drag-and-drop operation on an electronic device 110 that includes a touch screen in accordance with one or more implementations. For purposes of explanation, the process 400 is described herein primarily with reference to the electronic device 110 of fig. 1 and 2, and in particular with reference to the drag-and-drop manager 310 described above in fig. 3. However, process 400 is not limited to electronic device 110 of fig. 1 and 2, and one or more blocks (or operations) of process 400 may be performed by one or more other components of other suitable devices. For further explanation purposes, the blocks of process 400 are described herein as occurring sequentially or linearly. However, multiple blocks of process 400 may occur in parallel. Further, the blocks of process 400 need not be performed in the order shown, and/or one or more blocks of process 400 need not be performed and/or may be replaced by other operations.

As illustrated in FIG. 3, the drag-and-drop manager 310 provides a mechanism for restricting communication between the source application 230 and the target application 240 during a drag session until a data transfer is initiated for an item that is dropped into the target application 240. In this way, the source application 230 and the target application 240 communicate directly with the drag-and-drop manager 310 but not with each other until a data transfer is initiated.

As shown in FIG. 4A, the drag-and-drop manager 310 detects a drag gesture that selects an item in a first application (402). The selected item may be the image 220 in the source application 230 in fig. 2A. The drag-and-drop manager 310 detects a touch release for dropping the item into the second application at the end of the drag gesture (404). The second application may be a target application 240 into which the item is placed, as shown in FIG. 2C. In one implementation, as a security feature, the drag-and-drop manager 310 can check a process Identifier (ID) of the target application to ensure that a touch-release associated with a drag event of a drag session corresponds to the intended target application.

The drag-and-drop manager 310 sends a message to the second application in response to the detected touch release, the message including information describing a plurality of different representations of the item (406). For example, the source application 230 may provide a list including multiple representations of images 220 (e.g., original images, PDFs, PNGs, JPGs, plain text, etc.) with different fidelity or quality. Each representation in the list can be indicated in the form of a Uniform Type Identifier (UTI), which is a respective text string used to uniquely identify an item of a given category or type. In one or more implementations, the target application (e.g., target application 240) is responsible for providing UTI consistency information to indicate a representation of items that are acceptable to the target application. A message including information describing a representation of an item may be sent using an XPC connection established between the drag-and-drop manager 310 and the target application 240.

To provide additional flexibility, in one or more implementations, the drag-and-drop architecture 300 provides services that map files to file types specified in corresponding UTIs. A given application may register a new UTI or file type with the service, and the application may also use the service to extend the UTI or file type that is present (e.g., to be associated with one or more other files).

The drag-and-drop manager 310 receives a request for a representation of an item in the plurality of different representations from the second application (408). While a single requested representation is discussed, it should be understood that the target application 240 may also request multiple representations of the item. In one example, the target application 240 may select a particular representation with the highest fidelity provided by the source application 230. In another example, the representation of the requested item is dependent on context (e.g., a target application that will graphically render the received representation may select a different representation than another target application that will not graphically render the received representation).

The drag-and-drop manager 310 sends a request for a representation of an item to a first application (410). The drag-and-drop manager 310 initiates a data transfer of a representation of an item from a first application to a second application (412). In one implementation, the drag-and-drop manager 310 may have a connection with the source application 230 during the drag session. However, the connection is not provided to the target application 240 until after the drag session is completed to prevent unauthorized or unintentional data transfer from occurring, this security feature being enabled by the drag-and-drop architecture 300 of FIG. 3. In examples where the source application 230 is not executing (e.g., crashes or is no longer running), the drag-and-drop manager 310 can end the drag session and/or cancel the data transfer.

After the drag-and-drop manager 310 detects that a touch release has occurred and the target application 240 has requested a representation of the item, the drag-and-drop manager 310 can provide a connection (or extension to a connection endpoint) with the target application 240 to perform a data transfer of the representation of the item. In one implementation, as a security feature, the drag-and-drop manager 310 may also set a timeout to complete the data transfer and implement the timeout (e.g., by closing the connection and/or stopping the data transfer) to prevent long data transfers from occurring. The drag-and-drop manager 310 can detect when the data transfer is complete and can disconnect the connection between the source application 230 and the target application 240 at this point.

During data transfer, the drag-and-drop manager 310 can provide a placeholder preview, which is a graphical representation indicating or displaying the progress of the data transfer. After the data transfer is complete, a representation of the item can be displayed, replacing the placeholder preview. An animation can be implemented that changes the placeholder preview to a representation of the item.

In addition, one or more other implementations for performing data transfers may be provided. The source application 230 may be placed in the background while the data transfer is in progress and, over time, may be restricted from accessing resources provided by the drag-and-drop architecture 300 that would negatively impact the data transfer. In addition, data transfers for large files or files located on the network 106 (e.g., on the server 120) may take a long time to complete.

For example, to alleviate such problems, a file provider may be provided to handle data transfers. The file provider may be an extension (e.g., a non-UI background process or daemon) that provides files or data, and may be used to open documents from other containers (e.g., where files or data are stored locally or on server 120). In one implementation, the file provider may be included in a sandbox environment as a security feature.

Fig. 4B illustrates a flow diagram of an exemplary process 420 for completing a data transfer as part of a drag-and-drop operation on an electronic device 110 that includes a touch screen using a file provider, according to one or more implementations. For purposes of explanation, the process 420 is described herein primarily with reference to the electronic device 110 of fig. 1 and 2, and in particular with reference to the drag-and-drop manager 310 described above in fig. 3. However, process 420 is not limited to electronic device 110 of fig. 1 and 2, and one or more blocks (or operations) of process 420 may be performed by one or more other components of other suitable devices. For further explanation purposes, the blocks of process 420 are described herein as occurring sequentially or linearly. However, multiple blocks of process 420 may occur in parallel. Further, the blocks of process 420 need not be performed in the order shown, and/or one or more blocks of process 420 need not be performed and/or may be replaced by other operations.

The drag-and-drop manager 310 receives a request for a representation of an item (422). The drag-and-drop manager 310 sends a request for a representation of the item to the source application 230 (424). In response to a request for a particular representation of an item, the source application 230 may provide the drag-and-drop manager 310 with a URL or link to a file provider that completes the data transfer of the representation of the item. The drag-and-drop manager 310 receives the URL or link to the file provider that completed the data transfer of the representation of the item (426). The drag-and-drop manager 310 may then send (428) the URL or link to the file provider to the target application 240. Next, the target application 240 may request a file or data from the file provider via the URL. Advantageously, the drag-and-drop manager 310 may end the drag session at this point and rely on the file provider to complete the data transfer of the requested data. In response to a request from the target application 240, the file provider may initiate a data transfer to the target application 240 for a representation of the requested item.

In one or more implementations, the drag-and-drop manager 310 can create an Access Control List (ACL) in a database of the file provider to indicate a particular pair of processes (e.g., source application 230 and target application 240) that share and/or can access one or more particular files. The file provider may then check the ACL after receiving a request from the target application 240 for the particular file to ensure that access to the file is allowed. In one implementation, a file coordination process (e.g., a daemon process or a background process) may be provided to monitor the status of the data transfer and notify the file provider when the transfer is complete or when one of the processes crashes. Once the data transfer is notified that it is complete, the file provider removes the ACL from the file provider database.

In one or more implementations, support is provided for editing in place a remote (or cloud) file or document located on network 106 (e.g., on server 120 or otherwise not local to electronic device 110). This example provides coordination with the file provider to upload changes back to the server 120, or into a network or cloud of computers including the server 120. The target application 240 may request to open the cloud file in place. The target application 240 may receive a reference to the requested cloud file and make changes via the reference to the cloud file. The file provider may collect these changes and upload the changes back to the cloud to update the cloud file with the changes.

In one or more implementations, the drag-and-drop architecture 300 supports policy creation for excluding target applications from receiving any placed items and/or from receiving one or more particular types of placed items. For example, data may be prevented from moving between a particular source application and a particular target application. Further, certain policies may be provided such that drag and drop is only allowed between managed applications (e.g., in an enterprise configuration). In one implementation, such policies may be enforced by drag-and-drop manager 310 based on information indicating that the source application and the target application are in the same managed configuration. Further, the source application may control how much and which metadata (e.g., information related to the representation of the item) is disclosed to other applications with respect to the drag event. For third party applications, authorization checks using authorization keys and code signatures may be provided to determine access rights to placed items.

The drag-and-drop architecture 300 provides support for adding items and/or removing items from an existing drag session. As described above, each drag session may be assigned a drag session identifier, and each item included in the drag session may also be assigned its own unique identifier. The following discussion relates to an exemplary process 430 for adding an item to an existing drag session, and an exemplary process 440 for removing an item from an existing drag session.

FIG. 4C illustrates a flow diagram of an exemplary process 430 for adding an item to an existing drag session on an electronic device 110 that includes a touch screen, according to one or more implementations. For purposes of explanation, the process 430 is described herein primarily with reference to the electronic device 110 of fig. 1 and 2, and in particular with reference to the drag-and-drop manager 310 described above in fig. 3. However, process 430 is not limited to electronic device 110 of fig. 1 and 2, and one or more blocks (or operations) of process 430 may be performed by one or more other components of other suitable devices. For further explanation purposes, the blocks of process 430 are described herein as occurring sequentially or linearly. However, multiple blocks of the process 430 may occur in parallel. Further, the blocks of the process 430 need not be performed in the order shown, and/or one or more blocks of the process 430 need not be performed and/or may be replaced by other operations.

During the drag session, drag-and-drop manager 310 receives a request to add an item to the drag session (432). The request to add the item to the drag session may come from the source application 230 in fig. 2D and may also include a drag session identifier associated with the drag session. Referring to FIG. 2D, the request may be made after the drag session has been initiated with respect to the image 220 and a touch input (e.g., a tap gesture) for selecting the image 225 is received by the source application 230, which interprets the touch input as a request to add the selected image 225 to the drag session.

The drag-and-drop manager 310 determines an identifier of the added item (434). In one implementation, each item provided by the source application 230 is associated with a unique identifier, and the unique identifier may be included as part of a request from the source application 230 to add the item to the drag session.

The drag-and-drop manager 310 associates an identifier of the added item with the drag session (436). In one implementation, the drag session may include a data structure (e.g., an array or list, etc.) corresponding to one or more items included in the drag session. The drag-and-drop manager 310 may insert or include an identifier of the added item in such a data structure.

FIG. 4D illustrates a flow diagram of an exemplary process 440 for removing an item from an existing drag session on an electronic device 110 that includes a touch screen, according to one or more implementations. For purposes of explanation, the process 440 is described herein primarily with reference to the electronic device 110 of fig. 1 and 2, and in particular with reference to the drag-and-drop manager 310 described above in fig. 3. However, process 440 is not limited to electronic device 110 of fig. 1 and 2, and one or more blocks (or operations) of process 440 may be performed by one or more other components of other suitable devices. Further for purposes of explanation, the blocks of process 440 are described herein as occurring sequentially or linearly. However, multiple blocks of process 440 may occur in parallel. Further, the blocks of process 440 need not be performed in the order shown, and/or one or more blocks of process 440 need not be performed and/or may be replaced by other operations.

During the drag session, drag-and-drop manager 310 receives a request to remove an item from the drag session (442). The request to remove the item from the drag session may come from the source application 230 in FIG. 2D and may also include a drag session identifier associated with the drag session. Referring to FIG. 2D, the request may be made after the drag session has been initiated with respect to the image 220 and the image 225 is added to the drag session, as shown in FIG. 2E. In one implementation, the source application 230 may receive a touch input (e.g., a gesture) for removing the added item corresponding to the image 225. The source application 230 may interpret the touch input as a request to remove the selected image 225 from the drag session.

The drag-and-drop manager 310 determines the identifier of the item requested to be removed (444). In one implementation, each item provided by the source application 230 is associated with a unique identifier, and the unique identifier can be included as part of a request from the source application 230 to remove the item from the drag session.

Drag-and-drop manager 310 disassociates 446 the identifier of the item requested to be removed from the drag session. In one implementation, the drag session may include a data structure (e.g., an array or list, etc.) corresponding to one or more items included in the drag session. The drag-and-drop manager 310 may remove identifiers for items in such data structures.

Fig. 5 illustrates a flow diagram of an exemplary process 500 for enforcing a security policy for a drag-and-drop operation on an electronic device 110 that includes a touch screen, according to one or more implementations. For purposes of explanation, the process 500 is described herein primarily with reference to the electronic device 110 of fig. 1 and 2, and in particular with reference to the drag-and-drop manager 310 described above in fig. 3. However, process 500 is not limited to electronic device 110 of fig. 1 and 2, and one or more blocks (or operations) of process 500 may be performed by one or more other components of other suitable devices. Further for purposes of explanation, the blocks of process 500 are described herein as occurring sequentially or linearly. However, multiple blocks of process 500 may occur in parallel. Further, the blocks of process 500 need not be performed in the order shown, and/or one or more blocks of process 500 need not be performed and/or may be replaced by other operations.

For a drag-and-drop event, drag-and-drop manager 310 detects a touch release at the end of the drag gesture to drop an item from the first application into the second application, the item corresponding to the data (502). The first application may be a source application 230 and the second application may be a target application 240 into which the image 220 is placed, as shown in fig. 2C.

The drag-and-drop manager 310 uses the data access policy to determine whether the second application has access rights to the data corresponding to the item (504). The data corresponding to the items may be, for example, image files, video files, sound files, and the like. For example, a data access policy may only allow access to data corresponding to items between managed applications. Thus, the drag-and-drop manager 310 determines that the source application 230 and the target application 240 are both managed applications (e.g., in the same enterprise configuration) and allows access to the project.

In one implementation, drag-and-drop operations can only be enabled between applications that have views associated with the same account type. In particular, from a current view of the target application, it is determined whether the view is associated with a managed account or an unmanaged account, and then it is determined whether a drag event is to be sent to the target application based on whether the dragged item is associated with and/or is being dragged from a view associated with the same type of account.

If the drag-and-drop manager 310 determines that the second application has access rights to the data corresponding to the item (504), the drag-and-drop manager 310 allows the drag-and-drop event to complete (506). For example, the drag-and-drop manager 310 can facilitate data transfer of data corresponding to an item, such as discussed above with respect to fig. 4.

If the drag-and-drop manager 310 determines that the second application does not have access to the data corresponding to the item (504), the drag-and-drop manager 310 denies completion of the drag-and-drop event (508). Access may be denied when neither the source application 230 nor the target application 240 are managed applications, and/or the dragged item is not associated with the same type of account, and/or is being dragged from a view that is not associated with the same type of account. After declining to complete the drag-and-drop event, drag-and-drop manager 310 may end the drag session associated with the drag-and-drop event.

Fig. 6A-6C illustrate exemplary drag-and-drop operations for text selection performed on an electronic device 110 including a touch screen according to one or more implementations. For purposes of explanation, the exemplary drag-and-drop operations shown in fig. 6A-6C are described as being performed on the electronic device 110 of fig. 1, and with reference also to the drag-and-drop manager 310 described in fig. 3. However, the exemplary drag-and-drop operation illustrated in fig. 6A-6C may be performed on any electronic device that includes a touch screen.

As shown in fig. 6A, the touch screen 210 of the electronic device 110 may simultaneously display two different applications, which may be referred to as a source application 235 and a target application 240.

The electronic device 110 can detect an initial touch input (e.g., a long touch gesture) that initiates a drag session based on the user's finger 270 touching the touch screen 210 of the electronic device 110. For example, touch input may be detected based on a user's finger 270 touching text 234 displayed in the source application 235. While the user's finger 270 is still in contact with the touch screen 210, the electronic device 110 may detect another touch input (e.g., a swipe gesture) that selects the text 234 as part of the drag session. In some cases, a second touch input may not be needed to select a particular type of text. For example, a touch input corresponding to a long touch gesture may be used to select a hyperlink or URL provided in the source application 235.

In one or more implementations, the electronic device 110 may provide a tray graphical element ("tray") corresponding to the graphical representation of the selected text. When provided for display, the tray may obscure the background when dragged during the drag session. Different representations of the trays may be provided. For example, the tray may be presented with rounded corners, shaded, without shading, without borders, and/or further customized in any way. Further, in examples where the selected text may include a large number of selected characters, the tray may truncate a number of characters to provide as part of the presented tray.

As shown in fig. 6B, the electronic device 110 detects a drag gesture caused by a user's finger 270 dragging on the touch screen 210 of the electronic device 110. The drag gesture drags the selected text 236 (as shown in the tray representation) from the source application 235 to the target application 240. The example shown in FIG. 6B illustrates an ongoing drag session in which the selected text 236 is dragged into the document 242 of the target application 240. The target application 240 may request from the drag-and-drop manager 310 a precise mode for providing a cursor 238 indicating where the selected text will be placed at the time of placement in the target application 240. Further, the drag-and-drop manager 310 can provide an offset for displaying the selected text 236 at a location away from the location of the current touch input of the user's finger 270 during the drag session. The offset may vary depending on the content provided in the target application 240 and/or the size of the selected text 236.

As shown in fig. 6C, the electronic device 110 detects completion of the drag gesture when the user's finger 270 is lifted from the touch screen 210 of the electronic device 110. In the example of FIG. 6C, the selected text has been inserted into the document 242 of the target application 240 at a location corresponding to the cursor 238 from FIG. 6B.

Fig. 6D illustrates a flow diagram of an exemplary process 600 for performing a drag-and-drop operation for text selected on an electronic device 110 that includes a touch screen, according to one or more implementations. For purposes of explanation, the process 600 is described herein primarily with reference to the electronic device 110 of fig. 1 and 2, and in particular with reference to the drag-and-drop manager 310 described above in fig. 3. However, process 600 is not limited to electronic device 110 of fig. 1 and 2, and one or more blocks (or operations) of process 600 may be performed by one or more other components of other suitable devices. For further explanation purposes, the blocks of process 600 are described herein as occurring sequentially or linearly. However, multiple blocks of process 600 may occur in parallel. Further, the blocks of the process 600 need not be performed in the order shown, and/or one or more blocks of the process 600 need not be performed and/or may be replaced by other operations.

The drag-and-drop manager 310 detects a text selection gesture for text provided in the first application (602). For example, the text selection gesture may include one or more text characters provided in the first application. The drag-and-drop manager 310 selects a text character for dragging based on the text selection gesture (604). In one or more implementations, the drag-and-drop manager 310 may filter out spaces and select only images for dragging if the selection includes only spaces and one or more images, or the drag-and-drop manager 310 may decide not to select any text characters for dragging if the selection includes only spaces. Additionally, for selections including text characters with marking information, text characters with or without markings may be selected by the drag-and-drop manager 310 depending on the particular implementation.

The drag-and-drop manager 310 detects a drag gesture of the selected text character from the first application to the second application (606). In one example, the first application may add or modify the dragged selected text so that the modified text will be dragged into the second application. For example, the first application may reverse text or apply some other conversion to the selected text (e.g., convert the selected text to an image).

The drag-and-drop manager 310 detects a touch release at the end of the drag gesture to drop the selected character at a location within the content displayed in the second application (608). The drag-and-drop manager 310 inserts the selected text character into the second application at a location within the displayed content (610).

In one or more implementations, the subject system may display a cursor over or under the text being dragged. The cursor may be used to indicate the precise location within the content of the second application where the text selection is to be placed. When the cursor is displayed over the dragged text selection and the text selection is dragged to the bottom of the screen, the system may adaptively flip the cursor to the bottom of the text selection. Similarly, when the cursor is displayed below the dragged text selection and the text selection is dragged to the top of the screen, the system may adaptively flip the cursor to the top of the text selection.

Fig. 7A-7B illustrate exemplary drag-and-drop operations performed on an electronic device 110 including a touch screen involving a table view according to one or more implementations. For purposes of explanation, the exemplary drag-and-drop operation illustrated in fig. 7A-7B is described as being performed on the electronic device 110 of fig. 1. However, the exemplary drag-and-drop operation illustrated in fig. 7A-7B may be performed on any electronic device that includes a touch screen.

As shown in fig. 7A, the electronic device 110 may detect an initial touch input based on a finger 270 of a user touching the touch screen 210 of the electronic device 110. The electronic device 110 can detect that the user's finger 270 has touched an item 720 provided in a source application 730 (e.g., a productivity application or any application) to become part of a drag session for a drag-and-drop operation. The target application 740 (e.g., the to-do list manager application or any application) is provided in a split screen view, with both the source application 730 and the target application 740 displayed side-by-side on the touch screen 210. As shown, the target application 740 provides a list of items in the table view that includes items 750, 752, 754, and 756.

The electronic device 110 detects the drag gesture based on the user's finger 270 moving across the touch screen 210 of the electronic device 110. The drag gesture may include moving the item 720 to a location within a table view of a list of items provided in the target application 740. As shown in fig. 7B, the electronic device 110 detects that the drag gesture is complete when the electronic device 110 detects that the user's finger 270 has been lifted from the touch screen 210 of the electronic device 110.

Upon completion of the drag gesture, item 720 has been moved (or copied) from source application 730 and inserted into the table view of target application 740 having items 750, 752, 754, and 756. In particular, the item 720 has been inserted between the items 754 and 756 in the list of items. The drag session is now complete. In one or more implementations, when inserting an item into the form view, the electronic device 110 may implement state coordination of the process discussed below with respect to fig. 9 to account for any state changes between the time the item is first moved over the target application 740 until the time the item is placed into the form view of the target application 740.

In one or more implementations, if an item corresponds to a large file, or if an item corresponds to a file transmitted from a remote location (such as server 120), a placeholder image may be inserted within the form view in a location that releases the item until the transmission or download of the file is completed. The placeholder image may indicate to the user that the file is still being transferred or downloaded. For example, the placeholder image may be and/or may include a progress bar indicating progress of the transfer or download.

In one or more implementations, the target application may determine that the location within the tabular view where the user placed the item is not a suitable location for inserting the item. For example, the table view may be sorted by a particular factor (such as alphabetically), and the user may have placed items within the table view at locations that are inconsistent with the alphabetical sorting. In this case, the target application may redirect the item to an appropriate location within the form view, and the item may be automatically moved and inserted into the appropriate location within the form view. In one implementation, during the item redirection, a graphical representation of a "hole" at a location within the form view into which the item is to be inserted may be displayed in the target application, and when the placement occurs, the item will be animated as moving to the hole at the location within the form view.

Fig. 8A-8B illustrate drag-and-drop operations performed on an electronic device 110 including a touch screen involving a collection view according to one or more implementations. For purposes of explanation, the exemplary drag-and-drop operation illustrated in fig. 8A-8B is described as being performed on the electronic device 110 of fig. 1. However, the exemplary drag-and-drop operation illustrated in fig. 8A-8B may be performed on any electronic device that includes a touch screen.

As shown in fig. 8A, the electronic device 110 may detect an initial touch input based on a finger 270 of a user touching the touch screen 210 of the electronic device 110. The electronic device 110 can determine that the user's finger 270 has selected an item 820 (e.g., an image) provided in a source application 830 (e.g., a cloud storage application or any application) to be part of a drag session for a drag-and-drop operation. The target application 840 (e.g., an image editor application or any application) is provided in a split screen view, where both the source application 830 and the target application 840 are displayed side-by-side on the touch screen 210. As shown, the target application 840 provides items 850, 852, 854, and 856 in a collection view that correspond to different images.

The electronic device 110 detects the drag gesture based on the user's finger 270 moving across the touch screen 210 of the electronic device 110. The drag gesture may include moving the item 820 to a location within a collection view of items provided in the target application 840. As shown in fig. 8B, the electronic device 110 detects that the drag gesture is complete when the electronic device 110 detects that the user's finger 270 has been lifted from the touch screen 210 of the electronic device 110.

Upon completion of the drag gesture, the item 820 has been moved from the source application 830 and inserted into the aggregated view of the target application 840 having items 850, 852, 854, and 856. In particular, item 820 has been inserted into the same location as item 854 previously in FIG. 8A, and item 854 has been moved into exactly the same location as item 856 previously in FIG. 8A. Item 856 has been moved to a new position in fig. 8B below the remaining items in the other items. The drag session is now complete. In one or more implementations, when inserting an item into the collection view, the electronic device 110 can implement state coordination of the process discussed below with respect to fig. 9 to account for any state changes between the time the item is first moved over the target application 840 until the time the item is placed into the collection view of the target application 840.

Fig. 9 illustrates a flow diagram of an exemplary process 900 for performing a drag-and-drop operation involving a table/collection view on an electronic device that includes a touchscreen according to one or more implementations. For purposes of explanation, the process 900 is described herein primarily with reference to the electronic device 110 of fig. 1 and 2, and in particular with reference to the drag-and-drop manager 310 described above in fig. 3. However, process 900 is not limited to electronic device 110 of fig. 1 and 2, and one or more blocks (or operations) of process 900 may be performed by one or more other components of other suitable devices. For further explanation purposes, the blocks of process 900 are described herein as occurring sequentially or linearly. However, multiple blocks of process 900 may occur in parallel. Further, the blocks of process 900 need not be performed in the order shown, and/or one or more blocks of process 900 need not be performed and/or may be replaced by other operations.

The drag-and-drop manager 310 identifies that an item is dragged over a first application, the first application including a table (or collection of items) of items, the items being arranged in a first arrangement in the table, and each item in the table corresponding to a graphical representation of a file (902). The drag-and-drop manager 310 copies the first arrangement in response to the identifying to generate a second arrangement corresponding to an initial arrangement of items in the table when the items are first identified as being dragged over the first application (904).

The drag-and-drop manager 310 updates the first arrangement to reflect the changed position of the item in the table that resulted from, for example, another item being inserted into the table when the item was dragged over the first application (906). For example, if a copy or download operation has been previously initiated with respect to the form view, the additional item may be inserted at the time the additional item is copied or downloaded, such as from server 120. In one or more implementations, the changed position of an item may also result from, for example, deleting an item from a table.

The drag-and-drop manager 310 updates the second arrangement to reflect the changed position of the item in the initial arrangement of items resulting from the item being released for insertion into the table, the second arrangement not including another item (908). For example, when an item is released for insertion into a location or place in a table, the position of the item around the insertion location may change to create a space for inserting the item.

The drag-and-drop manager 310 merges the updated second arrangement with the updated first arrangement to reconcile a changed position resulting from the insertion of an item into the table at a position based on the initial arrangement with a changed position resulting from the insertion of another item into the table (910). In this way, any updates to the form view that occur during the drag session can be coordinated with the location of the inserted items (e.g., based on the initial arrangement).

Fig. 10 illustrates an electronic system 1000 that may be used to implement one or more implementations of the subject technology. Electronic system 1000 may be and/or may be part of electronic device 110 and/or server 120 shown in fig. 1. Electronic system 1000 may include various types of computer-readable media and interfaces for various other types of computer-readable media. Electronic system 1000 includes a bus 1008, one or more processing units 1012, a system memory 1004 (and/or cache), a ROM1010, a persistent storage device 1002, an input device interface 1014, an output device interface 1006, and one or more network interfaces 1016, or subsets and variations thereof.

Bus 1008 generally represents all of the system bus, peripheral buses, and chipset buses that communicatively connect the many internal devices of electronic system 1000. In one or more implementations, the bus 1008 communicatively connects one or more processing units 1012 with the ROM1010, the system memory 1004, and the permanent storage device 1002. One or more processing units 1012 retrieve instructions to be executed and data to be processed from these various memory units in order to perform the processes of the subject disclosure. In different implementations, the one or more processing units 1012 may be a single processor or a multi-core processor.

The ROM1010 stores static data and instructions that are required by the one or more processing units 1012 and other modules of the electronic system 1000. On the other hand, persistent storage device 1002 may be a read-write memory device. Persistent storage 1002 may be a non-volatile memory unit that stores instructions and data even when electronic system 1000 is turned off. In one or more implementations, a mass storage device (such as a magnetic or optical disk and its corresponding disk drive) may be used as the persistent storage device 1002.

In one or more implementations, a removable storage device (such as a floppy disk, a flash drive, and their corresponding disk drives) may be used as the permanent storage device 1002. Like the persistent storage device 1002, the system memory 1004 may be a read-write memory device. However, unlike the persistent storage device 1002, the system memory 1004 may be a volatile read-and-write memory, such as a random access memory. The system memory 1004 may store any of the instructions and data that may be needed by the one or more processing units 1012 at runtime. In one or more implementations, the processes of the subject disclosure are stored in the system memory 1004, the persistent storage 1002, and/or the ROM 1010. One or more processing units 1012 retrieve instructions to be executed and data to be processed from the various memory units in order to perform one or more embodied processes.

The bus 1008 also connects to an input device interface 1014 and an output device interface 1006. Input device interface 1014 enables a user to communicate information and select commands to electronic system 1000. Input devices that can be used with the input device interface 1014 can include, for example, an alphanumeric keyboard and a pointing device (also referred to as a "cursor control device"). The output device interface 1006 may, for example, enable display of images generated by the electronic system 1000. Output devices that may be used with output device interface 1006 may include, for example, printers and display devices, such as Liquid Crystal Displays (LCDs), Light Emitting Diode (LED) displays, Organic Light Emitting Diode (OLED) displays, flexible displays, flat panel displays, solid state displays, projectors, or any other device for outputting information. One or more implementations may include a device that acts as both an input device and an output device, such as a touch screen. In these implementations, the feedback provided to the user can be any form of sensory feedback, such as visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.

Finally, as shown in FIG. 10, bus 1008 also couples electronic system 1000 to one or more networks and/or to one or more network nodes, such as electronic device 110 shown in FIG. 1, through one or more network interfaces 1016. In this manner, electronic system 1000 may be part of a computer network, such as a LAN, wide area network ("WAN"), or intranet, or may be part of a network of networks, such as the internet. Any or all of the components of electronic system 1000 may be used with the subject disclosure.

Implementations within the scope of the present disclosure may be realized, in part or in whole, by a tangible computer-readable storage medium (or multiple tangible computer-readable storage media of one or more types) having one or more instructions written thereon. The tangible computer readable storage medium may also be non-transitory in nature.

A computer-readable storage medium may be any storage medium that can be read, written, or otherwise accessed by a general purpose or special purpose computing device and that includes any processing electronics and/or processing circuitry capable of executing instructions. For example, without limitation, the computer-readable medium may include any volatile semiconductor memory, such as RAM, DRAM, SRAM, T-RAM, Z-RAM, and TTRAM. The computer readable medium may also include any non-volatile semiconductor memory, such as ROM, PROM, EPROM, EEPROM, NVRAM, flash memory, nvSRAM, FeRAM, FeTRAM, MRAM, PRAM, CBRAM, SONOS, RRAM, NRAM, racetrack memory, FJG, and Millipede memory.

Further, the computer-readable storage medium may include any non-semiconductor memory, such as optical disk storage, magnetic tape, other magnetic storage devices, or any other medium capable of storing one or more instructions. In one or more implementations, the tangible computer-readable storage medium may be directly coupled to the computing device, while in other implementations, the tangible computer-readable storage medium may be indirectly coupled to the computing device, e.g., via one or more wired connections, one or more wireless connections, or any combination thereof.

The instructions may be directly executable or may be used to develop executable instructions. For example, the instructions may be implemented as executable or non-executable machine code, or may be implemented as high-level language instructions that may be compiled to produce executable or non-executable machine code. Further, instructions may also be implemented as, or may include, data. Computer-executable instructions may also be organized in any format, including routines, subroutines, programs, data structures, objects, modules, applications, applets, functions, and the like. As those skilled in the art will recognize, details including, but not limited to, number, structure, sequence, and organization of instructions may vary significantly without changing the underlying logic, function, processing, and output.

Although the above discussion has primarily referred to microprocessor or multi-core processors executing software, one or more implementations are performed by one or more integrated circuits, such as ASICs or FPGAs. In one or more implementations, such integrated circuits execute instructions stored on the circuit itself.

Those skilled in the art will appreciate that the various illustrative blocks, modules, elements, components, methods, and algorithms described herein may be implemented as electronic hardware, computer software, or combinations of both. To illustrate this interchangeability of hardware and software, various illustrative blocks, modules, elements, components, methods, and algorithms have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application. The various components and blocks may be arranged differently (e.g., arranged in a different order, or divided in a different manner) without departing from the scope of the subject technology.

It is to be understood that the specific order or hierarchy of blocks in the processes disclosed is an illustration of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of blocks in the processes may be rearranged or that all illustrated blocks may be performed. Any of these blocks may be performed simultaneously. In one or more implementations, multitasking and parallel processing may be advantageous. Moreover, the division of various system components in the implementations described above should not be understood as requiring such division in all implementations, and it should be understood that program components and systems can generally be integrated together in a single software product or packaged into multiple software products.

As used in this specification and any claims of this patent application, the terms "base station," "receiver," "computer," "server," "processor," and "memory" all refer to electronic or other technical devices. These terms exclude a person or group of persons. For the purposes of this specification, the term "display" or "displaying" means displaying on an electronic device.

As used herein, the phrase "at least one of," following the use of the term "and" or "to separate a series of items from any one of the items, modifies the list as a whole and not every member of the list (i.e., every item). The phrase "at least one of" does not require the selection of at least one of each of the items listed; rather, the phrase allows the meaning of at least one of any one item and/or at least one of any combination of items and/or at least one of each item to be included. For example, the phrases "at least one of A, B and C" or "at least one of A, B or C" each refer to a only, B only, or C only; A. any combination of B and C; and/or A, B and C.

The predicate words "configured to", "operable to", and "programmed to" do not imply any particular tangible or intangible modification to a certain subject but are intended to be used interchangeably. In one or more implementations, a processor configured to monitor and control operations or components may also mean that the processor is programmed to monitor and control operations or that the processor is operable to monitor and control operations. Also, a processor configured to execute code may be interpreted as a processor that is programmed to execute code or that is operable to execute code.

Phrases such as an aspect, the aspect, another aspect, some aspects, one or more aspects, a specific implementation, the specific implementation, another specific implementation, some specific implementation, one or more specific implementations, embodiments, the embodiment, another embodiment, some embodiments, one or more embodiments, configurations, the configuration, other configurations, some configurations, one or more configurations, the subject technology, the disclosure, the present disclosure, other variations thereof, and the like are for convenience and do not imply that a disclosure relating to such phrase or phrases is essential to the subject technology, nor that such disclosure applies to all configurations of the subject technology. Disclosure relating to such one or more phrases may apply to all configurations or one or more configurations. Disclosure relating to such one or more phrases may provide one or more examples. Phrases such as an aspect or some aspects may refer to one or more aspects and vice versa and this applies similarly to the other preceding phrases.

The word "exemplary" is used herein to mean "serving as an example, instance, or illustration. Any embodiment described herein as "exemplary" or as "exemplary" is not necessarily to be construed as preferred or advantageous over other implementations. Furthermore, to the extent that the terms "includes," has, "" having, "" has, "" with, "" has, "" having, "" contains, "" containing, "" contain.

All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the claims. No claim element need be construed according to the provisions of 35u.s.c. § 112 sixth paragraph, unless the element is explicitly recited using the phrase "means for … …", or for method claims, the element is recited using the phrase "step for … …".

The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but are to be accorded the full scope consistent with the language claims, wherein reference to an element in a singular value is not intended to mean "one only" and means "one or more" unless specifically so stated. The term "some" means one or more unless specifically stated otherwise. Pronouns for men (e.g., his) include women and neutrals (e.g., her and its), and vice versa. Headings and sub-headings (if any) are used for convenience only and do not limit the disclosure.

50页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于呈现车辆通知的设备、方法和图形用户界面

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类