Apparatus, method and system for manipulating a user interface

文档序号:1957946 发布日期:2021-12-10 浏览:6次 中文

阅读说明:本技术 用于操纵用户界面的设备、方法和系统 (Apparatus, method and system for manipulating a user interface ) 是由 C·D·索利 B·W·格里芬 D·T·普雷斯顿 T·S·乔恩 于 2020-04-10 设计创作,主要内容包括:各种实施方式包括用于操纵用户界面的系统、方法和设备。例如,在一些实施方式中,一种方法包括基于对应请求移动绘图调色板。能够移动的绘图调色板使得显示器的较大部分能够用于内容修改操作。作为另一个示例,在一些实施方式中,一种方法包括基于检测到的输入的输入类型来显示用于编辑所捕获的屏幕截图的屏幕截图编辑界面。通过响应于执行屏幕截图捕获而显示该屏幕截图编辑界面,该方法利用更少的系统资源提供了增强的用户体验。作为又一个示例,在一些实施方式中,一种方法包括使用多模式橡皮擦工具选择性地擦除对象的部分。像素擦除输入删除显示对象的一部分,而对象擦除输入删除整个该显示对象。(Various embodiments include systems, methods, and devices for manipulating a user interface. For example, in some implementations, a method includes moving a drawing palette based on a corresponding request. The movable drawing palette enables a larger portion of the display to be used for content modification operations. As another example, in some implementations, a method includes displaying a screenshot editing interface for editing a captured screenshot based on an input type of a detected input. By displaying the screenshot editing interface in response to performing a screenshot capture, the method provides an enhanced user experience with fewer system resources. As yet another example, in some embodiments, a method includes selectively erasing portions of an object using a multi-mode eraser tool. The pixel erase input deletes a portion of the display object and the object erase input deletes the entire display object.)

1. A method, comprising:

at an electronic device with one or more processors, a non-transitory memory, an input device, and a display device:

displaying, via the display device, a first drawing palette at a first location within a first application interface, wherein the first drawing palette has a first appearance at the first location in which a representation of a currently selected drawing tool is displayed concurrently with one or more representations of other available drawing tools;

detecting, via the input device, a first input corresponding to a request to move the first drawing palette within the first application interface; and

in response to detecting the first input:

in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a second location within the first application interface, displaying the first drawing palette having the first appearance at the second location; and

in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a third location within the first application interface that is different from the second location, displaying the first drawing palette at the third location having a second appearance that is different from the first appearance, wherein when the first drawing palette has the second appearance, a representation of a currently selected drawing tool is displayed in the first drawing palette without displaying representations of other drawing tools in the first drawing palette.

2. The method of claim 1, wherein the first location corresponds to a first location type, the method further comprising:

in accordance with a determination that the first input corresponds to a first input type, displaying the first drawing palette having the first appearance at the second location based on the first location type and the first input type; and

in accordance with a determination that the first input corresponds to a second input type that is different from the first input type, displaying the first drawing palette having the second appearance at the third location based on the first location type and the second input type.

3. The method of claim 1, wherein determining that the first input corresponds to the request to move the first drawing palette to the second location comprises determining that the second location corresponds to a first location type, and wherein determining that the first input corresponds to the request to move the first drawing palette to the third location comprises determining that the third location corresponds to a second location type that is different from the first location type.

4. The method of claim 1, further comprising:

When the first drawing palette having the first appearance and having a first orientation is displayed at the second location, wherein the second location corresponds to a first location type:

detecting, via the input device, a second input corresponding to a request to move the first drawing palette to a fourth location within the first application interface, wherein the fourth location corresponds to the first location type; and

in response to detecting the second input:

in accordance with a determination that the fourth location is on an opposite side of the display than the second location, displaying the first drawing palette having the first appearance and having the first orientation at the fourth location; and

in accordance with a determination that the fourth location is not on the opposite side of the display than the second location, displaying the first drawing palette having the first appearance and having a second orientation different from the first orientation at the fourth location.

5. The method of any of claims 1-4, further comprising, in response to detecting a second input corresponding to a drawing operation on the first application interface, ceasing to display the first drawing palette.

6. The method of any of claims 1-5, wherein the first appearance corresponds to the first drawing palette being in a first expanded state, and wherein the second appearance corresponds to the first drawing palette being in a compressed state.

7. The method of claim 6, further comprising:

when the first drawing palette in the compressed state is displayed at the third position:

detecting, via the input device, a touch input directed to the first drawing palette; and

in response to detecting the touch input, displaying the first drawing palette in a second expanded state different from the first expanded state, wherein the first drawing palette in the second expanded state includes more drawing tools than the first drawing palette in the compressed state.

8. The method of claim 7, further comprising:

detecting, via the input device, a second input directed to a particular content manipulation affordance within the first drawing palette that is in the second expanded state; and

in response to detecting the second input, setting a current editing setting associated with the first drawing palette in accordance with the particular content manipulation affordance.

9. The method of any of claims 1 to 8, further comprising:

when the first drawing palette is displayed at the second location or the third location:

detecting, via the input device, a tap input directed to the first drawing palette; and

in response to detecting the tap input, moving the first drawing palette to the first position.

10. The method of any of claims 1 to 9, further comprising:

in accordance with a determination that the first input satisfies a first distance threshold, determining that the first input corresponds to the request to move the first drawing palette to the second location, and displaying the first drawing palette having the first appearance at the second location; and

in accordance with a determination that the first input does not satisfy the first and second distance thresholds, displaying the first drawing palette having the first appearance at the first location.

11. The method of any of claims 1 to 10, further comprising:

in accordance with a determination that the first input satisfies a speed threshold, determining that the first input corresponds to the request to move the first drawing palette to the second location and displaying the first drawing palette having the first appearance at the second location; and

In accordance with a determination that the first input does not satisfy the speed threshold, displaying the first drawing palette having the first appearance at the first location.

12. The method of any of claims 1-11, further comprising displaying a currently selected drawing tool indicator based on the first input.

13. The method of any of claims 1 to 12, further comprising:

concurrently displaying, via the display device, a second application interface and the first application interface, wherein the second application interface comprises a second drawing palette;

in accordance with a determination that the first application interface has a respective size characteristic that does not satisfy a threshold size, setting the first drawing palette to be immovable within the first application interface and the second drawing palette to be movable within the second application interface;

detecting a second input via the input device; and

in response to detecting the second input:

in accordance with a determination that the second input corresponds to a request to move the first drawing palette to a fourth location within the first application interface, maintaining a current location of the first drawing palette; and

In accordance with a determination that the second input corresponds to a request to move the second drawing palette to a fifth location within the second application interface, move the second drawing palette to the fifth location.

14. The method of claim 13, further comprising:

detecting, via the input device, a third input directed to the first application interface; and

in response to detecting the third input, deemphasizing the second drawing palette relative to content displayed on the canvas of the second application interface.

15. An electronic device, comprising:

one or more processors;

a non-transitory memory;

an input device;

a display device; and

one or more programs, wherein the one or more programs are stored in the non-transitory memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for:

displaying, via the display device, a first drawing palette at a first location within a first application interface, wherein the first drawing palette has a first appearance at the first location in which a representation of a currently selected drawing tool is displayed concurrently with one or more representations of other available drawing tools;

Detecting, via the input device, a first input corresponding to a request to move the first drawing palette within the first application interface; and

in response to detecting the first input:

in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a second location within the first application interface, displaying the first drawing palette having the first appearance at the second location; and

in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a third location within the first application interface that is different from the second location, displaying the first drawing palette at the third location having a second appearance that is different from the first appearance, wherein when the first drawing palette has the second appearance, a representation of a currently selected drawing tool is displayed in the first drawing palette without displaying representations of other drawing tools in the first drawing palette.

16. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device with one or more processors, an input device, and a display device, cause the electronic device to:

Displaying, via the display device, a first drawing palette at a first location within a first application interface, wherein the first drawing palette has a first appearance at the first location in which a representation of a currently selected drawing tool is displayed concurrently with one or more representations of other available drawing tools;

detecting, via the input device, a first input corresponding to a request to move the first drawing palette within the first application interface; and

in response to detecting the first input:

in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a second location within the first application interface, displaying the first drawing palette having the first appearance at the second location; and

in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a third location within the first application interface that is different from the second location, displaying, at the third location, the first drawing palette having a second appearance that is different from the first appearance, wherein when the first drawing palette has the second appearance, a representation of a currently selected drawing tool is displayed in the first drawing palette,

Without displaying representations of other drawing tools in the first drawing palette.

17. An electronic device, comprising:

one or more processors;

a non-transitory memory;

an input device;

a display device;

means for displaying, via the display device, a first drawing palette at a first location within a first application interface, wherein the first drawing palette has a first appearance at the first location in which a representation of a currently selected drawing tool is displayed concurrently with one or more representations of other available drawing tools;

means for detecting, via the input device, a first input corresponding to a request to move the first drawing palette within the first application interface; and

in response to detecting the first input:

in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a second location within the first application interface, display the first drawing palette having the first appearance at the second location; and

in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a third location within the first application interface that is different from the second location, display, at the third location, the first drawing palette having a second appearance that is different from the first appearance, wherein when the first drawing palette has the second appearance, a representation of a currently selected drawing tool is displayed in the first drawing palette without displaying representations of other drawing tools in the first drawing palette.

18. An information processing apparatus for use in an electronic device with a non-transitory memory, an input device, and a display device, the information processing apparatus comprising:

means for displaying, via the display device, a first drawing palette at a first location within a first application interface, wherein the first drawing palette has a first appearance at the first location in which a representation of a currently selected drawing tool is displayed concurrently with one or more representations of other available drawing tools;

means for detecting, via the input device, a first input corresponding to a request to move the first drawing palette within the first application interface; and

in response to detecting the first input:

in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a second location within the first application interface, display the first drawing palette having the first appearance at the second location; and

in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a third location within the first application interface that is different from the second location, display, at the third location, the first drawing palette having a second appearance that is different from the first appearance, wherein when the first drawing palette has the second appearance, a representation of a currently selected drawing tool is displayed in the first drawing palette without displaying representations of other drawing tools in the first drawing palette.

19. An electronic device, comprising:

one or more processors;

a non-transitory memory;

an input device;

a display device; and

one or more programs, wherein the one or more programs are stored in the non-transitory memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for performing the method of any of claims 1-14.

20. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device with one or more processors, an input device, and a display device, cause the electronic device to perform the method of any of claims 1-14.

21. A graphical user interface on an electronic device with a non-transitory memory, an input device, a display device, and one or more processors to execute one or more programs stored in the non-transitory memory, the graphical user interface comprising user interfaces displayed in accordance with the method of any of claims 1-14.

22. An electronic device, comprising:

one or more processors;

a non-transitory memory;

an input device;

a display device; and

means for performing the method of any of claims 1-14.

23. An information processing apparatus for use in an electronic device comprising one or more processors, non-transitory memory, an input device, a display device, and means for performing the method of any of claims 1-14.

24. A method, the method comprising:

at an electronic device with one or more processors, a non-transitory memory, an input device, and a display device:

detecting a screenshot capture input while displaying content via the display device; and

in response to detecting the screenshot capture input:

capturing a screenshot image of the content displayed via the display device;

in accordance with a determination that the screenshot capture input is a first input type, displaying, via the display device, a screenshot editing interface for editing the screenshot image, wherein the screenshot editing interface includes the screenshot image; and

In accordance with a determination that the screenshot capture input corresponds to a second input type that is different from the first input type, displaying, via the display device, a thumbnail representation of the screenshot image overlaid on the content.

25. The method of claim 24, further comprising:

detecting a drag input on a touch-sensitive surface of the electronic device; and

in response to detecting the drag input, displaying, via the display device, a screenshot capture affordance;

wherein the first input type corresponds to a selection of the screenshot capture affordance.

26. The method of claim 25, wherein the drag input is directed to a first taskbar, and wherein displaying the screenshot capture affordance includes replacing the first taskbar with a second taskbar that includes the screenshot capture affordance.

27. The method of claim 25, wherein the drag input moves across the touch-sensitive surface away from a corresponding edge of the touch-sensitive surface, and wherein the screenshot capture affordance is displayed within a control interface.

28. The method of any of claims 24-27, wherein the first input type includes movement of a stylus on a touch-sensitive surface of the electronic device away from an edge of the touch-sensitive surface.

29. The method of claim 28, wherein determining that the screenshot capture input is the first input type is based at least in part on determining that a release point of the movement within the touch-sensitive surface is a threshold distance from a target location on the touch-sensitive surface.

30. The method of any of claims 24-27, wherein the first input type corresponds to movement of a stylus on a touch-sensitive surface of the electronic device away from a corner of the touch-sensitive surface.

31. The method of any of claims 24 to 30, further comprising:

detecting movement of a stylus on a touch-sensitive surface of the electronic device, wherein the movement is away from a corresponding corner of the touch-sensitive surface and originates at a threshold distance from the corresponding corner; and

in response to detecting the release of the movement of the stylus, displaying, via the display device, a screenshot capture menu comprising a capture screenshot representation and an editing screenshot representation, wherein in response to detecting a first input directed at the capture screenshot representation, the screenshot image of the content is captured, and wherein in response to detecting a second input directed at the editing screenshot representation, the screenshot editing interface for editing the screenshot image is displayed via the display device.

32. The method of claim 24, further comprising:

while displaying the thumbnail representation of the screenshot image, detecting, via the input device, a first input directed to the thumbnail representation of the screenshot image; and

in response to detecting the first input, displaying, via the display device, the screenshot editing interface.

33. The method of any of claims 24 to 32, further comprising:

when displaying the screenshot editing interface that includes an opacity level affordance:

detecting, via the input device, a first input directed to the opacity level, wherein the first input sets the opacity level affordance to a respective opacity value; and

in response to detecting the first input, changing an opacity of a filter layer overlaid on the screenshot image to the corresponding opacity value.

34. The method of claim 33, further comprising displaying, via the display device, the filter layer overlaid over annotations of the screenshot image in response to detecting the first input.

35. The method of claim 33, further comprising displaying, via the display device, an annotation of the screenshot image overlaid on the filter layer in response to detecting the first input.

36. The method of claim 33, further comprising:

in response to detecting, via the input device, a second input directed to a completion affordance included within the screenshot editing interface, displaying, via the display device, a save interface;

detecting, via the input device, a third input directed to the save interface; and

in response to detecting the third input, storing the screenshot image and the filter layer as a flattened image.

37. The method of claim 33, further comprising:

in response to detecting, via the input device, a second input directed to a sharing affordance included within the screenshot editing interface, displaying, via the display device, a sharing interface;

detecting, via the input device, a third input directed to the sharing interface; and

in response to detecting the third input, storing the screenshot image and the filter layer as an image file, wherein the screenshot image and the filter layer are separately editable.

38. The method of any of claims 24-37, wherein the screenshot editing interface further comprises a respective affordance, the method further comprising:

Detecting, via the input device, a first input directed to the respective affordance; and

in response to detecting the first input, adding additional content to the screenshot editing interface that was not displayed on the display when the screenshot capture input was detected.

39. The method of any of claims 24-38, wherein the screenshot editing interface further comprises a first drawing palette at a first location within the screenshot editing interface, and wherein the first drawing palette is movable to a second location within the screenshot editing interface in response to a first input directed to the first drawing palette.

40. An electronic device, comprising:

one or more processors;

a non-transitory memory;

an input device;

a display device; and

one or more programs, wherein the one or more programs are stored in the non-transitory memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for:

detecting a screenshot capture input while displaying content via the display device; and

In response to detecting the screenshot capture input:

capturing a screenshot image of the content displayed via the display device;

in accordance with a determination that the screenshot capture input is a first input type, displaying, via the display device, a screenshot editing interface for editing the screenshot image, wherein the screenshot editing interface includes the screenshot image; and

in accordance with a determination that the screenshot capture input corresponds to a second input type that is different from the first input type, displaying, via the display device, a thumbnail representation of the screenshot image overlaid on the content.

41. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device with one or more processors, an input device, and a display device, cause the electronic device to:

detecting a screenshot capture input while displaying content via the display device; and

in response to detecting the screenshot capture input:

capturing a screenshot image of the content displayed via the display device;

In accordance with a determination that the screenshot capture input is a first input type, displaying, via the display device, a screenshot editing interface for editing the screenshot image, wherein the screenshot editing interface includes the screenshot image; and

in accordance with a determination that the screenshot capture input corresponds to a second input type that is different from the first input type, displaying, via the display device, a thumbnail representation of the screenshot image overlaid on the content.

42. An electronic device, comprising:

one or more processors;

a non-transitory memory;

an input device;

a display device;

means for detecting a screenshot capture input while displaying content via the display device; and

in response to detecting the screenshot capture input:

means for capturing a screenshot image of the content displayed via the display device;

in accordance with a determination that the screenshot capture input is a first input type, displaying, via the display device, a screenshot editing interface for editing the screenshot image, wherein the screenshot editing interface includes the screenshot image; and

In accordance with a determination that the screenshot capture input corresponds to a second input type different from the first input type, display, via the display device, a thumbnail representation of the screenshot image overlaid on the content.

43. An information processing apparatus for use in an electronic device with a non-transitory memory, an input device, and a display device, the information processing apparatus comprising:

means for detecting a screenshot capture input while displaying content via the display device; and

in response to detecting the screenshot capture input:

means for capturing a screenshot image of the content displayed via the display device;

in accordance with a determination that the screenshot capture input is a first input type, displaying, via the display device, a screenshot editing interface for editing the screenshot image, wherein the screenshot editing interface includes the screenshot image; and

in accordance with a determination that the screenshot capture input corresponds to a second input type different from the first input type, display, via the display device, a thumbnail representation of the screenshot image overlaid on the content.

44. An electronic device, comprising:

one or more processors;

a non-transitory memory;

an input device;

a display device; and

one or more programs, wherein the one or more programs are stored in the non-transitory memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for performing the method of any of claims 24-39.

45. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device with one or more processors, an input device, and a display device, cause the electronic device to perform the method of any of claims 24-39.

46. A graphical user interface on an electronic device with a non-transitory memory, an input device, a display device, and one or more processors to execute one or more programs stored in the non-transitory memory, the graphical user interface comprising user interfaces displayed in accordance with the method of any of claims 24-39.

47. An electronic device, comprising:

one or more processors;

a non-transitory memory;

an input device;

a display device; and

means for performing the method of any of claims 24 to 39.

48. An information processing apparatus for use in an electronic device comprising one or more processors, non-transitory memory, an input device, a display device, and means for performing the method of any of claims 24-39.

49. A method, comprising:

at an electronic device with one or more processors, a non-transitory memory, an input device, and a display device:

displaying, via the display device, a drawing user interface;

while displaying the drawing user interface, detecting an object insertion input corresponding to a request to insert an object into the drawing user interface;

in response to detecting the object insertion input, inserting a corresponding object into the drawing user interface;

detecting a pixel wipe input while the corresponding object is displayed in the drawing user interface; and

in response to detecting the pixel erasure input, ceasing to display the first portion of the respective object without ceasing to display the second portion of the respective object and without ceasing to display the third portion of the respective object;

Detecting an object scrub input directed to a portion of the respective object; and

in response to detecting the object scrub input:

in accordance with a determination that the object scrub input is directed to the second portion of the respective object and the second portion of the respective object is not connected to the third portion of the respective object, ceasing to display the second portion of the respective object without ceasing to display the third portion of the respective object; and

in accordance with a determination that the object scrub input is directed to the third portion of the respective object and the third portion of the respective object is not connected to the second portion of the respective object, ceasing to display the third portion of the respective object without ceasing to display the second portion of the respective object.

50. The method of claim 49, further comprising:

in response to detecting the object scrub input:

in accordance with a determination that the object scrub input is directed to the second portion of the respective object and the second portion of the respective object is connected to the third portion of the respective object, ceasing to display the second portion of the respective object and ceasing to display the third portion of the respective object.

51. The method of claim 49, further comprising:

in response to detecting the object scrub input:

in accordance with a determination that the object scrub input is directed to the third portion of the respective object and the third portion of the respective object is connected to the second portion of the respective object, ceasing to display the third portion of the respective object and ceasing to display the second portion of the respective object.

52. The method of any of claims 49-51, further comprising:

while displaying the respective object in the drawing user interface and prior to detecting the pixel scrub input:

displaying, within the drawing user interface, a drawing palette comprising a plurality of content manipulation affordances;

detecting, via the input device, a first input directed to an eraser affordance of the plurality of content manipulation affordances, wherein the eraser affordance is associated with an eraser tool;

in response to detecting the first input, displaying an eraser mode interface comprising a plurality of eraser mode affordances;

detecting, via the input device, a second input directed to a first of the plurality of eraser mode affordances; and

In response to detecting the second input, setting the eraser tool to a pixel erase mode of operation, wherein the pixel erase input is detected while the eraser tool is in the pixel erase mode of operation.

53. The method of claim 52, further comprising:

while the eraser tool is in the pixel erase mode of operation:

detecting, via the input device, a third input directed to a second of the plurality of eraser mode affordances; and

in response to detecting the third input, setting the eraser tool to an object erase mode of operation, wherein the object erase input is detected while the eraser tool is in the object erase mode of operation.

54. The method of claim 53, wherein the eraser affordance has a first appearance when the eraser tool is in the object erase mode of operation, and wherein the eraser affordance has a second appearance when the eraser tool is in the pixel erase mode of operation that is different from the first appearance.

55. The method of any one of claims 49-54, further comprising:

While displaying a drawing palette comprising a plurality of content manipulation affordances within the drawing user interface, detecting, via the input device, a first input directed to a drawing affordance of the plurality of content manipulation affordances;

in response to detecting the first input, changing a currently selected tool from an eraser tool to a drawing tool associated with the drawing affordance;

detecting, via the input device, a drawing input directed to a canvas of the drawing user interface; and

in response to detecting the drawing input, performing a drawing operation on the canvas.

56. The method of claim 55, further comprising:

after changing the currently selected tool from the eraser tool to the drawing tool, detecting, via the input device, a second input directed to an eraser affordance of the plurality of content manipulation affordances, wherein the eraser affordance is associated with the eraser tool; and

in response to detecting the second input, changing the currently selected tool from the drawing tool to the eraser tool.

57. The method of any of claims 49-56, wherein the first portion of the respective object is within a first path defined by the pixel erase input.

58. The method of claim 57, wherein the first path defined by the pixel erase input traverses the respective object, producing the second portion of the respective object that is unconnected to the third portion of the respective object.

59. The method of claim 49, wherein:

in accordance with a determination that the object scrub input defines a first path that intersects the second portion of the respective object and does not intersect the third portion of the respective object, ceasing to display the second portion of the respective object without ceasing to display the third portion of the respective object; and

in accordance with a determination that the object scrub input defines a second path that intersects the third portion of the respective object and does not intersect the second portion of the respective object, ceasing to display the third portion of the respective object without ceasing to display the second portion of the respective object.

60. The method of claim 59, further comprising, in accordance with a determination that the object scrub input defines a third path that intersects the second portion of the respective object and intersects the third portion of the respective object, ceasing to display the second portion of the respective object and ceasing to display the third portion of the respective object.

61. The method of any of claims 49-60, further comprising:

in response to detecting the object scrub input:

in accordance with a determination that the object scrub input is directed to the second portion of the respective object and the second portion of the respective object is connected to the third portion of the respective object, ceasing to display the second portion of the respective object and ceasing to display the third portion of the respective object; and

in accordance with a determination that the object scrub input is directed to the third portion of the respective object and the second portion of the respective object is connected to the third portion of the respective object, ceasing to display the second portion of the respective object and ceasing to display the third portion of the respective object.

62. An electronic device, comprising:

one or more processors;

a non-transitory memory;

an input device;

a display device; and

one or more programs, wherein the one or more programs are stored in the non-transitory memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for:

Displaying, via the display device, a drawing user interface;

while displaying the drawing user interface, detecting an object insertion input corresponding to a request to insert an object into the drawing user interface;

in response to detecting the object insertion input, inserting a corresponding object into the drawing user interface;

detecting a pixel wipe input while the corresponding object is displayed in the drawing user interface; and

in response to detecting the pixel erasure input, ceasing to display the first portion of the respective object without ceasing to display the second portion of the respective object and without ceasing to display the third portion of the respective object;

detecting an object scrub input directed to a portion of the respective object; and

in response to detecting the object scrub input:

in accordance with a determination that the object scrub input is directed to the second portion of the respective object and the second portion of the respective object is not connected to the third portion of the respective object, ceasing to display the second portion of the respective object without ceasing to display the third portion of the respective object; and

In accordance with a determination that the object scrub input is directed to the third portion of the respective object and the third portion of the respective object is not connected to the second portion of the respective object, ceasing to display the third portion of the respective object without ceasing to display the second portion of the respective object.

63. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device with one or more processors, an input device, and a display device, cause the electronic device to:

displaying, via the display device, a drawing user interface;

while displaying the drawing user interface, detecting an object insertion input corresponding to a request to insert an object into the drawing user interface;

in response to detecting the object insertion input, inserting a corresponding object into the drawing user interface;

detecting a pixel wipe input while the corresponding object is displayed in the drawing user interface; and

in response to detecting the pixel erasure input, ceasing to display the first portion of the respective object without ceasing to display the second portion of the respective object and without ceasing to display the third portion of the respective object;

Detecting an object scrub input directed to a portion of the respective object; and

in response to detecting the object scrub input:

in accordance with a determination that the object scrub input is directed to the second portion of the respective object and the second portion of the respective object is not connected to the third portion of the respective object, ceasing to display the second portion of the respective object without ceasing to display the third portion of the respective object; and

in accordance with a determination that the object scrub input is directed to the third portion of the respective object and the third portion of the respective object is not connected to the second portion of the respective object, ceasing to display the third portion of the respective object without ceasing to display the second portion of the respective object.

64. An electronic device, comprising:

one or more processors;

a non-transitory memory;

an input device;

a display device; and

means for displaying a drawing user interface via the display device;

means for detecting an object insertion input corresponding to a request to insert an object into the drawing user interface when the drawing user interface is displayed;

Means for inserting a corresponding object into the drawing user interface in response to detecting the object insertion input;

means for detecting a pixel erasure input while the corresponding object is displayed in the drawing user interface; and

means for, in response to detecting the pixel erasure input, ceasing to display the first portion of the respective object without ceasing to display the second portion of the respective object and without ceasing to display the third portion of the respective object;

means for detecting an object scrub input directed to a portion of the respective object; and

in response to detecting the object scrub input:

means for, in accordance with a determination that the object scrub input is directed to the second portion of the respective object and the second portion of the respective object is not connected to the third portion of the respective object, ceasing to display the second portion of the respective object without ceasing to display the third portion of the respective object; and

means for, in accordance with a determination that the object scrub input is directed to the third portion of the respective object and the third portion of the respective object is not connected to the second portion of the respective object, ceasing to display the third portion of the respective object without ceasing to display the second portion of the respective object.

65. An information processing apparatus for use in an electronic device with a non-transitory memory, an input device, and a display device, the information processing apparatus comprising:

means for displaying a drawing user interface via the display device;

means for detecting an object insertion input corresponding to a request to insert an object into the drawing user interface when the drawing user interface is displayed;

means for inserting a corresponding object into the drawing user interface in response to detecting the object insertion input;

means for detecting a pixel erasure input while the corresponding object is displayed in the drawing user interface; and

means for, in response to detecting the pixel erasure input, ceasing to display the first portion of the respective object without ceasing to display the second portion of the respective object and without ceasing to display the third portion of the respective object;

means for detecting an object scrub input directed to a portion of the respective object; and

in response to detecting the object scrub input:

means for, in accordance with a determination that the object scrub input is directed to the second portion of the respective object and the second portion of the respective object is not connected to the third portion of the respective object, ceasing to display the second portion of the respective object without ceasing to display the third portion of the respective object; and

Means for, in accordance with a determination that the object scrub input is directed to the third portion of the respective object and the third portion of the respective object is not connected to the second portion of the respective object, ceasing to display the third portion of the respective object without ceasing to display the second portion of the respective object.

66. An electronic device, comprising:

one or more processors;

a non-transitory memory;

an input device;

a display device; and

one or more programs, wherein the one or more programs are stored in the non-transitory memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for performing the method of any of claims 49-61.

67. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device with one or more processors, an input device, and a display device, cause the electronic device to perform the method of any of claims 49-61.

68. A graphical user interface on an electronic device with a non-transitory memory, an input device, a display device, and one or more processors to execute one or more programs stored in the non-transitory memory, the graphical user interface comprising user interfaces displayed in accordance with the method of any of claims 49-61.

69. An electronic device, comprising:

one or more processors;

a non-transitory memory;

an input device;

a display device; and

means for performing the method of any of claims 49-61.

70. An information processing apparatus for use in an electronic device comprising one or more processors, non-transitory memory, an input device, a display device, and means for performing the method of any of claims 49-61.

Technical Field

The present disclosure relates generally to electronic devices having user interfaces, and in particular to electronic devices having one or more input devices that detect inputs for manipulating these user interfaces.

Background

The use of inputs for manipulating user interfaces of electronic devices has become a ubiquitous phenomenon. For example, in various embodiments, the electronic device uses a peripheral type input device (e.g., touch screen input, mouse, keyboard) to affect the display of one or more displayed user interfaces.

However, most of these input devices provide limited and inefficient control for manipulating the user interface. Thus, manipulating a user interface for an electronic device to perform a particular operation may require repetitive, complex, and/or cumbersome inputs or types of inputs.

Disclosure of Invention

Accordingly, there is a need for a robust and efficient mechanism for manipulating a user interface of a display at an electronic device. In particular, there is a need for electronic devices having faster, more efficient methods and interfaces for manipulating user interfaces. Such methods and interfaces optionally complement or replace conventional methods for manipulating user interfaces. Such methods and interfaces reduce the number, extent, and/or nature of inputs from a user and result in a more efficient human-machine interface. Thus, for battery-driven devices, such methods and interfaces conserve power and increase the time between battery charges.

The above-described deficiencies and other problems associated with user interfaces for electronic devices having touch-sensitive surfaces may be reduced or eliminated with the disclosed devices, systems, and methods. In some embodiments, the electronic device is a desktop computer. In some embodiments, the electronic device is portable (e.g., a laptop, tablet, or handheld device). In some embodiments, the electronic device is a personal electronic device, such as a mobile phone or a wearable device (e.g., a smart watch). In some embodiments, the electronic device has a touch pad. In some embodiments, the electronic device has a touch-sensitive display (also referred to as a "touchscreen" or "touchscreen display"). In some embodiments, the electronic device has a Graphical User Interface (GUI), one or more processors, memory, and one or more modules, programs or sets of instructions stored in the memory for performing a plurality of functions. In some embodiments, the user interacts with the GUI primarily through stylus and/or finger contacts and gestures on the touch-sensitive surface. In some embodiments, the user interacts with the GUI primarily through user interaction with the stylus when the stylus is not in physical contact with the touch-sensitive surface. In some embodiments, when the user is holding a stylus, the user interacts with the GUI primarily through finger and/or hand contacts and gestures on the stylus. In some embodiments, the functions optionally include image editing, drawing, presenting, word processing, spreadsheet making, game playing, telephone answering, video conferencing, e-mailing, instant messaging, fitness support, digital photography, digital video recording, web browsing, digital music playing, note taking, and/or digital video playing. Executable instructions for performing these functions are optionally included in a non-transitory computer-readable storage medium or other computer program product configured for execution by one or more processors.

According to some embodiments, a method is performed at an electronic device having one or more processors, a non-transitory memory, an input device, and a display device. The method includes displaying, via a display device, a first drawing palette at a first location within a first application interface. The first drawing palette has a first appearance at a first location in which a representation of a currently selected drawing implement is displayed concurrently with one or more representations of other available drawing implements. The method also includes detecting, via the input device, a first input corresponding to a request to move the first drawing palette within the first application interface. The method further comprises the following steps: in response to detecting the first input: in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a second location within the first application interface, displaying the first drawing palette having a first appearance at the second location; and in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a third location within the first application interface that is different from the second location, displaying, at the third location, the first drawing palette having a second appearance that is different from the first appearance, wherein when the first drawing palette has the second appearance, a representation of the currently selected drawing tool is displayed in the first drawing palette without displaying representations of other drawing tools in the first drawing palette.

According to some embodiments, a method is performed at an electronic device having one or more processors, a non-transitory memory, an input device, and a display device. The method includes detecting a screenshot capture input while displaying content via a display device. The method further comprises the following steps: in response to detecting the screenshot capture input: capturing a screenshot image of content displayed via a display device; in accordance with a determination that the screenshot capture input is a first input type, displaying, via a display device, a screenshot editing interface for editing a screenshot image, wherein the screenshot editing interface includes the screenshot image; and in accordance with a determination that the screenshot capture input corresponds to a second input type that is different from the first input type, displaying, via the display device, a thumbnail representation of the screenshot image overlaid on content captured within the screenshot image.

According to some embodiments, a method is performed at an electronic device having one or more processors, a non-transitory memory, an input device, and a display device. The method includes displaying a drawing user interface via a display device. The method also includes detecting an object insertion input corresponding to a request to insert an object into the drawing user interface while the drawing user interface is displayed. The method also includes inserting a corresponding object into the drawing user interface in response to detecting the object insertion input. The method also includes detecting a pixel erasure input while displaying the corresponding object in the drawing user interface. The method further comprises the following steps: in response to detecting the pixel erasure input, display of the first portion of the respective object is stopped without stopping display of the second portion of the respective object and without stopping display of the third portion of the respective object. The method also includes detecting an object scrub input directed to a portion of the corresponding object. The method further comprises the following steps: in response to detecting the object scrub input: in accordance with a determination that the object scrub input points to a second portion of the respective object and the second portion of the respective object is not connected to a third portion of the respective object, ceasing to display the second portion of the respective object and not ceasing to display the third portion of the respective object; and in accordance with a determination that the object scrub input points to a third portion of the respective object and the third portion of the respective object is not connected to the second portion of the respective object, ceasing to display the third portion of the respective object and not ceasing to display the second portion of the respective object.

According to some embodiments, an electronic device includes one or more processors, non-transitory memory, an input device, a display device, and one or more programs. The one or more programs are stored in the non-transitory memory and configured to be executed by the one or more processors. The one or more programs include instructions for performing or causing the performance of the operations of any of the methods described herein. According to some embodiments, a non-transitory computer readable storage medium has stored therein instructions, which when executed by an electronic device with one or more processors, an input device, and a display device, cause the electronic device to perform or cause operations of any of the methods described herein to be performed. According to some embodiments, a graphical user interface on an electronic device with non-transitory memory, an input device, a display device, and one or more processors to execute one or more programs stored in the non-transitory memory includes one or more of the elements displayed in any of the methods described herein that are updated in response to an input, as described in any of the methods described herein. According to some embodiments, an electronic device comprises: one or more processors, non-transitory memory, input devices, display devices, and means for performing or causing the performance of the operations of any of the methods described herein. According to some embodiments, an information processing apparatus for use in an electronic device with one or more processors, non-transitory memory, an input device, and a display device includes means for performing or causing the performance of the operations of any of the methods described herein.

Accordingly, an electronic device having an input device and a display device utilizes various inputs detected via the input device, such as a touch input, a mouse input, a keyboard input, and the like. Based on these inputs, the electronic device implements various operations, such as drawing palette manipulation operations (e.g., movement and resizing/reorientation of a drawing palette), screenshot capture operations, and editing operations. In some embodiments, fewer inputs cause the electronic device to perform certain operations, thereby improving the functionality of the electronic device, as compared to previously available systems. Examples of improved functionality are longer battery life and less wear, and more efficient and accurate user interaction with the electronic device.

Drawings

For a better understanding of the various described embodiments, reference should be made to the following detailed description, taken in conjunction with the following drawings, wherein like reference numerals designate corresponding parts throughout the figures.

FIG. 1A is a block diagram illustrating a portable multifunction device with a touch-sensitive display in accordance with some embodiments.

FIG. 1B is a block diagram illustrating exemplary components for event processing according to some embodiments.

FIG. 2 illustrates a portable multifunction device with a touch screen in accordance with some embodiments.

FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments.

FIG. 4 is a block diagram of an exemplary electronic stylus, according to some embodiments.

5A-5B illustrate positional states of a stylus relative to a touch-sensitive surface, according to some embodiments.

FIG. 6A illustrates an exemplary user interface of an application menu on a portable multifunction device according to some embodiments.

FIG. 6B illustrates an exemplary user interface for a multifunction device with a touch-sensitive surface separate from a display in accordance with some embodiments.

Fig. 7A-7 CF are examples of user interfaces for repositioning a drawing palette, according to some embodiments.

Fig. 8A-8 AL are examples of user interfaces for invoking and utilizing a screenshot editing interface, according to some embodiments.

Fig. 9A-9Z are examples of capturing screenshot images based on detected stylus input, according to some embodiments.

Fig. 10A-10D are flow diagrams of methods for repositioning a drawing palette, according to some embodiments.

11A-11C are flow diagrams of methods for invoking and utilizing a screenshot editing interface, according to some embodiments.

12A-12 AP are examples of user interfaces for selectively erasing portions of an object, according to some embodiments.

13A-13D are flow diagrams of methods for selectively erasing portions of an object, according to some embodiments.

Detailed Description

Many electronic devices manipulate a user interface based on detected input. However, existing methods for manipulating user interfaces can be slow, cumbersome, and inefficient.

For example, in various embodiments, the electronic device may display a drawing palette that enables a currently selected drawing tool and/or a property (e.g., color) of the currently selected drawing tool to be changed. However, the drawing palette may be fixed to a particular location (e.g., to a particular side of the display). Thus, drawing operations (e.g., drawing lines, entering text, pasting shapes) cannot be applied to a portion of the user interface where the drawing palette is located, which in turn limits the available display area of the user interface. Further, the electronic device does not change the appearance of the drawing palette (e.g., the size, orientation, or number of types of drawing tools displayed in the drawing palette) in response to detecting an input requesting movement of the drawing palette. However, various embodiments disclosed herein provide that, in response to detecting a request to move a drawing palette to a particular location within a user interface, the electronic device displays the drawing palette at the particular location and in various instances with a different appearance. By changing the position and orientation of the drawing palette, the electronic device provides a larger available portion of the display for drawing operations and other content modification operations.

As another example, in various embodiments, the electronic device may provide screenshot capture functionality that provides a non-intuitive, non-user-friendly experience. For example, the screenshot capture function provides a limited mechanism for manipulating captured screenshot images. In addition, the screenshot images are typically saved to a back-office clipboard unknown to the unfamiliar user. However, various embodiments disclosed herein provide for the electronic device to display a thumbnail representation of a screenshot editing interface or screenshot image based on the input type of the detected input. Further, in some embodiments, the screenshot editing interface includes a rich set of manipulation options for applying to the screenshot image (e.g., annotating, changing the opacity level, or showing additional related content). Accordingly, after performing the screenshot capture, the electronic device displays a screenshot editing interface, providing a seamless and intuitive user experience that requires less time and less user input to manipulate the screenshot image. This also reduces power usage and extends the battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiently.

As yet another example, in various embodiments, an electronic device provides an erase tool for erasing portions of content. However, the erasing tool is limited in its function. For example, an erase tool cannot perform different erase operations on a portion of an object based on whether the portion was previously separated (e.g., split or separated) from other portions of the object. In another aspect, various embodiments disclosed herein provide that after the object is segmented into a plurality of separate portions based on the pixel-erasure input, the electronic device stops displaying the particular separate portions and holds the other remaining portions in response to the object-erasure input. Thus, the electronic device provides more functionality and control over the erase operation. Further, the electronic device need not receive a drag-erase input that is spatially coextensive with a separate portion of the object in order to erase the separate portion. By erasing the separate portions using an object erase input instead of a drag erase input, the electronic device reduces processing and battery usage and experiences less wear.

Fig. 1A-1B, 2-4, 5A-5B, and 6A-6B provide a description of exemplary devices. Fig. 7A-7 CF are examples of user interfaces for repositioning a drawing palette, according to some embodiments. The user interfaces in fig. 7A to 7CF are for illustrating the processes in fig. 10A to 10D. Fig. 8A-8 AL are examples of user interfaces for invoking and utilizing a screenshot editing interface, according to some embodiments. Fig. 9A-9Z are examples of capturing screenshot images based on detected stylus input, according to some embodiments. The user interfaces in fig. 8A to 8AL and 9A to 9Z are for illustrating the processes in fig. 11A to 11C. 12A-12 AP are examples of user interfaces for selectively erasing portions of an object, according to some embodiments. The user interfaces in fig. 12A to 12AP are used to illustrate the processes in fig. 13A to 13D.

Exemplary device

Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of various described embodiments. However, it will be apparent to one of ordinary skill in the art that various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail as not to unnecessarily obscure aspects of the embodiments.

It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements in some cases, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact may be termed a second contact, and, similarly, a second contact may be termed a first contact, without departing from the scope of the various described embodiments. The first contact and the second contact are both contacts, but they are not the same contact unless the context clearly indicates otherwise.

The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises," "comprising," "includes," and/or "including," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

As used herein, the term "if" is optionally interpreted to mean "when … …" after "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrase "if determined … …" or "if [ stated condition or event ] is detected" is optionally to be construed to mean "upon determination … …" or "in response to determination … …" or "upon detection of [ stated condition or event ] or" in response to detection of [ stated condition or event ] ", depending on the context.

Embodiments of electronic devices, user interfaces for such devices, and related processes for using such devices are described herein. In some embodiments, the electronic device is a portable communication device, such as a mobile phone, that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of portable multifunction devices include, but are not limited to, those from Apple Inc Andan apparatus. Other portable electronic devices are optionally used, such as a laptop or tablet computer with a touch-sensitive surface (e.g., a touch screen display and/or a touch pad). It should also be understood that in some embodiments, the electronic device is not a portable communication device, but is a desktop device with a touch-sensitive surface (e.g., a touch screen display and/or a touch pad)And (4) a computer.

In the following discussion, an electronic device including a display and a touch-sensitive surface is described. However, it should be understood that the electronic device optionally includes one or more other physical user interface devices, such as a physical keyboard, mouse, and/or joystick.

The electronic device typically supports various applications, such as one or more of the following: a note taking application, a drawing application, a rendering application, a word processing application, a website creation application, a disc editing application, a spreadsheet application, a gaming application, a telephony application, a video conferencing application, an email application, an instant messaging application, a fitness support application, a photo management application, a digital camera application, a digital video camcorder application, a web browsing application, a digital music player application, and/or a digital video player application.

Various applications executing on the electronic device optionally use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the device are optionally adjusted and/or varied for different applications and/or within respective applications. As such, a common physical architecture of the electronic device (such as a touch-sensitive surface) optionally supports various applications with a user interface that is intuitive and clear to the user.

Attention is now directed to embodiments of portable devices having touch sensitive displays. FIG. 1A is a block diagram illustrating a portable multifunction device 100 with a touch-sensitive display system 112 in accordance with some embodiments. Touch-sensitive display system 112 is sometimes referred to as a "touch screen" for convenience and is sometimes simply referred to as a touch-sensitive display. Electronic device 100 includes memory 102 (which optionally includes one or more computer-readable storage media), a memory controller 122, one or more processing units (CPUs) 120, a peripheral interface 118, RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, an input/output (I/O) subsystem 106, other input or control devices 116, and an external port 124. The electronic device 100 optionally includes one or more optical sensors 164. Electronic device 100 optionally includes one or more intensity sensors 165 for detecting intensity of contacts on electronic device 100 (e.g., a touch-sensitive surface, such as touch-sensitive display system 112 of electronic device 100). Electronic device 100 optionally includes one or more tactile output generators 163 for generating tactile outputs on electronic device 100 (e.g., generating tactile outputs on a touch-sensitive surface such as touch-sensitive display system 112 of electronic device 100 or trackpad 355 of device 300). These components optionally communicate over one or more communication buses or signal lines 103.

As used in this specification and claims, the term "haptic output" refers to a physical displacement of an electronic device relative to a previous position of the electronic device, a physical displacement of a component of an electronic device (e.g., a touch-sensitive surface) relative to another component of the electronic device (e.g., a housing), or a displacement of the component relative to a center of mass of the electronic device that is to be detected by a user with his or her sense of touch. For example, where an electronic device or a component of the electronic device is in contact with a surface of a user that is sensitive to touch (e.g., a finger, palm, or other portion of a user's hand), the haptic output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in a physical characteristic of the electronic device or the component of the electronic device. For example, movement of the touch-sensitive surface (e.g., a touch-sensitive display or trackpad) is optionally interpreted by the user as a "down click" or "up click" of a physical actuation button. In some cases, the user will feel a tactile sensation, such as a "press click" or "release click," even when the physical actuation button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movement is not moving. As another example, even when there is no change in the smoothness of the touch sensitive surface, the movement of the touch sensitive surface is optionally interpreted or sensed by the user as "roughness" of the touch sensitive surface. While such interpretation of touch by a user will be limited by the user's individualized sensory perception, many sensory perceptions of touch are common to most users. Thus, when a haptic output is described as corresponding to a particular sensory perception of a user (e.g., "up click," "down click," "roughness"), unless otherwise stated, the generated haptic output corresponds to a physical displacement of the electronic device or a component thereof that would generate the sensory perception of a typical (or ordinary) user.

It should be understood that electronic device 100 is only one example of a portable multifunction device, and that electronic device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of these components. The various components shown in fig. 1A are implemented in hardware, software, firmware, or any combination thereof, including one or more signal processing circuits and/or application specific integrated circuits.

The memory 102 optionally includes high-speed random access memory, and also optionally includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 102 by other components of electronic device 100, such as one or more CPUs 120 and peripheral interface 118, is optionally controlled by memory controller 122.

Peripheral interface 118 may be used to couple input peripherals and output peripherals of an electronic device to one or more CPUs 120 and memory 102. The one or more CPUs run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions of electronic device 100 and process data.

In some embodiments, peripherals interface 118, one or more CPUs 120, and memory controller 122 are optionally implemented on a single chip, such as chip 104. In some other embodiments, they are optionally implemented on separate chips.

RF (radio frequency) circuitry 108 receives and transmits RF signals, also referred to as electromagnetic signals. The RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communication networks and other communication devices via electromagnetic signals. RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a codec chipset, a Subscriber Identity Module (SIM) card, memory, and so forth. RF circuitry 108 optionally communicates with networks such as the internet, also known as the World Wide Web (WWW), intranets, and/or wireless networks such as cellular telephone networks, wireless Local Area Networks (LANs), and/or Metropolitan Area Networks (MANs), and other devices via wireless communication. The wireless communication optionally uses any of a number of communication standards, protocols, and/or technologies, including, but not limited to, global system for mobile communications (GSM), Enhanced Data GSM Environment (EDGE), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), evolved pure data (EV-DO), HSPA +, Dual-cell HSPA (DC-HSPA), Long Term Evolution (LTE), Near Field Communication (NFC), wideband code division multiple Access (W-CDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11ac, IEEE 802.11ax, IEEE 802.11b, IEEE 802.11g, and/or IEEE 802.11n), Voice over Internet protocol (VoIP), Wi-MAX, email protocols (e.g., Internet Message Access Protocol (IMAP), and/or Post Office Protocol (POP)) Instant messaging (e.g., extensible messaging and presence protocol (XMPP), session initiation protocol for instant messaging and presence with extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol including communication protocols not yet developed at the filing date of this document.

Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between a user and electronic device 100. The audio circuitry 110 receives audio data from the peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to the speaker 111. The speaker 111 converts the electrical signals into sound waves audible to a human. The audio circuit 110 also receives electrical signals converted from sound waves by the microphone 113. The audio circuit 110 converts the electrical signals to audio data and transmits the audio data to the peripheral interface 118 for processing. Audio data is optionally retrieved from and/or transmitted to memory 102 and/or RF circuitry 108 by peripheral interface 118. In some embodiments, the audio circuit 110 also includes a headset jack (e.g., the headset jack 212 in fig. 2). The headset jack provides an interface between the audio circuitry 110 and removable audio input/output peripherals such as output-only headphones or a headset with both output (e.g., a monaural headphone or a binaural headphone) and input (e.g., a microphone).

The I/O subsystem 106 couples input/output peripheral devices on the electronic device 100, such as touch-sensitive display system 112 and other input or control devices 116, to a peripheral interface 118. The I/O subsystem 106 optionally includes a display controller 156, an optical sensor controller 158, an intensity sensor controller 159, a haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. One or more input controllers 160 receive/transmit electrical signals from/to other input or control devices 116. Other input control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slide switches, joysticks, click wheels, and the like. In some alternative embodiments, one or more input controllers 160 are optionally coupled with (or not coupled with) any of: a keyboard, infrared port, USB port, stylus, and/or pointer device such as a mouse. The one or more buttons (e.g., button 208 in fig. 2) optionally include an up/down button for volume control of speaker 111 and/or microphone 113. The one or more buttons optionally include a push button (e.g., push button 206 in fig. 2).

Touch-sensitive display system 112 provides an input interface and an output interface between the electronic device and a user. Display controller 156 receives electrical signals from and/or transmits electrical signals to touch-sensitive display system 112. Touch-sensitive display system 112 displays visual output to a user. The visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively "graphics"). In some embodiments, some or all of the visual output corresponds to a user interface object. As used herein, the term "affordance" refers to a user-interactive graphical user interface object (e.g., a graphical user interface object configured to respond to input directed to the graphical user interface object). Examples of user interactive graphical user interface objects include, but are not limited to, buttons, sliders, icons, selectable menu items, switches, hyperlinks, or other user interface controls.

Touch-sensitive display system 112 has a touch-sensitive surface, sensor, or group of sensors that accept input from a user based on haptic and/or tactile contact. Touch-sensitive display system 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on touch-sensitive display system 112 and convert the detected contact into interaction with user interface objects (e.g., one or more soft keys, icons, web pages, or images) that are displayed on touch-sensitive display system 112. In an exemplary embodiment, the point of contact between touch-sensitive display system 112 and the user corresponds to a user's finger or a stylus.

Touch-sensitive display system 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. Touch-sensitive display system 112 and display controller 156 optionally detect contact and any movement or breaking thereof using any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive technologies, resistive technologies, infrared technologies, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch-sensitive display system 112. In one exemplary embodiment, a projected mutual capacitance sensing technique is used, such as that from Apple IncAndthe technique found in (1).

Touch sensitive display system 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touchscreen video resolution exceeds 400dpi (e.g., 500dpi, 800dpi, or greater). The user optionally makes contact with touch-sensitive display system 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work with finger-based contacts and gestures, which may not be as accurate as stylus-based input due to the larger contact area of the finger on the touch screen. In some embodiments, the electronic device translates the rough finger-based input into a precise pointer/cursor position or command for performing the action desired by the user.

In some embodiments, in addition to the touch screen, the electronic device 100 optionally includes a trackpad (not shown) for activating or deactivating particular functions. In some embodiments, a trackpad is a touch-sensitive area of an electronic device that, unlike a touchscreen, does not display visual output. The trackpad is optionally a touch-sensitive surface separate from touch-sensitive display system 112, or an extension of the touch-sensitive surface formed by the touch screen.

The electronic device 100 also includes a power system 162 for powering the various components. Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, Alternating Current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a Light Emitting Diode (LED)), and any other components associated with the generation, management, and distribution of power in a portable device.

The electronic device 100 optionally further includes one or more optical sensors 164. FIG. 1A shows an optical sensor coupled to an optical sensor controller 158 in the I/O subsystem 106. The one or more optical sensors 164 optionally include Charge Coupled Devices (CCDs) or Complementary Metal Oxide Semiconductor (CMOS) phototransistors. The one or more optical sensors 164 receive light projected through the one or more lenses from the environment and convert the light into data representing an image. In conjunction with imaging module 143 (also referred to as a camera module), one or more optical sensors 164 optionally capture still images and/or video. In some embodiments, an optical sensor is located on the back of electronic device 100, opposite touch-sensitive display system 112 on the front of electronic device 100, enabling the touch screen to serve as a viewfinder for still and/or video image capture. In some embodiments, another optical sensor is located on the front of the electronic device 100 to capture an image of the user (e.g., for self-timer shooting, for video conferencing while the user is viewing other video conference participants on a touch screen, etc.).

Electronic device 100 optionally further includes one or more contact intensity sensors 165. FIG. 1A shows a contact intensity sensor coupled to an intensity sensor controller 159 in the I/O subsystem 106. The one or more contact intensity sensors 165 optionally include one or more piezoresistive strain gauges, capacitive force sensors, electrical force sensors, piezoelectric force sensors, optical force sensors, capacitive touch-sensitive surfaces, or other intensity sensors (e.g., sensors for measuring the force (or pressure) of a contact on a touch-sensitive surface). One or more contact intensity sensors 165 receive contact intensity information (e.g., pressure information or a proxy for pressure information) from the environment. In some embodiments, at least one contact intensity sensor is collocated with or proximate to a touch-sensitive surface (e.g., touch-sensitive display system 112). In some embodiments, at least one contact intensity sensor is located on the back of the electronic device 100, opposite the touch screen display system 112 located on the front of the electronic device 100.

The electronic device 100 optionally further includes one or more proximity sensors 166. Fig. 1A shows a proximity sensor 166 coupled to the peripheral interface 118. Alternatively, the proximity sensor 166 is coupled with the input controller 160 in the I/O subsystem 106. In some embodiments, proximity sensor 166 turns off and disables touch-sensitive display system 112 when electronic device 100 is placed near the user's ear (e.g., when the user is making a phone call).

The electronic device 100 optionally further includes one or more tactile output generators 163. FIG. 1A shows a haptic output generator coupled to a haptic feedback controller 161 in I/O subsystem 106. The one or more tactile output generators 163 optionally include one or more electro-acoustic devices (such as speakers or other audio components), and/or electromechanical devices that convert energy into linear motion (such as motors, solenoids, electroactive aggregators, piezoelectric actuators, electrostatic actuators), or other tactile output generating components (e.g., components that convert electrical signals into tactile output on an electronic device). One or more tactile output generators 163 receive tactile feedback generation instructions from the tactile feedback module 133 and generate tactile outputs on the electronic device 100 that can be felt by a user of the electronic device 100. In some embodiments, at least one tactile output generator is juxtaposed or adjacent to a touch-sensitive surface (e.g., touch-sensitive display system 112), and optionally generates tactile output by moving the touch-sensitive surface vertically (e.g., into/out of the surface of electronic device 100) or laterally (e.g., back and forth in the same plane as the surface of electronic device 100). In some embodiments, at least one tactile output generator sensor is located on the back of electronic device 100, opposite touch-sensitive display system 112, which is located on the front of electronic device 100.

The electronic device 100 optionally also includes one or more accelerometers 167, gyroscopes 168, and/or magnetometers 169 (e.g., as part of an Inertial Measurement Unit (IMU)) for obtaining information regarding the position (e.g., pose) of the electronic device. FIG. 1A shows sensors 167, 168, and 169 coupled to peripheral interface 118. Alternatively, sensors 167, 168, and 169 are optionally coupled to input controller 160 in I/O subsystem 106. In some embodiments, information is displayed in a portrait view or a landscape view on the touch screen display based on analysis of data received from the one or more accelerometers. The electronic device 100 optionally includes a GPS (or GLONASS or other global navigation system) receiver (not shown) for obtaining information about the position and orientation (e.g., portrait or landscape) of the electronic device 100.

In some embodiments, the software components stored in memory 102 include an operating system 126, a communication module (or set of instructions) 128, a contact/motion module (or set of instructions) 130, a graphics module (or set of instructions) 132, a haptic feedback module (or set of instructions) 133, a text input module (or set of instructions) 134, a Global Positioning System (GPS) module (or set of instructions) 135, and an application (or set of instructions) 136. Further, in some embodiments, memory 102 stores device/global internal state 157, as shown in fig. 1A and 3. Device/global internal state 157 includes one or more of: an active application state indicating which applications (if any) are currently active; a display state indicating what applications, views, or other information occupy various areas of touch-sensitive display system 112; sensor states, which include information obtained from various sensors of the electronic device and other input or control devices 116; and position and/or orientation information regarding the position and/or pose of the electronic device.

The operating system 126 (e.g., iOS, Darwin, RTXC, LINUX, UNIX, OSX, WINDOWS, or embedded operating systems such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.

Communications module 128 facilitates communications with other devices through one or more external ports 124 and also includes various software components for processing data received by RF circuitry 108 and/or external ports 124. An external port 124 (e.g., Universal Serial Bus (USB), firewire, etc.) is adapted to couple directly to other devices or indirectly through a network (e.g., the internet, wireless LAN, etc.). In some embodiments, the external ports are some of those found with Apple IncAndthe 30-pin connectors used in the devices are the same or similar and/or compatible multi-pin (e.g., 30-pin) connectors. In some embodiments, the external ports are some of those found with Apple Inc Andthe Lightning connector used in the device is the same or similar and/or compatible Lightning connector.

Contact/motion module 130 optionally detects contact with touch-sensitive display system 112 (in conjunction with display controller 156) and other touch-sensitive devices (e.g., a trackpad or physical click wheel). Contact/motion module 130 includes various software components for performing various operations related to contact detection (e.g., by a finger or stylus), such as determining whether contact has occurred (e.g., detecting a finger-down event), determining the intensity of contact (e.g., the force or pressure of the contact, or a surrogate for the force or pressure of the contact), determining whether there is movement of the contact and tracking movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining whether contact has ceased (e.g., detecting a finger-lift-off event or a contact-break). The contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact optionally includes determining velocity (magnitude), velocity (magnitude and direction), and/or acceleration (change in magnitude and/or direction) of the point of contact, the movement of the point of contact being represented by a series of contact data. These operations are optionally applied to single point contacts (e.g., single finger contacts or stylus contacts) or multiple simultaneous contacts (e.g., "multi-touch"/multi-finger contacts and/or stylus contacts). In some embodiments, the contact/motion module 130 and the display controller 156 detect contact on a trackpad.

The contact/motion module 130 optionally detects gesture input by the user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different motions, timings, and/or intensities of detected contacts). Thus, the gesture is optionally detected by detecting a particular contact pattern. For example, detecting a single-finger tap gesture includes detecting a finger-down event, and then detecting a finger-up (lift-off) event at the same location (or substantially the same location) as the finger-down event (e.g., at an icon location). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event, then detecting one or more finger-dragging events, and then subsequently detecting a finger-up (lift-off) event. Similarly, taps, swipes, drags, and other gestures of the stylus are optionally detected by detecting a particular contact pattern of the stylus.

In conjunction with accelerometer 167, gyroscope 168, and/or magnetometer 169, the position module 131 optionally detects position information about the electronic device, such as the pose (e.g., roll, pitch, and/or yaw) of the electronic device in a particular frame of reference. The location module 131 includes software components for performing various operations related to detecting the location of the electronic device and detecting a change in the location of the electronic device. In some embodiments, the position module 131 uses information received from a stylus used with the electronic device 100 to detect position information about the stylus, such as detecting a positional state of the stylus relative to the electronic device 100 and detecting a change in the positional state of the stylus.

Graphics module 132 includes various known software components for rendering and displaying graphics on touch-sensitive display system 112 or other displays, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast, or other visual properties) of displayed graphics. As used herein, the term "graphic" includes any object that may be displayed to a user, including without limitation text, web pages, icons (such as user interface objects including soft keys), digital images, videos, animations and the like.

In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is optionally assigned a corresponding code. The graphics module 132 receives one or more codes specifying graphics to be displayed, along with coordinate data and other graphics attribute data if necessary, from an application or the like, and then generates screen image data to output to the display controller 156.

Haptic feedback module 133 includes various software components for generating instructions for use by one or more haptic output generators 163 to produce haptic outputs at one or more locations on electronic device 100 in response to user interaction with electronic device 100.

Text input module 134, which is optionally a component of graphics module 132, provides a soft keyboard for entering text in various applications (e.g., contacts 137, email 140, IM 141, browser 147, and any other application that requires text input).

The GPS module 135 determines the location of the electronic device 100 and provides this information for use in various applications (e.g., to the phone 138 for location-based dialing; to the camera 143 as picture/video metadata; and to applications that provide location-based services such as weather desktop applets, local yellow pages desktop applets, and map/navigation desktop applets).

The applications 136 optionally include the following modules (or sets of instructions), or a subset or superset thereof:

a contacts module 137 (sometimes referred to as an address book or contact list);

a phone module 138;

a video conferencing module 139;

an email client module 140;

an Instant Messaging (IM) module 141;

fitness support module 142;

a camera module 143 for still and/or video images;

an image management module 144;

a browser module 147;

a calendar module 148;

desktop applet module 149, optionally including one or more of: a weather desktop applet 149-1, a stock market desktop applet 149-2, a calculator desktop applet 149-3, an alarm desktop applet 149-4, a dictionary desktop applet 149-5, and other desktop applets acquired by the user, and a user created desktop applet 149-6;

A desktop applet creator module 150 for forming a user-created desktop applet 149-6;

a search module 151;

a video and music player module 152, optionally consisting of a video player module and a music player module;

a memo module 153;

a map module 154;

an online video module 155; and/or

An annotation application 195 for providing annotations to the user interface and optionally storing and/or accessing saved annotations 196 in memory 102.

Examples of other applications 136 optionally stored in memory 102 include other word processing applications, other image editing applications, drawing applications, rendering applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.

In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, contacts module 137 includes executable instructions for managing an address book or contact list (e.g., stored in application internal state 192 of contacts module 137 in memory 102 or memory 370), including: adding a name to the address book; deleting names from the address book; associating a telephone number, email address, physical address, or other information with a name; associating the image with a name; classifying and classifying names; providing a telephone number or email address to initiate and/or facilitate communications through the telephone module 138, the video conference module 139, the email client module 140, or the IM module 141; and so on.

In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, phone module 138 includes executable instructions for: entering a sequence of characters corresponding to a telephone number, accessing one or more telephone numbers in address book 137, modifying an entered telephone number, dialing a corresponding telephone number, conducting a conversation, and disconnecting or hanging up a telephone when the conversation is completed. As noted above, the wireless communication optionally uses any of a variety of communication standards, protocols, and technologies.

In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, one or more optical sensors 164, optical sensor controller 158, contact module 130, graphics module 132, text input module 134, contact list 137, and telephony module 138, video conference module 139 includes executable instructions for initiating, conducting, and terminating video conferences between the user and one or more other participants according to user instructions.

In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, email client module 140 includes executable instructions for creating, sending, receiving, and managing emails in response to user instructions. In conjunction with the image management module 144, the email client module 140 makes it very easy to create and send an email with a still image or a video image captured by the camera module 143.

In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, instant messaging module 141 includes executable instructions for: entering a sequence of characters corresponding to an instant message, modifying previously entered characters, sending a corresponding instant message (e.g., using a Short Message Service (SMS) or Multimedia Messaging Service (MMS) protocol for telephone-based instant messages or using XMPP, SIMPLE, Apple Push Notification Service (APNs) or IMPS for internet-based instant messages), receiving an instant message, and viewing the received instant message. In some embodiments, the transmitted and/or received instant messages optionally include graphics, photos, audio files, video files, and/or MMS and/or other attachments supported in an Enhanced Messaging Service (EMS). As used herein, "instant messaging" refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and internet-based messages (e.g., messages sent using XMPP, SIMPLE, APNs, or IMPS).

In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module 146, fitness support module 142 includes executable instructions for: creating fitness (e.g., having time, distance, and/or calorie burning goals); communicating with fitness sensors (in sports equipment and smart watches); receiving fitness sensor data; calibrating a sensor for monitoring fitness; selecting and playing music for fitness; and displaying, storing and transmitting fitness data.

In conjunction with touch-sensitive display system 112, display controller 156, one or more optical sensors 164, optical sensor controller 158, contact module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions to: capturing still images or video (including video streams) and storing them in the memory 102, modifying characteristics of the still images or video, and/or deleting the still images or video from the memory 102.

In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions for arranging, modifying (e.g., editing) or otherwise manipulating, tagging, deleting, presenting (e.g., in a digital slide or album), and storing still and/or video images.

In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions for browsing the internet according to user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.

In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, email client module 140, and browser module 147, calendar module 148 includes executable instructions for creating, displaying, modifying, and storing calendars and data associated with calendars (e.g., calendar entries, to-do, etc.) according to user instructions.

In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, desktop applet module 149 is a mini-application (e.g., weather desktop applet 149-1, stock desktop applet 149-2, calculator desktop applet 149-3, alarm desktop applet 149-4, and dictionary desktop applet 149-5) or a mini-application created by a user (e.g., user-created desktop applet 149-6) that is optionally downloaded and used by the user. In some embodiments, the desktop applet includes an HTML (HyperText markup language) file, a CSS (cascading Style sheet) file, and a JavaScript file. In some embodiments, the desktop applet includes an XML (extensible markup language) file and a JavaScript file (e.g., Yahoo! desktop applet).

In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, desktop applet creator module 150 includes executable instructions for creating a desktop applet (e.g., turning a user-specified portion of a web page into a desktop applet).

In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions for searching memory 102 for text, music, sound, images, videos, and/or other files that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.

In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, audio circuitry 110, speakers 111, RF circuitry 108, and browser module 147, video and music player module 152 includes executable instructions to allow a user to download and playback recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, as well as executable instructions for displaying, rendering, or otherwise playing back video (e.g., on touch-sensitive display system 112 or on an external display wirelessly connected via external port 124). In some embodiments, the electronic device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple inc. of Cupertino, California).

In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, memo module 153 includes executable instructions for creating and managing memos, backlogs, and the like according to user instructions.

In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 includes executable instructions for receiving, displaying, modifying, and storing maps and data associated with maps (e.g., driving directions; data for stores and other points of interest at or near a particular location; and other location-based data) according to user instructions.

In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, audio circuit 110, speaker 111, RF circuit 108, text input module 134, email client module 140, and browser module 147, online video module 155 includes executable instructions that allow a user to access, browse, receive (e.g., by streaming and/or downloading), play back (e.g., on touch screen 112 or on an external display that is wirelessly connected or connected via external port 124), send emails with links to particular online videos, and otherwise manage online videos in one or more file formats, such as h.264. In some embodiments, the link to the particular online video is sent using instant messaging module 141 instead of email client module 140.

Each of the modules and applications identified above corresponds to a set of executable instructions for performing one or more of the functions described above as well as the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. In some embodiments, memory 102 optionally stores a subset of the modules and data structures described above. Further, memory 102 optionally stores additional modules and data structures not described above.

In some embodiments, electronic device 100 is an electronic device in which operation of a predefined set of functions on the electronic device is performed exclusively through a touch screen and/or a trackpad. By using a touchscreen and/or trackpad as the primary input control device for operating the electronic device 100, the number of physical input control devices (e.g., push buttons, dials, etc.) on the electronic device 100 is optionally reduced.

The predefined set of functions performed exclusively through the touchscreen and/or trackpad optionally includes navigation between user interfaces. In some embodiments, the trackpad, when touched by a user, navigates electronic device 100 from any user interface displayed on electronic device 100 to a main, home, or root menu. In such embodiments, a "menu button" is implemented using a touch pad. In some other embodiments, the menu button is a physical push button or other physical input control device, rather than a touchpad.

FIG. 1B is a block diagram illustrating exemplary components for event processing according to some embodiments. In some embodiments, memory 102 (in FIG. 1A) or memory 370 (in FIG. 3) includes event classifier 170 (e.g., in operating system 126) and corresponding application 136-1 (e.g., any of the aforementioned applications 136, 137-155, 380-390).

Event classifier 170 receives the event information and determines application 136-1 to which the event information is to be delivered and application view 191 of application 136-1. The sorter 170 includes an event monitor 171 and an event dispatcher module 174. In some embodiments, the application 136-1 includes an application internal state 192 that indicates one or more current application views displayed on the touch-sensitive display system 112 when the application is active or executing. In some embodiments, device/global internal state 157 is used by event classifier 170 to determine which application(s) are currently active, and application internal state 192 is used by event classifier 170 to determine application view 191 to which to deliver event information.

In some embodiments, the application internal state 192 includes additional information, such as one or more of: resume information to be used when the application 136-1 resumes execution, user interface state information indicating that information is being displayed or is ready for display by the application 136-1, a state queue for enabling a user to return to a previous state or view of the application 136-1, and an undo/resume queue of previous actions taken by the user.

Event monitor 171 receives event information from peripheral interface 118. The event information includes information about a sub-event (e.g., a user touch on touch-sensitive display system 112 as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or sensors such as proximity sensor 166, one or more accelerometers 167, one or more gyroscopes 168, one or more magnetometers 169, and/or microphone 113 (through audio circuitry 110). Information received by peripheral interface 118 from I/O subsystem 106 includes information from touch-sensitive display system 112 or a touch-sensitive surface.

In some embodiments, event monitor 171 sends requests to peripheral interface 118 at predetermined intervals. In response, peripheral interface 118 transmits the event information. In other embodiments, peripheral interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or receiving more than a predetermined duration).

In some embodiments, event classifier 170 further includes hit view determination module 172 and/or active event recognizer determination module 173. When touch-sensitive display system 112 displays more than one view, hit view determination module 172 provides a software process for determining where within one or more views a sub-event has occurred. The view consists of controls and other elements that the user can see on the display.

Another aspect of the user interface associated with an application is a set of views, sometimes referred to herein as application views or user interface windows, in which information is displayed and touch-based gestures occur. The application view (of the respective application) in which the touch is detected optionally corresponds to a programmatic level within a programmatic or view hierarchy of applications. For example, the lowest level view in which a touch is detected is optionally referred to as a hit view, and the set of events identified as correct inputs is optionally determined based at least in part on the hit view of the initial touch that initiated the touch-based gesture.

Hit view determination module 172 receives information related to sub-events of the touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies the hit view as the lowest view in the hierarchy that should handle sub-events. In most cases, the hit view is the lowest level view in which the initiating sub-event (i.e., the first sub-event in the sequence of sub-events that form an event or potential event) occurs. Once a hit view is identified by the hit view determination module, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.

The active event recognizer determination module 173 determines which view or views within the view hierarchy should receive a particular sequence of sub-events. In some embodiments, the active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of the sub-event are actively participating views, and thus determines that all actively participating views should receive a particular sequence of sub-events. In other embodiments, even if the touch sub-event is completely confined to the area associated with one particular view, the higher views in the hierarchy will remain as actively participating views.

The event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180). In embodiments including active event recognizer determination module 173, event dispatcher module 174 delivers event information to event recognizers determined by active event recognizer determination module 173. In some embodiments, the event dispatcher module 174 stores event information in an event queue, which is retrieved by the respective event receiver module 182.

In some embodiments, the operating system 126 includes an event classifier 170. Alternatively, application 136-1 includes event classifier 170. In yet another embodiment, the event classifier 170 is a stand-alone module or is part of another module stored in the memory 102, such as the contact/motion module 130.

In some embodiments, the application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for processing touch events that occur within a respective view of the application's user interface. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, the respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of the event recognizers 180 are part of a separate module that is a higher-level object such as a user interface toolkit (not shown) or application 136-1 that inherits methods and other properties from it. In some embodiments, the respective event handlers 190 include one or more of: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or calls data updater 176, object updater 177 or GUI updater 178 to update application internal state 192. Alternatively, one or more of the application views 191 include one or more respective event handlers 190. Additionally, in some embodiments, one or more of data updater 176, object updater 177, and GUI updater 178 are included in a respective application view 191.

The corresponding event recognizer 180 receives event information (e.g., event data 179) from the event classifier 170 and recognizes events according to the event information. The event recognizer 180 includes an event receiver module 182 and an event comparator 184. In some embodiments, event recognizer 180 also includes metadata 183 and at least a subset of event delivery instructions 188 (which optionally include sub-event delivery instructions).

The event receiver module 182 receives event information from the event sorter 170. The event information includes information about a sub-event such as a touch or touch movement. According to the sub-event, the event information further includes additional information, such as the location of the sub-event. When the sub-event relates to motion of a touch, the event information optionally also includes the velocity and direction of the sub-event. In some embodiments, the event includes rotation of the electronic device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation of the electronic device (also referred to as the device pose).

Event comparator 184 compares the event information to predefined event or sub-event definitions and determines an event or sub-event or determines or updates the state of an event or sub-event based on the comparison. In some embodiments, event comparator 184 includes event definitions 186. Event definition 186 contains definitions of events (e.g., predefined sub-event sequences), such as event 1(187-1), event 2(187-2), and other events. In some embodiments, sub-events in event 187 include, for example, touch start, touch end, touch move, touch cancel, and multi-touch. In one example, the definition of event 1(187-1) is a double click on the displayed object. For example, a double tap includes a first touch (touch start) on the displayed object for a predetermined length of time, a first lift-off (touch end) for a predetermined length of time, a second touch (touch start) on the displayed object for a predetermined length of time, and a second lift-off (touch end) for a predetermined length of time. In another example, the definition of event 2(187-2) is a drag on the displayed object. For example, dragging includes a predetermined length of time of touch (or contact) on a displayed object, movement of the touch across touch-sensitive display system 112, and liftoff of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.

In some embodiments, event definition 187 includes definitions of events for respective user interface objects. In some embodiments, event comparator 184 performs a hit test to determine which user interface object is associated with a sub-event. For example, in an application view that displays three user interface objects on touch-sensitive display system 112, when a touch is detected on touch-sensitive display system 112, event comparator 184 performs a hit test to determine which of the three user interface objects is associated with the touch (sub-event). If each display object is associated with a respective event handler 190, the event comparator uses the results of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects the event handler associated with the sub-event and the object that triggered the hit test.

In some embodiments, the definition of the respective event 187 further includes delay actions that delay the delivery of event information until it has been determined that the sequence of sub-events does or does not correspond to the event type of the event identifier.

When the respective event recognizer 180 determines that the sequence of sub-events does not match any event in the event definition 186, the respective event recognizer 180 enters an event not possible, event failed, or event ended state, after which subsequent sub-events of the touch-based gesture are ignored. In this case, other event recognizers (if any) that remain active for the hit view continue to track and process sub-events of the ongoing touch-based gesture.

In some embodiments, the respective event recognizer 180 includes metadata 183 with configurable attributes, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively participating event recognizers. In some embodiments, metadata 183 includes configurable attributes, flags, and/or lists that indicate how or how event recognizers interact with each other. In some embodiments, metadata 183 includes configurable attributes, flags, and/or lists that indicate whether a sub-event is delivered to a different level in the view or programmatic hierarchy.

In some embodiments, when one or more particular sub-events of an event are identified, the respective event identifier 180 activates an event handler 190 associated with the event. In some embodiments, the respective event identifier 180 delivers event information associated with the event to the event handler 190. Activating the event handler 190 is different from sending (and deferring) sub-events to the corresponding hit view. In some embodiments, the event recognizer 180 throws a marker associated with the recognized event, and the event handler 190 associated with the marker retrieves the marker and performs a predefined process.

In some embodiments, event delivery instructions 188 include sub-event delivery instructions that deliver event information about sub-events without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the sequence of sub-events or to actively participating views. Event handlers associated with the sequence of sub-events or with actively participating views receive the event information and perform a predetermined process.

In some embodiments, data updater 176 creates and updates data for use in application 136-1. For example, data updater 176 updates a phone number used in contacts module 137 or stores a video file used in video player module 145. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, object updater 177 creates a new user interface object or updates the location of a user interface object. GUI updater 178 updates the GUI. For example, GUI updater 178 prepares display information and sends the display information to graphics module 132 for display on the touch-sensitive display.

In some embodiments, event handler 190 includes, or has access to, data updater 176, object updater 177, and GUI updater 178. In some embodiments, data updater 176, object updater 177, and GUI updater 178 are included in a single module of a respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.

It should be understood that the above discussion of event processing with respect to user touches on a touch sensitive display also applies to other forms of user input utilizing an input device to operate multifunction device 100, not all of which are initiated on a touch screen. For example, mouse movements and mouse button presses, optionally in conjunction with single or multiple keyboard presses or holds; contact movements on the touchpad, such as tapping, dragging, scrolling, etc.; inputting by a stylus; movement of the electronic device; verbal instructions; detected eye movement; inputting biological characteristics; and/or any combination thereof, is optionally used as input corresponding to sub-events defining the event to be identified.

FIG. 2 illustrates a portable multifunction device 100 with a touch screen (e.g., touch-sensitive display system 112 of FIG. 1A) in accordance with some embodiments. The touch screen optionally displays one or more graphics within the User Interface (UI) 200. In this and other embodiments described below, a user can select one or more of these graphics by making gestures on the graphics, for example, with one or more fingers 202 (not drawn to scale in the figure) or one or more styluses 203 (not drawn to scale in the figure). In some embodiments, selection of one or more graphics will occur when the user breaks contact with the one or more graphics. In some embodiments, the gesture optionally includes one or more taps, one or more swipes (left to right, right to left, up, and/or down), and/or a roll of a finger (right to left, left to right, up, and/or down) that has made contact with the electronic device 100. In some embodiments or in some cases, inadvertent contact with the graphic does not select the graphic. For example, when the gesture corresponding to the selection is a tap, a swipe gesture that swipes over the application icon optionally does not select the corresponding application.

Stylus 203 includes a first end 276 and a second end 277. In various embodiments, first end 276 corresponds to a tip of stylus 203 (e.g., a tip of a pencil) and second end 277 corresponds to an opposite or bottom end of stylus 203 (e.g., an eraser of a pencil).

Stylus 203 includes a touch sensitive surface 275 to receive touch inputs from a user. In some embodiments, touch-sensitive surface 275 corresponds to a capacitive touch element. Stylus 203 includes a sensor or group of sensors that detect input from a user based on tactile and/or haptic contact with touch-sensitive surface 275. In some embodiments, stylus 203 includes any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive technologies, resistive technologies, infrared technologies, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch-sensitive surface 275. Since stylus 203 includes various sensors and types of sensors, stylus 203 can detect various inputs from a user, including gestures disclosed herein with respect to the touch screen of portable multifunction device 100. In some embodiments, the one or more sensors may detect a single touch input or a continuous touch input in response to a user tapping one or more times on the touch-sensitive surface 275. In some embodiments, one or more sensors may detect a swipe input on stylus 203 in response to a user swiping one or more fingers along touch-sensitive surface 275. In some embodiments, if the speed of the user swiping along the touch-sensitive surface 275 violates a threshold, the one or more sensors detect a flick input instead of a swipe input.

Stylus 203 also includes one or more sensors, such as accelerometers, magnetometers, gyroscopes, etc., that detect orientation (e.g., angular position relative to the electronic device) and/or movement of stylus 203. The one or more sensors may detect various rotational movements of stylus 203 by the user, including the type and direction of the rotation. For example, the one or more sensors may detect that the user rolled and/or turned stylus 203, and may detect the direction of the rolling/turning (e.g., clockwise or counterclockwise). In some embodiments, the detected input depends on the angular position of the first and second ends 276, 277 of the stylus 203 relative to the electronic device. For example, in some embodiments, if stylus 203 is substantially perpendicular to electronic device 100 and second end 277 (e.g., an eraser) is closer to the electronic device, contacting the surface of the electronic device with second end 277 results in an erase operation. On the other hand, if the stylus 203 is substantially perpendicular to the electronic device and the first end 276 (e.g., tip) is closer to the electronic device, contacting the surface of the electronic device with the first end 276 results in a marking operation.

Electronic device 100 optionally also includes one or more physical buttons, such as a "home" or menu button 204. As previously described, the menu button 204 is optionally used to navigate to any application 136 in a set of applications that are optionally executed on the electronic device 100. Alternatively, in some embodiments, the menu button 204 is implemented as a soft key in a GUI displayed on a touch screen display.

In some embodiments, electronic device 100 includes a touch screen display, menu buttons 204, a push button 206 for powering the electronic device on/off and for locking electronic device 100, a volume adjustment button 208, a Subscriber Identity Module (SIM) card slot 210, a headset jack 212, and docking/charging external port 124. The push button 206 is optionally used to: turning on/off the electronic device by pressing the push button 206 and maintaining the push button 206 in a pressed state for a predefined time interval; locking the electronic device 100 by pressing the push button 206 and releasing the push button 206 before the predefined time interval has elapsed; and/or unlock or initiate an unlocking process for electronic device 100. In some embodiments, electronic device 100 also accepts voice input through microphone 113 for activating or deactivating certain functions. Electronic device 100 also optionally includes one or more contact intensity sensors 165 for detecting intensity of contacts on touch-sensitive display system 112 and/or one or more tactile output generators 163 for generating tactile outputs for a user of electronic device 100.

FIG. 3 is a block diagram of an exemplary multifunction device 300 with a display and a touch-sensitive surface in accordance with some embodiments. The electronic device 300 need not be portable. In some embodiments, the electronic device 300 is a laptop computer, a desktop computer, a tablet computer, a multimedia player device, a navigation device, an educational device (such as a child learning toy), a gaming system, or a control device (e.g., a home controller or an industrial controller). Electronic device 300 typically includes one or more processing units (CPUs) 310, one or more network or other communication interfaces 360, memory 370, and one or more communication buses 320 for interconnecting these components. One or more communication buses 320 optionally include circuitry (sometimes called a chipset) that interconnects and controls communication between system components. Electronic device 300 includes an input/output (I/O) interface 330 with a display 340, which is typically a touch screen display. I/O interface 330 also optionally includes a keyboard and/or mouse (or other pointing device) 350 and a touchpad 355, a tactile output generator 357 (e.g., similar to tactile output generator 163 described above with reference to fig. 1A) for generating tactile outputs on electronic device 300, a sensor 359 (e.g., a touch-sensitive sensor, an optical sensor, a contact intensity sensor, a proximity sensor, an acceleration sensor, a gesture sensor, and/or a magnetic sensor similar to sensors 112, 164, 165, 166, 167, 168, and 169 described above with reference to fig. 1A). Memory 370 includes high speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices; and optionally includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. Memory 370 optionally includes one or more storage devices remotely located from one or more CPUs 310. In some embodiments, memory 370 stores programs, modules, and data structures similar to or a subset of the programs, modules, and data structures stored in memory 102 of portable multifunction device 100 (FIG. 1A). Further, memory 370 optionally stores additional programs, modules, and data structures not present in memory 102 of portable multifunction device 100. For example, memory 370 of device 300 optionally stores drawing module 380, presentation module 382, word processing module 384, website creation module 386, disk editing module 388, and/or spreadsheet module 390, while memory 102 of portable multifunction device 100 (FIG. 1A) optionally does not store these modules.

Each of the above identified elements in fig. 3 is optionally stored in one or more of the previously mentioned memory devices. Each of the above identified modules corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. In some embodiments, memory 370 optionally stores a subset of the modules and data structures described above. Further, memory 370 optionally stores additional modules and data structures not described above.

Fig. 4 is a block diagram of an exemplary electronic stylus 203, according to some embodiments. The electronic stylus 203 is sometimes simply referred to as a stylus. Stylus 203 includes memory 402 (which optionally includes one or more computer-readable storage media), a memory controller 422, one or more processing units (CPUs) 420, a peripheral interface 418, RF circuitry 408, an input/output (I/O) subsystem 406, and other input or control devices 416. Stylus 203 optionally includes an external port 424 and one or more optical sensors 464. Stylus 203 optionally includes one or more intensity sensors 465 for detecting the intensity of contacts of stylus 203 on device 100 (e.g., when stylus 203 is used with a touch-sensitive surface such as touch-sensitive display system 112 of electronic device 100) or on other surfaces (e.g., a table surface). Stylus 203 optionally includes one or more tactile output generators 463 for generating tactile outputs on stylus 203. These components optionally communicate over one or more communication buses or signal lines 403.

In some embodiments, the term "haptic output" discussed above refers to a physical displacement of an accessory (e.g., stylus 203) of an electronic device (e.g., electronic device 100) relative to a previous position of the accessory, a physical displacement of a component of the accessory relative to another component of the accessory, or a displacement of the component relative to a center of mass of the accessory that is to be detected by a user with his or her sense of touch. For example, where the accessory or a component of the accessory is in contact with a surface of the user that is sensitive to touch (e.g., a finger, palm, or other portion of the user's hand), the haptic output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in a physical characteristic of the accessory or component of the accessory. For example, movement of a component (e.g., the housing of stylus 203) is optionally interpreted by a user as a "click" of a physical actuation button. In some cases, the user will feel a tactile sensation, such as a "click," even when the physical actuation button associated with the stylus that is physically pressed (e.g., displaced) by the user's movement is not moved. While such interpretation of touch by a user will be limited by the user's individualized sensory perception, many sensory perceptions of touch are common to most users. Thus, when a haptic output is described as corresponding to a particular sensory perception of a user (e.g., "click"), unless otherwise stated, the generated haptic output corresponds to a physical displacement of the electronic device or a component thereof that would generate the sensory perception of a typical (or ordinary) user.

It should be understood that stylus 203 is merely one example of an electronic stylus, and that stylus 203 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of these components. The various components shown in fig. 4 are implemented in hardware, software, firmware, or any combination thereof, including one or more signal processing circuits and/or application specific integrated circuits.

The memory 402 optionally includes high-speed random access memory, and also optionally includes non-volatile memory, such as one or more flash memory devices or other non-volatile solid-state memory devices. Access to memory 402 by other components of stylus 203, such as one or more CPUs 420 and peripheral interface 418, is optionally controlled by memory controller 422.

Peripherals interface 418 can be used to couple stylus input and output peripherals to one or more CPUs 420 and memory 402. One or more CPUs 420 run or execute various software programs and/or sets of instructions stored in memory 402 to perform various functions of stylus 203 and process data.

In some embodiments, peripherals interface 418, one or more CPUs 420, and memory controller 422 are optionally implemented on a single chip, such as chip 404. In some other embodiments, they are optionally implemented on separate chips.

The RF (radio frequency) circuitry 408 receives and transmits RF signals, also referred to as electromagnetic signals. The RF circuitry 408 converts electrical signals to/from electromagnetic signals and communicates with the electronic device 100 or electronic device 300, a communication network, and/or other communication devices via electromagnetic signals. RF circuitry 408 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a codec chipset, a Subscriber Identity Module (SIM) card, memory, and the like. RF circuitry 408 optionally communicates with networks such as the internet, also known as the World Wide Web (WWW), intranets, and/or wireless networks such as cellular telephone networks, wireless Local Area Networks (LANs), and/or Metropolitan Area Networks (MANs), and other devices via wireless communications. The wireless communication optionally uses any of a number of communication standards, protocols, and/or technologies, including, but not limited to, global system for mobile communications (GSM), Enhanced Data GSM Environment (EDGE), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), evolved pure data (EV-DO), HSPA +, Dual-cell HSPA (DC-HSPA), Long Term Evolution (LTE), Near Field Communication (NFC), wideband code division multiple Access (W-CDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11ac, IEEE 802.11ax, IEEE 802.11b, IEEE 802.11g, and/or IEEE 802.11n), Voice over Internet protocol (VoIP), Wi-MAX, email protocols (e.g., Internet Message Access Protocol (IMAP), and/or Post Office Protocol (POP)) Instant messaging (e.g., extensible messaging and presence protocol (XMPP), session initiation protocol for instant messaging and presence with extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol including communication protocols not yet developed at the filing date of this document.

I/O subsystem 406 couples input/output peripherals on stylus 203, such as other input or control devices 416, to peripheral interface 418. The I/O subsystem 406 optionally includes an optical sensor controller 458, an intensity sensor controller 459, a haptic feedback controller 461, and one or more input controllers 460 for other input or control devices. One or more input controllers 460 receive/transmit electrical signals from/to other input or control devices 416. Other input or control devices 416 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slide switches, click wheels, and the like. In some alternative embodiments, one or more input controllers 460 are optionally coupled to (or not coupled to) any of the following: an infrared port and/or a USB port.

Stylus 203 also includes a power system 462 for powering the various components. Power system 462 optionally includes a power management system, one or more power sources (e.g., battery, Alternating Current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a Light Emitting Diode (LED)), and any other components associated with the generation, management, and distribution of power in portable devices and/or portable accessories.

Stylus 203 optionally also includes one or more optical sensors 464. FIG. 4 shows an optical sensor coupled to optical sensor controller 458 in I/O subsystem 406. The one or more optical sensors 464 optionally include Charge Coupled Devices (CCD) or Complementary Metal Oxide Semiconductor (CMOS) phototransistors. The one or more optical sensors 464 receive light projected through the one or more lenses from the environment and convert the light into data representing an image.

Stylus 203 optionally also includes one or more contact intensity sensors 465. Fig. 4 shows a contact intensity sensor coupled to an intensity sensor controller 459 in the I/O subsystem 406. The one or more contact intensity sensors 465 optionally include one or more piezoresistive strain gauges, capacitive force sensors, electrical force sensors, piezoelectric force sensors, optical force sensors, capacitive touch-sensitive surfaces, or other intensity sensors (e.g., sensors for measuring the force (or pressure) of a contact on a surface). One or more contact intensity sensors 465 receive contact intensity information (e.g., pressure information or a proxy for pressure information) from the environment. In some embodiments, at least one contact intensity sensor is juxtaposed or adjacent to the tip of stylus 203.

Stylus 203 optionally also includes one or more proximity sensors 466. Fig. 4 shows a proximity sensor 466 coupled to the peripheral interface 418. Alternatively, the proximity sensor 466 is coupled with the input controller 460 in the I/O subsystem 406. In some embodiments, proximity sensor 466 determines the proximity of stylus 203 to an electronic device (e.g., electronic device 100).

Stylus 203 optionally also includes one or more tactile output generators 463. Fig. 4 shows a haptic output generator 463 coupled to a haptic feedback controller 461 in the I/O subsystem 406. The one or more tactile output generators 463 optionally include one or more electro-acoustic devices (such as speakers or other audio components), and/or electromechanical devices that convert energy into linear motion (such as motors, solenoids, electroactive aggregators, piezoelectric actuators, electrostatic actuators), or other tactile output generating components (e.g., components that convert electrical signals into tactile output on an electronic device). One or more tactile output generators 463 receive tactile feedback generation instructions from the tactile feedback module 433 and generate tactile outputs on the stylus 203 that can be felt by a user of the stylus 203. In some embodiments, at least one tactile output generator 463 is collocated with or adjacent to a length (e.g., a body or housing) of stylus 203, and optionally generates tactile output by moving stylus 203 vertically (e.g., in a direction parallel to the length of stylus 203) or laterally (e.g., in a direction perpendicular to the length of stylus 203).

Stylus 203 optionally also includes one or more accelerometers 467, gyroscopes 468, and/or magnetometers 469 (e.g., as part of an Inertial Measurement Unit (IMU)) for obtaining information regarding the position and positional state of stylus 203. Fig. 4 shows sensors 467, 468, and 469 coupled to peripheral interface 418. Alternatively, sensors 467, 468, and 469 are optionally coupled with input controller 460 in I/O subsystem 406. Stylus 203 optionally includes a GPS (or GLONASS or other global navigation system) receiver (not shown) for obtaining information about the position of stylus 203.

Stylus 203 includes a touch sensitive system 432. The touch sensitive system 432 detects input received at the touch sensitive surface 275. These inputs include those discussed herein with respect to touch-sensitive surface 275 of stylus 203. For example, the touch sensitive system 432 may detect tap, turn, roll, flick, and swipe inputs. The touch sensitive system 432 cooperates with the touch interpretation module 477 to decipher certain types of touch input (e.g., turn/roll/flick/swipe/etc.) received at the touch-sensitive surface 275.

In some embodiments, the software components stored in memory 402 include an operating system 426, a communication module (or set of instructions) 428, a contact/motion module (or set of instructions) 430, a location module (or set of instructions) 431, and a Global Positioning System (GPS) module (or set of instructions) 435. Further, in some embodiments, memory 402 stores device/global internal state 457, as shown in fig. 4. Further, although not shown, memory 402 includes a touch interpretation module 477. Device/global internal state 457 includes one or more of: sensor states, including information obtained from various sensors of the stylus and other input or control devices 416; a position state comprising information about the position (e.g., position, orientation, tilt, roll, and/or distance, as shown in fig. 5A and 5B) of the stylus relative to an electronic device (e.g., electronic device 100); and location information regarding the stylus location (e.g., determined by GPS module 435).

The operating system 426 (e.g., iOS, Darwin, RTXC, LINUX, UNIX, OSX, WINDOWS, or embedded operating systems such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, power management, etc.) and facilitates communication between various hardware and software components.

The communication module 428 optionally facilitates communication with other devices through one or more external ports 424, and also includes various software components for processing data received by the RF circuitry 408 and/or the external ports 424. External port 424 (e.g., Universal Serial Bus (USB), firewire, etc.) is adapted to couple directly to other devices or indirectly through a network (e.g., the internet, wireless LAN, etc.). In some embodiments, external port 424 is compatible with some of Apple inc. from cupertino, californiaAndthe Lightning connector used in the device is the same or similar and/or compatible Lightning connector.

Contact/motion module 430 optionally detects contact with stylus 203 and other touch-sensitive devices of stylus 203 (e.g., buttons or other touch-sensitive components of stylus 203). Contact/motion module 430 includes software components for performing various operations related to the detection of contact (e.g., the detection of contact of the tip of stylus 203 with a touch-sensitive display, such as touch screen 112 of electronic device 100, or with another surface, such as a table surface), such as determining whether a contact has occurred (e.g., detecting a touch down event), determining the strength of the contact (e.g., the force or pressure of the contact or a substitute for the force or pressure of the contact), determining whether there is movement of the contact and tracking across the movement (e.g., across touch screen 112 of electronic device 100), and determining whether the contact has ceased (e.g., detecting a lift off event or interruption of the contact). In some embodiments, the contact/motion module 430 receives contact data from the I/O subsystem 406. Determining movement of the point of contact optionally includes determining velocity (magnitude), velocity (magnitude and direction), and/or acceleration (change in magnitude and/or direction) of the point of contact, the movement of the point of contact being represented by a series of contact data. As described above, in some embodiments, one or more of these operations related to the detection of contact are performed by electronic device 100 using contact/motion module 130 (using contact/motion module 430 in addition to or in place of stylus 203).

Contact/motion module 430 optionally detects gesture input through stylus 203. Different gestures made with stylus 203 have different contact patterns (e.g., different motion, timing, and/or intensity of detected contact). Thus, the gesture is optionally detected by detecting a particular contact pattern. For example, detecting a single tap gesture includes detecting a touch down event followed by detecting a lift off event at the same location (or substantially the same location) as the touch down event (e.g., at the location of an icon). As another example, detecting a swipe gesture includes detecting a touch down event, followed by detecting one or more stylus drag events, and followed by detecting a lift off event. As described above, in some embodiments, gesture detection is performed by electronic device 100 using contact/motion module 130 (using contact/motion module 430 in addition to or in place of stylus 203).

In conjunction with accelerometer 467, gyroscope 468, and/or magnetometer 469, position module 431 optionally detects position information about stylus 203, such as the stylus's pose (roll, pitch, and/or yaw) in a particular frame of reference. In conjunction with accelerometer 467, gyroscope 468, and/or magnetometer 469, location module 431 optionally detects stylus movement gestures, such as flicking, tapping, and rotation of stylus 203. The position module 431 includes software components for performing various operations related to detecting the position of the stylus and detecting a change in the position of the stylus in a particular reference frame. In some embodiments, position module 431 detects a position state of stylus 203 relative to electronic device 100 and detects a change in the position state of stylus 203 relative to electronic device 100. As described above, in some embodiments, electronic device 100 or electronic device 300 uses location module 131 to determine the positional state of stylus 203 relative to electronic device 100 and the change in the positional state of stylus 203 (using location module 431 in addition to or in place of stylus 203).

The haptic feedback module 433 includes various software components for generating instructions for use by one or more haptic output generators 463 to produce haptic output at one or more locations on the stylus 203 in response to user interaction with the stylus 203.

GPS module 435 determines the location of stylus 203 and provides this information for use in various applications (e.g., to applications that provide location-based services, such as applications for finding lost devices and/or accessories).

Touch interpretation module 477 cooperates with touch-sensitive system 432 to determine (e.g., decipher or identify) the type of touch input received at touch-sensitive surface 275 of stylus 203. For example, if the user flicks a sufficient distance across the touch-sensitive surface 275 in a sufficiently short amount of time, the touch interpretation module 477 determines that the touch input corresponds to a swipe input (rather than a tap input). As another example, if the user swipes fast enough on the touch-sensitive surface 275 to correspond to speech corresponding to a swipe input, the touch interpretation module 477 determines that the touch input corresponds to a flick input (rather than a swipe input). The threshold speed of the stroke may be preset and varied. In various embodiments, the pressure and/or force of a touch received at the touch-sensitive surface determines the type of input. For example, a light touch may correspond to a first type of input, while a more forceful touch may correspond to a second type of input.

Each of the modules and applications identified above corresponds to a set of executable instructions for performing one or more of the functions described above as well as the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. In some embodiments, memory 402 optionally stores a subset of the modules and data structures described above. Further, memory 402 optionally stores additional modules and data structures not described above.

Fig. 5A-5B illustrate positional states of stylus 203 relative to a touch-sensitive surface (e.g., touch screen 112 of electronic device 100) in accordance with some embodiments. In some embodiments, the positional state of stylus 203 corresponds to (or indicates): a projected location of a tip (or other representative portion) of stylus 203 on the touch-sensitive surface (e.g., (x, y) location 504 in fig. 5A), an orientation of stylus 203 relative to the touch-sensitive surface (e.g., orientation 506 in fig. 5A), a tilt of stylus 203 relative to the touch-sensitive surface (e.g., tilt 512 in fig. 5B), and/or a distance of stylus 203 relative to the touch-sensitive surface (e.g., distance 514 in fig. 5B). In some embodiments, the positional state of stylus 203 corresponds to (or indicates) a pitch, yaw, and/or rotation of the stylus (e.g., a pose of stylus 203 relative to a particular frame of reference such as a touch-sensitive surface (e.g., touch screen 112) or the ground). In some embodiments, the location state includes a set of location parameters (e.g., one or more location parameters). In some embodiments, the position state is detected from one or more measurements from stylus 203 transmitted to the electronic device (e.g., electronic device 100). For example, stylus 203 measures the tilt (e.g., tilt 512 in fig. 5B) and/or orientation (e.g., orientation 506 in fig. 5A) of the stylus and transmits the measurements to electronic device 100. In some embodiments, instead of or in conjunction with the position state detected from one or more measurements from stylus 203, the position state is detected from raw output from one or more electrodes in the stylus as sensed by the touch-sensitive surface (e.g., touch screen 112 of electronic device 100). For example, the touch-sensitive surface receives raw output from one or more electrodes in stylus 203, and calculates the tilt and/or orientation of stylus 203 based on the raw output (optionally in conjunction with position state information provided by stylus 203 based on sensor measurements generated by stylus 203).

FIG. 5A illustrates stylus 203 with respect to a touch-sensitive surface (e.g., touch screen 112 of electronic device 100) viewed from a perspective directly above the touch-sensitive surface, in accordance with some embodiments. In FIG. 5A, z-axis 594 points out of the page (i.e., in a direction perpendicular to the plane of touch screen 112), x-axis 590 is parallel to a first edge (e.g., length) of touch screen 112, y-axis 592 is parallel to a second edge (e.g., width) of touch screen 112, and y-axis 592 is perpendicular to x-axis 590.

Fig. 5A shows the tip of stylus 203 at (x, y) position 504. In some embodiments, the tip of stylus 203 is the end of stylus 203 that is configured to determine the proximity of stylus 203 to a touch-sensitive surface (e.g., touch screen 112). In some embodiments, the projection of the tip of stylus 203 on the touch-sensitive surface is an orthogonal projection. In other words, the projection of the tip of stylus 203 on the touch-sensitive surface is a point (e.g., (x, y) location 504) at the end of a line from the stylus tip to the touch-sensitive surface that is perpendicular to the surface of the touch-sensitive surface, at which point the tip of stylus 203 would touch the touch-sensitive surface if stylus 203 were moved directly along a path that is perpendicular to the touch-sensitive surface. In some embodiments, the (x, y) location 504 at the lower left corner of the touch screen 112 is location (0,0) (e.g., (0,0) location 502) and the other (x, y) locations on the touch screen 112 are relative to the lower left corner of the touch screen 112. Alternatively, in some embodiments, the (0,0) location is located at another location of touch screen 112 (e.g., in the center of touch screen 112) and the other (x, y) locations are relative to the (0,0) location of touch screen 112.

Additionally, fig. 5A shows that stylus 203 has an orientation 506. In some embodiments, orientation 506 is an orientation of a projection of stylus 203 on touch screen 112 (e.g., an orthogonal projection of a length of stylus 203 or a line corresponding to a line between projections of two different points of stylus 203 on touch screen 112). In some embodiments, orientation 506 is relative to at least one axis in a plane parallel to touch screen 112. In some embodiments, orientation 506 is relative to a single axis in a plane parallel to touch screen 112 (e.g., axis 508 having a clockwise rotation angle ranging from 0 degrees to 360 degrees from axis 508, as shown in FIG. 5A). Alternatively, in some embodiments, orientation 506 is relative to a pair of axes in a plane parallel to touch screen 112 (e.g., x-axis 590 and y-axis 592, as shown in fig. 5A, or a pair of axes associated with an application displayed on touch screen 112).

In some embodiments, the indication (e.g., indication 516) is displayed on a touch-sensitive display (e.g., touch screen 112 of electronic device 100). In some embodiments, indication 516 shows a location where stylus 203 will touch (or mark) the touch-sensitive display before stylus 203 touches the touch-sensitive display. In some embodiments, the indication 516 is part of a marker drawn on the touch-sensitive display. In some embodiments, the indication 516 is separate from the marker drawn on the touch-sensitive display and corresponds to a virtual "pen tip" or other element that indicates a location on the touch-sensitive display where the marker is to be drawn.

In some embodiments, indication 516 is displayed according to the position state of stylus 203. For example, in some cases, indication 516 is displaced from (x, y) position 504 (as shown in fig. 5A and 5B), and in other cases, indication 516 is not displaced from (x, y) position 504 (e.g., indication 516 is displayed at or near (x, y) position 504 when tilt 512 is zero degrees). In some embodiments, indication 516 is displayed with varying color, size (or radius or area), opacity, and/or other characteristics depending on the positional state of stylus 203. In some embodiments, the displayed indication takes into account the thickness of the glass layer on the touch sensitive display so as to carry the indication 516 through to the "pixels" of the touch sensitive display, rather than displaying the indication 516 on the "glass" covering the pixels.

FIG. 5B illustrates stylus 203 viewed from a side perspective of a touch-sensitive surface (e.g., touch screen 112 of electronic device 100) relative to the touch-sensitive surface, in accordance with some embodiments. In FIG. 5B, z-axis 594 points in a direction perpendicular to the plane of touch screen 112, x-axis 590 is parallel to a first edge (e.g., length) of touch screen 112, y-axis 592 is parallel to a second edge (e.g., width) of touch screen 112, and y-axis 592 is perpendicular to x-axis 590.

Fig. 5B shows stylus 203 having tilt 512. In some embodiments, the tilt 512 is an angle relative to a normal of the surface of the touch-sensitive surface (also referred to simply as the normal of the touch-sensitive surface) (e.g., the normal 510). As shown in fig. 5B, tilt 512 is zero when the stylus is perpendicular/perpendicular to the touch-sensitive surface (e.g., when stylus 203 is parallel to normal 510), and tilt 512 increases as stylus 203 tilts closer to being parallel to the touch-sensitive surface.

Additionally, FIG. 5B shows a distance 514 of stylus 203 relative to the touch-sensitive surface. In some embodiments, distance 514 is the distance from the tip of stylus 203 to the touch-sensitive surface in a direction perpendicular to the touch-sensitive surface. For example, in FIG. 5B, distance 514 is the distance from the tip of stylus 203 to (x, y) location 504.

Although the terms "x-axis," "y-axis," and "z-axis" are used herein to illustrate certain directions in particular drawings, it should be understood that these terms do not refer to absolute directions. In other words, the "x-axis" may be any corresponding axis, and the "y-axis" may be a particular axis other than the x-axis. Typically, the x-axis is perpendicular to the y-axis. Similarly, the "z-axis" is different from, and generally perpendicular to, both the "x-axis" and the "y-axis".

In addition, fig. 5B shows a rotation 518-a rotation around the length (long axis) of stylus 203.

Attention is now directed to embodiments of a user interface ("UI") optionally implemented on portable multifunction device 100.

FIG. 6A illustrates an exemplary user interface for a menu of applications on portable multifunction device 100 in accordance with some embodiments. A similar user interface is optionally implemented on the electronic device 300. In some embodiments, the user interface 600 includes the following elements, or a subset or superset thereof:

one or more signal strength indicators 602 for one or more wireless communications, such as cellular signals and Wi-Fi signals;

time 604;

a bluetooth indicator 605;

battery status indicator 606;

a tray 608 with icons for common applications, such as:

an icon 616 of the phone module 138 labeled "phone", optionally including an indicator 614 of the number of missed calls or voice messages;

an icon 618 of the email client module 140, labeled "mail", optionally including an indicator 610 of the number of unread emails;

icon 620 of browser module 147, labeled "browser"; and

Omicron video and music player module 152 (also referred to asIcon 622 labeled "iPod" of (trademark of Apple inc.) module 152); and

icons for other applications, such as:

icon 624 of IM module 141 labeled "message"; (ii) a

Icon 626 of calendar module 148 labeled "calendar"; (ii) a

Icon 628 of image management module 144 labeled "photo"; (ii) a

Icon 630 of camera module 143 labeled "camera"; (ii) a

Icon 632 labeled "online video" for online video module 155; (ii) a

An icon 634 labeled "stock market" for the stock market desktop applet 149-2; (ii) a

Icon 636 of the map module 154 labeled "map"; (ii) a

Icon 638 of weather desktop applet 149-1 labeled "weather"; (ii) a

Icon 640 labeled "clock" for alarm clock desktop applet 149-4; (ii) a

Icon 642 of fitness support module 142 labeled "fitness support"; (ii) a

Icon 644 labeled "notepad" of notepad module 153; and

icon 646 for setting applications or modules, which provides access to settings of electronic device 100 and its various applications 136.

It should be noted that the icon labels shown in fig. 6A are merely exemplary. For example, in some embodiments, icon 622 of video and music player module 152 is labeled "music" or "music player". Other tabs are optionally used for various application icons. In some embodiments, the label of the respective application icon includes a name of the application corresponding to the respective application icon. In some embodiments, the label of a particular application icon is different from the name of the application corresponding to the particular application icon.

Fig. 6B illustrates an exemplary user interface on an electronic device (e.g., device 300 in fig. 3) having a touch-sensitive surface 651 (e.g., tablet or trackpad 355 in fig. 3) separate from a display 650. Device 300 also optionally includes one or more contact intensity sensors (e.g., one or more sensors 359) for detecting the intensity of contacts on touch-sensitive surface 651 and/or one or more tactile output generators 357 for generating tactile outputs for a user of device 300.

Fig. 6B illustrates an exemplary user interface on an electronic device (e.g., device 300 in fig. 3) having a touch-sensitive surface 651 (e.g., tablet or trackpad 355 in fig. 3) separate from a display 650. Although many of the examples that follow will be given with reference to input on touch screen display 112 (where the touch-sensitive surface and the display are combined), in some embodiments, electronic device 100 detects input on a touch-sensitive surface that is separate from the display, as shown in FIG. 6B. In some embodiments, the touch-sensitive surface (e.g., touch-sensitive surface 651 in fig. 6B) has a major axis (e.g., major axis 652 in fig. 6B) that corresponds with a major axis (e.g., major axis 653 in fig. 6B) on the display (e.g., 650). In accordance with these embodiments, the electronic device 100 detects contact (e.g., contacts 660 and 662 in fig. 6B) with the touch-sensitive surface 651 at locations corresponding to respective locations on the display (e.g., in fig. 6B, contact 660 corresponds to location 668 and contact 662 corresponds to location 670). In this manner, user inputs (e.g., contacts 660 and 662 and their movements) detected by electronic device 100 on a touch-sensitive surface (e.g., touch-sensitive surface 651 in FIG. 6B) of electronic device 100 are used by electronic device 100 to manipulate a user interface on the display (e.g., display 650 in FIG. 6B) when the touch-sensitive surface is separate from the display. It should be understood that similar methods are optionally used for the other user interfaces described herein.

Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contact, finger tap gesture, finger swipe gesture, etc.) and/or stylus inputs, it should be appreciated that in some embodiments one or more of these finger inputs are replaced by inputs from another input device (e.g., mouse-based inputs). For example, the swipe gesture is optionally replaced by a mouse click (e.g., rather than a contact), followed by movement of the cursor along the path of the swipe (e.g., rather than movement of the contact). As another example, a flick gesture is optionally replaced by a mouse click (e.g., instead of detecting a contact, followed by ceasing to detect a contact) while the cursor is over the location of the flick gesture. Similarly, when multiple user inputs are detected simultaneously, it should be understood that multiple computer mice are optionally used simultaneously, or a mouse and finger contact (or stylus contact) are optionally used simultaneously.

User interface and associated process

Attention is now directed to embodiments of a user interface ("UI") and associated processes that can be implemented on an electronic device, such as portable multifunction device 100 in fig. 1 or electronic device 300 in fig. 3, wherein one or more input devices are used to detect various inputs (e.g., touch inputs, stylus inputs, mouse inputs, keyboard inputs, etc.) and a display device is used to manipulate the user interface based on the various inputs.

Fig. 7A-7 CF are examples of user interfaces for repositioning a drawing palette, according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 10A-10D. Although some of the examples that follow will be given with reference to input on a touch screen display (where the touch-sensitive surface and the display are combined, e.g., on touch screen 112), in some embodiments, electronic device 100 detects input on a touch-sensitive surface 651 that is separate from display 650, as shown in FIG. 6B.

As shown in FIG. 7A, the electronic device 100 displays a first application interface 702, such as a drawing application interface or a word processing application interface. The first application interface 702 is defined by a first edge 701a, a second edge 701b, a third edge 701c, and a fourth edge 701 d.

The electronic device 100 displays a first drawing palette 704 having a first appearance. The first drawing palette 704 is displayed along (e.g., near, anchored to, and/or substantially parallel to) a fourth side 701d of the first application interface 702. The first drawing palette 704 may include various affordances (e.g., a drawing tool affordance, an editing function affordance, and/or a color can) to facilitate content manipulation operations. For example, as shown in fig. 7A, the first drawing palette 704 includes an undo affordance 704a, a resume affordance 704b, a set of drawing tool affordances 704c, a set of color cans 704d, a text tool affordance 704e, a shape tool affordance 704f, and an additional function affordance 704 g. One of ordinary skill in the art will appreciate that the first drawing palette 704 may include any number and type of affordances arranged in any number of ways.

An input directed to the undo affordance 704a requests the electronic device 100 to undo a previous operation, such as erasing a previously drawn mark. An input directed to the resume affordance 704b requests the electronic device 100 to resume a previously undone operation, such as redisplaying a previously erased mark.

A set of drawing instrument affordances 704c (left to right) includes a pen affordance, a marker affordance (e.g., a highlighter affordance), a pencil affordance, a ruler affordance, and an eraser affordance. As shown in FIG. 7A, the pencil affordance indicates that a pencil is selected as the currently selected drawing instrument. An input directed to the corresponding drawing tool affordance sets the corresponding drawing tool as the currently selected drawing tool.

A set of color cans 704d includes a top row of color affordances for setting a currently selected color and a bottom row of pattern affordances for setting a currently selected pattern associated with the color. As shown in fig. 7A, black and solid patterns are currently selected. An input directed to the respective color affordance or the respective pattern affordance takes the respective color/pattern as the currently selected color/pattern.

The text tool affordance 704e enables creation of text content within the first application interface 702. For example, after selecting text tool affordance 704e, an input directed to first application interface 702 causes electronic device 100 to display a text box for receiving a text string and causes electronic device 100 to replace the text box with the text string entered into the text box.

The shape tool affordance 704f enables placement of a particular shape within the first application interface 702. In some embodiments, for example, input directed to the shape tool affordance 704f invokes a shape interface that includes various predetermined shapes (e.g., square, circle, triangle). Subsequently, electronic device 100 detects an input corresponding to dragging a particular shape from within the shape interface to a location within first application interface 702. In response, the electronic device 100 displays a particular shape at the location within the first application interface 702.

As shown in fig. 7B, the electronic device 100 detects a drag input 708 corresponding to a request to move the first drawing palette 704 within the first application interface 702. The drag input 708 corresponds to a request to move the first drawing palette 704 away from the fourth edge 701d of the first application interface 702 toward the first edge 701a of the first application interface 702. In some embodiments, the electronic device 100 detects a drag input 708, such as detecting a finger drag input or a stylus drag input, on the touch-sensitive surface of the electronic device 100. In some embodiments, the drag input 708 corresponds to a mouse drag input (e.g., a click and drag).

For illustration purposes only, fig. 7B also includes a first threshold line 706a first distance 706a from the first edge 701a of the first application interface 702. Notably, as shown in FIG. 7B, the end point of the drag input 708 exceeds the first threshold line 706.

As shown in fig. 7C, the drag input 708 proceeds from the initial position in fig. 7B to a second position closer to the first edge 701 a. Therefore, the electronic apparatus 100 replaces the first drawing palette 704 with the drawing tool indicator 709. Drawing implement indicator 709 includes a pencil 710a with a black tipped end because, as shown in fig. 7A and 7B, a set of drawing implement affordances 704c indicates that the currently selected drawing implement is a pencil and a set of color cans 704d indicates that black is the currently selected color. Further, drawing tool indicator 709 is oriented upward (e.g., in a north direction) because in fig. 7A and 7B, a set of drawing tool affordances 704c are also oriented upward.

As shown in fig. 7D, the drag input 708 still proceeds closer to the first edge 701a, crossing the first threshold line 706. The electronic device 100 maintains the orientation of the drawing tool indicator 709 because the set of drawing tool affordances 704c within the repositioned first drawing palette 704 continue to face upward as shown in fig. 7E. In response to determining completion of the drag input 708, the electronic device 100 replaces the drawing tool indicator 709 (in fig. 7E) with a first drawing palette 704 having a first appearance along the first edge 701 a.

As shown in fig. 7F, the electronic device 100 detects a tap input 711 directed to the first drawing palette 704. In some embodiments, tap input 711 corresponds to a single tap input or a double tap input detected on a touch-sensitive surface of electronic device 100. In some embodiments, the tap input 711 corresponds to a single mouse click input or a dual mouse click input.

In response to detecting the tap input 711 (in fig. 7F), the electronic device 100 moves the first drawing palette 704 to a previous position along the fourth edge 701d, as shown in fig. 7G-7I. That is, as shown in fig. 7G, the electronic apparatus 100 moves the first drawing palette 704 along the line 712 toward the fourth edge 701 d. In some embodiments, the electronic device 100 moves the first drawing palette 704 according to the animation. During the transition from the first edge 701 back to the fourth edge 701d, the electronic device 100 replaces the first drawing palette 704 with a drawing tool indicator 709, as shown in fig. 7H. Likewise, electronic device 100 maintains drawing tool indicator 709 in an upward (e.g., northward) orientation because a set of drawing tool affordances 704c remain in an upward orientation, as shown in FIG. 7I. Fig. 7I shows the end of the transition, wherein the electronic device 100 displays the first drawing palette 704 with the first appearance along the fourth edge 701 d.

As further shown in FIG. 7I, electronic device 100 detects drag input 713. However, unlike the previous drag input 708 shown in fig. 7B-7D, the drag input 713 does not cross the first threshold line 706. Instead, the drag input 713 ends at a reference line 714, which is shown for illustration purposes only. As shown in fig. 7J, the electronic device 100 replaces the first drawing palette 704 with a drawing tool indicator 709 when the drag input 713 travels upward away from the fourth edge 701 d. Drawing tool indicator 709 includes a pencil 710a with a black tipped end. As shown in fig. 7K, in response to detecting that the release of the drag input 713 does not exceed the first threshold line 706, the electronic device 100 moves the drawing tool indicator 709 back toward the fourth edge 701d, as indicated by the line path 715. Finally, as shown in fig. 7L, the electronic device 100 displays the first drawing palette 704 with the first appearance along the fourth edge 701 d.

As shown in FIG. 7M, in some embodiments, the electronic device 100 divides into seven regions 716a-716g, which are shown for illustration purposes only. According to various embodiments, in response to detecting a particular input directed to first drawing palette 704, electronic device 100 moves first drawing palette 704 from an initial position (e.g., along fourth edge 701 d) to a terminal position in a particular region of seven regions 716a-716 g. In some embodiments, electronic device 100 changes the appearance of the first drawing palette 704 from a first appearance to a second appearance when the first drawing palette is located in a particular one of the seven regions 716a-716 g. Hereinafter, in various embodiments, a drawing palette is moved to a particular area of the first application interface 702 when a majority of the drawing palette is within the particular area or a center of the drawing palette is within the particular area. Thus, in various embodiments, a drawing palette may be moved to a particular area even if a portion of the drawing palette is outside of the particular area.

In particular, the first region 716a corresponds to a first corner of the first application interface 702 that intersects the first edge 701a and the second edge 701 b. The second area 716b corresponds to a portion of the first application interface 702 extending along the second edge 701 b. The third region 716c corresponds to a second corner of the first application interface 702 that intersects the second edge 701b and the fourth edge 701 d. The fourth region 716d corresponds to a third corner of the first application interface 702 that intersects the first edge 701a and the third edge 701 c. The fifth area 716e corresponds to a portion of the first application interface 702 extending along the third edge 701 c. The sixth area 716f corresponds to a fourth corner of the first application interface 702 that intersects the third edge 701c and the fourth edge 701 d. The seventh area 716g corresponds to a portion of the first application interface 702 extending along the first edge 701 a.

As shown in FIG. 7N, with continued reference to FIG. 7M, each of the seven regions 716a-716g is associated with one or more threshold lines located away from the corresponding edge. According to various embodiments, when a drag input directed to first drawing palette 704 crosses a corresponding threshold line, electronic device 100 moves first drawing palette 704 to a particular area of seven areas 716a-716 g. The threshold lines are shown for illustrative purposes only. One of ordinary skill in the art will appreciate that the location of the threshold line may vary according to various embodiments.

That is, the seventh region 716g is associated with the first threshold line 706a first distance 706a from the first edge 701a, as described above with reference to fig. 7B-7D and 7I-7K. The first region 716a is associated with a second threshold line 718a second distance 718a from the first edge 701a and a third threshold line 717a third distance 717a from the second edge 701 b. The second region 716b is associated with a third threshold line 717. The third area 716c is associated with a third threshold line 717 and a fourth threshold line 719a fourth distance 719a from the fourth edge 710 d. The fourth region 716d is associated with a fifth threshold line 721 at a fifth distance 721a from the first edge 701a and a sixth threshold line 720 at a sixth distance from the third edge 701 c. The fifth section 716e is associated with a sixth threshold line 720. The sixth area 716f is associated with a sixth threshold line 720 and a seventh threshold line 722a seventh distance 722a from the fourth edge 701 d.

According to various embodiments, in response to detecting an input that satisfies one or more movement criteria, electronic device 100 moves first drawing palette 704 to a particular region of seven regions 716a-716 g. For example, in some embodiments, in response to detecting a drag input directed to first drawing palette 704 that crosses one or more threshold lines, electronic device 100 moves first drawing palette 704 to a particular region of seven regions 716a-716 g. As another example, in some embodiments, in response to detecting a flick input directed to first drawing palette 704 that satisfies a speed threshold (e.g., a direction threshold and a magnitude threshold), electronic device 100 moves first drawing palette 704 to a particular one of seven regions 716a-716 g.

As shown in fig. 7O, the electronic device 100 detects a drag input 723 requesting to move the first drawing palette 704 to the fifth area 716 e. As shown in fig. 7P, the electronic device 100 replaces the first drawing palette 704 with a drawing tool indicator 709 including an upward-facing, black-tipped pencil 710a before detecting that the drag input 723 crosses a sixth threshold line 720 associated with a fifth region 716 e. The black tipped pencil 710a is facing upward because the currently selected drawing implement corresponds to an upward tipped black pencil, as indicated by the set of drawing implement affordances 704c in fig. 7O.

As shown in fig. 7Q, in response to detecting that the drag input 723 crosses the sixth threshold line 720, the electronic device 100 rotates the black-tipped pencil 710a so as to be substantially perpendicular to and facing away from the third edge 701 c. The reason why electronic device 100 rotates black-tipped pencil 710a in this manner is that, as shown in fig. 7R, electronic device 100 reorients first drawing palette 704 and a set of drawing tool affordances 704c so as to cause the set of drawing tool affordances to be substantially perpendicular to and facing away from third edge 701 c.

As shown in fig. 7R, in response to detecting completion of the drag input 723, the electronic device 100 displays a first drawing palette 704 having a first appearance along the third edge 701c in an orientation different from, for example, the orientation in fig. 7O. Notably, the electronic device 100 displays the first drawing palette 704 along the third edge 701c and rotates the set of drawing tool affordances 704c so as to face away from the third edge 701 c.

As shown in fig. 7S, the electronic device 100 detects (e.g., via an Inertial Measurement Unit (IMU)) a first device rotation input 724 that rotates the electronic device 100 counterclockwise by 90 degrees. Thus, as shown in FIG. 7T, the positions of the edges 701a-701d of the electronic device 100 change relative to FIG. 7S. In response to detecting the first device rotation input 724 in fig. 7S, the electronic device 100 moves the first drawing palette 704 from along the third edge 701c (as shown in fig. 7S) to along the fourth edge 701 d. In this manner, the electronic device 100 secures (e.g., anchors) the first drawing palette 704 to a particular side (e.g., the right side) of the first application interface 702.

As shown in fig. 7U, the electronic device 100 detects a second device rotation input 725 that rotates the electronic device 100 counterclockwise an additional 90 degrees. Thus, as shown in FIG. 7V, the positions of the edges 701a-701d of the electronic device 100 change relative to FIG. 7U. In response to detecting the second device rotation input 725 (in fig. 7U), the electronic device 100 moves the first drawing palette 704 from along the fourth edge 701d in fig. 7U to along the second edge 701b in order to secure the first drawing palette 704 to the right side of the first application interface 702.

As shown in fig. 7W, the electronic device 100 detects a drag input 726 requesting that the first drawing palette 704 be moved to the fifth area 716 e. As shown in fig. 7X, prior to detecting that the drag input 726 crosses the sixth threshold line 720 associated with the fifth region 716e, the electronic device 100 replaces the first drawing palette 704 with a drawing tool indicator 709 that includes a black-tipped pencil 710 a. The electronic device 100 holds the black-tipped pencil 710a substantially perpendicular to and facing away from the second edge 701 b.

As shown in fig. 7Y, in response to detecting that the drag input 726 crosses the sixth threshold line 720, the electronic device 100 rotates the black-tipped pencil 710a so as to be substantially perpendicular to and facing away from the third edge 701 c. The reason electronic device 100 rotates black-tipped pencil 710a in this manner is that, as shown in FIG. 7Z, electronic device 100 reorients a set of drawing tool affordances 704c so that they are substantially perpendicular to and facing away from third edge 701 c.

As shown in fig. 7Z, in response to detecting completion of the drag input 726, the electronic device 100 displays a first drawing palette 704 having a first appearance along a third edge 701 c. Notably, the electronic device 100 displays the first drawing palette 704 along the third edge 701c and rotates the set of drawing tool affordances 704c so as to be substantially perpendicular to and facing away from the third edge 701 c. Further, in some embodiments, as shown in fig. 7Z, electronic device 100 displays first drawing palette 704 having a respective appearance that corresponds to a mirror image of a respective appearance of first drawing palette 704 in fig. 7W.

As shown in FIG. 7AA, electronic device 100 detects an input 728 setting the pen as the currently selected drawing tool. As shown in FIG. 7AB, electronic device 100 displays a first drawing palette 704, and in particular, a set of drawing tool affordances 704c indicating that the pen is the currently selected drawing tool.

As shown in fig. 7AC, the electronic device 100 detects a flick input 729 directed to the first drawing palette 704. Notably, due to the first device rotational input 724 shown in fig. 7S and the second device rotational input 725 shown in fig. 7U, the displayed area in fig. 7AC has a different location compared to the previous figures. As shown in fig. 7AC, the flick input 729 includes a horizontal component 729a and a vertical component 729 b. Although the flick input 729 is in the direction 730 towards the seventh region 716g, the flick input 729 does not cross the corresponding first threshold line 706. However, in response to determining that the flick input 729 satisfies the speed threshold, the electronic device 100 moves the first drawing palette 704 to the seventh region 716 g. For example, in some embodiments, when the flick input 729 is associated with a sufficient magnitude (e.g., velocity and/or acceleration), the flick input 729 satisfies a velocity threshold. As another example, in some embodiments, when the flick input 729 is associated with a sufficient level of velocity and/or acceleration, the flick input 729 satisfies a velocity threshold.

As shown in fig. 7AD, when the flick input 729 advances, the electronic device 100 replaces the first drawing palette 704 with a drawing tool indicator 709 including a tip black pen 710 b. The electronic device 100 displays the black-tipped pen 710b because the black-tipped pen drawing tool is currently selected. Further, the electronic device 100 displays the tip black pen 710b substantially perpendicular to and facing away from the third edge 701c so as to match the orientation of the set of drawing tool affordances 704c in fig. 7 AC.

As shown in fig. 7AE, the electronic device 100 partially rotates the tip black pen 710b so as to face a direction between the orientation of the tip black pen 710b in fig. 7AD and the orientation (e.g., 45 degree angle) of the tip black pen 710b in fig. 7 AF. Unlike the previous example involving a drag input, in response to detecting the flick input 729, the electronic device 100 moves the first drawing palette 704 to the target area (the seventh area 716g) regardless of whether the flick input 729 crosses the first threshold line 706. Accordingly, the electronic device 100 begins to rotate the black tipped pen 710b before the drawing tool indicator 709 reaches the first threshold line 706.

As shown in FIG. 7AF, when electronic device 100 moves drawing tool indicator 709 into seventh area 716g, electronic device 100 completes rotating tip black pen 710b to match the orientation of the set of drawing tool affordances 704c in FIG. 7 AG. As shown in fig. 7AG, electronic device 100 displays a first drawing palette 704 having a first appearance that includes a set of drawing tool affordances 704c that are substantially perpendicular to and facing away from first edge 701 a.

As shown in fig. 7AH, electronic device 100 detects an input 732 that sets a pencil as the currently selected drawing tool. As shown in FIG. 7AB, electronic device 100 displays a first drawing palette 704, and in particular, a set of drawing tool affordances 704c indicating that a pencil is the currently selected drawing tool.

As shown in fig. 7AJ, the electronic apparatus 100 detects a drag input 733 requesting to move the first drawing palette 704 to the second area 716 b. As shown in fig. 7AK, prior to detecting the dragging input 733 crossing into the second region 716b, the electronic device 100 replaces the first drawing palette 704 with a drawing tool indicator 709 that includes a black-tipped pencil 710a, because the black-tipped pencil is the currently selected drawing tool. The black tipped pencil 710a is substantially perpendicular to and faces away from the first edge 701a so as to match the orientation of the set of drawing tool affordances 704c in fig. 7 AJ.

As shown in FIG. 7AL, in response to detecting the crossing of the drag input 733 into the second region 716b, the electronic device 100 rotates the black-tipped pencil 710a so as to be substantially perpendicular to and facing away from the second edge 701 b. The reason why electronic device 100 rotates black-tipped pencil 710a in this manner is that, as shown in FIG. 7AM, electronic device 100 reorients a set of drawing implement affordances 704c so as to be substantially perpendicular to and facing away from second edge 701 b.

As shown in fig. 7AM, in response to detecting completion of the drag input 733, the electronic device 100 displays a first drawing palette 704 having a first appearance along the second edge 701 b. Notably, the electronic device 100 displays the first drawing palette 704 along the second edge 701b and rotates the set of drawing tool affordances 704c so as to be substantially perpendicular to and facing away from the second edge 701 b.

As shown in fig. 7AN, the electronic device 100 detects a drag input 734 requesting that the first drawing palette 704 be moved to the first region 716a (corner region). As shown in fig. 7AO, after detecting the drag input 734 but before completion of the drag input 734, the electronic device 100 replaces the first drawing palette 704 with a drawing tool indicator 709 comprising a black-tipped pencil 710 a.

As shown in fig. 7AP, in response to detecting completion of the drag input 734, the electronic device 100 displays a first drawing palette 704 having a second appearance different from the first appearance. The second appearance includes a currently selected drawing tool indicator 735a that is substantially perpendicular to and faces away from the second edge 701b because the set of drawing tool affordances 704c associated with the first appearance in fig. 7AM are similarly oriented.

In some embodiments, the first appearance corresponds to the first drawing palette being in a first expanded state and the second appearance corresponds to the first drawing palette being in a compressed state. In some embodiments, the second appearance includes fewer content manipulation affordances than the first appearance. For example, as shown in fig. 7AP, the first drawing palette 704 having the second appearance includes a single affordance, e.g., a currently selected drawing tool indicator 735a corresponding to a currently selected drawing tool, while the first drawing palette 704 having the first appearance includes a plurality of content manipulation affordances 704a-704g in fig. 7 AM.

As shown in fig. 7AQ to 7AU, the electronic device 100 transitions the first drawing palette 704 in a compressed state to a second extended state different from the first extended state. As shown in fig. 7AQ, the electronic device 100 detects a touch input 736 directed to the first drawing palette 704. In response to detecting the touch input 736 (in fig. 7 AQ), the electronic device 100 enlarges (e.g., expands or inflates) the first drawing palette 704 (in fig. 7 AR). In some embodiments, touch input 736 corresponds to a touch input detected on the touch-sensitive surface for a first threshold amount of time. In some embodiments, the enlarged first drawing palette 704 in fig. 7AR is in a second expanded state relative to the first drawing palette 704 in fig. 7AP and fig. 7 AQ. The enlarged first drawing palette 704 includes a currently selected drawing tool indicator 735 a.

In some embodiments, AS shown in fig. 7AS, in response to detecting the touch input 736 for a second threshold amount of time that is greater than the first threshold amount of time, the electronic device 100 replaces the first drawing palette 704 with a preview drawing palette 738. Notably, the preview drawing palette 738 has the same left-facing orientation as the enlarged first drawing palette 704 in fig. 7 AR. Accordingly, the electronic device 100 expands the preview drawing palette 738 vertically (e.g., upward) relative to the enlarged first drawing palette 704 in fig. 7 AR.

In some embodiments, the preview drawing palette 738 is in a second expanded state relative to the first drawing palette 704 in fig. 7AP and fig. 7 AQ. The preview drawing palette 738 includes selectable drawing tool affordances 738a-738d, with the black-tipped pencil affordance 738a having focus because it is the currently selected drawing tool. One of ordinary skill in the art will appreciate that the preview drawing palette 738 may include any number and type of content manipulation affordances. For example, in some embodiments, the preview drawing palette 738 includes respective affordances for different implements, such as pencils, pens, highlighters, and erasers.

As shown in FIG. 7AT, the electronic device 100 detects a drag input 740 that selects a gray-tipped pencil tool affordance 738 c. In some embodiments, the drag input 740 originates from the point of the previous touch input 736. In response to detecting the drag input 740 in fig. 7AT, the electronic device changes the currently selected drawing tool indicator 735a from a black-tipped pencil to a gray-tipped pencil (in fig. 7 AU) within the first drawing palette 704.

As shown in fig. 7AV, the electronic device 100 detects a touch input 742 directed to the first drawing palette 704. In response to detecting the touch input 742 in fig. 7AV, the electronic device 100 enlarges the first drawing palette 704 (in fig. 7 AW).

As shown in fig. 7AX, the electronic device 100 detects a drag input 744 requesting to move the first drawing palette 704 to the sixth area 716 f. In some embodiments, the drag input 740 originates from the point of the previous touch input 742. As shown in fig. 7AY, prior to detecting the dragging input 744 crossing into the sixth area 716f, the electronic device 100 replaces the first drawing palette 704 with a drawing tool indicator 709 that includes a tip-gray pencil 710c that is substantially perpendicular to and facing away from the second edge 710 b. The reason why the tip gray pencil 710c is oriented in this manner is that the currently selected drawing tool indicator 735a within the first drawing palette 704 in fig. 7AX is similarly oriented.

As shown in fig. 7AZ, in response to detecting the dragging input 744 crossing into the sixth region 716f, the electronic device 100 rotates the gray-tipped pencil 710c so as to be substantially perpendicular to and facing away from the first edge 701 a. The reason that electronic device 100 rotates tip gray pencil 710c in this manner is that maintaining the orientation of tip gray pencil 710c to face left toward target third edge 701c would result in an unintuitive and undesirable user experience. As shown in fig. 7BA, in response to detecting completion of the drag input 744, the electronic device 100 replaces the drawing tool indicator 709 with the first drawing palette 704 having the second appearance, which includes the currently selected drawing tool indicator 735a facing up.

As shown in fig. 7BB, the electronic device 100 detects a touch input 746. In response to detecting the touch input 746 in fig. 7BB for a first threshold amount of time, the electronic device 100 enlarges the first drawing palette 704 (in fig. 7 BC).

As shown in fig. 7BD, in response to continuing to detect the touch input 746 for a second threshold amount of time that is greater than the first threshold amount of time, the electronic device 100 replaces the first drawing palette 704 in fig. 7BC with the preview drawing palette 738 in fig. 7 BD. Notably, the preview drawing palette 738 has the same upward facing orientation as the enlarged first drawing palette 704 in fig. 7 BC. Accordingly, the electronic device 100 expands the preview drawing palette 738 horizontally (e.g., to the right) relative to the enlarged first drawing palette 704 in fig. 7 BC. This is in contrast to fig. 7AS where the electronic device 100 expands the enlarged first drawing palette 704 vertically. The preview drawing palette 738 indicates that the tip gray pencil 710c is the currently selected drawing implement.

As shown in FIG. 7BE, upon detecting touch input 746, electronic device 100 detects stylus tap input 750 by stylus 203 held by user's hand 748. The stylus taps input 750 to a pencil 738a which is pointed black. In response to detecting the stylus tap input 750 in FIG. 7BE, electronic device 100 replaces preview drawing palette 738 with first drawing palette 704 having a second appearance (in FIG. 7 BF). The first drawing palette 704, and in particular the currently selected drawing tool indicator 735a, indicates that a black tipped pencil is the currently selected drawing tool.

As shown in fig. 7BG, electronic device 100 detects a touch input 752 directed to first drawing palette 704. In response to detecting touch input 752 in fig. 7BG, electronic device 100 enlarges first drawing palette 704 (in fig. 7 BH). The enlarged first drawing palette 704 includes a currently selected drawing tool indicator 735a corresponding to the currently selected black-tipped pencil.

As shown in fig. 7BI, the electronic device 100 detects a drag input 754 requesting to move the first drawing palette 704 toward the bottom edge 701 a. In response to detecting the drag input 754 and prior to detecting the drag input 754 crossing the first threshold line 706, the electronic device 100 replaces the first drawing palette 704 (in fig. 7 BJ) with a drawing tool indicator 709. Drawing tool indicator 709 includes an upwardly facing, tip-black pencil 710a to match the orientation of the currently selected drawing tool indicator 735a displayed within first drawing palette 704 in fig. 7 BI.

As shown in fig. 7BK, in response to detecting the drag input 754 crossing the first threshold line 706, the electronic device 100 maintains the orientation of the tip-black pencil 710a so as to match the orientation of the corresponding set of drawing tool affordances 704c within the first drawing palette 704 in fig. 7 BL. As shown in fig. 7BL, in response to detecting completion of the drag input 754, the electronic device 100 replaces the drawing tool indicator 709 in fig. 7BK with a first drawing palette 704 having a first appearance. The first drawing palette 704 includes a set of drawing tool affordances 704c indicating that a pencil with a black tip is the currently selected drawing tool.

Fig. 7BM through 7CF illustrate the electronic device 100 displaying and manipulating multiple drawing palettes within a corresponding drawing application interface. As shown in fig. 7BM, the electronic device 100 simultaneously displays a first application interface 702 including a first drawing palette 704 having a first appearance and a second application interface 758 including a second drawing palette 762 having a first appearance. The first application interface 702 includes content 767 (e.g., triangles). The second drawing palette 762 includes various content manipulation affordances 762a-762g, which in some embodiments correspond to content manipulation affordances 704a-704g, respectively, within the first drawing palette 704. In some embodiments, the first application interface 702 and the second application interface 758 correspond to different application windows of the same application (e.g., notepad application). In some embodiments, the first application interface 702 is associated with a first application that is different from a second application associated with the second application interface 758. As further shown in fig. 7BM, the first application interface 702 is associated with a first width 764 that is equal or substantially equal to a second width 766 associated with the second application interface 758. According to the spatial relationship, as shown in fig. 7BM to 7BU, both the first drawing palette 704 and the second drawing palette 762 are movable within the first application interface 702 and the second application interface 758, respectively.

As shown in fig. 7BN, the electronic device 100 detects a drag input 768 requesting to move the first drawing palette 704 to the top edge of the first application interface 702. As shown in fig. 7BO and 7BP, in response to detecting drag input 768 but before completion of drag input 768, electronic device 100 replaces first drawing palette 704 with drawing tool indicator 709. Details regarding this type of transition are provided above. In response to detecting completion of the drag input 768, the electronic device 100 displays a first drawing palette 704 (in fig. 7 BQ) having a first appearance along a top edge of the first application interface 702.

As shown in FIG. 7BR, electronic device 100 detects a drag input 770 requesting that the second drawing palette 762 be moved to the upper right corner of the second application interface 758. As shown in fig. 7BS and 7BT, in response to detecting drag input 770 but before completion of drag input 770, electronic device 100 replaces second drawing palette 762 with second drawing tool indicator 772. In fig. 7BU, in response to detecting completion of the drag input 770, the electronic device 100 displays a second drawing palette 762 in the upper right corner and facing upward. The second drawing palette 762 has a second appearance different from the first appearance of the second drawing palette 762 in fig. 7 BR. That is, the second appearance is smaller than the first appearance and includes fewer affordances than the first appearance.

As shown in FIG. 7BV, electronic device 100 detects a drawing input 774 within second application interface 758. In some embodiments, as shown in fig. 7BW, when electronic device 100 detects a drawing input 774 within second application interface 758, electronic device 100 deemphasizes (e.g., fades) first drawing palette 704 with respect to content 767 displayed within first application interface 702. In response to detecting drawing input 774 in fig. 7BV, device 100 displays corresponding marker 775. In some embodiments, as shown in fig. 7BX, in response to detecting completion of the drawing input 774, the electronic device 100 re-emphasizes the first drawing palette 704.

As shown in fig. 7BY, the electronic device 100 detects a drag input 778 requesting a decrease in a first width 764 associated with the first application interface 702 and an increase in a second width 766 associated with the second application interface 758. In response to detecting the drag input 778 in fig. 7BY, the electronic device 100 decreases the first width 764 and increases the second width 766, as shown in fig. 7 BZ.

Further, because the electronic device 100 reduces the first width 764 to exceed the particular threshold, the electronic device 100 replaces the first drawing palette 704 within the first application interface 702 with a toolbar 777 in fig. 7BZ that includes various content manipulation affordances. It is noted that unlike the first drawing palette 704, the toolbar 777 is fixed to the corresponding edge 701a of the first application interface 702. Thus, as shown in fig. 7CA and 7CB, in response to detecting the drag input 780 directed to the toolbar 777, the electronic device 100 does not move the toolbar 777 or the content manipulation affordance therein.

On the other hand, the second width 766 associated with the second application interface 758 has not been reduced to less than a particular threshold. Accordingly, the electronic device 100 maintains the second drawing palette 762, as shown in fig. 7 BZ. As shown in fig. 7CC through 7CF, electronic device 100 detects a drag input 782 requesting that second drawing palette 762 be moved to the bottom edge of second application interface 758. In response to detecting drag input 782 in fig. 7CC, electronic device 100 displays second drawing tool indicator 772 (in fig. 7CD and 7 CE), and ultimately displays second drawing palette 762 (in fig. 7 CF) having the first appearance along the bottom edge of second application interface 758.

Fig. 8A-8 AL are examples of user interfaces for invoking and utilizing a screenshot editing interface, according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 11A-11C. Although some of the examples that follow will be given with reference to input on a touch screen display (where the touch-sensitive surface and the display are combined, e.g., on touch screen 112), in some embodiments, electronic device 100 detects input on a touch-sensitive surface 651 that is separate from display 650, as shown in FIG. 6B.

As shown in FIG. 8A, the electronic device 100 displays a user interface with content 802 including text 806, a return affordance 803a, a forward affordance 803b, a share affordance 804a, a bookmark affordance 804b, and a page-bookmark affordance 804 c. One of ordinary skill in the art will appreciate that the content 802 may include any type and/or kind of content, such as text, images, different affordances, and the like.

As shown in fig. 8B, the electronic device 100 detects the input 808. In some embodiments, input 808 corresponds to a drag input or a flick input detected on a touch-sensitive surface of electronic device 100. In some embodiments, the input 808 corresponds to a mouse drag input. In response to detecting input 808 in fig. 8B, electronic device 100 displays interface 810 (in fig. 8C). The interface 810 includes a plurality of application representations 810a corresponding to a plurality of applications. Interface 810 also includes screenshot capture affordance 810 b. For example, in some embodiments, interface 810 corresponds to a taskbar that includes respective affordances that, when selected, cause the device to display a user interface of an application corresponding to the selected affordance. As another example, in some embodiments, interface 810 corresponds to a control center that includes respective affordances for performing various functions, such as changing attributes of an electronic device (e.g., changing screen brightness levels, turning on/off flight modes, and invoking features of the electronic device (e.g., bringing up a camera application, initiating a screen image).

As shown in FIG. 8D, the electronic device 100 detects a first screenshot capture input 811 that is directed to a screenshot capture affordance 810 b. The first screenshot capture input 811 is a first input type. In response to detecting the first screenshot capture input 811, the electronic device 100 captures the content 802 as a screenshot image and displays a screenshot editing interface 812 (in FIG. 8E) that includes a screenshot image 813. The screenshot editing interface 812 also includes various affordances 816, 818, 820, 822a, 822b, 824, and 826 for editing (e.g., annotating) and manipulating the screenshot image 813. The screenshot editing interface 812 also includes a drawing tool indicator 828 indicating that the currently selected drawing tool is a black tipped pencil 830.

According to some embodiments, in response to activation (e.g., selection with contact) of affordance 814 (e.g., a completion affordance), electronic device 100 displays a completion interface that enables saving or discarding (e.g., deleting) screenshot image 813, as shown in fig. 8P-8R.

According to some embodiments, in response to activation (e.g., selection with touch) of the affordance 815 (e.g., a crop affordance), the electronic device 100 enters a crop mode and enables cropping of the screenshot image 813. For example, the electronic device 100 displays a crop handle that is able to move around a corner of the screenshot image 813.

In response to activation (e.g., selection with contact) of the affordance 816 (e.g., displaying all affordances), the electronic device 100 displays additional content within the screenshot image 813 that was not displayed on the display at the time the first screenshot capture input 811 was detected, according to some embodiments. Examples of this functionality are provided below with reference to fig. 8X, 8Y, and 8 AI-8 AL.

According to some embodiments, in response to activation (e.g., selection with touch or with dragging of opacity value indicator 818 a) of affordance 818 (e.g., opacity level affordance), electronic device 100 changes the opacity of the filter layer overlaid on screenshot image 813. Examples of this functionality are provided below with reference to fig. 8H through 8J, 8Z, and 8 AA.

In response to activation (e.g., selection with contact) of the affordance 820 (e.g., the share affordance), the electronic device 100 displays a transport interface overlaid on the screenshot editing interface 812, which is provided for sharing the screenshot image 813 via one or more communication means (such as email, SMS, etc.) and/or for performing one of a variety of operations on the webpage, such as a copy operation, a print operation, etc., according to some embodiments. Examples of this functionality are provided below with reference to fig. 8K through 8O.

According to some embodiments, in response to activation (e.g., selection with a touch) of the affordance 822a (e.g., an undo affordance), the electronic device 100 will revert back to one or more previous modifications of the screenshot image 813. According to some embodiments, in response to activation (e.g., selection with touch) of the affordance 822b (e.g., a restore affordance), the electronic device 100 reapplies one or more previously-restored modifications to the screenshot image 813.

According to some embodiments, in response to activation (e.g., selection with a touch) of one of a set of affordances 824 (e.g., the set of drawing tool affordances), electronic device 100 sets the currently selected drawing tool and, in various cases, changes the currently selected tool 830. According to some embodiments, in response to activation (e.g., selection with contact) of one of a set of affordances 826 (e.g., color cans), electronic device 100 sets the currently selected drawing tool color and, in various instances, changes the color of the currently selected tool 830 (e.g., changes the tip of the tool).

As shown in fig. 8F, electronic device 100 detects annotation input 832 that circles the word "Peleus" within screenshot image 813. In response to detecting annotation input 832 in fig. 8F, electronic device 100 displays corresponding annotation 834 (in fig. 8G). The characteristics of the annotation 834 are based on the currently selected drawing tool 824 (e.g., pencil), including its color 826 (e.g., black).

As shown in fig. 8H, electronic device 100 detects a drag input 836 directed to opacity value indicator 818a of opacity level affordance 818. Dragging input 836 moves opacity value indicator 818a to the corresponding opacity value, as shown in FIG. 8I. In some embodiments, in response to detecting a tap input directed to a corresponding opacity value, electronic device 100 moves opacity value indicator 818a to the corresponding opacity value.

In some embodiments, in response to detecting the drag input 836 in FIG. 8H, the electronic device changes the opacity of the first filter layer 838 overlaid on the screenshot image 813 to a corresponding opacity value, as shown in FIG. 8I. Notably, the electronic device 100 overlays the first filter layer 838 over the screenshot image 813 and overlays the annotation 834 over the first filter layer 838 as shown in fig. 8I.

In some embodiments, in response to detecting drag input 836 in fig. 8H, the electronic device changes the opacity of second filter layer 840 overlaid over screenshot image 813 and annotation 834 to corresponding opacity values, as shown in fig. 8J. Notably, the electronic device 100 overlays the second filter layer 840 over the screenshot image 813 and the annotation 834, as shown in FIG. 8J.

As shown in FIG. 8K, electronic device 100 detects input 842 directed to sharing affordance 820. In response to detecting input 842 in fig. 8K, electronic device 100 displays transmission interface 844 (in fig. 8L) overlaid on screenshot editing interface 812. The transmission interface 844 includes: a local sharing affordance 844a provided for sharing the screenshot image 813 via a local interface (e.g., bluetooth, NFC, WiFi, etc.); a remote sharing affordance 844b-844e provided for sharing the screenshot image 813 via a corresponding communication means (e.g., SMS, email, cloud storage, etc.); and operation affordances 844f-844i provided to perform corresponding operations on the screenshot image 813.

As shown in FIG. 8M, electronic device 100 detects input 846 directed to affordance 844i (e.g., save to file affordance). In response to detecting input 846 in fig. 8M, the electronic device 100 displays the first save interface 848 (in fig. 8N). The first save interface 848 provides various save locations for the screenshot image 813, the second filter layer 840, and the annotations 834. As shown in FIG. 8O, the electronic device 100 detects a subsequent input 850 requesting that the screenshot image 813 be saved to the My images folder. Accordingly, the electronic device 100 saves the screenshot image 813 to the "My images" folder and stops displaying the first save interface 848 (in FIG. 8P). In some embodiments, in response to detecting the subsequent input 850, the electronic device 100 saves the screenshot image 813 and the second filter layer 840 as separately editable.

As shown in FIG. 8P, electronic device 100 detects input 852 directed to completion affordance 814. In response to detecting input 852 in fig. 8P, electronic device 100 displays second save interface 853 (in fig. 8Q). The second save interface 853 includes a "save to photo" affordance and a "delete screenshot" affordance. As shown in FIG. 8R, electronic device 100 detects a subsequent input 854 directed to the "save to photograph" affordance. Accordingly, the electronic device 100 saves the screenshot image 813 to the "photo" area and stops displaying the second save interface 853. In some embodiments, in response to detecting subsequent input 854, electronic device 100 stores screenshot image 813 and second filter layer 840 as a flattened image.

Fig. 8S-8Y illustrate updating the screenshot image to include additional content not displayed on the display at the time the screenshot image was captured by electronic device 100 in response to detecting the screenshot capture input of the second input type. In some embodiments, electronic device 100 updates the captured screenshot image in response to detecting a screenshot capture input of the first input type, such as shown in fig. 8D. Referring back to FIG. 8S, the electronic device 100 displays the content 802, which includes a partial list 856a of "Top Ten Cities to Visit in Europe". Notably, the remainder of the list 856b (e.g., cities 7-10) is not shown in fig. 8S, but will be visible in response to a scroll-down input on the scrollbar 857.

As shown in fig. 8T, the electronic device 100 detects second screenshot capture inputs 858 and 860 that point to the main button 204 and the down button 206, respectively. In general, the second screenshot capture inputs 858 and 860 are of a second input type that is different from the first input type associated with the first screenshot capture input 811 in FIG. 8D. Thus, in some embodiments, the second input type corresponds to a hardware input, such as a press of a hardware button (e.g., the main button 204 and the down button 206), while the first input type corresponds to a touch input or a mouse input directed to a particular affordance (e.g., the first screenshot capture input 811 in FIG. 8D directed to the screenshot capture affordance 810 b).

In response to detecting the second screenshot capture inputs 858 and 860 in fig. 8T, the electronic device 100 displays a thumbnail representation 862 (in fig. 8U) of the first screenshot image 866 overlaid on the content 802. In some embodiments, the electronic device 100 resizes the thumbnail representation 862 according to a size of a display of the electronic device 100. For example, the first screenshot image 866 is contracted to generate a thumbnail representation 862 based on a predefined size, a predefined aspect ratio, and/or a predefined resolution.

As shown in fig. 8V, electronic device 100 detects a first input 864 directed to thumbnail representation 862. In response to detecting first input 864 in fig. 8V, electronic device 100 displays screenshot editing interface 812 (e.g., as described above with reference to fig. 8E) that includes first screenshot image 866 (in fig. 8W). Notably, as with the content 802 shown in fig. 8S, the first screenshot image 866 includes a portion of the list 856a, but does not include the remainder of the list 856 b. In some embodiments, as shown in fig. 8W, the electronic device 100 displays a first drawing palette 704 having a first appearance. The first drawing palette 704 is movable within the screenshot editing interface 812 as described below with reference to fig. 8AB through 8 AE.

As shown in FIG. 8X, electronic device 100 detects input 868 directed to affordance 816 (e.g., displays all affordances). In response to detecting input 868 in FIG. 8X, electronic device 100 displays second screenshot image 870 (in FIG. 8Y). The second screenshot image 870 includes additional content 856b corresponding to the remainder of the list as compared to the first screenshot image 866.

As shown in fig. 8Z, the electronic device 100 detects an input 872 directed to an opacity value indicator 818a of the opacity level affordance 818. Input 872 moves opacity value indicator 818a to the corresponding opacity value as shown in FIG. 8 AA. In response to detecting input 872 in FIG. 8Z, electronic device 100 changes the opacity of filter layer 874 overlaid on second screenshot image 870 to a corresponding opacity value (in FIG. 8 AA).

As shown in fig. 8AB, the electronic device 100 detects a drag input 876 requesting that the first drawing palette 704 be moved along the right edge of the screenshot editing interface 812. The drag input 876 is advanced beyond a threshold line 877 that is a corresponding distance 877a from the third edge 701 c. The threshold line 877 and corresponding distance 877a are shown for illustrative purposes only.

In response to detecting the drag input 876 in fig. 8AB, the electronic device 100 replaces the first drawing palette 704 (in fig. 8 AC) with a drawing tool indicator 878. The drawing implement indicator 878 includes a pencil 879 with a black tip to indicate that the currently selected drawing implement is a pencil with a black tip. As shown in fig. 8AC, because the drag input 768 has not crossed the threshold line 877, the electronic device 100 displays the pencil 879 with the black tip facing upward to match the orientation of the set of drawing tool affordances 704c within the first drawing palette 704 in fig. 8 AB.

In fig. 8AD, in response to detecting that drag input 768 crosses threshold line 877 in fig. 8AD, electronic device 100 rotates pencil 879, which is tip black, to face to the left. Electronic device 100 rotates black tipped pencil 879 in such a way as to match the orientation of a set of drawing tool affordances 704c within first drawing palette 704 along third edge 701c in fig. 8 AE.

In response to detecting completion of the drag input 768, the electronic device 100 replaces the drawing tool indicator 878 with the first drawing palette 704 having the first appearance along the third edge 701c in fig. 8 AE. In contrast to the first drawing palette 704 along the bottom edge in fig. 8AB, the first drawing palette 704 in fig. 8AE is rotated so as to be along (e.g., substantially parallel to) the third edge 701 c.

As shown in fig. 8AF, the electronic device 100 detects a comment input 880 that is underlined for the third city, "London, England. Thus, in fig. 8AG, electronic device 100 displays a corresponding annotation 882 of "London, England" underlined, where annotation 882 reflects that the currently selected tool is a black tipped pencil.

Figures 8 AH-8 AL illustrate updating screenshot images to include additional content not displayed on the display when a corresponding screenshot capture input is detected, in accordance with some embodiments. As shown in fig. 8AH, the electronic device 100 displays a screenshot editing interface 812 that includes a first screenshot image 883 that includes a first list of a first group of "Cities in California" 884 a.

As shown in FIG. 8AI, electronic device 100 detects an input 885 directed to displaying all affordances 816. In response to detecting the input 885 in fig. 8AI, in fig. 8AJ, the electronic device 100 displays a second screenshot image 886 that includes a first list of a first set of cities 884a displayed within the first screenshot image 883 and a second list of a second set of cities 884b not displayed within the first screenshot image 883. In addition, the electronic device 100 displays a wiper interface 887. The swipe interface 887 includes a selectable content volume indicator 887a that enables changing the amount of content (e.g., the number of cities listed) displayed by the electronic device 100 within the screenshot editing interface 812.

As shown in FIG. 8AK, the electronic device 100 displays an input 888 directed to a content volume indicator 887 a. That is, the input 888 moves the content volume indicator 887a to the left (e.g., toward the "-" indicator) to request that additional content be displayed within the screenshot editing interface 812. In response to detecting the input 888 in fig. 8AK, the electronic device 100 displays a third screenshot image 890 that includes a first list of a first set of cities 884a, a second list of a second set of cities 884b, and a third list of a third set of cities 884c that is not displayed within the first screenshot image 883 in fig. 8AH or within the second screenshot image 886 in fig. 8 AL.

Fig. 9A-9Z are examples of capturing screenshot images based on detected stylus input, according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 11A-11C. Although some of the examples that follow will be given with reference to input on a touch screen display (where the touch-sensitive surface and the display are combined, e.g., on touch screen 112), in some embodiments, electronic device 100 detects input on a touch-sensitive surface 651 that is separate from display 650, as shown in FIG. 6B.

As shown in fig. 9A, the electronic device 100 displays content 902 including a lion 906 (e.g., an image of the lion), as well as various affordances, including a back affordance 903a, a forward affordance 903a, a share affordance 904a, a bookmark affordance 904b, and a page bookmark affordance 904 c.

As shown in fig. 9B, electronic device 100 detects a first stylus movement 908 of stylus 203 held by a user's hand 748 on its touch-sensitive surface. The first stylus is moved 908 away from the corresponding outer edge 701b of the touch sensitive surface towards a release point within the touch sensitive surface, as indicated by the end of the arrow. As shown in FIG. 9B, in some embodiments, the first stylus movement 908 originates outside of a first region 910a first threshold distance 910a from the edges 701a-701d of the electronic device 100. As further shown in fig. 9B, in some embodiments, the first stylus movement 908 ends at a release point that is within (e.g., spans into) a second region 912a second threshold distance 912a from the edges 701a-701d of the electronic device 100. First region 910, first threshold distance 910a, second region 912, and second threshold distance 912a are shown for illustrative purposes only.

As shown in fig. 9C, the first stylus movement 908 advances closer to the release point. In response to detecting that the first stylus movement 908 originates outside the first region 910 in FIG. 9B, the electronic device 100 changes display (in FIG. 9C). That is, electronic device 100 replaces content 902 with screenshot preview interface 913. The screenshot preview interface 913 includes an outer region 914 (e.g., a solid gray region) that defines the current position of the first stylus movement 908, and a screenshot preview region 916 that is associated with (e.g., continuously associated with) the outer region 914. The screenshot preview area 916 provides a preview of the screenshot image 918. On the other hand, in response to detecting that the stylus movement does not originate outside of the first region 910, the electronic device 100 does not replace the content 902 with the screenshot preview interface 913.

As the first stylus movement 908 advances towards the second region 912, the electronic device 100 shrinks the screenshot preview region 916. For example, as shown in FIG. 9D, when the first stylus movement 908 advances closer to the release point than in FIG. 9C, the electronic device 100 correspondingly shrinks the screenshot preview area 916 and enlarges the outer area 914 (in FIG. 9D).

In response to determining that the release point of the first stylus movement 908 is within the second region 912, the electronic device 100 replaces the screenshot preview interface 913 with the screenshot editing interface 917 (in fig. 9E). The screenshot editing interface 917 includes a screenshot image 918, affordances 814, 816, 818, 820, 822a, 822b, 824, and 826, and a drawing tool indicator 828.

As shown in FIG. 9F, while the electronic device 100 is displaying the content 902, the electronic device 100 detects a second stylus movement 920 that originates outside the first region 910. As shown in fig. 9G, as the second stylus movement 920 travels closer to the second region 912, the electronic device 100 replaces the content 902 with a screenshot preview interface 913 that includes an outer region 914 and a screenshot preview region 916. However, because the release point of the second stylus movement 920 is not within the second region 912, the electronic device 100 does not display the screenshot editing interface 917. In contrast, as shown in FIG. 9H, in response to detecting completion of the second stylus movement 920, the electronic device 100 displays the content 902 without capturing the screenshot image.

As shown in FIG. 9I, the electronic device 100 detects a first drag input 922 that is not a stylus drag input. For example, when a finger drags across a touch-sensitive surface of electronic device 100, electronic device 100 detects first drag input 922. As shown in fig. 9I-9K, the first drag input 922 originates within the first region 910 in fig. 9I and completes within the second region 912. However, because the first drag input 922 is not a stylus drag input, in response to detecting completion of the first drag input 922 within the second region 912, the electronic device 100 does not display the screenshot preview interface 913 and does not display the screenshot editing interface 917 as the first drag input 922 progresses toward the end point. In contrast, as shown in FIG. 9L, the electronic device 100 displays the content 902 in response to detecting completion of the first drag input 922.

As shown in fig. 9M, the electronic device 100 detects a second drag input 924 that is not a stylus drag input. In some embodiments, the second drag input 924 is directed away from the bottom edge (e.g., fourth edge 701d) of the touch-sensitive surface into the user interface.

In some embodiments, as shown in fig. 9M and 9N, in response to detecting that the second drag input 924 spans into the area between the first region 910 and the second region 912, the electronic device 100 displays the interface 810 (e.g., a taskbar or a control center). The interface 810 includes a plurality of application representations 810a corresponding to a plurality of applications. Interface 810 also includes screenshot capture affordance 810 b.

As shown in FIG. 9O, electronic device 100 detects input 926 directed to screenshot capture affordance 810 b. In response to detecting input 926 in fig. 9O, electronic device 100 displays screenshot editing interface 918 (in fig. 9P).

As shown in fig. 9Q, electronic device 100 detects, on its touch-sensitive surface, a third stylus movement 928 of stylus 203 held by user's hand 748. The third stylus movement 928 originates at a particular corner threshold distance 929 from the touch-sensitive surface, towards a release point within the touch-sensitive surface. As shown in fig. 9Q, a particular corner of the touch-sensitive surface corresponds to a location where the first edge 701a intersects the second edge 710 b. Thus, the threshold distance 929 includes a vertical distance threshold 929a from the first edge 701a and a horizontal distance threshold 929b from the second edge 701 b. One of ordinary skill in the art will appreciate that the third stylus movement 928 may originate at a threshold distance from any of the four corners. In some embodiments, the corner swipe gesture is limited to a subset of the corners. For example, in some embodiments, when movement is detected at one of the top two corners, the movement triggers a corresponding response, and when movement is detected at one of the bottom two corners, the movement does not trigger a corresponding response. As another example, in some embodiments, when movement is detected at one of the bottom two corners, the movement triggers a corresponding response, and when movement is detected at one of the top two corners, the movement does not trigger a corresponding response. In some embodiments, the corner swipe gesture is limited to gestures performed with a stylus rather than a finger (e.g., a corresponding movement of a finger detected from a corner of a touch screen display will not trigger a response). In some embodiments, the corner swipe gesture is limited to gestures performed with a finger rather than a stylus (e.g., a corresponding movement of the stylus detected from a corner of the touch screen display will not trigger a response).

As shown in fig. 9R, the electronic device 100 changes display as the third stylus movement 928 advances closer to the release point. That is, the electronic device 100 replaces the content 902 with a screenshot preview interface 913 that includes an outer region 914 and a screenshot preview region 916.

As shown in fig. 9S, as the third stylus movement 928 advances into the second region 912, the electronic device 100 shrinks the screenshot preview region 916. In response to detecting the release of the third stylus movement 928 within the second region 912, the electronic device 100 replaces the screenshot preview interface 913 with the screenshot editing interface 917, as shown in fig. 9T.

In some implementations, as shown in fig. 9U-9X, instead of displaying the screenshot editing interface 917 based on stylus movement originating at a corner, the electronic device 100 displays the screenshot capture menu 938. As shown in FIG. 9U, electronic device 100 detects a fourth stylus movement 930 of stylus 203 held by user's hand 748 on its touch-sensitive surface. The fourth stylus movement 930 originates at a particular corner threshold distance 929 from the touch-sensitive surface, towards a release point within the touch-sensitive surface.

As shown in fig. 9V, the electronic device 100 changes display as the fourth stylus movement 930 advances closer to the release point. In contrast to the examples provided above with reference to fig. 9Q-9T, electronic device 100 foregoes displaying screenshot preview interface 913 and instead displays stylus movement indicator 932. Stylus movement indicator 932 indicates a fourth stylus movement 930 from spanning a distance into second region 912. As described below, in response to detecting the fourth stylus movement 930 crossing into the second region 912, the electronic device 100 displays the screenshot capture menu 938. That is, as shown in FIG. 9V, stylus movement indicator 932 includes an amount of color 934 corresponding to approximately one-half of stylus movement indicator 932 because fourth stylus movement 930 is approximately from its starting point to halfway across into second region 912. Those of ordinary skill in the art will appreciate that the stylus movement indicator 932 may indicate the distance spanned into the second region 912 in different ways.

As shown in fig. 9W, the electronic device 100 detects that the fourth stylus movement 930 has crossed into the second region 912. Accordingly, the electronic device 100 replaces the stylus movement indicator 932 (in fig. 9W) with the screenshot capture menu 938. The screenshot capture menu 938 includes a capture screenshot representation 938a and an edit screenshot representation 938 b. In response to detecting the release of the fourth stylus movement 930 within the second region 912, the electronic device 100 maintains display of the screenshot capture menu 938, as shown in FIG. 9X.

In response to detecting an input directed to the capture screenshot representation 938a, the electronic device 100 captures a screenshot image without displaying the screenshot image or a screenshot editing interface. In some implementations, the electronic device 100 pastes the captured screenshot in response to detecting a subsequent paste input.

On the other hand, as shown in fig. 9Y and 9Z, in response to detecting input 940 directed to the edit screenshot affordance 938b in fig. 9Y, the electronic device 100 displays a screenshot editing interface 917, as shown in fig. 9Z.

Fig. 10A-10D are flow diagrams of methods 1000 for repositioning a drawing palette, according to some embodiments. In some embodiments, method 1000 is performed at an electronic device (e.g., electronic device 300 in fig. 3 or portable multifunction device 100 in fig. 1A and 7A-7 CF) having one or more processors, non-transitory memory, an input device, and a display device. Some operations in method 1000 are optionally combined, and/or the order of some operations is optionally changed.

An electronic device that repositions a drawing palette and changes the appearance of the drawing palette in various circumstances (e.g., resizing or reducing the number of content manipulation affordances) in response to receiving a request to move the drawing palette improves operability of the electronic device. For example, repositioning the drawing palette results in a larger available display area for drawing operations and other content modification operations. Further, the electronic device is configured to perform one of a plurality of repositioning operations based on the target location of the input request in response to the single input. Thus, by avoiding the display of a plurality of different affordances corresponding to a plurality of repositioning operations, clutter of the user interface is reduced.

With respect to fig. 10A, the electronic device displays (1002), via the display device, a first drawing palette at a first location within the first application interface, wherein the first drawing palette has a first appearance at the first location in which a representation of a currently selected drawing tool is displayed concurrently with one or more representations of other available drawing tools. For example, the first application interface corresponds to a drawing application. As another example, the first drawing palette is anchored to a corner or edge of the first application interface.

In some embodiments, the electronic device displays a first drawing palette having a first appearance and positioned along an edge of the first application interface (e.g., substantially parallel to the edge of the first application interface). For example, referring to fig. 7A, the electronic device 100 displays the first drawing palette 704 along a fourth edge 701d of the first application interface 702. Continuing with the example, the first appearance of the first drawing palette 704 includes more content manipulation affordances (e.g., 704a-704g) than the second appearance of the first drawing palette 704, and in various embodiments, is larger in size than the second appearance of the first drawing palette 704.

In some embodiments, the first appearance corresponds to (1004) a first drawing palette in a first expanded state. The electronic device displays the plurality of content manipulation affordances when the first drawing palette is in the first expanded state, thereby avoiding a need for input to invoke the plurality of content manipulation affordances. As a result, the electronic device experiences less wear and uses less battery and processing resources. As one example, referring to FIG. 7A, electronic device 100 displays a first drawing palette 704 in a first expanded state that includes a plurality of content manipulation affordances 704a-704 g.

The electronic device detects (1006), via the input device, a first input corresponding to a request to move a first drawing palette within the first application interface. For example, the first input corresponds to a drag input, such as drag input 708 pointing to the first drawing palette 704 in fig. 7B-7D. As another example, the first input corresponds to a flick input, such as flick input 729 in fig. 7AC and 7AD directed to the first drawing palette 704. In some embodiments, the first input corresponds to a touch input or a stylus input detected on a touch-sensitive surface of the electronic device. As another example, the first input corresponds to a mouse input, such as a click and drag input.

In some embodiments, the electronic device displays (1008) a currently selected drawing tool indicator based on the first input. Displaying a representation of the path of the first input provides feedback to the user, ensuring that the drawing palette moves as instructed by the user, thereby reducing the likelihood of further user interaction to provide a different movement. Reducing the amount of user interaction with the electronic device reduces wear on the electronic device and, for battery-powered devices, extends the battery life of the device. For example, the drawing tool indicator includes a visual representation of the currently selected drawing tool, and in various embodiments, a visual representation of the color of the currently selected drawing tool (e.g., a circle with a pencil photograph inside). As another example, the electronic device displays a drawing tool indicator when the first input is within a threshold distance from a corresponding edge or corner of the first application interface. In some embodiments, the electronic device rotates the drawing tool indicator based on a comparison between a respective orientation of the first drawing palette at the starting position and a respective orientation of the first drawing palette at the ending position.

As an example, referring to fig. 7C and 7D, as the drag input 708 advances toward the first edge 701a, the electronic device 100 displays a drawing tool indicator 709. The drawing tool indicator 709 includes a black tipped pencil 710a to indicate that the black tipped pencil is the currently selected drawing tool. As another example, referring to fig. 7P and 7Q, when the corresponding drag input 723 crosses into the fifth region 716e, the electronic device 100 rotates the drawing tool indicator 709 to match the orientation of the first drawing palette 704 in the fifth region 716 e. As yet another example, referring to fig. 7AD through 7AF, the electronic device 100 rotates the drawing tool indicator 709 to match the orientation of the first drawing palette 704 in the seventh region 716 g. Since the first input 729 corresponds to the flick input type, the electronic apparatus 100 gradually rotates the drawing tool indicator 709 in fig. 7AD to 7 AF.

In response to detecting the first input: in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a second location within the first application interface, the electronic device displays (1010) the first drawing palette having a first appearance at the second location. For example, the electronic device moves a first drawing palette including a plurality of content manipulation affordances from one edge of the first application interface to another edge of the first application interface. In some embodiments, the first drawing palette at the second location has the same orientation as the first drawing palette at the first location, such as shown in fig. 7B-7E. In some embodiments, the first drawing palette at the second location has a different orientation than the first drawing palette at the first location, such as shown in fig. 7O-7R and 7 AC-7 AG.

In response to detecting the first input: in accordance with a determination that the first input corresponds to a request to move the first drawing palette to a third location within the first application interface that is different from the second location, the electronic device displays (1012) the first drawing palette at the third location having a second appearance that is different from the first appearance, wherein when the first drawing palette has the second appearance, a representation of the currently selected drawing tool is displayed in the first drawing palette without displaying representations of other drawing tools in the first drawing palette. For example, the electronic device moves a first drawing palette comprising a plurality of content manipulation affordances from one edge of the first application interface to a corner of the first application interface.

As one example, referring to fig. 7AN through 7AP, in response to determining that the drag input 734 corresponds to a request to move the first drawing palette 704 from the second region 716b to the first region 716a, the electronic device 100 displays the first drawing palette 704 (in fig. 7 AP) having the second appearance in the first region 716 a. The first drawing palette 704 having the second appearance includes the currently selected drawing tool indicator 735a but does not include representations of other drawing tools. In some embodiments, as shown in fig. 7 AN-7 AP, the orientation (e.g., facing to the left) of the first drawing palette 704 having the first appearance in fig. 7AN matches the orientation of the first drawing palette 704 having the second appearance in fig. 7 AP. In embodiments where the source and target locations do not share edges (e.g., the seventh and third regions 716g, 716c), the orientation of the first drawing palette having the first appearance may not match the orientation of the first drawing palette having the second appearance.

Referring to fig. 10B, in some embodiments, in response to detecting the second input corresponding to the drawing operation on the first application interface, the electronic device stops (1014) displaying the first drawing palette. Ceasing to display the first drawing palette increases the available display area of the first application interface.

In some embodiments, the electronic device simultaneously displays (1016), via the display device, a second application interface and the first application interface, wherein the second application interface includes a second drawing palette. In accordance with a determination that the first application interface has a respective size characteristic that does not satisfy the threshold size, the electronic device sets (1016) the first drawing palette to be immovable within the first application interface and sets the second drawing palette to be movable within the second application interface. Further, the electronic device detects (1016) a second input via the input device. In response to detecting the second input: in accordance with a determination that the second input corresponds to a request to move the first drawing palette to a fourth position within the first application interface, the electronic device maintains (1016) a current position of the first drawing palette; and in accordance with a determination that the second input corresponds to a request to move the second drawing palette to a fifth location within the second application interface, the electronic device moves (1016) the second drawing palette to the fifth location. Maintaining the current position of the first drawing palette instead of moving the first drawing palette reduces processing power and battery usage of the electronic device. For example, the electronic device simultaneously displays a first reduced-size representation of a first drawing palette within a first reduced-size representation of a first application interface and/or a second reduced-size representation of a second drawing palette within a second reduced-size representation of a second application interface. As another example, the respective size characteristic corresponds to a width of the corresponding application interface, and the respective size characteristic satisfies the threshold size when the width is sufficiently large (e.g., the corresponding application interface occupies more than 50% of the display area). In some embodiments, the first drawing application interface and the second drawing application interface are application windows of the same application. In some embodiments, the first drawing application interface and the second drawing application interface are application windows of different applications. In some embodiments, the first drawing application interface and the second drawing application interface are continuously associated with (e.g., share a boundary line) or adjacent to each other.

As one example, in response to detecting the drag input 778 in fig. 7BY, the electronic device 100 determines that the first application interface 702 has a respective dimensional characteristic (e.g., a first width 764) that does not satisfy the threshold size and determines that the second application interface 758 has a respective dimensional characteristic (e.g., a second width 766) that satisfies the threshold size. Thus, as shown in fig. 7BZ, electronic device 100 replaces first drawing palette 704 with toolbar 777 within the fixed toolbar area in first application interface 702 and maintains second drawing palette 762 in second application interface 758. Continuing the example, in response to detecting the drag input 780 in FIG. 7CA directed to the toolbar 777, the electronic device 100 maintains the current location of the toolbar 777 (in FIG. 7 CB). Continuing with the example, in response to detecting drag input 782 in fig. 7CC that points to second drawing palette 762, electronic device 100 moves second drawing palette 762 accordingly (in fig. 7 CD-7 CF).

In some embodiments, the electronic device detects (1018), via the input device, a third input directed to the first application interface, and de-emphasizes (1018) the second drawing palette relative to content displayed on a canvas of the second application interface in response to detecting the third input. De-emphasizing the second drawing palette indicates that the first drawing palette and the first application interface are currently active, thereby enhancing operability of the electronic device and making the electronic device more efficient. For example, the number of erroneous user inputs is reduced, which also reduces power usage and extends the battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiently.

For example, the third input corresponds to a drawing operation with respect to the first application interface, such as selection of a particular content manipulation affordance and/or drawing a markup on a canvas of the first application interface. As another example, de-emphasis corresponds to reducing focus, such as reducing brightness (e.g., darkening), increasing opacity, and so forth. In some embodiments, in response to detecting a fourth input directed to the second application interface, the electronic device resumes focus on the second drawing palette, and in various embodiments, reduces focus on the first drawing palette relative to content displayed on a canvas of the first application interface.

As one example, referring to fig. 7BV, the electronic device 100 detects a drawing input 774 directed to the second application interface 758. In response to detecting the drawing input 774, the electronic device 100 de-emphasizes the first drawing palette 704 relative to content 767 within the first application interface 702, as shown in fig. 7 BW. In response to detecting completion of the drawing input 774, the electronic device 100 restores (e.g., re-emphasizes) the first drawing palette 704.

In some embodiments, in accordance with a determination that the first input corresponds to the first input type, the electronic device displays (1020) a first drawing palette having a first appearance at the second location based on the first location type and the first input type. By displaying the first drawing palette at the second location based on the first input type, the electronic device avoids moving the first drawing palette to the second location in response to an input that does not correspond to the first input type. Thus, the electronic device utilizes less processing power and battery usage. For example, the first input type corresponds to a drag input that is completed within a particular predefined area of the first application interface (e.g., sufficiently close to an edge or corner of the first application interface). As another example, the first input type corresponds to a flick input in a direction of a particular predefined area (e.g., toward an edge), where the flick input satisfies a speed threshold.

As one example, referring to fig. 7W, the electronic device 100 detects a drag input 726 that is directed to the first drawing palette 704 at a first location along the second edge 701 b. In response to determining that the drag input 726 is a drag input type and that the drag input 726 spans into the fifth region 716e, the electronic device 100 displays the first drawing palette 704 at a second location along the third edge 701c in fig. 7Z. As another example, referring to fig. 7AC, the electronic device 100 detects a flick input 729 directed to the first drawing palette 704 at a first position along the third edge 701 c. In response to determining that the flick input 729 is of the flick input type and that the flick input 729 satisfies the speed threshold, the electronic device 100 displays the first drawing palette 704 at a second position along the first edge 701a in fig. 7 AG.

In some embodiments, determining that the first input corresponds to the request to move the first drawing palette to the second location includes (1022) determining that the second location corresponds to the first location type. By displaying the first drawing palette at the second location based on determining that the second location corresponds to the first location type, the electronic device avoids moving the first drawing palette to the second location in response to an input that does not request movement to the first location type. Thus, the electronic device conserves processing power and battery usage. For example, the first location type corresponds to a location within a threshold distance from a corresponding edge (e.g., top edge, bottom edge, side edge) of the display. As another example, the first location type corresponds to locations within a first threshold distance from a first edge of the display and within a second threshold distance from a second edge of the display, where the second edge intersects and is substantially perpendicular to the first edge. As one example, referring to fig. 7B, the electronic device 100 determines that the drag input 708 requests the first drawing palette 704 to be moved to the first location by determining that the target location is an edge type (e.g., the first edge 701a) instead of a corner type.

In some embodiments, in accordance with a determination that the first input satisfies a first distance threshold (e.g., relative to the second location), the electronic device determines (1024) that the first input corresponds to a request to move the first drawing palette to the second location and displays (1024) the first drawing palette having the first appearance at the second location. By moving the first drawing palette when the first input satisfies the first distance threshold, the electronic device avoids erroneously moving the first drawing palette in response to an input that does not intend to move the first drawing palette (such as an input that changes a currently selected drawing tool within the first drawing palette). Thus, the efficiency and operability of the electronic device is enhanced, thereby reducing processor utilization and battery usage. In some embodiments, the first input satisfies the first distance threshold when the release position of the first input is within a threshold distance from the corresponding edge. In some embodiments, the electronic device moves the first drawing palette to a position where the first drawing palette was located prior to detecting the first input when the released position of the first input is not within the threshold distance from the corresponding edge.

As one example, referring to fig. 7W, the electronic device 100 determines that the drag input 726 crosses a distance threshold 720 that is a corresponding distance 720a from the third edge 701 c. Accordingly, the electronic device 100 displays the first drawing palette 704 along the third edge 701c in fig. 7Z. As a counter example, in response to determining that the drag input 713 in fig. 7I does not cross the distance threshold 706 that is the corresponding distance 706a from the first edge 701a, the electronic device 100 does not move the first drawing palette 704 to the first edge 701a, but instead displays the first drawing palette 704 (in fig. 7L) at its previous location (e.g., along the fourth edge 701 d).

In some embodiments, in accordance with a determination that the first input satisfies the speed threshold, the electronic device determines (1026) that the first input corresponds to a request to move the first drawing palette to the second location and displays (1026) the first drawing palette having the first appearance at the second location. In some embodiments, the electronic device moves the first drawing palette based on a short duration input (e.g., a flick input) that satisfies a speed threshold. Thus, the electronic device avoids utilizing processing power and batteries that would otherwise be used for longer duration inputs (e.g., drag inputs). For example, the first input corresponds to a flick input that is associated with a direction toward a particular region of the application interface and is associated with a sufficiently high magnitude in the particular direction (e.g., a velocity or acceleration of the flick input). As one example, referring to fig. 7AC, the electronic device 100 detects the flick input 729 and determines that the flick input 729 has a horizontal component 729a and a vertical component 729b that are collectively associated with a direction toward the seventh region 716g, as indicated by the corresponding illustration direction line 730. Further, the electronic device 100 determines that the flick input 729 is associated with a sufficiently high magnitude towards the seventh region 716 g. Accordingly, as shown in fig. 7AD to 7AG, the electronic apparatus 100 moves the first drawing palette 704 to the seventh area 716 g.

Referring to fig. 10C, in some embodiments, when displaying the first drawing palette having the first appearance and the first orientation at a second location, wherein the second location corresponds to the first location type, the electronic device: a second input is detected (1028), via the input device, corresponding to a request to move the first drawing palette to a fourth location within the first application interface, wherein the fourth location corresponds to the first location type. In response to detecting the second input, in accordance with a determination that the fourth location is on an opposite side of the display as compared to the second location, the electronic device displays (1028), at the fourth location, a first drawing palette having a first appearance and a first orientation. In accordance with a determination that the fourth location is not on an opposite side of the display than the second location, the electronic device displays (1028), at the fourth location, a first drawing palette having a first appearance and a second orientation different from the first orientation. By changing or maintaining the orientation of the first drawing palette in accordance with the spatial relationship between the initial position and the target position of the first drawing palette, the electronic device increases the amount of available display area that would otherwise be covered (and thus rendered unusable) by the first drawing palette. For example, the second location corresponds to a first edge of the first application interface, and the second input corresponds to a request to move the first drawing palette to a second edge different from the first edge. As another example, the second orientation of the first drawing palette is rotated approximately 90 degrees compared to the first orientation. As yet another example, when the first drawing palette is located at a top edge or a bottom edge, the electronic device displays the first drawing palette according to a first orientation, and when the first drawing palette is located at a side edge (e.g., a left edge or a right edge), the electronic device displays the first drawing palette according to a second orientation. In some embodiments, the electronic device adjusts the size of the first drawing palette at the fourth location because the second location of the first drawing palette corresponds to a shorter edge of the first application interface and the fourth location of the first drawing palette corresponds to a longer edge of the first application interface.

As one example, referring to fig. 7B-7E, in response to determining that the fourth location (e.g., along the first edge 701a) is on an opposite side of the display than the second location having the first orientation (e.g., the fourth edge 701d), the electronic device 100 displays a first drawing palette 704 (in fig. 7E) having a first appearance and a first orientation at the fourth location. As one example, referring to fig. 7O-7R, in response to determining that the fourth location (e.g., along the third edge 701c) is not on an opposite side of the display than the second location having the first orientation (e.g., the fourth edge 701d), the electronic device 100 displays a first drawing palette 704 (in fig. 7R) having a first appearance and a second orientation different from the first orientation at the fourth location. That is, the second orientation is rotated about 90 degrees relative to the first orientation.

In some embodiments, the second appearance corresponds to (1032) the first drawing palette being in a compressed state. The first drawing palette in the compressed state covers less display space, and thus the electronic device provides a larger available display area. For example, the compressed state corresponds to a collapsed drawing palette. As another example, the electronic device displays the first drawing palette in a compressed state when the first drawing palette is anchored to or adjacent to a corner of the first application interface. In some embodiments, the first drawing palette in the compressed state includes fewer content manipulation affordances than the first drawing palette in other states. In some embodiments, the first drawing palette in the compressed state is smaller than the first drawing palette in the other state.

As one example, the electronic device 100 displays the first drawing palette 704 (in fig. 7 AP) in a compressed state such that the first drawing palette 704 is displayed adjacent to a corresponding corner (e.g., lower right corner). As another example, the electronic device 100 displays the first drawing palette 704 (in fig. 7 BA) in a compressed state such that the first drawing palette 704 is displayed adjacent to a corresponding corner (e.g., upper left corner).

In some embodiments, in accordance with a determination that the first input corresponds to a second input type that is different from the first input type, the electronic device displays (1034) a first drawing palette having a second appearance at a third location based on the first location type and the second input type. By displaying the first drawing palette at the third location based on the second input type, the electronic device avoids moving the first drawing palette to the third location in response to an input that does not correspond to the second input type. Thus, the electronic device conserves processing power and battery usage. For example, the second input type corresponds to a flick or drag toward or to a corner.

As one example, in response to detecting that drag input 734 corresponds to a drag input type toward a respective corner (in fig. 7 AN), electronic device 100 displays first drawing palette 704 (in fig. 7 AP) adjacent the respective corner. As another example, in response to detecting that the drag input 744 corresponds to a drag input type toward a respective corner (in fig. 7 AX), the electronic device 100 displays the first drawing palette 704 (in fig. 7 BA) adjacent the respective corner.

In some embodiments, determining that the first input corresponds to the request to move the first drawing palette to the third location includes (1036) determining that the third location corresponds to a second location type different from the first location type. By displaying the first drawing palette at the third location based on determining that the third location corresponds to the second location type, the electronic device avoids moving the first drawing palette to the third location in response to an input that does not request movement to the second location type. Thus, the electronic device utilizes less processing power and battery usage. For example, the second location type corresponds to a corner.

As one example, in response to detecting that drag input 734 corresponds to a drag input type toward a respective corner (in fig. 7 AN), electronic device 100 displays first drawing palette 704 (in fig. 7 AP) adjacent the respective corner. As another example, in response to detecting that the drag input 744 corresponds to a drag input type toward a respective corner (in fig. 7 AX), the electronic device 100 displays the first drawing palette 704 (in fig. 7 BA) adjacent the respective corner.

In some embodiments, in accordance with a determination that the first input satisfies a second distance threshold that is different from the first distance threshold, the electronic device determines (1038) that the first input corresponds to a request to move the first drawing palette to a third location and displays (1038) the first drawing palette having a second appearance at the third location. By moving the first drawing palette when the first input satisfies the second distance threshold, the electronic device avoids erroneously moving the first drawing palette in response to an input that does not intend to move the first drawing palette (such as an input that changes a currently selected drawing tool within the first drawing palette). Thus, the efficiency and operability of the electronic device is enhanced, thereby reducing processor utilization and battery usage. For example, the first input satisfies the second distance threshold when the release position of the first input is within a threshold distance from the corner.

As one example, in response to detecting that the drag input 734 in fig. 7AN crosses the second distance threshold 718 that is a corresponding distance 718a from the first edge 701a (e.g., within the first region 716 a), the electronic device 100 displays the first drawing palette 704 (in fig. 7 AP) having the second appearance within the first region 716 a. As another example, in response to detecting that the drag input 744 in fig. 7AX crosses a distance threshold 722 that is a corresponding distance 722a from the fourth edge 701d and crosses a distance threshold 720 that is a corresponding distance 720a from the third edge 701c (e.g., within the sixth area 716 f), the electronic device 100 displays the first drawing palette 704 (in 7 BA) having the second appearance within the sixth area 716 f.

Referring to fig. 10D, in some embodiments, when displaying the first drawing palette in a compressed state at the third location, the electronic device: detecting (1040), via an input device, a touch input directed to a first drawing palette; and in response to detecting the touch input, displaying (1040) the first drawing palette in a second expanded state different from the first expanded state, wherein the first drawing palette in the second expanded state includes more drawing tools than the first drawing palette in the compressed state. By including more drawing tools within a single user interface, the electronic device avoids detecting additional input corresponding to selection of a respective drawing tool. For example, the second expanded state corresponds to a preview drawing palette that provides a preview of selectable drawing tools and closes in response to selection of a particular drawing tool. In some embodiments, the first drawing palette in the second expanded state (e.g., preview drawing palette 738 in fig. 7 AS) includes fewer drawing tools than the first drawing palette in the first expanded state (e.g., full-size drawing palette 704 in fig. 7E). In some embodiments, the electronic device expands the first drawing palette in the compressed state according to a current orientation of the first drawing palette in the compressed state to display the first drawing palette in the second expanded state. In some embodiments, the electronic device displays the first drawing palette in the second expanded state in response to detecting the touch input for a threshold amount of time.

AS one example, in response to detecting a touch input 736 (in fig. 7 AR) directed to the first drawing palette 704 in the compressed state, the electronic device 100 displays the first drawing palette 704 (in fig. 7 AS) in the second expanded state. Notably, in fig. 7AR, because the currently selected drawing tool indicator 735a is oriented (e.g., pointed) to the left, the electronic device 100 expands the first drawing palette 704 upward to maintain the orientation, AS shown in fig. 7 AS. On the other hand, the electronic device 100 expands the first drawing palette 704 to the right (in fig. 7 BD) so as to maintain the upward facing orientation of the currently selected drawing tool indicator 735a (in fig. 7 BC).

In some embodiments, the electronic device detects (1042), via the input device, a second input that directs the particular content manipulation affordance within the first drawing palette that is in the second expanded state. In response to detecting the second input, the electronic device sets (1042) a current content manipulation setting associated with the first drawing palette according to the particular content manipulation affordance. By enabling selection of a particular content manipulation affordance within the first drawing palette, the electronic device reduces input related to opening and closing an additional user interface to select the content manipulation affordance. Thus, the electronic device consumes less processing and battery resources. In some embodiments, the second input corresponds to a drag input (e.g., a finger drag) along the first drawing palette in the second expanded state, wherein a drop point of the drag input corresponds to the particular content manipulation affordance. In some embodiments, the second input corresponds to a stylus touch input directed to the particular content manipulation affordance. For example, a particular content manipulation affordance corresponds to a particular tool, color, operation (e.g., undo/redo), and so on.

As an example, while the electronic device 100 is displaying the preview palette 738 (in FIG. 7 AT), the electronic device 100 detects a drag input 740 selecting a pencil tool affordance 738c having a tip gray. Accordingly, the electronic device 100 sets the pencil having the gray tip as the currently selected drawing tool, and updates the currently selected drawing tool indicator 735a to the pencil having the gray tip (in fig. 7 AU). As another example, while electronic device 100 is displaying preview palette 738 (in FIG. 7 BE), electronic device 100 detects a stylus tap input 750 selecting a black-tipped pencil affordance 738 a. Accordingly, the electronic apparatus 100 sets the pencil having the black tip as the currently selected drawing tool, and updates the currently selected drawing tool indicator 735a to the pencil having the black tip (in fig. 7 BF).

In some embodiments, when displaying the first drawing palette at the second location or the third location, the electronic device: a tap input directed to the first drawing palette is detected (1044) via the input device. In response to detecting the tap input, the electronic device moves (1044) the first drawing palette to a first position. Moving the first drawing palette back to the first position in response to a tap input instead of a drag input or a flick input reduces processor and battery utilization since tap inputs have a shorter duration. In some embodiments, the tap input corresponds to one of a single tap or a double tap. As an example, when the first drawing palette is in the first expanded state, the tap input is a single tap, such as tap input 711 in fig. 7F directed to the first drawing palette 704 in the second position. Accordingly, the electronic device 100 moves the first drawing palette 704 back to the first position, as shown in fig. 7G to 7I. As another example, in response to detecting a tap input directed to a first drawing palette (e.g., first drawing palette 704 in fig. 7 AP) that is in a compressed state, the electronic device moves the first drawing palette to its previous location.

11A-11C are flow diagrams of methods 1100 for invoking and utilizing a screenshot editing interface, according to some embodiments. In some embodiments, method 1100 is performed at an electronic device (e.g., electronic device 300 in fig. 3 or portable multifunction device 100 in fig. 1A, 8A-8 AL, and/or 9A-9Z) having one or more processors, non-transitory memory, an input device, and a display device. Some operations in method 1100 are optionally combined, and/or the order of some operations is optionally changed.

The method 1100 comprises: in response to detecting a single screenshot capture input, a screenshot editing interface is displayed for editing a screenshot image or a thumbnail representation of the screenshot image based on an input type of the screenshot capture input. Thus, the method 1100 provides additional control options without cluttering the user interface with additional displayed controls. Additionally, method 1100 provides an intuitive way to edit screenshot images. The method 1100 reduces the cognitive burden on the user when editing the screenshot image, thereby creating a more efficient human-machine interface. For battery-driven electronic devices, enabling a user to edit screenshot images faster and more efficiently conserves power and increases the time interval between battery charges.

With respect to FIG. 11A, in some embodiments, the electronic device detects (1102) a drag input on the touch-sensitive surface. In response to detecting the drag input, the electronic device displays (1102) a screenshot capture affordance via the display device. In some embodiments, the electronic device displays the screenshot capture affordance within a user interface that includes other affordances, such as icons corresponding to respective applications. Displaying a user interface with multiple affordances including a screenshot capture affordance reduces the number of inputs used to open different user interfaces including multiple affordances. As a result, the electronic device utilizes less processing resources, thereby extending the battery life of the electronic device. For example, the drag input corresponds to a drag from the bottom of the display area up or a drag from the top of the display area down. As one example, in response to detecting the drag input 808 in fig. 8B, electronic device 100 displays screenshot capture affordance 810B (in fig. 8C) within an interface that includes multiple application representations 810 a.

In some embodiments, the drag input is directed to a first taskbar, and displaying the screenshot capture affordance includes (1104) replacing the first taskbar with a second taskbar that includes the screenshot capture affordance. By ceasing to display the first taskbar, the electronic device conserves processing resources and battery life while providing a screenshot capture affordance within the second taskbar for capturing screenshot images. For example, replacing the first taskbar with the second taskbar includes the electronic device expanding the first taskbar to display the second taskbar.

In some embodiments, the drag input is moved across the touch-sensitive surface away from a corresponding edge of the touch-sensitive surface, and the electronic device displays (1106) a screenshot capture affordance within the control interface. In some embodiments, the electronic device displays the control interface and extends the control interface from the corresponding edge according to the drag input. By extending the control interface from the corresponding edge as the drag input advances, the electronic device provides feedback to avoid obscuring (e.g., overlaying) the control interface with content on the display. As one example, in response to detecting the drag up input 808 in fig. 8B, electronic device 100 displays an interface 810 (in fig. 8C) that includes a screenshot capture affordance 810B upward.

While displaying content via the display device, the electronic device detects (1108) a screenshot capture input. For example, the screenshot capture input is directed to a screenshot capture affordance, as shown in FIG. 8B. As another example, in fig. 8T, the screenshot capture input is a combination of hardware inputs, such as by electronic device 100 detecting a simultaneous press of the primary button via second screenshot capture input 858 and the press button 206 via second screenshot capture input 860.

In response to detecting the screenshot capture input, the electronic device captures (1110) a screenshot image of content displayed via the display device. In some embodiments, the electronic device captures a screenshot image and stores the screenshot image in a non-transitory memory (e.g., Random Access Memory (RAM) or cache). As one example, referring to fig. 8D, in response to detecting a first screenshot capture input 811, the electronic device captures screenshot images of content 803a, 803b, 804a-804c, and 806.

In response to detecting the screenshot capture input: in accordance with a determination that the screenshot capture input is of a first input type, the electronic device displays (1112) a screenshot editing interface for editing a screenshot image via a display device, wherein the screenshot editing interface includes the screenshot image. For example, the first input type corresponds to selecting a screenshot capture affordance displayed on the display. In some embodiments, the first input type corresponds to one of a standard tap input (e.g., a finger tap) or a stylus tap input. As one example, in response to detecting the first screenshot capture input 811 in fig. 8D, the electronic device 100 displays a screenshot editing interface 812 (in fig. 8E) that includes a screenshot image 813.

In response to detecting the screenshot capture input: in accordance with a determination that the screenshot capture input corresponds to a second input type that is different from the first input type, the electronic device displays (1114), via the display device, a thumbnail representation of the screenshot image overlaid on the content. For example, the second input type is a hardware-based input, such as a press of one or more hardware buttons on the electronic device. As one example, in response to detecting the respective hardware input 858 directed to the main button 204 and the respective hardware input 860 directed to the down button 206 simultaneously (in fig. 8T), the electronic device 100 displays a thumbnail representation 862 (in fig. 8U) of the screenshot image overlaid on the content.

Referring to fig. 11B, in some embodiments, the first input type includes (1116) movement of a stylus across a touch-sensitive surface of the electronic device away from an edge of the touch-sensitive surface (e.g., criteria for determining that the input is a first type of input that triggers capturing a screenshot and optionally entering a screenshot editing interface includes a requirement that the input is a swipe gesture away from the edge of the touch-sensitive surface). By displaying the screenshot editing interface in response to determining that the screenshot capture input corresponds to a release of movement of the stylus, the electronic device avoids displaying the screenshot editing interface in response to non-stylus movement (such as finger movement). Thus, the electronic device conserves processing power and battery by not displaying the screenshot editing interface in response to detecting certain types of input. For example, in response to detecting the first stylus movement 908 in fig. 9B-9D, the electronic device 100 displays a screenshot editing interface 917 (in fig. 9E). By way of example, in response to detecting the first non-stylus drag input 922 (e.g., a finger drag input) in fig. 9I-9K, electronic device 100 forgoes displaying the screenshot editing interface (in fig. 9L). As another example, in response to detecting a second non-stylus drag input 924 (e.g., a finger drag input) originating from a bottom edge 701d of the display interface (in fig. 9M), the electronic device 100 forgoes displaying the screenshot editing interface and displays the interface 810 (in fig. 9N) including the screenshot capture affordance 810 b. In some embodiments, when the electronic device detects movement of the stylus, the electronic device displays an animation representing a corresponding contraction of the displayed content. For example, referring to fig. 9B-9D, the electronic device shrinks the screenshot preview area 916 according to the first stylus movement 908. In some embodiments, the animation includes an outer region, such as outer region 914 in fig. 9B-9D.

In some embodiments, the release point within the touch sensitive surface is a threshold distance from (1118) a target location on the touch sensitive surface (e.g., criteria for determining that the input is a first type of input that triggers capturing a screenshot and optionally entering a screenshot editing interface includes a requirement that the input is a swipe gesture that starts from an edge of the touch sensitive surface and includes a lift-off of a contact that performs the gesture at least a threshold distance from the target location on the touch sensitive surface). By displaying the screenshot editing interface in response to determining that the release point is a threshold distance from the target location, the electronic device avoids displaying the screenshot editing interface incorrectly. For example, the electronic device refrains from displaying the screenshot editing interface in response to a stylus movement intended to add content (e.g., draw markup) within the canvas of the current application interface. Avoiding displaying the screenshot editing interface incorrectly improves the operability of the electronic device. For example, the target location is at or near the center of the touch-sensitive surface.

As one example, in response to determining that the first stylus movement 908 spans into the second region 912a second threshold distance 912a from the edges 701a-701D (in fig. 9D), the electronic device 100 displays the screenshot editing interface 917 (in fig. 9E). As a reverse example, in response to determining that the second stylus movement 920 does not cross into the second region 912 (in fig. 9G), the electronic device 100 forgoes displaying the screenshot editing interface (in fig. 9H).

In some embodiments, the first input type corresponds to movement of the stylus on a touch-sensitive surface of the electronic device away from a corner of the touch-sensitive surface (e.g., criteria for determining that the input is the first type of input that triggers capturing a screenshot and optionally entering a screenshot editing interface includes a requirement that the input is a swipe gesture starting from a corner of the touch-sensitive surface, such as a lower corner of the touch-sensitive surface). In some embodiments, the first input type corresponds (1120) to movement of a stylus on a touch-sensitive surface of the electronic device away from a corner of the touch-sensitive surface. By displaying the screenshot editing interface when movement of the stylus originates at a threshold distance from the corresponding corner, the electronic device avoids mistakenly displaying the screenshot editing interface in response to certain stylus movement inputs. Thus, the electronic device conserves processing power and battery by not displaying the screenshot editing interface in response to detecting certain types of input. For example, as shown in fig. 9Q, the electronic device 100 detects a third stylus movement 928 that originates at a threshold distance 929 from a corresponding corner. Continuing the example, in response to detecting that the release point of the third stylus movement 928 is within the second region 912 (in fig. 9S), the electronic device 100 displays a screenshot editing interface 917 (in fig. 9T).

In some embodiments, the screenshot editing interface also includes (1122) a drawing palette at a first location within the screenshot editing interface, and in response to a first input directed to the first drawing palette, the first drawing palette is movable to a second location within the screenshot editing interface. The electronic device comprising the movable first drawing palette improves operability of the electronic device. For example, moving a first drawing palette results in a larger available display area for drawing operations and other content modification operations. As one example, in response to detecting a drag input 876 (in fig. 8 AB-8 AD) requesting to move the first drawing palette 704 along the third edge 701c, the electronic device 100 moves the first drawing palette 704 accordingly, as shown in fig. 8 AE. Thus, the bottom area of the screenshot editing interface 812 in fig. 8AB that is overlaid by the first drawing palette 704 is available for editing operations in fig. 8 AE.

In some embodiments, when displaying a screenshot editing interface that includes an opacity level affordance, the electronic device detects (1124), via an input device, a first input directed to the opacity level affordance, wherein the first input sets the opacity level affordance to a corresponding opacity value. In response to detecting the first input, the electronic device changes (1124) the opacity of a filter layer overlaid on the screenshot image to a corresponding opacity value. Changing the opacity of the filter layer in response to detecting the first input without requiring further input enhances the operability of the electronic device by reducing the utilization of the processor and the battery. For example, the filter layer corresponds to the translucent layer. In some embodiments, the screenshot editing interface includes three layers, where the screenshot image corresponds to a bottom layer, the filter layers correspond to middle layers, and the marker (e.g., annotation) layer corresponds to a top layer.

As one example, as shown in fig. 8H, electronic device 100 detects a drag input 836 directed to opacity value indicator 818a of opacity level affordance 818, which sets the opacity level affordance to a corresponding opacity value. In response to detecting the drag input 836 in FIG. 8H, the electronic device 100 changes the opacity of the first filter layer 838 overlaid on the screenshot image 813 to a corresponding opacity value (in FIG. 8I). As another example, the electronic device 100 detects a drag input 872 directed to the opacity value indicator 818a of the opacity level affordance 818 that sets the opacity level affordance to a corresponding opacity value (in fig. 8Z). In response to detecting drag input 872 in FIG. 8Z, electronic device 100 changes the opacity of filter layer 874 overlaid on second screenshot image 870 to a corresponding opacity value (in FIG. 8 AA).

In some embodiments, in response to detecting the first input, the electronic device displays (1126), via the display device, a filter layer overlaid on the annotation of the screenshot image. Displaying a filter layer overlaid on an annotation of a screenshot image results in filtering both the screenshot image and the annotation simultaneously, thereby avoiding multiple filtering operations. Thus, the electronic device utilizes less processing power and battery usage. As one example, in response to detecting drag input 836 (in fig. 8H) that sets the opacity level affordance to a corresponding opacity value, electronic device 100 displays a second filter layer 840 (in fig. 8J) overlaid over screenshot image 813 and annotation 834, having the corresponding opacity value.

In some embodiments, in response to detecting the first input, the electronic device displays (1128) an annotation of the screenshot image as an overlay on the filter layer via the display device. Displaying the annotation overlaid on the filter layer causes the filter layer to obscure the annotation less (or not at all), resulting in greater visibility of the annotation. The more visible annotations reduce filtering reduction operations, resulting in less processing and battery resources consumed by the electronic device. As one example, in response to detecting drag input 836 (in FIG. 8H) that sets the opacity level affordance to a corresponding opacity value, electronic device 100 displays annotation 834 as overlaid on first filter layer 838 with the corresponding opacity value (in FIG. 8I).

In some embodiments, in response to detecting, via the input device, a second input directed to a completion affordance included within the screenshot editing interface, the electronic device displays (1130), via the display device, a save interface. Further, the electronic device detects (1130), via the input device, a third input directed to the save interface. In response to detecting the third input, the electronic device stores (1130) the screenshot image and the filter layer as a flattened image. The save interface provides an efficient mechanism for a user to manage storage, thereby reducing the amount of user interaction to perform storage management operations. The reduction in user interaction reduces wear on the electronic device. The reduction in user interaction also results in faster initiation of execution of the storage management operations and, therefore, reduced power consumption for performing the storage management operations, thereby extending the battery life of the electronic device. Furthermore, providing users with an efficient mechanism to manage storage increases the likelihood that users will perform such management and improves the performance of the electronic device. In some embodiments, the electronic device stores the flattened image to a pre-assigned memory location (e.g., a "photo" area). In some embodiments, the flattened image is editable.

As one example, in response to detecting input 852 (in fig. 8P) directed to completion affordance 814, electronic device 100 displays a save interface (e.g., second save interface 853) (in fig. 8Q). Continuing with the example, in response to detecting a subsequent input 854 (in FIG. 8R) directed to the "save to photograph" affordance within the second save interface 853, the electronic device 100 stores the screenshot image 813 and the second filter layer 840 as flattened images.

Referring to fig. 11C, in some embodiments, the screenshot editing interface also includes a respective affordance, and the electronic device detects (1132), via the input device, a first input directed to the respective affordance. In response to detecting the first input, the electronic device adds (1132) additional content to the screenshot editing interface that was not displayed on the display when the screenshot capture input was detected. By adding additional content to the screenshot editing interface, the electronic device avoids additional input requesting display of the additional content. For example, the electronic device avoids the need to detect an input to close the screenshot capture interface, navigate to additional content, and/or re-open the screenshot capture interface. Thus, the electronic device conserves processing and battery resources. For example, for a paginated document, the electronic device captures all pages into a single screenshot. As another example, the electronic device captures all content, such as multiple presentation slides or a full web page, as a screenshot or PDF to accommodate the expanded content. As yet another example, the additional content corresponds to a portion of the document that is available by scrolling the content up or down on the display. In some embodiments, the additional content corresponds to a reduced size representation of the content displayed when the electronic device captures the screenshot image. The reduced size representation may include smaller text, pictures, etc. In some embodiments, the electronic device displays a swipe to facilitate viewing of the additional content.

As one example, in response to detecting input 868 (in FIG. 8X) directed to "display all" affordance 816, electronic device 100 displays additional content 856b (e.g., cities 7-10) (in FIG. 8Y) that is not displayed when the electronic device detects the screenshot capture input. That is, in fig. 8T, the electronic device 100 does not display the additional content 856 b. As another example, in response to detecting an input 885 (in fig. 8 AI) directed to the "display all" affordance 816, the electronic device 100 displays additional content 884b and also displays a swipe interface 887 (in fig. 8 AJ). Further, in response to detecting the input 888 (in fig. 8 AK) directed to the swipe interface 887, the electronic device displays more additional content 884c (in fig. 8 AL).

In some embodiments, the electronic device detects (1134) movement of the stylus on the touch-sensitive surface of the electronic device, wherein the movement is away from a corresponding corner of the touch-sensitive surface and originates at a threshold distance from the corresponding corner; and in response to detecting the release of the movement of the stylus, displaying (1134), via the display device, a screenshot capture menu comprising a capture screenshot affordance and an edit screenshot affordance. In response to detecting a first input directed at the affordance of the capture screenshot, a screenshot image of content is captured. In response to detecting a second input directed to the editing screenshot representation, a screenshot editing interface for editing the screenshot image is displayed via the display device. Because the screenshot capture menu includes multiple affordances, the electronic device need not detect multiple inputs that each open a single affordance. Thus, the electronic device conserves battery and processing resources. For example, referring to fig. 9U-9Z, the electronic device 100 displays the screenshot capture menu 938 based on a fourth stylus movement 930 that moves away from the corresponding corner of the touch-sensitive surface and that originates at a threshold distance 929 from the corresponding corner.

In some embodiments, while displaying the thumbnail representation of the screenshot image, the electronic device detects (1136) a first input directed to the thumbnail representation of the screenshot image via the input device. In response to detecting the first input, the electronic device displays (1136) a screenshot editing interface via a display device. By displaying the screenshot editing interface in response to a first input directed to a thumbnail representation, the electronic device avoids detecting other inputs for displaying additional interfaces that enable subsequent display of the screenshot editing interface. Thus, the electronic device reduces the utilization of the processor and the battery. In some embodiments, in response to detecting the first input, the electronic device stops displaying the thumbnail representation. As one example, in response to detecting the first input 864 (in fig. 8V) directed to the thumbnail representation 862, the electronic device 100 displays a screenshot editing interface 812 (in fig. 8W) that includes a first screenshot image 866.

In some embodiments, in response to detecting, via the input device, a second input directed to a sharing affordance included within the screenshot editing interface, the electronic device displays (1138) the sharing interface via the display device. Further, the electronic device detects (1138) a third input directed to the sharing interface via the input device. In response to detecting the third input, the electronic device stores (1138) the screenshot image and the filter layer as an image file, wherein the screenshot image and the filter layer are separately editable. The sharing interface provides an efficient mechanism for managing storage for the user, thereby reducing the amount of user interaction to perform storage management operations. The reduction in user interaction reduces wear on the electronic device. The reduction in user interaction also results in faster initiation of execution of the storage management operations and, therefore, reduced power consumption for performing the storage management operations, thereby extending the battery life of the electronic device. Furthermore, providing users with an efficient mechanism to manage storage increases the likelihood that users will perform such management and improves the performance of the electronic device. For example, in some embodiments, the image file is editable. As another example, the image file is not flattened.

As one example, in response to detecting input 842 (in FIG. 8K) directed to sharing affordance 820, electronic device 100 displays a transfer interface 844 (in FIG. 8L) that includes a "save to file" affordance 844 i. Continuing with the example, in response to detecting input 846 directed to "Save to File" affordance 844i (in FIG. 8M), electronic device 100 displays first save interface 848 (in FIG. 8N). Continuing with the example, in response to detecting the input 850 (in FIG. 8O) pointing to the My images folder location within the first save interface 848, the electronic device 100 stores the screenshot image 813 and the second filter layer 840 as image files, wherein the screenshot image 813 and the second filter layer 840 are separately editable.

12A-12 AP are examples of user interfaces for selectively erasing portions of an object, according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 13A-13D. Although some of the examples that follow will be given with reference to input on a touch screen display (where the touch-sensitive surface and the display are combined, e.g., on touch screen 112), in some embodiments, electronic device 100 detects input on a touch-sensitive surface 651 that is separate from display 650, as shown in FIG. 6B.

As shown in fig. 12A, the electronic device 100 displays a drawing user interface 1202, such as a drawing application interface or a word processing application interface. Electronic device 100 displays drawing palette 1204. First drawing palette 1204 may include various affordances (e.g., a drawing tool affordance, an editing function affordance, and/or a color can) to facilitate content manipulation operations. One of ordinary skill in the art will appreciate that drawing palette 1204 may include any number and type of affordances arranged in any number of ways.

Drawing palette 1204 includes an undo affordance 1204a and a resume affordance 1204 b. An input directed to the undo affordance 1204a requests the electronic device 100 to undo a previous operation, such as erasing a previously drawn mark. An input directed to the recovery affordance 1204b requests the electronic device 100 to recover a previously undone operation, such as redisplaying a previously erased flag.

Drawing palette 1204 includes a set of drawing tool affordances including a pen affordance 1204c, a marker affordance 1204d (e.g., a highlighter affordance), a pencil affordance 1204e, a ruler affordance 1204f, and an eraser affordance 1204 g. Notably, the eraser affordance 1204g has a first appearance (e.g., "x" within and near the top of the eraser affordance 1204 g), indicating that an eraser tool associated with the eraser affordance 1204g is in an object erase mode of operation. The following describes the function of the eraser tool when in the object erasing mode of operation.

Further, when a particular drawing tool affordance of the set of drawing tool affordances is selected, an input directed to the drawing user interface 1202 causes the electronic device 100 to perform a corresponding content manipulation operation within the drawing user interface 1202. As shown in FIG. 12A, the pencil affordance 1204e is currently selected, indicating that the pencil instrument is the currently selected drawing instrument. An input directed to the corresponding drawing tool affordance sets the corresponding drawing tool as the currently selected drawing tool.

Drawing palette 1204 includes a set of color cans 1204h that includes a top row of color affordances for setting a currently selected color and a bottom row of pattern affordances for setting a currently selected pattern associated with the color. As shown in fig. 12A, black and solid patterns are currently selected. An input directed to the respective color affordance or the respective pattern affordance takes the respective color/pattern as the currently selected color/pattern.

Drawing palette 1204 includes text tool affordances 1204i that enable text content to be created within drawing user interface 1202. For example, after selecting text tool affordance 1204i, an input directed to drawing user interface 1202 causes electronic device 100 to display a text box for receiving the text string and causes electronic device 100 to replace the text box with the text string entered into the text box.

Drawing palette 1204 includes a shape tool affordance 1204j that enables placement of a particular shape within drawing user interface 1202. In some embodiments, for example, an input directed to the shape tool affordance 1204j invokes a shape interface that includes various predetermined shapes (e.g., square, circle, triangle). Subsequently, the electronic device 100 detects an input corresponding to dragging a particular shape from within the shape interface to a location within the drawing user interface 1202. In response, the electronic device 100 displays a particular shape at the location within the drawing user interface 1202.

As shown in fig. 12B, electronic device 100 detects an object insertion input 1206 corresponding to a request to insert an object 1208 into drawing user interface 1202. In some embodiments, object insertion input 1206 corresponds to movement of the contact in the shape shown in FIG. 12B. In response to detecting object insertion input 1206 in FIG. 12B, electronic device 100 inserts object 1208 into drawing user interface 1202, as shown in FIG. 12C.

As shown in FIG. 12D, the electronic device 100 detects an input 1210 directed to the eraser affordance 1204 g. In some embodiments, input 1210 corresponds to a tap input. In response to detecting the input 1210 in FIG. 12D, the electronic device 100 sets the eraser tool as the currently selected tool, as shown in FIG. 12E.

As shown in FIG. 12F, the electronic device 100 detects an input 1211 pointing at the eraser affordance 1204 g. In some embodiments, input 1211 corresponds to a long touch input (e.g., a detected touch input lasting more than a threshold amount of time) or a force sense touch input (e.g., a touch input associated with a level of force greater than a threshold amount). In response to detecting the input 1211 in FIG. 12F, the electronic device 100 displays an eraser mode interface 1212, as shown in FIG. 12G. In some embodiments, the eraser mode interface 1212 is at least partially overlaid on the drawing palette 1204. The eraser mode interface 1212 includes a pixel eraser affordance 1212a associated with a pixel erase mode of operation and an object eraser affordance 1212b associated with an object erase mode of operation. As shown in FIG. 12G, the object eraser affordance 1212b is currently selected, indicating that the eraser tool is currently in the object erase mode of operation. The eraser mode interface 1212 also includes a plurality of eraser thickness affordances 1212c respectively associated with a plurality of thickness levels. An input pointing to a particular eraser thickness affordance of the plurality of eraser affordances 1212c sets a thickness associated with the eraser tool to a corresponding thickness level.

As shown in FIG. 12H, electronic device 100 detects input 1213 that points to pixel eraser affordance 1212 a. In some embodiments, input 1213 corresponds to a tap input. In response to detecting input 1213 in fig. 12H, electronic device 100 changes the eraser tool from the object erase mode of operation to the pixel erase mode of operation, as indicated by the selected pixel eraser affordance 1212a, as shown in fig. 12I. Further, as shown in fig. 12J, the electronic device 100 stops displaying the eraser mode interface 1212 and changes the appearance of the eraser affordance 1204g from the first appearance to a second appearance (e.g., a shaded area within and near the top of the eraser affordance 1204 g). In some embodiments, the electronic device 100 stops displaying the eraser mode interface 1212 after a threshold amount of time has elapsed since detecting the input 1213. The second appearance indicates that the eraser tool is in the pixel erase mode of operation as compared to the first appearance indicating that the eraser tool is in the object erase mode of operation.

As shown in fig. 12K, electronic device 100 detects first pixel erase input 1214. In some embodiments, the first pixel erase input 1214 corresponds to movement of the contact. First pixel erase input 1214 defines a first path through object 1208. Accordingly, as shown in fig. 12L, the electronic device 100 stops displaying (e.g., erasing) the first portion of the object 1208 while maintaining the display of the second portion 1208a of the object 1208 and the third portion 1208b of the object 1208. Notably, the second portion 1208a is not connected to the third portion 1208 b. In some embodiments, in response to detecting an object erase input defining the first path (e.g., while the eraser tool is in the object erase mode of operation), the electronic device 100 stops displaying (e.g., erasing) the entire object 1208 instead of stopping displaying pixels in the first path.

As shown in FIG. 12M, the electronic device 100 detects an input 1215 directed to the eraser affordance 1204 g. In some embodiments, input 1215 corresponds to a long touch input (e.g., a detected touch input that lasts greater than a threshold amount of time) or a force-sensitive touch input (e.g., a touch input associated with a level of force greater than a threshold amount). In response to detecting input 1215 in FIG. 12M, electronic device 100 displays an eraser mode interface 1212, as shown in FIG. 12N.

As shown in FIG. 12O, the electronic device 100 detects an input 1218 directed to the object eraser affordance 1212 b. In some embodiments, the input 1218 corresponds to a tap input. In response to detecting the input 1218 in fig. 12O, the electronic device 100 changes the eraser tool from the pixel erase mode of operation to the object erase mode of operation, as indicated by the selected object eraser affordance 1212b, as shown in fig. 12P. Further, as shown in fig. 12Q, the electronic device 100 stops displaying the eraser mode interface 1212 and changes the appearance of the eraser affordance 1204g from the second appearance to the first appearance (e.g., "x" at the top of the eraser affordance 1204 g), thereby indicating that the eraser tool is in the object erase mode of operation. In some embodiments, the electronic device 100 stops displaying the eraser mode interface 1212 after a threshold amount of time has elapsed since detecting the input 1218.

As shown in fig. 12R, the electronic device 100 detects a first object erase input 1220. In various implementations, the first object erase input 1220 corresponds to movement of the contact. The first object wipe input 1220 defines a second path that intersects the third portion 1208b of the object 108 and does not intersect the second portion 1208a of the object 1208. In response to detecting the first object wipe input 1220 in fig. 12R, the electronic device 100 stops displaying the third portion 1208b of the object 1208 without stopping displaying the second portion 1208a of the object 1208, as shown in fig. 12S. In some embodiments, in response to detecting a pixel erase input defining the second path (e.g., while the eraser tool is in the pixel erase mode of operation), the electronic device 100 stops displaying pixels in the second path instead of stopping displaying the entire third portion 1208b of the object 1208.

As shown in fig. 12T, electronic device 100 detects input 1222 directed to dismissal affordance 1204 a. In some embodiments, input 1222 corresponds to a tap input. In response to detecting the input 1222 in fig. 12T, the electronic device 100 dismisses the ceasing to display the third portion 1208b (e.g., redisplays the third portion 1208b), as shown in fig. 12U.

As shown in fig. 12V, the electronic device 100 detects the second object-erase input 1224. In some embodiments, the second object-wipe input 1224 corresponds to movement of a contact. The second object wipe input 1224 defines a third path that intersects the second portion 1208a of the object 1208 and does not intersect the third portion 1208b of the object 1208. In response to detecting the second object wipe input 1224 in fig. 12V, the electronic device 100 stops displaying the second portion 1208a of the object 1208 without stopping displaying the third portion 1208b of the object 1208, as shown in fig. 12W. In some embodiments, in response to detecting a pixel erase input defining the third path (e.g., while the eraser tool is in the pixel erase mode of operation), the electronic device 100 stops displaying pixels in the third path instead of stopping displaying the entire second portion 1208a of the object 1208.

As shown in fig. 12X-12 AP, in some embodiments, electronic device 100 stops displaying (e.g., deleting) the entire object 1208 in response to detecting an object wipe input directed to fourth portion 1208c of object 1208. Notably, in contrast to the examples described above with reference to fig. 12A-12W, the fourth portion 1208c of the object 1208 is connected to the remainder of the object 1208 that was generated by the previous pixel erase input.

As shown in fig. 12X, electronic device 100 displays object 1208 in response to detecting object insertion input 1206 in fig. 12B. In addition, the electronic device 100 detects an input 1226 directed to the eraser affordance 1204 g. In some embodiments, the input 1226 corresponds to a long touch input (e.g., a detected touch input that lasts greater than a threshold amount of time) or a force-sensitive touch input (e.g., a touch input associated with a level of force greater than a threshold amount). In response to detecting input 1226 in FIG. 12X, device 100 displays an eraser mode interface 1212, as shown in FIG. 12Y.

As shown in FIG. 12Z, electronic device 100 detects input 1228 pointing to pixel eraser affordance 1212 a. In some embodiments, input 1228 corresponds to a tap input. In response to detecting input 1228 in FIG. 12Z, the electronic device 100 changes the eraser tool from the object erase mode of operation to the pixel erase mode of operation, as indicated by the selected pixel eraser affordance 1212a, as shown in FIG. 12 AA. Further, as shown in fig. 12AB, the electronic device 100 stops displaying the eraser mode interface 1212 and changes the appearance of the eraser affordance 1204g from the first appearance to the second appearance, thereby indicating that the eraser tool is in the pixel erase mode of operation. In some embodiments, the electronic device 100 stops displaying the eraser mode interface 1212 after a threshold amount of time has elapsed since detecting the input 1228.

As shown in FIG. 12AC, the electronic device 100 detects an input 1230 directed to the eraser affordance 1204 g. In some embodiments, input 1230 corresponds to a long touch input (e.g., a detected touch input lasting more than a threshold amount of time) or a force-sensitive touch input (e.g., a touch input associated with a level of force greater than a threshold amount). In response to detecting the input 1230 in fig. 12AC, the electronic device 100 displays an eraser mode interface 1212 (in fig. 12 AD).

As shown in FIG. 12AE, the electronic device 100 detects an input 1232 pointing to the thickest of the plurality of eraser thickness affordances 1212 c. In some embodiments, the input 1232 corresponds to a tap input. In response to detecting input 1232 in fig. 12AE, electronic device 100 sets the thickness of the drawing tool to the thickest thickness level, as shown in fig. 12 AF. Thus, a subsequent erase input will erase a larger portion of the content than in the previous example. As shown in FIG. 12AG, the electronic device 100 stops displaying the eraser mode interface 1212. In some embodiments, the electronic device 100 stops displaying the eraser mode interface 1212 after a threshold amount of time has elapsed since detecting the input 1232.

As shown in fig. 12AH, the electronic device 100 detects the second pixel erase input 1234. In some embodiments, the second pixel erase input 1234 corresponds to movement of the contact. Notably, in contrast to the example provided above with reference to the first pixel erase input 1214, the second pixel erase input 1234 defines a fourth path that does not divide the object 1208 into two separate portions. In contrast, as shown in fig. 12AI, the second pixel erase input 1234 removes the bottom block of the object 1208, leaving the fourth portion 1208c as a contiguous (e.g., continuous) drawing marker. In some embodiments, in response to detecting an object erase input defining the fourth path (e.g., while the eraser tool is in the object erase mode of operation), the electronic device 100 stops displaying (e.g., erasing) the entire object 1208 instead of stopping displaying pixels in the fourth path.

As shown in FIG. 12AJ, the electronic device 100 detects an input 1236 directed to the eraser affordance 1204 g. In some embodiments, input 1236 corresponds to a long touch input (e.g., a detected touch input that lasts greater than a threshold amount of time) or a force-sensitive touch input (e.g., a touch input associated with a level of force greater than a threshold amount). In response to detecting the input 1236 in FIG. 12AJ, the electronic device 100 displays the eraser mode interface 1212, as shown in FIG. 12 AK.

As shown in FIG. 12AL, the electronic device 100 detects an input 1238 directed to the object eraser affordance 1212 b. In some embodiments, the input 1238 corresponds to a tap input. In response to detecting the input 1238 in fig. 12AL, the electronic device 100 changes the eraser tool from the pixel erase mode of operation to the object erase mode of operation, as indicated by the selected object eraser affordance 1212b, as shown in fig. 12 AM. Further, as shown in fig. 12AN, the electronic device 100 stops displaying the eraser mode interface 1212 and changes the appearance of the eraser affordance 1204g from the second appearance to the first appearance, thereby indicating that the eraser tool is in the object erasing mode of operation. In some embodiments, the electronic device 100 stops displaying the eraser mode interface 1212 after a threshold amount of time has elapsed since detecting the input 1238.

As shown in fig. 12AO, the electronic device 100 detects a third object erase input 1240. In some embodiments, the third object scrub input 1240 corresponds to movement of the contact. The third object erase input 1240 defines a fifth path that intersects the fourth portion 1208c of the object 1208. As shown in fig. 12AO, the third object scrub input 1240 traverses the section of the fourth portion 1208c that is to the right of the previously removed bottom block of the object 1208. As shown in fig. 12AP, the third object wipe input 1240 removes the entire object 1208 because the fifth path intersects the fourth portion 1208c of the object 1208, and the fourth portion 1208c of the object 1208 corresponds to a single (e.g., not divided into multiple portions) drawing marker. This is in contrast to the two object wipe input examples provided above with reference to fig. 12R-12W, where the previous pixel wipe input 1214 divides the object 1208 into two separate (e.g., separate) portions, e.g., a second portion 1208a and a third portion 1208 b. In some embodiments, in response to detecting an object wipe input through a section of the fourth portion 1208c that is to the left of the previously removed bottom block of the object 1208, the electronic device 100 removes the entire object 1208. In some embodiments, in response to detecting a pixel erase input defining the fifth path (e.g., while the eraser tool is in the pixel erase mode of operation), the electronic device 100 stops displaying pixels in the fifth path instead of stopping displaying the entire object 1208.

Fig. 13A-13D are flow diagrams of methods 1300 for selectively erasing portions of an object, according to some embodiments. In some embodiments, method 1300 is performed at an electronic device (e.g., electronic device 300 in fig. 3 or portable multifunction device 100 in fig. 12A-12 AP) having one or more processors, non-transitory memory, an input device, and a display device. Some operations in method 1300 are optionally combined, and/or the order of some operations is optionally changed.

The method 1300 includes: after the object is segmented into the plurality of separate portions based on the pixel-erasure input, the electronic device stops displaying the particular separate portion and keeps displaying the other remaining portions in response to the object-erasure input. Thus, the electronic device provides more functionality and control over the erase operation. Further, the electronic device need not receive a drag-erase input that is spatially coextensive with a separate portion of the object in order to erase the separate portion. By erasing the separate portions using an object erase input instead of a drag erase input, the electronic device reduces processing and battery usage and experiences less wear.

With respect to fig. 13A, the electronic device displays (1302) a drawing user interface via a display device. For example, the drawing user interface includes content (e.g., annotated screenshot images) that is tagged with an object. As another example, the drawing user interface includes a toolbar region that includes a plurality of drawing tool selection affordances associated with a plurality of drawing tools, respectively. As an example, referring to fig. 12A, electronic device 100 displays drawing user interface 1202.

While displaying the drawing user interface, the electronic device detects (1304) an object insertion input corresponding to a request to insert an object into the drawing user interface. For example, the object is a line drawing input, a text input, or a request to insert a predetermined object such as a shape. As one example, referring to fig. 12B, electronic device 100 detects an object insertion input 1206 corresponding to a request to insert a closed object into drawing user interface 1202.

In response to detecting the object insertion input, the electronic device inserts (1306) the corresponding object into the drawing user interface. For example, the respective object corresponds to a stroke respective object defined by continuous user input within the drawing user interface upon selection of the drawing tool associated with the stroke operation. As another example, the respective object corresponds to one of a shape object, a stroke object, a magnifying glass object, or a text object. As yet another example, the respective object is generated as a result of movement of the contact across the touch-sensitive surface of the electronic device, which movement ends when the contact is released. As yet another example, the corresponding object is generated as a result of a mouse click followed by a drag movement of the cursor on the display, which ends when the mouse click is released. As one example, in response to detecting the object insertion input 1206 in fig. 12B, the electronic device 100 displays the corresponding object 1208 (in fig. 12C).

The electronic device detects (1308) a pixel wipe input while displaying a corresponding object in the drawing user interface. As one example, referring to fig. 12K, the electronic device 100 detects a first pixel-erasure input 1214 that defines a respective path through an object 1208 (e.g., splits the object into two). As another example, referring to fig. 12AH, electronic device 100 detects a second pixel erase input 1234 that defines a respective path that does not cross object 1208 but instead removes a segment of object 1208.

In response to detecting the pixel erasure input, the electronic device stops (1310) displaying the first portion of the respective object without stopping displaying the second portion of the respective object and without stopping displaying the third portion of the respective object. For example, in some embodiments, the second portion and the third portion are separate (e.g., unconnected) with respect to each other. For example, in response to detecting the first pixel scrub input 1214 in fig. 12K, the electronic device 100 remains displaying the second portion 1208a and the third portion 1208b (in fig. 12L), wherein the second portion 1208a and the third portion 1208b are unconnected to each other. As another example, in some embodiments, the second portion and the third portion are contiguously associated with (e.g., concatenated with) each other, as shown in fig. 12AH and 12 AI.

The electronic device detects (1312) an object erasure input directed to a portion of the corresponding object. For example, the electronic device detects the object scrub input after stopping displaying the first portion of the respective object without stopping displaying the second portion and the third portion of the respective object. As one example, electronic device 100 detects first object erase input 1220 in fig. 12R. As one example, electronic device 100 detects second object erase input 1224 in fig. 12V. As an example, the electronic device 100 detects the third object erase input 1240 in fig. 12 AO.

In response to detecting the object scrub input: in accordance with a determination that the object scrub input is directed to a second portion of the respective object and the second portion of the respective object is not connected to a third portion of the respective object, the electronic device stops (1314) displaying the second portion of the respective object without stopping displaying the third portion of the respective object. As one example, in response to detecting the first object wipe input 1220 in fig. 12R, the electronic device 100 stops displaying the third portion 1208b of the object 1208 without stopping displaying the second portion 1208a of the object 1208 (in fig. 12S).

In response to detecting the object scrub input: in accordance with a determination that the object scrub input is directed to a third portion of the respective object and the third portion of the respective object is not connected to the second portion of the respective object, the electronic device stops (1316) displaying the third portion of the respective object without stopping displaying the second portion of the respective object. As one example, in response to detecting the second object wipe input 1224 in fig. 12V, the electronic device 100 stops displaying the second portion 1208a of the object 1208 and does not stop displaying the third portion 1208b of the object 1208 (in fig. 12W).

Referring to fig. 13B, in some embodiments, while the corresponding object is displayed in the drawing user interface and prior to detecting the pixel scrub input: the electronic device performs the following operations: displaying (1318) within the drawing user interface a drawing palette comprising a plurality of content manipulation affordances; detecting (1318), via an input device, a first input directed to an eraser affordance of the plurality of content manipulation affordances, wherein the eraser affordance is associated with an eraser tool; in response to detecting the first input, displaying (1318) an eraser mode interface comprising a plurality of eraser mode affordances; detecting (1318), via the input device, a second input directed to a first of the plurality of eraser mode affordances; and in response to detecting the second input, setting (1318) an eraser tool to a pixel erase mode of operation, wherein the pixel erase input is detected while the eraser tool is in the pixel erase mode of operation. By displaying the plurality of eraser mode affordances within the erase mode interface, the electronic device need not detect multiple inputs for displaying the corresponding plurality of eraser mode affordances. Thus, processor usage, battery usage, and wear of the electronic device are reduced. For example, the content manipulation affordances include a pencil affordance, an eraser affordance, a lasso affordance, a highlighter affordance, a cancel affordance, a resume affordance, a color pot (e.g., hue and shade), and so forth. As another example, the first input corresponds to a long press touch or a force sense touch. In some embodiments, the eraser mode interface at least partially overlays the drawing palette. In some embodiments, the eraser mode interface includes different sized circles indicating the corresponding thickness of the erase operation. In some embodiments, the drawing palette is removable. In some embodiments, the drawing palette is fixed to the toolbar area. As one example, the electronic device 100 detects the input 1211 (in fig. 12F) and accordingly opens the eraser mode interface 1212 (in fig. 12G), and subsequently detects the input 1213 (in fig. 12H) directed to the pixel eraser affordance 1212 a. In response to detecting input 1213 in FIG. 12H, electronic device 100 sets the eraser tool to the pixel erase mode of operation. While the eraser tool is in the pixel erase mode of operation, the electronic device 100 detects a first pixel erase input 1214 (in fig. 12K).

In some embodiments, when the eraser tool is in the pixel erase mode of operation: detecting (1320), by the electronic device via the input device, a third input directed to a second of the plurality of eraser mode affordances; and in response to detecting the third input, the electronic device sets (1320) an eraser tool to an object erase mode of operation, wherein the object erase input is detected while the eraser tool is in the object erase mode of operation. For example, an eraser tool associated with the screenshot marking interface is in an object erase mode of operation. As another example, erasing an object in the object erase mode of operation corresponds to completely removing (e.g., deleting or stopping display) the object. As yet another example, the electronic device erases the object when a rate of erasing inputs satisfies a rate threshold. The pixel erase mode is different from the object erase mode. As one example, the electronic device 100 detects input 1215 (in fig. 12M) and opens the eraser mode interface 1212 (in fig. 12N) accordingly, and then detects input 1218 (in fig. 12O) pointing to the object eraser affordance 1212 b. In response to detecting the input 1218 in fig. 12O, the electronic device 100 sets the eraser tool to the object erase operation mode. While the eraser tool is in the object erase mode of operation (in fig. 12K), the electronic device 100 detects a first object erase input 1220.

In some embodiments, the eraser affordance has (1322) a first appearance when the eraser tool is in the object erase mode of operation, and the eraser affordance has (1322) a second appearance different from the first appearance when the eraser tool is in the pixel erase mode of operation. By indicating the current erase mode of operation, the electronic device detects less erroneous erase inputs directed into the drawing interface, thereby reducing processor utilization and wear of the electronic device. For example, the first appearance includes an "X" near the top of the eraser affordance, as shown in FIG. 12Q. As another example, the second appearance includes a shaded area near the top of the eraser affordance, as shown in FIG. 12J.

In some embodiments, the first portion of the respective object is within (1324) a first path defined by the pixel erase input. By stopping the display of the first portion based on the first path defined by the pixel erase input, the electronic device provides an accurate erase mechanism. For example, the length of the first path relative to the size of the object determines whether to delete a portion of the object or delete a portion of the object and segment the object. As another example, if the length of the first path extends through the object, the electronic device deletes a portion of the object within the first path and splits a remaining portion of the object into the second portion and the third portion. As one example, in response to detecting a second pixel erase input 1234 (in fig. 12 AH) that defines a respective path that does not traverse object 1208, device 100 erases a section of object without partitioning object 1208 into two separate portions (in fig. 12 AI), leaving a fourth portion 1208c of object 1208.

In some embodiments, a first path defined by the pixel erase input traverses (1326) the respective object, resulting in a second portion of the respective object that is unconnected to a third portion of the respective object. For example, the second portion corresponds to the left side of the divided square, and the third portion corresponds to the right side of the divided square. As one example, in response to detecting the first pixel-erasure input 1214 (in fig. 12K) that defines a respective path through the object 1208, the electronic device 100 divides (e.g., into two unconnected portions) the object 1208 into a second portion 1208a and a third portion 1208b (in fig. 12L).

Referring to fig. 13C, in some embodiments, in response to detecting the object wipe input: in accordance with a determination that the object scrub input is directed to a second portion of the respective object and the second portion of the respective object is connected to a third portion of the respective object, the electronic device stops (1328) displaying the second portion of the respective object and stops displaying (1328) the third portion of the respective object. By ceasing to display both the second and third portions of the respective object in response to a single object erase input, the electronic device need not detect multiple erase inputs for erasing both portions, thereby reducing processor usage, battery usage, and wear of the electronic device. For example, removing a first portion of a respective object in response to a previous pixel erasure input divides the respective object into a second portion and a third portion. As an example, in response to determining that the third object wipe input 1240 points to the fourth portion 1208c and that the fourth portion 1208c corresponds to a connected drawing marker (in fig. 12 AO), the electronic device 100 removes the entire object 1208 (in fig. 12 AP).

In some embodiments, in response to detecting the object scrub input: in accordance with a determination that the object scrub input is directed to a third portion of a respective object and the third portion of the respective object is connected to a second portion of the respective object, the electronic device stops (1330) displaying the third portion of the respective object and stops (1330) displaying the second portion of the respective object. By ceasing to display both the second and third portions of the respective object in response to a single object erase input, the electronic device need not detect multiple erase inputs for erasing both portions, thereby reducing processor usage, battery usage, and wear of the electronic device. For example, removing a first portion of a respective object in response to a previous pixel erasure input divides the respective object into a second portion and a third portion. As an example, in response to determining that the third object wipe input 1240 points to the fourth portion 1208c and that the fourth portion 1208c corresponds to a connected drawing marker (in fig. 12 AO), the electronic device 100 removes the entire object 1208 (in fig. 12 AP).

In some embodiments, in accordance with a determination that the object scrub input defines a third path that intersects the second portion of the respective object and that intersects the third portion of the respective object, the electronic device stops (1332) displaying the second portion of the respective object and stops (1332) displaying the third portion of the respective object. By ceasing to display both the second and third portions of the respective object in response to a single object erase input, the electronic device need not detect multiple erase inputs for erasing both portions, thereby reducing processor usage, battery usage, and wear of the electronic device.

In some embodiments, in response to detecting the object scrub input: in accordance with a determination that the object scrub input is directed to a second portion of a respective object and the second portion of the respective object is connected to a third portion of the respective object, the electronic device stops (1334) displaying the second portion of the respective object and stops (1334) displaying the third portion of the respective object; and in accordance with a determination that the object scrub input is directed to a third portion of the respective object and the second portion of the respective object is connected to the third portion of the respective object, the electronic device ceases (1334) to display the second portion of the respective object and ceases (1334) to display the third portion of the respective object. By ceasing to display both the second and third portions of the respective object in response to a single object erase input, the electronic device need not detect multiple erase inputs for erasing both portions, thereby reducing processor usage, battery usage, and wear of the electronic device. As an example, in response to determining that the third object wipe input 1240 points to the fourth portion 1208c and that the fourth portion 1208c corresponds to a connected drawing marker (in fig. 12 AO), the electronic device 100 removes the entire object 1208 (in fig. 12 AP).

Referring to FIG. 13D, in some embodiments, in accordance with a determination that the object scrub input defines a first path that intersects the second portion of the respective object and does not intersect the third portion of the respective object, the electronic device stops (1336) displaying the second portion of the respective object without stopping displaying the third portion of the respective object. By ceasing to display a particular separated portion and maintaining the other remaining portions in response to an object erase input, the electronic device provides more functionality and control over the erase operation. Further, the electronic device need not receive a drag-erase input that is spatially coextensive with a separate portion of the object in order to erase the separate portion. By erasing the separate portions using an object erase input instead of a drag erase input, the electronic device reduces processing and battery usage and experiences less wear. As one example, in response to detecting the first object wipe input 1220 in fig. 12R, the electronic device 100 stops displaying the third portion 1208b of the object 1208 without stopping displaying the second portion 1208a of the object 1208 (in fig. 12S).

In some embodiments, in accordance with a determination that the object scrub input defines a second path that intersects the third portion of the respective object and does not intersect the second portion of the respective object, the electronic device stops (1338) displaying the third portion of the respective object without stopping displaying the second portion of the respective object. By ceasing to display a particular separated portion and maintaining the other remaining portions in response to an object erase input, the electronic device provides more functionality and control over the erase operation. Further, the electronic device need not receive a drag-erase input that is spatially coextensive with a separate portion of the object in order to erase the separate portion. By erasing the separate portions using an object erase input instead of a drag erase input, the electronic device reduces processing and battery usage and experiences less wear. As one example, in response to detecting the second object wipe input 1224 in fig. 12V, the electronic device 100 stops displaying the second portion 1208a of the object 1208 and does not stop displaying the third portion 1208b of the object 1208 (in fig. 12W).

In some embodiments, when displaying a drawing palette comprising a plurality of content manipulation affordances within a drawing user interface, the electronic device detects (1340), via an input device, a first input directed to a drawing affordance of the plurality of content manipulation affordances; in response to detecting the first input, the electronic device changes (1340) the currently selected tool from the eraser tool to a drawing tool associated with the drawing affordance; detecting (1340), by the electronic device via the input device, a drawing input directed to a canvas of the drawing user interface; and in response to detecting the drawing input, the electronic device performs (1340) a drawing operation on the canvas. By displaying the plurality of content manipulation affordances, the electronic device need not detect a plurality of inputs for displaying the plurality of content manipulation affordances. Thus, battery usage, processor usage, and wear of the electronic device are reduced. For example, the plurality of content manipulation affordances includes two or more of a pencil affordance, a pen affordance, a text insertion affordance, a highlighter affordance, and the like. As another example, the electronic device also changes an appearance of the pictorial affordance to distinguish its appearance from a corresponding appearance of the remainder of the plurality of content manipulation affordances.

In some embodiments, after changing the currently selected tool from the eraser tool to the drawing tool, the electronic device detects (1342), via the input device, a second input directed to an eraser affordance of the plurality of content manipulation affordances, wherein the eraser affordance is associated with the eraser tool; and in response to detecting the second input, the electronic device changes (1342) the currently selected tool from the drawing tool to an eraser tool. As one example, in response to detecting input 1210 in fig. 12D, electronic device 100 changes the currently selected implement from a pencil implement to an eraser implement, as indicated by the selected eraser affordance 1204g (in fig. 12E).

The foregoing description, for purpose of explanation, has been described with reference to embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The embodiment was chosen and described in order to best explain the principles of the invention and its practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments described with various modifications as are suited to the particular use contemplated.

275页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:信息提供系统、服务器以及信息提供方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类