Method and device for data tracking and presentation

文档序号:1358582 发布日期:2020-07-24 浏览:15次 中文

阅读说明:本技术 一种用于数据跟踪和呈现的方法和装置 (Method and device for data tracking and presentation ) 是由 胡迅 余欣 于 2018-10-12 设计创作,主要内容包括:实施例提供了用于提供与媒体节目相关联的对象信息的方法和设备。所述实施例包括:在所述媒体节目的帧图像中标识感兴趣的对象;将指示符插入所述帧图像以标识所述感兴趣的对象;根据所述媒体节目的一部分生成与所述标识对象相关联的视频片段;构建包含有关所述兴趣对象的信息的信息页面,其中,所述信息页面包含用于显示所述视频片段的窗口;将包括所述视频片段的所述信息页面发送给用户设备。(Embodiments provide methods and apparatus for providing object information associated with a media program. The embodiment comprises the following steps: identifying an object of interest in a frame image of the media program; inserting an indicator into the frame image to identify the object of interest; generating a video clip associated with the identifying object from a portion of the media program; constructing an information page containing information about the object of interest, wherein the information page contains a window for displaying the video clip; and sending the information page comprising the video clip to user equipment.)

1. A method for providing object information associated with a media program, comprising:

identifying an object of interest in a frame image of the media program;

inserting an indicator into the frame image to identify the object of interest;

generating a video clip associated with the identifying object from a portion of the media program;

constructing an information page containing information about the object of interest, wherein the information page contains a window for displaying the video clip;

sending the media program to user equipment;

sending a uniform resource locator (UR L) of the information page to the user equipment;

receiving an information page request associated with the user device displaying the media program;

and sending the information page comprising the video clip to user equipment.

2. The method of claim 1, wherein the indicator is adjacent to the object of interest.

3. The method of claim 1, wherein the indicator is located at a corner of the frame image.

4. The method of claim 1, further comprising:

identifying the object of interest in the frame when the object matches an object ID in a product library.

5. The method of claim 1, further comprising:

sending the media program to a first user equipment;

sending the UR L of the information page to a second user equipment;

receiving an information page request associated with the first user equipment displaying the media program;

and sending the information page comprising the video clip to the second user equipment, wherein the first user equipment and the second user equipment are different user equipment.

6. The method of claim 1, wherein sending the information page including the video clip to the user device further comprises:

wherein the video segment includes a latest timestamp of the media program displayed in the user equipment.

7. A server for providing object information associated with a media program, comprising:

a memory storing instructions;

one or more processors coupled with the memory, wherein the one or more processors execute the instructions to:

identifying an object of interest in a frame image of the media program;

inserting an indicator into the frame image to identify the object of interest;

generating a video clip associated with the identifying object from a portion of the media program;

constructing an information page containing information about the object of interest, wherein the information page contains a window for displaying the video clip;

sending the media program to user equipment;

sending a uniform resource locator (UR L) of the information page to the user equipment;

receiving an information page request associated with the user device displaying the media program;

and sending the information page comprising the video clip to the user equipment.

8. The server of claim 7, wherein the indicator is adjacent to the object of interest.

9. The server according to claim 7, wherein the indicator is located at a corner of the frame image.

10. The server of claim 7, further comprising:

identifying the object of interest in the frame when the object matches an object ID in a product library.

11. The server of claim 7, further comprising:

sending the media program to a first user equipment;

sending the UR L of the information page to a second user equipment;

receiving an information page request associated with the first user equipment displaying the media program;

and sending the information page comprising the video clip to the second user equipment, wherein the first user equipment and the second user equipment are different user equipment.

12. The server according to claim 7, wherein the sending the information page including the video clip to the user equipment further comprises:

wherein the video segment includes a latest timestamp of the media program displayed in the user equipment.

13. A method for providing object information associated with a media program, comprising:

obtaining a uniform resource locator (UR L) of an information page associated with an object of interest, wherein the object of interest is presented in a frame of the media program, the frame including an indicator identifying the object of interest;

requesting the information page on the UR L;

receiving the information page including a window for displaying a video clip, wherein the video clip is based on a portion of the media program and is associated with the object of interest;

and displaying the information page and the video clip in the window.

14. The method of claim 13, further comprising:

when a user request is received, the UR L of the information page is retrieved.

15. The method of claim 13, further comprising:

the UR L is displayed by a User Interface (UI) image, which is displayed when an application is running on a mobile device.

16. The method of claim 13, further comprising:

updating the UR L when an application program is running on the device, wherein the video segment includes the latest timestamp of the media program displayed in the window.

17. A mobile device for providing object information associated with a media program, comprising:

a memory storing instructions;

a processor coupled with the memory, wherein the processor executes the instructions to:

obtaining a uniform resource locator (UR L) of an information page associated with an object of interest, wherein the object of interest is presented in a frame of the media program, the frame including an indicator identifying the object of interest;

requesting the information page on the UR L;

receiving the information page including a window for displaying a video clip, wherein the video clip is based on a portion of the media program and is associated with the object of interest;

and displaying the information page and the video clip in the window.

18. The mobile device of claim 17, further comprising:

when a user request is received, the UR L of the information page is retrieved.

19. The mobile device of claim 17, further comprising:

the UR L is displayed by a User Interface (UI) image, which is displayed when an application is running on a mobile device.

20. The mobile device of claim 17, further comprising:

updating the UR L when an application program is running on the device, wherein the video segment includes the latest timestamp of the media program displayed in the window.

Technical Field

The present invention relates generally to data editing, data tracking, and presentation of media programs, and more particularly, to a method, server, and mobile device for providing object information associated with a media program.

Background

Media programs such as movies, television programs, slideshows, and video presentations may contain rich information such as images of actors, clothing, bags, kitchen utensils, and other merchandise. The information of the items read, viewed and heard from the media program is not necessarily discernible from the media program during the playing of the program. For example, the brand of a certain jacket displayed at a particular time of a movie may not be readily identifiable at the time the movie is played. Some viewers of a media program may want to learn more about images or music while watching the program.

Disclosure of Invention

An exemplary embodiment provides a method for providing object information associated with a media program, including identifying an object of interest in a frame image of the media program, inserting an indicator into the frame image to identify the object of interest, generating a video clip associated with the identified object from a portion of the media program, constructing an information page containing information about the object of interest, wherein the information page contains a window for displaying the video clip, transmitting a uniform resource locator (UR L) of the information page to the user device, receiving an information page request related to the user device displaying the media program, and transmitting the information page including the video clip to the user device.

Optionally, in any preceding embodiment, the indicator is adjacent to the object of interest.

Optionally, in any preceding embodiment, the indicator is located at a corner of the frame image.

Optionally, in any preceding embodiment, the method further comprises: identifying the object of interest in the frame when the object matches an object ID in a product library.

Optionally, in any preceding embodiment, the method further includes sending the media program to a first user equipment, sending the UR L of the information page to a second user equipment, receiving an information page request related to the first user equipment displaying the media program, and sending the information page including the video segment to the second user equipment, where the first user equipment and the second user equipment are different user equipment.

Optionally, in any preceding embodiment, the sending the information page including the video clip to a user equipment further includes: wherein the video segment includes a latest timestamp of the media program displayed in the user equipment.

An exemplary embodiment provides a server for providing object information associated with a media program, the server comprising a memory storing instructions, one or more processors coupled with the memory, wherein the one or more processors execute the instructions to identify an object of interest in a frame image of the media program, insert an indicator into the frame image to identify the object of interest, generate a video clip associated with the identified object from a portion of the media program, construct an information page containing information about the object of interest, wherein the information page contains a window for displaying the video clip, send a UR L of the information page to the user device, receive an information page request related to the user device displaying the media program, and send the information page including the video clip to the user device.

Optionally, in any preceding embodiment, the indicator is adjacent to the object of interest.

Optionally, in any preceding embodiment, the indicator is located at a corner of the frame image.

Optionally, in any preceding embodiment, the server further includes: identifying the object of interest in the frame when the object matches an object ID in a product library.

Optionally, in any preceding embodiment, the sending the information page including the video clip to a user equipment further includes: wherein the video segment includes a latest timestamp of the media program displayed in the user equipment.

An exemplary embodiment provides a method for providing object information associated with a media program, the method comprising obtaining a UR L of an information page associated with an object of interest, wherein the object of interest is presented in a frame of the media program, the frame including an indicator identifying the object of interest, requesting the information page on the UR L, receiving the information page including a window for displaying a video clip, wherein the video clip is based on a portion of the media program and is associated with the object of interest, and displaying the information page and the video clip in the window.

Optionally, in any preceding embodiment, the method further comprises obtaining the UR L of the information page when a user request is received.

Optionally, in any of the preceding embodiments, the UR L is displayed by a User Interface (UI) image, wherein the UI image is displayed when an application program is running on the mobile device.

Optionally, in any preceding embodiment, the method further comprises updating the UR L when an application program is running on the device, wherein the video segment includes the latest timestamp of the media program displayed in the window.

An exemplary embodiment provides a mobile device for providing object information associated with a media program, the mobile device comprising a memory storing instructions, a processor coupled with the memory, wherein the processor executes the instructions to retrieve a UR L of an information page associated with an object of interest, wherein the object of interest is presented in a frame of the media program, the frame comprising an indicator identifying the object of interest, request the information page on the UR L, receive the information page comprising a window for displaying a video clip, wherein the video clip is based on a portion of the media program and is associated with the object of interest, and display the information page and the video clip in the window.

Optionally, in any of the preceding embodiments, the apparatus further comprises the UR L that obtains the information page when a user request is received.

Optionally, in any of the preceding embodiments, the UR L is displayed by a User Interface (UI) image, wherein the UI image is displayed when an application program is running on the mobile device.

Optionally, in any preceding embodiment, the device further comprises updating the UR L when an application is running on the device, wherein the video clip comprises the latest timestamp of the media program displayed in the window.

Drawings

FIG. 1 is a block diagram of an exemplary process for data tracking and presentation in a media program according to an embodiment of the present invention;

FIG. 2 illustrates a diagram that manages one or more video segments of one or more media programs, according to an embodiment of the invention;

FIG. 3 illustrates a flow diagram associated with FIGS. 1 and 2 in accordance with an embodiment of the present invention;

FIG. 4 illustrates an example of object tracking and rendering in a media program on a first display 400, according to an embodiment of the present invention;

FIG. 5 illustrates an example of object tracking and rendering in a media program on a second display 500, according to an embodiment of the present invention;

FIG. 6 is a flow diagram for providing object information associated with a media program, according to an embodiment of the present invention;

fig. 7 is a block diagram of components of a machine 700 according to some embodiments of the invention.

Detailed Description

Reference will now be made in detail to the present embodiments of the invention, examples of which are illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the various embodiments of the invention. However, it will be apparent to one of ordinary skill in the art that the embodiments may be practiced without these specific details or that various modifications or alterations may be made thereto without departing from the spirit and scope of the invention. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to obscure aspects of the embodiments.

Producers of media programs such as movies, television programs, direct digital storage media products, online videos, etc., often want to provide viewers with information about certain items in the program. The advertisement producer may also borrow the opportunity to place product advertisements in the program. Some advertisements interrupt the program to show merchandise or suddenly add advertising content to the program, causing viewers who are not interested in the product to find the program uninteresting.

Similarly, some producers attempt to avoid interfering with the broadcast of certain media programs by superimposing pop-up advertisements on top of them, but many viewers still find the programs uninteresting.

Producers need to provide advertisements in a more comfortable and pleasant way.

In addition, viewers of some media programs may be interested in learning more about images or music while watching the program. It would be effective to win the client if more information about the images or sounds played in the media program could be obtained immediately.

According to the scheme provided by the embodiment of the invention, a story is introduced to show examples in real life which may happen. The user subscribes to the service from the content provider. The user may leave a preferred interest list during subscription. The user may install applications and services in their mobile device. In this embodiment, the user is watching a media program on a television; from time to time, displaying an indicator that there is an object displayed by the indicator for further understanding, the object being a package, a kitchen utensil or any item for sale; when the user is interested, the user can pause or continue playing the program and open an application program in the mobile equipment; displaying, on the application, an image of an object that the user has just viewed from the media program; in addition to the image, several lines of brief information relating to the object are provided; the user can click on the image or lines of brief information to obtain additional information about the object.

In addition, when the user clicks the image, a video clip may be played on the screen of the mobile device, where the video clip is specifically a clip of the media program that the user has just watched, and the video clip is focused on a scene where the object is located in the media program. Optionally, the indicator is configured as a transparent icon. Alternatively, the indicator may be an animation or the like on the display screen to inform the person that there may be other information. The indicator may be sized to be viewed to attract a person's attention. However, if the user is not interested in the object, the indicator may be designed in a way that is easily ignored by the user, so as not to disturb the user viewing the program, but rather to inform the user that the user may have more information about the object in the application. The indicator may be an icon, pointer, symbol, animation, box, etc., and the indicator may have a specific shape, such as a rectangle, triangle, circle, or 3D shape, etc.

FIG. 1 is a block diagram of an exemplary process for data tracking and presentation in a media program according to an embodiment of the present invention.

The user 100 views the media program on the first display 110. The user 100 may also obtain other information using the second display 120. Server 130 may process one or more frames of a media program, and when an object in one frame of the media program matches an object ID in a product library, server 130 may identify the object and edit the frame of the media program. The server 130 may be a streaming media server, and may also provide media programs; or may be an advertising platform (not shown in fig. 1) or may provide advertisements for media programs. Optionally, the server 130 also identifies the location of the object in the one or more frames of the media program at the same time. The product library comprises a database of goods for sale, including brand names, prices, pictures, places of origin, instruction manuals, etc. of the goods. The identification process may be automatically implemented by an image processing algorithm when the media program is played, or may be implemented by manually marking the object before the media program is played. This process is repeated for each frame of the media program to identify objects in the frame. The one or more frames of the media program may be received from an independent media program provider or the same provider operating the server. If the server 130 receives the media program when the first display 110 plays the media program, a synchronization signal may be sent between the server 130 and the first display 110. The object is identified while a timestamp of the object is recorded. The timestamp is used to record a start time and an end time for displaying an identifying object in the one or more frames of the media program. As the media program continues to play on the first display 110, a synchronization signal is periodically sent to the server 130. The first display 110 and the second display 120 may be separate windows of the same screen, or may be separate screens of the same device or different devices.

The server 130 sends one or more frames of a media program having one or more identifying objects to the media player/mixer 140. Optionally, the server 130 also sends the information identifying the position of the object in the one or more frames of the media program to the media player/mixer 140. The media player/mixer 140 modifies or edits the limited amount of frame data by embedding an indicator at the location of the frame of the media program that includes the image of the identified object. The indicator may be designed in the manner described in the various embodiments of the invention.

The server 130 updates a User Interface (UI) and related information to the one or more identification objects on the application periodically or upon request by the application or opening of the application. The application may be installed and run in the user device. In the field of industrial design of human-computer interaction, the UI is the space used for the interaction between a human and a machine. The purpose of this interaction is to: the human being can effectively operate and control the machine, and the machine can feed back information which is helpful for an operator to execute a decision-making process. Examples of this broad concept of user interfaces include computer operating systems, hand tools, heavy machine operator controls, and interactive interfaces for process controls. It is generally assumed that the UI refers to a graphical user interface.

In addition to the UI, several lines of information are provided relating to the object, the several lines of information providing brief information of the one or more identified objects. When an image or several lines of brief information are selected, more information is displayed to obtain further information of the object. Optionally, when the image of the UI is selected, a video segment, specifically a segment of the media program, is played on the second display 120. The second display 120 may be different from the first display 110. Based on the ID of the user subscription or the synchronization signal between the first display 110 and the server 130, the server 130 may update the UI and related information to the one or more identification objects on the application to match the media program playing on the first display 110. Thus, when an image UI is selected, the timestamp of the video clip retrieved from the server 130 is closest to the current actual time of the media program on the first display 110. The video clip played on the second display 120 shows the logo object that was just displayed on the first display 110 for a short time. Wherein the short time is a difference between the timestamp of the video segment and a current actual time of the media program on the first display 110.

A plurality of segments of the media program may be played on the second display, which may be stored in a database or may be generated while the media program is played on the first display 110. The segments of the media program may include particular segments of one or more scenes of the media program. The plurality of segments of the media program may be stored or generated to display one or more scenes of the media program on a block-by-block basis. A segment of the media program may be generated from a portion of the media program associated with an identifying object according to a media format or a media template. Preferably, the frames are modified or generated without changing the data that has been embedded in the media program. The process of embedding one or more indicators may be repeated for various frames of the media program.

The UI and related information associated with the one or more identification objects on the application may be updated periodically when a request is sent to the server 130, when the user opens the application, or when the user pauses playing the media program on the first display.

The UIs and related information identifying the object or objects on the application may be arranged in chronological order, with the most recently updated UIs and related information displayed at the top, each image UI or profile information row may represent a resource for a video segment of a media program to be played by a device and accessible through a database, when an image UI is selected, the video segment associated with the image UI may be replayed on the second display 120 or an available screen of choice, the UI may be used to link to various URs L, these URs L point to the video segment associated with the identified object, the content and process of making the video segment will be further described below, FIG. 2 shows a diagram of managing one or more video segments of one or more media programs according to an embodiment of the invention, as described above in FIG. 1, when an object in one or more media program frames matches an object ID in a product library, the server 130, labeled 230, identifies the object or objects in FIG. 2, the server 130, 230 may identify the presence of the identified object in the program object or record cache 250.

Fig. 3 shows a flow diagram associated with fig. 1 and 2, according to an embodiment of the invention.

In step 301, the server 130, 230 identifies an object from a video frame and sends the relevant object ID and the position of the object in the frame of the media program to the clip manager 260.

In step 303, the clip manager 260 looks up the associated object ID in the cache 250.

In step 305, the clip manager 260 determines whether an associated or identified object ID exists in the cache 250; if not, step 308 is performed, and if so, step 307 is performed.

If no such associated or identified object ID exists in the cache 250, then in step 308, an object record is generated in the cache 250; an object record is created in the cache 250, which may include items such as information of an object ID, a current timestamp, and a current location. Optionally, the object record includes a clip _ start _ time stamp and a clip _ end _ time stamp. For example, the object record may include the item information in step 310 (not shown in FIG. 3) as follows.

◆ set object in cache:

◆ Object _ ID Object ID

◆ Movie _ ID

◆ Current _ clip _ start _ time Current timestamp;

◆ Current _ clip _ end _ time Current timestamp;

◆ Current _ positions adding positions to the list

The movie _ id may indicate a movie from which the object is identified.

After creating the object record in step 308, the process may return to step 301 and repeat its steps.

If such an associated or identified object ID exists in the cache 250, then in step 307 the object record is read in the cache 250.

In step 309, if the difference between current _ clip _ end _ time of the object record in the cache 250 and current _ clip _ end _ time of the identified object is greater than a certain time interval, comparing the object record in the cache with the identified object; if the comparison result is yes, go to step 311; if the comparison is negative, step 312 is performed.

If the comparison is yes, indicating that the difference is greater than the time interval, then an object record is generated in step 311, including the following:

◆ is calculating:

◆time_range:(current_clip_start_time,current_clip_end_time)

◆ position current _ positions

◆ avg _ img _ contribution. average scale is calculated using the position in the "positions" field.

In step 313, the information of the object record is written into the database 240.

In step 315, the object record is deleted in the cache 250. The process then proceeds to step 317.

If the comparison result is negative, indicating that the difference is less than the time interval, then in step 312, the object record is updated in the cache 250. The updated object record may include, for example, the update items "current _ clip _ end _ time" and "clip _ end _ time" in the markup object, and add the object location to the "current location" item in the markup object. Examples of the updated object record may include the following:

◆ clip _ end _ time stamp

◆ Current positions

When the object record in the cache 250 is updated, the flow goes to step 317.

In step 317, the process of steps 301 to 317 may be repeated when the next time interval begins. The process may be repeated until the cache is full or all identified objects have been cached.

A video clip may be generated from the received media program and with reference to the associated object record.

The clip manager 260 stores one or more object records and provides the object records as basic information for the segments of the object records. In step 318, one or more video clips are generated using the information related to the clips recorded by the object. The video segments include timestamp information and associated scenes identifying objects in the media program. In step 319, the media player/mixer 140 may modify or edit the media program. Optionally, in step 320, the video segment presented on the second display 120 of fig. 1 may obtain resources from the clip manager 260 or the server 230. The clip manager 260 may retrieve media program assets from the server 230 and generate video segments associated with object recordings.

Alternatively, the video clips presented by the second display 120 of fig. 1 may be obtained from a third party. The clip manager 260 and the server 230 may be separate entities or combined in one unit server.

Optionally, the video segments may be periodically pushed to the server 130, 230 as the media program continues to play on the first display 110.

The clip manager 260 may periodically review and generate video clips or object records. Alternatively, the clip manager 260 may be a passive module driven by the server 130, 230.

Thus, when the difference between current _ clip _ end _ time recorded by the object in the cache and clip _ end _ time of the identified object is greater than a certain time interval, the clip _ end _ time items recorded by different objects are compared, and then steps 311, 313, 315 and 317 are repeatedly executed; otherwise, steps 312 and 317 are repeated.

The video clip manager 260 or the database 240 or the cache 250 may be located in the server 130 or separate from the server 130.

FIG. 4 illustrates an example of object tracking and rendering in a media program on a first display 400 according to an embodiment of the present invention. The first display may be displayed on a user device. Corresponding time series t0、t1、t2、t3And tnExample scenes 410, 420 and 430 of the media program displayed on the first display 400. t is t0And t1A person and some items in the media program are shown in the scene 410 that is playing at a moment in time, and the backpack 440 is specifically shown with an indicator 450. The backpack 440 is a bag-pack product. Preferably, the backpack has shoulder straps to facilitate travel and hiking or carry a laptop computer. Here, the knapsack 440 represents the product objects identified in the above embodiments to display potential information that may be available to them. Over time, the media program continues to play the next scene. In the scenario 420, the backpack is taken away from the scenario 410 where it was originally located. When the position of the backpack is changed, the indicator 450 is moved so that the backpack 440 can always remain indicated. In the scenario 430, the backpack 440 is always in a moving state, and the indicator 450 moves with the backpack 440 no matter where the backpack 440 can be displayed on the first display 400.

According to the described embodiment of the present invention, the example scenes 410, 420, and 430 displayed on the first display 400 may be video clips of the media program generated by the server 130, 230. The user equipment may retrieve from the server 130, 230The example scenarios 410, 420, and 430. FIG. 5 illustrates an example of object tracking and rendering in a media program on the second display 500, according to an embodiment of the present invention. As described above, the user 100 views the media program on the first display 110, 400, and when the user 100 is interested in the object identified by the transparent indicator, the user 100 may pause or continue playing the media program and open the application in the tablet 501. The tablet 501 may also be a mobile device, a laptop, a personal computer, or the like. The tablet 501 may have a second display 500. When the application is started, the interface or menu of the application may have multiple UIs: UI 502, UI 503, UI 504, and UI 505. In addition to the UI 502, UI 503, UI 504, and UI 505, several lines of information are provided that are associated with one of the one or more identification objects that are located on the menu and in proximity to the several lines of information. The several lines of information may provide brief information of the one or more identifying objects. When an image or several lines of brief information are selected, more information is displayed to obtain further information of the object. Optionally, when the image of the UI 504 is selected, a video clip is played on the second display 120, 500. The second display 120, 500 may be different from the first display 110, 400. Based on the ID of the user subscription or the synchronization signal communicated between the first display 110 and the server 130, the server 130 may update the UI and related information to the one or more identification objects on the application to match the media program playing on the first display 110. Thus, when the image UI is selected, the timestamp of the segment of the media program retrieved from the server 130 is closest to the current actual time of the media program on the first display 110, 400. The segment of the media program played on the second display 120, 500 shows the logo object that was just displayed on the first display 110, 400 within a short time. For example, when the user is in time slot t3I.e. the time when the first display 400 displays the media program, is opened by the selected image UI 504 linked video clips may preferably cover the time series t in the corresponding fig. 40–t2Scenes 541, 542, and 543. The scenes 541, 542, and 543 displayed on the second display 500 may have been edited to include other information of the original scene of the media program. Preferably, a plurality of image UIs are listed in a time-series order on the menu UI of the application, the top-level UI being linked to a time-series of video segments that is closest to a current actual time-series of the media program displayed on the first display 110, 400, and the time-series may be displayed with a time-stamp of the media program.

Fig. 6 is a flow chart for providing object information associated with a media program according to an embodiment of the present invention, a server 600 communicates with a device 601, the server 600 may be a media program provider or a media player/mixer or server, the server 600 may include a server 130, 230 as described above in the figures and embodiments, the server identifies an object of interest in a frame image of the media program in step 610, inserts an indicator into the frame image to identify the object of interest in step 620, generates a video segment associated with the identified object from a portion of the media program in step 630, constructs an information page containing information about the object of interest in step 640, wherein the information page contains a window for displaying the video segment, transmits the media program to the user device 601 in step 650, transmits a UR L of the information page to the user device 601 in step 660, receives an information page request associated with the user device 601 for displaying the media program in step 670, transmits the information page including the video segment to the user device 601 in a different order.

The device 601 may be a device having a plurality of displays (a first display and a second display) or may be a stand-alone device (device 1 and device 2) having a stand-alone display (a first display and a second display). in step 611, the device 601 obtains a UR L of an information page associated with an object of interest, wherein the object of interest is presented in a frame of a media program, the frame including an indicator identifying the object of interest, requests the information page on the UR L in step 612, receives the information page including a window for displaying a video clip, wherein the video clip is based on a portion of the media program and is associated with the object of interest, displays the information page and the video clip in the window in step 614.

Fig. 7 is a block diagram of components of a machine 700 according to some example embodiments of the invention. The machine 700 may be capable of reading instructions from a machine-readable medium (e.g., a machine-readable storage medium) and performing any one or more of the embodiments described herein.

In particular, FIG. 7 illustrates an example form of a computer system. In the computer system, instructions, such as software, programs, applications, applets, or other executable code may be executed to cause the machine 700 to perform any one or more of the embodiments described herein. For example, the instructions may cause the machine to perform the flow diagrams of fig. 1-6. Which transform the generic, unprogrammed machine into a specific machine that has been programmed to perform the functions described and illustrated in the manner described. In alternative embodiments, the machine 700 operates as a standalone device, such as a mobile device, or may be coupled (e.g., networked) to other machines. In a networked deployment, the machine 700 may operate in the capacity of a server machine or a client machine in server-client network environments. The machine 700 may include, but is not limited to, a server computer, a client computer, a Personal Computer (PC), or any machine capable of executing instructions of the processes of fig. 1-6, sequentially or otherwise, that specify actions to be performed by the machine 700. Further, while only a single machine 700 is illustrated, the term "machine" shall also be taken to include a collection of machines 700 that individually or jointly execute the instructions of the flows of fig. 1-6 to perform any one or more of the methodologies discussed herein. For example, replacing the device 130 in fig. 1 or the device 230 in fig. 2, or replacing a device with the first display 110 or a device with the second display 120 as shown in fig. 1.

The machine 700 includes a memory 701, a processor 702, a peripheral interface 703, a memory controller 704, a Graphics Processing Unit (GPU) 705, and one or more sensors 706. The processor may also be coupled to a function definition module 707, an activation module 708, and a Central Processing Unit (CPU) 709. The CPU may also be coupled to the peripheral interface 703. The apparatus may also include Radio Frequency (RF) circuitry 721, which may include a Wi-Fi interface and/or a bluetooth interface. The device may also include one or more external ports 722, audio circuitry 723, and/or a microphone 726, etc., the audio circuitry 723 may be further connected to one or more speakers 725. The device may further comprise a screen 724, which screen 724 may be coupled with the peripheral interface 703. These components communicate over one or more communication buses or signal lines. The machine 700 may be, for example, a server, a palm top computer, a tablet computer, a mobile electronic device, a mobile phone, a media player, or a Personal Digital Assistant (PDA). The server may not be equipped with the microphone 726, the screen 724, and the speaker 725. The various components shown in fig. 7 may be implemented in hardware or a combination of hardware and software, including one or more signal processing circuits and/or application specific integrated circuits.

In other embodiments of the invention, the memory 701 may include Storage remote from the machine 700, such as Network-connected Storage accessed through the RF circuitry 721 or external port 722 and a communication Network (not shown) including, for example, the internet, AN intranet, a local Area Network (L environmental Network, L AN), a Wide Area Network (Wide L environmental Area Network, W L AN), a Storage Area Network (SAN), or the like, or any suitable combination thereof.

The peripherals interface 703 couples input and output peripherals of the machine 700 to the CPU709, the GPU705 and the memory 701. The CPU709 executes various software programs and/or sets of instructions stored in the memory 701 to perform various functions of the machine 700 and to process data. The GPU705 handles the graphics processing functions of the screen 724. In some embodiments, the graphics processing functions shown are processed by the CPU709, and thus the GPU705 may be saved.

The external port 722 is for coupling directly or remotely to other devices over a network. For example, the external port 722 may include a Universal Serial Bus (USB), a firewire, a memory slot for receiving an external storage medium, and the like.

The RF circuitry 721 may include well-known circuitry for performing these functions, including but not limited to AN antenna System, AN RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC, a Subscriber Identity Module (SIM) card, memory, etc. the RF circuitry 721 may communicate with a network such as the Internet, AN intranet, and/or a Wireless network (e.g., a cellular telephone network, a Wireless local area network (L AN), and/or a Metropolitan Area Network (MAN)), as well as other devices via Wireless communication.the Wireless communication network may use any of a variety of communication standards, protocols, and technologies, including but not limited to a Global System for Mobile Communications (Glyphons), a Mobile Communications network (Mobile for Mobile Communications, IEEE), a high speed network (IEEE) and a high speed uplink Data (GSM) network, a high speed uplink-access network (IEEE) 802, a high speed uplink-speed uplink (GSM) communication, a high speed uplink-speed uplink (IP-speed uplink) communication (IEEE) 802, a high-speed uplink (IEEE-speed uplink, or other suitable communication Protocol, IEEE-speed-Data-access network (IEEE-uplink, IEEE-speed-communication System, IEEE-IP, IEEE-802, and/Wireless communication systems, IEEE-uplink-IP-communication systems, IEEE-radio-communication systems, IEEE-802, and Wireless communication systems, IEEE-IP.

The audio circuitry 723 is connected to one or more speakers 725 and a microphone 726, these components together provide an audio interface between a user and the machine 700. the audio circuitry 723 receives audio data from the peripheral interface 703, converts the audio data to electrical signals, and sends electrical signals to the speakers 725. the speaker 725 converts the electrical signals to sound waves audible to the human ear. the audio circuitry 190 also receives electrical signals formed by the conversion of sound waves by the microphone 726. the audio circuitry 723 converts the electrical signals to audio data and sends the audio data to the peripheral interface 703 for processing.

The operating system 710 (e.g., Android, RTXC, L inux, UNIX, Apple OS X, Microsoft Windows, or embedded operating systems such as VxWorks) stored in the memory 701 includes various software components and/or drivers for controlling and managing conventional system tasks (e.g., memory management, storage device control, power management, etc.) and facilitating communication between various hardware and software components.

Although the present invention has been described with reference to particular features and embodiments, it should be understood that various changes and substitutions may be made in the embodiments of the invention without departing from the spirit and scope of the invention as defined by the appended claims.

21页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:使用电容感应的在头上/离开头检测

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类