System and method for operating a physical entity based on a virtual representation of the physical entity

文档序号:1286334 发布日期:2020-08-28 浏览:4次 中文

阅读说明:本技术 用于基于物理实体的虚拟表示来操作物理实体的系统和方法 (System and method for operating a physical entity based on a virtual representation of the physical entity ) 是由 S.穆尼尔 S.达斯 于 2020-02-19 设计创作,主要内容包括:公开了用于基于物理实体的虚拟表示来操作物理实体的系统和方法。一种操作多个电子装置中的所选择的电子装置的方法,包括:利用成像装置生成至少所选择的电子装置的图像数据;基于所生成的图像数据在显示装置上显示至少所选择的电子装置的虚拟表示;以及利用输入装置接收用户输入。该方法进一步包括:使用控制器将用户输入与所选择的电子装置相关联;利用收发器将装置数据传送到所选择的电子装置;以及基于所传送的装置数据利用所选择的电子装置执行操作。(Systems and methods for operating a physical entity based on a virtual representation of the physical entity are disclosed. A method of operating a selected electronic device of a plurality of electronic devices, comprising: generating image data of at least the selected electronic device using the imaging device; displaying a virtual representation of at least the selected electronic device on a display device based on the generated image data; and receiving user input with the input device. The method further comprises the following steps: associating, using a controller, a user input with the selected electronic device; transmitting device data to the selected electronic device using the transceiver; and performing an operation with the selected electronic device based on the transmitted device data.)

1. A method of operating a selected electronic device of a plurality of electronic devices, comprising:

generating image data of at least the selected electronic device using the imaging device;

displaying a virtual representation of at least the selected electronic device on a display device based on the generated image data;

receiving a user input with an input device;

associating, using a controller, a user input with the selected electronic device;

transmitting device data to the selected electronic device using the transceiver; and

performing an operation with the selected electronic device based on the transmitted device data.

2. The method of claim 1, further comprising:

building a database based on the appearances of the plurality of electronic devices; and

the user input is associated with the selected electronic device by comparing the image data of the selected electronic device with the appearance of the selected electronic device in the database.

3. The method of claim 1, further comprising:

establishing an electronic communication link with the selected electronic device; and

the device data is transmitted to the selected electronic device over the electronic communication link using the transceiver.

4. The method of claim 1, wherein:

the selected electronic device is a light fixture,

the operation activates and/or deactivates the light fixture,

the light fixture emits light when activated, an

When the light fixture is deactivated, the light fixture is prevented from emitting light.

5. The method of claim 1, wherein:

the selected electronic device comprises a further display device, and

the operation includes displaying device data on another display device.

6. The method of claim 5, wherein:

the device data includes price quotes.

7. The method of claim 1, wherein:

the selected electronic device is a smart phone,

the operations include displaying a notification on the smartphone, an

The notification includes data regarding the price quote.

8. An electronic control device for operating a selected electronic device of a plurality of electronic devices, the electronic control device comprising:

an imaging device configured to generate image data of at least the selected electronic device;

a memory operatively connected to the imaging device and configured to store image data;

a display device operatively connected to the memory and configured to display a virtual representation of at least the selected electronic device;

an input device configured to receive a user input;

a controller configured to associate a user input with a selected electronic device; and

a transceiver operatively connected to the selected electronic device and configured to transmit device data to the selected electronic device,

wherein the selected electronic device is configured to perform an operation in response to receiving the transmitted device data.

9. The electronic control device according to claim 8, further comprising:

a database stored in the memory, the database based on the appearance of the plurality of electronic devices,

wherein the controller is configured to associate the user input with the selected electronic device by comparing image data of the selected electronic device with an appearance of the selected electronic device in the database.

10. The electronic control device according to claim 8, wherein:

the controller is configured to establish an electronic communication link with the selected electronic device; and

the transceiver is configured to transmit the device data to the selected electronic device over the electronic communication link.

11. The electronic control device according to claim 8, wherein:

the selected electronic device is a light fixture,

the operation activates and/or deactivates the light fixture,

the luminaire is configured to emit light when activated, an

When the light fixture is deactivated, the light fixture is prevented from emitting light.

12. The electronic control device according to claim 8, wherein:

the selected electronic device comprises a further display device, and

the operation includes displaying device data on another display device.

13. The electronic control device according to claim 12, wherein:

the device data includes price quotes.

14. The electronic control device according to claim 8, wherein:

the selected electronic device is a smart phone,

the operations include displaying a notification on the smartphone, an

The notification includes data regarding the price quote.

15. An electronic control system comprising:

a plurality of electronic devices, each electronic device comprising a first transceiver;

a control device for remotely operating a selected electronic device of the plurality of electronic devices, the control device comprising

An imaging device configured to generate image data of at least the selected electronic device,

a memory operatively connected to the imaging device and configured to store image data,

a display device operatively connected to the memory and configured to display a virtual representation of at least the selected electronic device,

an input device configured to receive a user input,

a controller configured to associate a user input with a selected electronic device, an

A second transceiver operatively connected to the selected electronic device and configured to transmit device data to the first transceiver of the selected electronic device,

wherein the selected electronic device is configured to perform an operation in response to receiving the transmitted device data.

16. The electronic control system of claim 15, further comprising:

a database stored in the memory, the database based on the appearance of the plurality of electronic devices,

wherein the controller is configured to associate the user input with the selected electronic device by comparing image data of the selected electronic device with an appearance of the selected electronic device in the database.

17. The electronic control system of claim 15, further comprising:

the first transceiver is configured to establish an electronic communication link with a second transceiver of the selected electronic device; and

the first transceiver is configured to transmit device data to the second transceiver over the electronic communication link.

18. The electronic control system of claim 15, further comprising:

the selected electronic device is a light fixture,

the operation activates and/or deactivates the light fixture,

the luminaire is configured to emit light when activated, an

When the light fixture is deactivated, the light fixture is prevented from emitting light.

19. The electronic control system of claim 15, wherein:

the selected electronic device comprises a further display device,

the operation includes displaying device data on another display device, an

The device data includes price quotes.

20. The electronic control system of claim 15, wherein:

the selected electronic device is a smart phone,

the operations include displaying a notification on the smartphone, an

The notification includes data regarding the price quote.

Technical Field

The present disclosure relates to the field of electronic control systems, and in particular to electronic control systems for controlling remotely located electronic devices.

Background

Some electronic devices are remotely controllable via dedicated control devices and interfaces. For example, some light fixtures may be activated and deactivated via an application (i.e., "app") running on a smartphone. However, these applications may be confusing for some users, as it may not be clear to the user which luminaires are controllable with the application. For example, a user has installed five remotely controllable luminaires in a room. Luminaires are identified by non-descriptive names in applications, which makes it difficult to select a particular luminaire for control. Due to difficulties in selecting a particular light fixture for remote control, a user may be dissuaded from using an application to control the light fixture.

Based on the above, improvements in technology that enable a user to easily and efficiently remotely control an electronic device are desirable.

Disclosure of Invention

According to an exemplary embodiment of the present disclosure, a method of operating a selected electronic device of a plurality of electronic devices includes: generating image data of at least the selected electronic device using the imaging device; displaying a virtual representation of at least the selected electronic device on a display device based on the generated image data; and receiving user input with the input device. The method further comprises the following steps: associating, using a controller, a user input with the selected electronic device; transmitting device data to the selected electronic device using the transceiver; and performing an operation with the selected electronic device based on the transmitted device data.

According to another exemplary embodiment of the present disclosure, an electronic control device is for operating a selected electronic device of a plurality of electronic devices. The electronic control device includes an imaging device, a memory, a display device, an input device, a controller, and a transceiver. The imaging device is configured to generate image data of at least the selected electronic device. The memory is operatively connected to the imaging device and configured to store image data. A display device is operatively connected to the memory and configured to display a virtual representation of at least the selected electronic device. The input device is configured to receive a user input. The controller is configured to associate the user input with the selected electronic device. The transceiver is operatively connected to the selected electronic device and configured to transmit device data to the selected electronic device. The selected electronic device is configured to perform an operation in response to receiving the transmitted device data.

According to a further exemplary embodiment of the present disclosure, an electronic control system includes: a plurality of electronic devices, each electronic device comprising a first transceiver; and a control device for remotely operating the selected electronic device among the plurality of electronic devices. The control device includes an imaging device, a memory, a display device, an input device, a controller, and a second transceiver. The imaging device is configured to generate image data of at least the selected electronic device. The memory is operatively connected to the imaging device and configured to store image data. A display device is operatively connected to the memory and configured to display a virtual representation of at least the selected electronic device. The input device is configured to receive a user input. The controller is configured to associate the user input with the selected electronic device. The second transceiver is operatively connected to the selected electronic device and is configured to transmit device data to the first transceiver of the selected electronic device. The selected electronic device is configured to perform an operation in response to receiving the transmitted device data.

Drawings

The above described and other features and advantages will become more readily apparent to those of ordinary skill in the art by reference to the following detailed description and accompanying drawings, in which:

fig. 1 is a block diagram illustrating an electronic control system including a control device for remotely controlling a plurality of electronic devices;

FIG. 2 is a flow chart illustrating an exemplary method of operating the electronic control system of FIG. 1;

FIG. 3 is a block diagram illustrating an exemplary use of the electronic control system of FIG. 1; and

FIG. 4 is a block diagram illustrating another exemplary use of the electronic control system of FIG. 1.

Detailed Description

For the purposes of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiments illustrated in the drawings and described in the following written specification. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. It is further understood that the present disclosure includes any alterations and modifications to the illustrated embodiments and includes further applications of the principles of the disclosure as would normally occur to one skilled in the art to which the disclosure relates.

Aspects of the disclosure are disclosed in the accompanying description. Alternative embodiments of the present disclosure and equivalents thereof may be devised without departing from the spirit or scope of the present disclosure. It should be noted that any discussion herein of "one embodiment," "an embodiment," and "example embodiment," etc., indicates that the embodiment described may include a particular feature, structure, or characteristic, and that such particular feature, structure, or characteristic may not necessarily be included in every embodiment. Furthermore, references to the foregoing do not necessarily include references to the same embodiments. Finally, a person of ordinary skill in the art will readily appreciate whether or not a specific feature, structure, or characteristic of a given embodiment is described explicitly herein as being utilized in connection with, or in combination with, a particular feature, structure, or characteristic of any other embodiment discussed herein.

For the purposes of this disclosure, the term "a and/or B" means (a), (B), or (a and B). For the purposes of this disclosure, the term "A, B and/or C" means (A), (B), (C), (A and B), (A and C), (B and C), or (A, B and C).

The terms "comprising," "including," and "having," and the like, as used with respect to embodiments of the present disclosure, are synonymous.

As shown in fig. 1, the electronic control system 100 is configured to bridge the separation between the physical world and its virtual representation. The electronic control system 100 includes a control device 104 and at least one electronic device 108 (i.e., a physical entity). Each of the control device 104 and the electronic device 108 is operatively connected to an electronic network, such as the internet 112. The electronic control system 100 enables a user to actuate or operate the electronic device 108 by interacting with a virtual depiction of the electronic device 108 on the control device 104. For example, the control device 104 displays an interior image of a smart home and clicks on pixels of a light fixture (i.e., the exemplary electronic device 108) to activate or deactivate a light bulb of the light fixture. Further, in a retail environment, when an operator selects a virtual depiction of a customer, the electronic control system 100 transmits a discount coupon or other special price offer to the selected customer. In this way, the electronic control system 100 links the virtual representation of the entity shown by the control device 104 to the physical world entity. The electronic control system 100 is also configured to automatically disambiguate the target (i.e., the selected electronic device 108) and provide a communication medium to send commands to the intended target. Each element of the electronic control system 100 is described below.

In the exemplary embodiment, control device 104 is a personal electronic device, such as a smartphone, tablet computer, or desktop computer, that includes an imaging device 116, a display device 120, an input device 124, a transceiver 128, and a memory 132, each operatively connected to a controller 136. The imaging device 116, which is also referred to herein as a camera or digital camera, is configured to generate image data 140 corresponding to a virtual representation of a region. In particular, the image data 140 includes a virtual representation of physical elements and structures located in a field of view 142 (FIG. 3) of the imaging device 116. When the electronic device 108 is within the field of view 142 of the imaging device 116, the imaging device 116 generates image data 140 having data points corresponding to the virtual representation of the electronic device 108. As described below, in another embodiment, the imaging device 116 is a security camera in a retail establishment (see fig. 4).

As shown in fig. 1, in one embodiment, the display device 120 is a Liquid Crystal Display (LCD) panel configured to display text, images, and other visually understandable data. In another embodiment, the display device 120 is any display as would be appreciated by one of ordinary skill in the art including, but not limited to, an active matrix organic light emitting diode display. The display device 120 is configured to display a virtual representation of the area and the electronic device 108, as included in the image data 140. For example, when the electronic device 108 is in the field of view 142 of the imaging device 116, the display device 120 displays a virtual representation of the electronic device 108 and the area near the electronic device 108. As described below, in another embodiment, the display device 120 is a security monitor in a retail establishment (see fig. 4).

The display device 120 may also display a Graphical User Interface (GUI) for operating the control device 104 and/or the electronic device 108. GUI data 144 corresponding to the displayed GUI is saved to the memory 132.

In the exemplary embodiment of fig. 1, the input device 124 is a touch screen applied on the display device 120, which is configured to respond to a touch of a finger or a stylus. The input device 124 is touch-sensitive and transparent, and is, for example, a capacitive touch screen. The input device 124 generates input data 148 that is stored in the memory 132. The input device 124 is configured to enable a user to interact with the electronic device 108 by touching the portion of the display device 120 on which the virtual representation of the electronic device 108 is displayed. In yet another embodiment, the input device 124 is any device configured to generate an input signal and input data 148 as desired by one of ordinary skill in the art. For example, the input device 124 may include a plurality of buttons configured to enable an operator of the control device 104 to generate the input data 148.

The transceiver 128 of the control device 104, also referred to as a wireless transmitter and receiver, is configured to wirelessly transmit electronic data (i.e., device data 152) from the control device 104 to the electronic device 108 and wirelessly receive electronic data from the electronic device 108 via the internet 112. Thus, the transceiver 128 operatively connects the control device 104 to the electronic device 108. In other embodiments, the transceiver 128 transmits and receives data using a cellular network, a wireless local area network ("Wi-Fi"), a personal area network, and/or any other wireless network. Thus, the transceiver 128 is compatible with any desired wireless communication standard or protocol, including but not limited to near field communication ("NFC"), IEEE802.11, IEEE802.15.1 ("Bluetooth"), Global System for Mobile ("GSM"), and code division multiple Access ("CDMA").

In one embodiment, memory 132 is configured to store image data 140, GUI data 144, input data 148, and device data 152. The memory 132 is also configured to store a database 156 generated based on the physical appearance of each of the plurality of electronic devices 108. In another embodiment, the database 156 is stored on a remote computer (not shown) connected to the control device 104 via the internet 112. The memory 132 is also referred to herein as a non-transitory computer-readable medium.

The controller 136 of the control device 104 is configured to execute program instructions to operate the imaging device 116, the display device 120, the input device 124, and the memory 132. The controller 136 is provided as at least one microcontroller and/or microprocessor. In one embodiment, the controller 136 processes the input data 148, the image data 140, and the database 156 to disambiguate the input from the user and identify the user-selected electronic device 108 from among the other electronic devices 108.

Referring again to fig. 1, the example electronic device 108 includes an output 160, a transceiver 164, an accelerometer 168, a gyroscope 166, and a memory 172, each operatively connected to a controller 176. The electronic device 108 represents any device that is remotely controllable by the control device 104. Examples of electronic devices 108 include, but are not limited to, any combination of light fixtures, light bulbs, smart phones, ceiling fans, televisions, desktop computers, tablet computers, digital signage, highway billboards, music players, and the like. The output 160 of the electronic device 108 is therefore dependent on the particular embodiment. In the exemplary embodiment described herein, the outputs 160 are the lighting elements of the light fixture 108a (FIG. 3), the motor of the ceiling fan 108b (FIG. 3), and the display of the smart phone or television 108c (FIG. 3). The output 160 may be configured to be in at least two states including an "off state" and an "on state". In an embodiment of the light fixture 108a, the output 160 does not emit light in the "off state" and the output 160 emits light in the "on state". In the embodiment of ceiling fan 108b, output 160 does not rotate the fan blade in the "off state" and output 160 rotates the fan blade in the "on state". In the smart phone and television 108c embodiment, output 160 does not display a screen in the "off state" and output 160 displays a screen in the "on state". Based on the above, in one embodiment, the electronic device 108 is a smartphone, or in another embodiment, the electronic device 108 is incorporated into a smartphone. In each embodiment, the smart phone may be carried by a person.

The transceiver 164 (also referred to as a wireless transmitter and receiver) of the electronic device 108 is configured to wirelessly transmit electronic data from the electronic device 108 to the control device 104 and to wirelessly receive electronic data from the control device 104 via the internet 112. Thus, the transceiver 164 operatively connects the electronic device 108 to the control device 104. In other embodiments, the transceiver 164 transmits and receives data using a cellular network, a wireless local area network ("Wi-Fi"), a personal area network, and/or any other wireless network. Thus, the transceiver 164 is compatible with any desired wireless communication standard or protocol, including but not limited to near field communication ("NFC"), IEEE802.11, IEEE802.15.1 ("Bluetooth"), Global System for Mobile ("GSM"), and code division multiple Access ("CDMA").

In one embodiment, the transceiver 164 receives the device data 152 from the control device 104. The device data 152 is configured to change a state of an output 160 of the electronic device 108. For example, if the electronic device 108 is a light fixture, the device data 152 may cause the electronic device 108 to change the lighting element from an "off state" to an "on state. Similarly, other device data 152 may cause the electronic device 108 to change the lighting element from an "on state" to an "off state.

The transceiver 164 is further configured to transmit the identification data 180 stored in the memory 172 to the control device 104 upon receiving a corresponding electronic request from the control device 104. The identification data 180 uniquely identifies the electronic device 108 to the control device 104 such that each electronic device 108 of the plurality of electronic devices is individually selectable and identifiable by the control device 104.

The gyroscope 166 and accelerometer 168 are sensors configured to generate motion data 184 when the electronic device 108 is carried, worn, and/or utilized by a user, such as embodiments in which the electronic device 108 is a smartphone or tablet computer, for example. As the user moves, the motion data 184 may correspond to a step rate of the user, a velocity of the user, and/or an acceleration of the user. Thus, the motion data 184 is usable by the control device 104 to enable the control device 104 to automatically identify the electronic device 108 from among the plurality of other electronic devices by matching the movement of the electronic device 108 according to the image data 140 to the movement of the electronic device 108 according to the motion data 184. The gyroscope 166 is provided as any type of gyroscope, such as a single-axis or multi-axis microelectromechanical (MEMS) gyroscope. Accelerometer 168 is provided as any type of accelerometer, such as a single or multi-axis MEMS accelerometer.

The memory 172 is configured to store identification data 180, motion data 184, and application data 188. The memory 172 is also referred to herein as a non-transitory computer-readable medium.

The application data 188 is included in some embodiments of the electronic device 108, such as when the electronic device 108 is a smartphone or tablet computer. The application data 188 enables the electronic device 108 to run an "app" that configures the electronic device 108 for electronic data transfer with the control device 104. The application data 188 may cause a GUI to be displayed on the output 160. The app may also leverage the gyroscope 166 and accelerometer 168 to adjust the generation of the motion data 184.

In operation, the electronic control system 100 is configured to implement the method 200 for controlling the electronic device 108 with the control device 104, illustrated by the flowchart of fig. 2. The method 200 is described in relation to an exemplary smart home embodiment of the electronic control system 100 illustrated in fig. 3.

As shown in fig. 3, electronic control system 100 is located in an exemplary smart home and includes a control device 104 configured as a smart phone and three electronic devices 108 (i.e., a plurality of electronic devices) including a light fixture electronic device 108a, a ceiling fan electronic device 108b, and a television electronic device 108 c. In the smart home, the control device 104 operates as a "remote control" for the electronic devices 108a, 108b, 108c, which is configured to activate and deactivate each of the outputs 160 of the electronic devices 108a, 108b, 108 c.

According to the method 200, the user wants to remotely control a selected one of the electronic devices 108a, 108b, 108 c. In this example, the selected electronic device is the light fixture electronic device 108 a. In block 204, the method 200 includes generating, with the imaging device 116 of the control device 104, image data 140 for at least the selected electronic device 108 a. To generate the image data 140, the user points the imaging device 116 at the selected electronic device 108a so that the selected electronic device 108a is in the field of view 142 of the imaging device 116, in the same manner that the user would use a smartphone to take a picture or video of the selected electronic device 108 a. The imaging device 116 generates image data 140 and at least a portion of the image data 140 is stored in the memory 132.

Next, in block 208 and as shown in fig. 3, the controller 136 of the control device 104 processes the image data 140 and shows a virtual representation of the selected electronic device 108a and an area around the selected electronic device 108a on the display device 120. As shown, in this example, the virtual representations include a virtual representation 240 of a light fixture, a virtual representation 244 of a ceiling fan, and a virtual representation 248 of a television. As the user moves the imaging device 116 of the control device 104, the image on the display device 120 changes and the display device 120 is continuously updated so that the image on the display device 120 corresponds to an area in the field of view 142 of the imaging device 116.

In block 212, the control device 104 determines whether the input device 124 has received an input. If no input has been received, the control device 104 continues to generate the image data 140 and display the virtual representations 240, 244, 248. However, if at block 212, the control device 104 determines that an input has been received with the input device 124, the control device 104 stores the corresponding input data 140 in the memory 132, and the method 200 continues to block 216. In one embodiment, the user provides input to the control device 136 by touching the display device 120 (i.e., the touch-sensitive input device 124 overlaid on the display device 120) at a location corresponding to the selected virtual representation 240 of the electronic device 108 a. For example, in fig. 3, the user touches the input device 124 at an input location 252 corresponding to the virtual representation 240 of the luminaire and generates corresponding input data 148.

At block 216 of the method 200, the control device 104 determines whether the input data 140 corresponds to one of the electronic devices 108, and the controller 136 associates the input data 140 with the selected electronic device 108 a. In one example, the control device 104 compares the image data 140 in a local region 256 around the input location 252 to the database 156 to determine whether the input data 140 corresponds to one of the virtual representations 240, 244, 248 of the electronic devices 108a, 108b, 108 c. The database 156 is constructed using image data of the electronic device 108 including a plurality of angles and a plurality of illumination types. In a particular embodiment, the database 156 is a neural network trained using image data of the electronic device 108. The image data 140 from the local region 252 is input to the database 156 (i.e., the neural network), and the neural network generates a corresponding output that identifies the local region 252 as including the virtual representation 240, 244, 248 of one of the electronic devices 108a, 108b, 108c or as not including the virtual representation 240, 244, 248 of one of the electronic devices 108a, 108b, 108 c.

At block 216, when the comparison of the image data 140 at the local area 256 with the database 156 indicates that the input location 252 does not correspond to the virtual representation 240, 244, 248 of one of the electronic devices 108a, 108b, 108c, the method 200 returns to block 204 and the imaging device 116 continues to generate the image data 140. This is because the input data 148 is not associated with a change in state of the output 160 of one of the electronic devices 108a, 108b, 108 c. The control device 104 therefore takes no action in attempting to control one of the electronic devices 108a, 108b, 108c based on the input data 148.

At block 216, the controller 136 identifies the selected electronic device 108a when the comparison of the image data 140 at the local area 256 with the database 156 indicates that the input location 252 does correspond to the virtual representation 240, 244, 248 of one of the electronic devices 108a, 108b, 108 c. Thus, at block 216, the control device 104 has made a connection between the virtual representation 240 of the luminaire 108a and the physical luminaire 108 a. The method 200 continues to block 220.

At block 220 of the method 200, the control device 104 transmits the device data 152 to the selected electronic device 108a, as identified by the user at the input location 252. In particular, the control device 104 establishes an electronic communication link with the selected electronic device 108a via the network 112 (i.e., the internet 112, a local area network connection, a bluetooth connection, and/or a Wi-Fi connection), and transmits the device data 152 to the selected electronic device 108a over the electronic communication link using the transceiver 128. The selected electronic device 108a receives the device data 152 using the transceiver 164, and the controller 176 processes the device data 152 to determine the operations to be performed by the output 160. In an exemplary embodiment, the operation performed by output 160 includes changing the state of the light fixture from an "on state" to an "off state" or from an "off state" to an "on state".

Depending on the embodiment, the operations performed by the electronic device 108 based on the transmitted device data 152 depend on the particular input data 148 received by the control device 104. For example, if the user touches the input location 252 with a single "tap" of a finger, the device data 152 is a "power on" signal that causes operations performed by the electronic device 108 to power the output 160, such as by turning on the light fixture 108a, turning on the ceiling fan 108b, and turning on the television 108 c. However, if the user touches the input location 252 with a double "tap" of a finger, then the input data 148 is a "power off" signal that causes operations performed by the electronic device 108 to disable the output 160, such as by turning off the light fixture 108a, turning off the ceiling fan 108b, and turning off the television 108 c. Any other input configuration may also be implemented by the control device 104 to enable remote control of any aspect of the electronic device 108, such as brightness/dimming control for the light fixture 108 a; fan speed control for the ceiling fan 108 b; as well as volume, channel, and input controls for television 108 c.

The electronic control system 100 provides advantages over the prior art. For example, some luminaires are remotely controllable with a dedicated device or through an app (application) that includes only the icon of the luminaire; however, the known luminaire is not remotely controllable via the app displaying the actual image (i.e. the virtual representation 240) of the luminaire to be controlled. Thus, the electronic control system 100 simplifies remote control of the electronic device 108 by preventing a user from having to remember any details of the electronic device 108, such as an identification name of the electronic device 108. Instead, the user simply points at the selected electronic device 108 to the control device 104 and touches the corresponding area of the input device 124/display device 120 to control the selected electronic device 108. In a smart home with many electronic devices 108, the electronic control system 100 makes control of each electronic device 108 very simple and convenient for the user.

In one embodiment, the setup process is performed by the electronic control system 100 when the electronic device 108 is added to a smart home. For example, during installation of a luminaire, such as luminaire 108a, the installer provides a mapping from image data corresponding to luminaire 108a to the electronic identification of luminaire 108a (such as the IP address of the luminaire). The electronic control system 100 may take any other setup and mapping method in response to additional electronic devices 108 being added to the system 100.

For example, in some embodiments, the control device 104 is configured to perform a trial and error method to determine the mapping and identification of the electronic device 108. For example, when a user selects a blue light as the selected electronic device 108, there may be two blue lights in a smart home. The controller 136 identifies a first blue light of the blue lights and transmits the device data 152 to the first blue light to illuminate the light. If an unselected lamp (i.e., a second blue lamp that does not correspond to input data 148) is illuminated, then controller 136 still makes a detection by processing image data 140 that shows that the first blue lamp corresponding to input data 148 has not changed state. Thus, the control device 104 sends further device data 152 corresponding to the input data 148 to the selected lamp to cause the selected lamp to change state. In some embodiments, a change in the state of the selected lamp (i.e., the selected electronic device 108) is identified by the controller 136 as a change in the appearance of the selected lamp, as virtually represented by the image data 140.

Further, when the electronic control system 100 performs visual recognition of the object, it captures various visual characteristics of the object. For example, when the controller 136 detects that the image data corresponds to a light (exemplary electronic device 108), the controller 136 stores the detected light characteristics in a data file that includes the color of the light, the size of the light, and the location of the light relative to other detected electronic devices 108 in the smart home. According to the method, when the input data 148 is generated by a user, the controller 136 performs object detection in the local region 252 of the image data 140 to determine a category of the object in the local region 252. That is, the controller 136 determines whether a refrigerator or a lamp is present in the local area 252. Then, after detecting the object class (e.g., luminaire), the controller 136 extracts more information about the characteristics of the detected object, such as the color, size, placement, and location of the object in the image data 140. The controller 136 uses this information to disambiguate the lamp from other light fixtures and other electronic devices 108 in the smart home.

Fig. 4 illustrates another exemplary use of electronic control system 100 in a retail facility. The retail facility includes a first hanger 264 being viewed by a first customer 268 and a second hanger 272 being viewed by a second customer 276. First customer 268 has a first smart phone 280 and second customer 276 has a second smart phone 284. In the exemplary embodiment, control device 104 is a two-part system that includes an imaging device 116, shown as a security camera, and a display device 120 provided as a monitor. The plurality of electronic devices 108 in this embodiment includes a first tablet computer 284, a second tablet computer 288, and a first smartphone 280. As explained below, the second smartphone 284 is not one of the electronic devices 108.

The first smartphone 280 is included as one of the electronic devices 108 in that the first smartphone includes an app (i.e., app data 188) that configures the smartphone 280 to operatively connect to the electronic control system 100. Accordingly, the control device 104 is configured to transmit the device data 152 to the first smartphone 280 to cause the first smartphone 280 to perform an operation, such as displaying a notification 292 that includes data regarding a price quote. The second smartphone 284 is not included as one of the electronic devices 108 because the second smartphone 284 does not include an app (i.e., app data 188) and is not operatively connected to the electronic control system 100. Thus, the control device 104 is not configured to transmit the device data 152 to the second smartphone 284.

The electronic control system 100 of fig. 4 is an improvement over existing retail facility security monitoring systems. In particular, in addition to providing typical benefits associated with security monitoring systems (such as real-time video monitoring), the electronic control system 100 also enables an operator of the control device 104 to provide specialized promotions, recommendations, and other information to selected ones of the plurality of customers.

For example, the virtual representation 296 of the first customer and the virtual representation 300 of the second customer are shown to an operator viewing the virtual representation of the retail establishment on the display screen 120 (i.e., blocks 204 and 208 of the flow chart of fig. 2). Further, the display device 120 displays virtual representations 304, 308 of at least some of the electronic devices. The smartphone 280 as the electronic device 108 may or may not be displayed because the smartphone 280 may be located in a pocket or purse of the customer 268.

By viewing the display device 120, the operator of the control device 104 determines that the first customer 268 is browsing the clothing selection of the first hanger 264, but has not yet selected an item for purchase. First customer 268 may have viewed, for example, a tablet 284 displaying the normal price of the item and not persuaded to make a selection. To persuade first customer 268 to make a selection and ultimately purchase, the operator uses input device 124 to touch the area of display device 120 corresponding to virtual representation 296 of the first customer at input location 312 (i.e., block 212 of fig. 2).

The control device 104 then determines the most efficient way to present, for example, special price quotes to the first customer 268. In one embodiment, the controller 136 processes the image data 140 to determine which of the electronic devices 280, 284, 288 are within the local region 316 near the input location 312. As shown in fig. 4, smartphone 280 and tablet 284 are within local area 316. The process for making this determination is described below.

The control device 104 determines that the tablet computer 284 is in the local region 316 by processing the image data 140 corresponding to the virtual representation 304 of the tablet computer and the database 156. Since the database 156 is constructed using the image and the location of the tablet computer 284, the control device 104 is able to identify the tablet computer 284 from the image data 140.

Control device 104 determines that first smartphone 280, which includes app data 188, is in local region 316 based on motion data 184 and image data 140. The control device 104 receives the motion data 184 generated by the gyroscope 166 and the accelerometer 168 from the first smartphone 280 via the internet/local network 112. Control device 104 is able to electronically communicate with first smartphone 280 and receive motion data 184 because smartphone 280 includes app data 188, which establishes permissions for extracting motion data 184 from smartphone 280. The app data 188 enables the first smartphone 280 to send electronic data to the control device 104 and receive electronic data from the control device 104. The motion data 184 is processed to determine fine-grained motion characteristics of the customer 268/smartphone 280 based on at least one of the movement speed of the customer 268, the acceleration of the customer 268, the movement time of the customer 268, and the resting time of the customer 268. The controller 136 then compares the motion data 184 to the image data 140 to identify one of the virtual representations 296, 300 of the customer having a movement corresponding to the motion data 184.

In addition to forming fine-grained movement characteristics of the customers 268 based on the movement data 184, the control apparatus 104 may also use additional data to identify the selected customers 268 from their virtual representations 296, 300. For example, Wi-Fi traffic data, bluetooth packets, gyroscope data, magnetometer data, and/or light sensor data may be used to construct fine-grained motion features of the smartphone 280 moving through the image data 140. Using this data, the customers 268, 276 are separated, matched and identified by performing a multi-modal sensor fusion by: the motion features are correlated across different sensing modalities. As the customer 268, 276 walks through the store, subtle movements due to his/her gait (micromovements) and long-term walking trajectories (macro-movements) are shown in the image data 140 and by the sensor (i.e., accelerometer 168) of the smartphone 280. In addition, Wi-Fi traffic generated by the smartphone 280 is leveraged to estimate angle of arrival (AoA) and distance from the imaging device 116 to capture micro-and macro-movements of the customers 268, 276. The control device 104 fuses the data together and captures the same motion characteristics from different perspectives, thereby enabling the control device 104 to detect, locate and identify the customers 268, 276 shown as virtual representations 296, 300 of the customers on the display device 120.

When a match occurs between the motion data 184 and the virtual representation 296 of the customer, the control device 104 transmits the device data 152 to a corresponding smartphone 280, which smartphone 280 is the first smartphone 280 in this example. The device data 152 causes the first smartphone 280 to perform operations, such as displaying a special discount price offer 292, which is a lower price than the normal price displayed on the tablet 284. Thus, in this manner, the electronic control system 100 has provided the customer 268 with a personalized discount, which may cause the customer 268 to purchase the clothing.

When no match between the motion data 184 and the virtual representation 296 of the customer occurs, then the control device 104 transmits the device data 152 to the nearest electronic device 108, such as the first tablet computer 284. The tablet 284 receives the device data 152 and performs operations such as displaying a discounted price offer instead of displaying a normal price for the item. Thus, even if the customer 268, 276 does not have the smartphone run app data 188, the electronic control system 100 is configured to deliver a personalized discount to the selected customer 268, which may cause the customer 268 to purchase the clothing.

In this embodiment, the electronic control system 100 improves upon typical security camera systems by enabling its operator to reach special offers, suggestions, promotions, discounts, recommendations and notifications to selected customers 268, 276.

In addition, the electronic control system 100 may be applied to other fields to improve the operation of the imaging system. For example, the techniques and methods described herein may be generalized to airports, hospitals, stadiums, theaters, academic buildings, and convention centers. The electronic control system 100 is usable anywhere a link from a pixel of an image to a physical entity is beneficial.

While the disclosure has been illustrated and described in detail in the drawings and foregoing description, the same is to be considered as illustrative and not restrictive in character. It is understood that only the preferred embodiment has been presented and that all changes, modifications, and further applications that come within the spirit of the disclosure are desired to be protected.

17页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:集成式模块化家居智能控制面板系统及其控制方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!