Visible light sensor configured to detect a glare condition

文档序号:1850879 发布日期:2021-11-16 浏览:3次 中文

阅读说明:本技术 被配置用于检测眩光状况的可见光传感器 (Visible light sensor configured to detect a glare condition ) 是由 C·A·凯西 B·普罗茨曼 于 2020-02-19 设计创作,主要内容包括:一种装置可以被配置为检测眩光状况并且可以包括光电感测电路和可见光电感测电路。所述光电感测电路可以被配置为周期性地生成指示照度值的照度信号。所述可见光感测电路可以被配置为在曝光时间周期性地记录空间的图像。所述装置可以接收来自所述光电感测电路的照度信号并且基于所述照度信号来确定当前照度。所述装置可以基于所述当前照度来调整所述可见光感测电路记录图像的频率。可以基于所述当前照度和眩光状况类型来确定所述曝光时间。在相应的曝光时间记录的图像可使高于特定照度值的像素泛白。所述装置可以检测在泛白像素的位置处的眩光状况。(An apparatus may be configured to detect a glare condition and may include a photo-sensing circuit and a visible photo-sensing circuit. The photo-sensing circuit may be configured to periodically generate an illuminance signal indicative of an illuminance value. The visible light sensing circuit may be configured to periodically record an image of the space at an exposure time. The apparatus may receive an illuminance signal from the photo-sensing circuit and determine a current illuminance based on the illuminance signal. The apparatus may adjust a frequency at which the visible-light sensing circuit records an image based on the current illuminance. The exposure time may be determined based on the current illuminance and a glare condition type. An image recorded at a corresponding exposure time may cause pixels above a particular illumination value to be whitened. The apparatus may detect a glare condition at the location of the whitened pixel.)

1. An apparatus for detecting a glare condition, the apparatus comprising:

a photo-sensing circuit configured to generate an illuminance signal indicative of a current illuminance value of light impinging on the photo-sensing circuit;

a visible light sensing circuit configured to record an image at an image processing rate; and

a control circuit configured to:

receiving the illumination signal from the photo sensing circuit,

determining a current illuminance value based on the illuminance signal, an

Adjusting the image processing rate based on the current luminance value determined from the luminance signal.

2. The apparatus of claim 1, wherein the control circuitry is further configured to:

determining an illumination value change based on the current illumination value and a previous illumination value; and

comparing the change in the illuminance value to an illuminance change threshold, wherein the image processing rate is adjusted based on the change in the illuminance value when the change in the illuminance value is greater than or equal to the illuminance change threshold.

3. The apparatus of claim 1, wherein the control circuit is further configured to compare the current illuminance value to an illuminance threshold, wherein when the current illuminance value is less than the illuminance threshold, the control circuit adjusts the image processing rate to zero to cause the visible light sensing circuit to stop recording images.

4. The device of claim 1, further comprising:

a communication circuit configured to transmit a signal;

wherein the control circuit is further configured to transmit control instructions via the communication circuit based on the image recorded by the visible light sensing circuit.

5. The apparatus of claim 4, wherein the control instructions are configured to control a daylight control apparatus.

6. The apparatus of claim 5, wherein the daylight control apparatus is a motorized window treatment, and wherein the control instructions are configured to control a shade position of the motorized window treatment.

7. The device of claim 5, wherein the daylight control device is a controllable dynamic glass.

8. The apparatus of claim 1, wherein the control circuit is configured to determine the current illuminance value based on the illuminance signal at a photo-sensing rate.

9. The apparatus of claim 1, wherein the control circuit is further configured to process one or more images recorded by the visible light sensing circuit to determine whether the glare condition is detected in the images.

10. An apparatus for detecting a glare condition, the apparatus comprising:

a photo-sensing circuit configured to generate an illuminance signal indicative of a current illuminance value of light impinging on the photo-sensing circuit;

a visible light sensing circuit configured to record an image; and

a control circuit configured to:

sampling the illumination signal from the photo-sensing circuit,

determining the current illuminance value based on the illuminance signal,

determining an exposure time for detecting the glare condition based on the current illuminance value, recording an image using the exposure time via the visible light sensing circuit, and

processing the image to determine whether the glare condition is detected in the image.

11. The apparatus of claim 10, wherein the control circuit is configured to calculate a contrast-based exposure time using the current luminance value.

12. The apparatus of claim 11, wherein the control circuit is configured to determine whether the contrast-based exposure time is greater than or equal to an absolute exposure time.

13. The apparatus of claim 12, wherein the control circuit is configured to record the image using the contrast-based exposure time when the contrast-based exposure time is greater than the absolute exposure time, and to record the image using the absolute exposure time when the contrast-based exposure time is less than the absolute exposure time.

14. The apparatus of claim 10, wherein the control circuitry is configured to determine the exposure time for detecting the glare condition such that a whitish pixel of the image indicates the glare condition.

15. The apparatus of claim 14, wherein the control circuitry is configured to determine the location of the glare source in the image by determining a location of a lowest pixel of the image that is whitish, wherein the lowest pixel of the image that is whitish has a maximum luminance.

16. The apparatus of claim 15, wherein the control circuitry is configured to generate control instructions based on a location of a lowest pixel in the image having the maximum luminance.

17. The device of claim 14, wherein the control circuitry is configured to process each pixel in the image from the bottom of the image to the top of the image until a pixel is identified with a light intensity greater than or equal to the maximum light intensity.

18. The apparatus of claim 10, further comprising a communication circuit, and wherein the control circuit is further configured to transmit control instructions via the communication circuit based on the image recorded by the visible light sensing circuit.

19. The apparatus of claim 18, wherein the control instructions are configured to control a daylight control apparatus.

20. The apparatus of claim 19, wherein the daylight control apparatus is a motorized window treatment, wherein the control instructions indicate a shade position of the motorized window treatment.

21. The device of claim 19, wherein the daylight control device is a controllable dynamic glass.

22. The apparatus of claim 10, wherein the control circuit is configured to periodically sample the illuminance signal from the photosensor at a photosensing rate.

23. The apparatus of claim 10, wherein the visible light sensing circuit is configured to periodically record an image at an image processing rate.

24. An apparatus for detecting a glare condition, the apparatus comprising:

a photo-sensing circuit configured to generate an illuminance signal indicative of a current illuminance value of light impinging on the photo-sensing circuit;

a visible light sensing circuit configured to record an image; and

a control circuit configured to:

receiving the illumination signal from the photo sensing circuit,

determining a current illuminance value based on the illuminance signal, an

When the illumination variation exceeds a threshold value, enabling the visible light sensing circuit to record at least one image; and

processing the at least one image recorded by the visible light sensing circuit to determine whether the glare condition is detected.

25. The apparatus of claim 24, wherein the control circuit is configured to calculate a difference between the current illumination and a previous illumination to determine the illumination change.

Background

For example, various types of load control systems may be used to configure a user environment, such as a home or office building. The lighting control system may be used to control a lighting load that provides artificial light in a user environment. The motorized window treatment control system may be used to control the natural light provided to the user's environment. HVAC systems can be used to control the temperature in the user's environment.

Each load control system may include various control devices, including an input device and a load control device. The load control device may receive digital messages from one or more of the input devices for controlling the electrical load, which may include load control instructions. The load control device may be capable of directly controlling the electrical load. The input device may be capable of indirectly controlling the electrical load via the load control device.

Examples of load control devices may include lighting control devices (e.g., dimmer switches, electronic switches, ballasts, or Light Emitting Diode (LED) drivers), motorized window treatments, temperature control devices (e.g., thermostats), AC plug-in load control devices, and so forth. Examples of input devices may include remote controls, occupancy sensors, daylight sensors, glare sensors, color temperature sensors, and the like. The remote control device may receive user input for performing load control. The occupancy sensor may include an Infrared (IR) sensor for detecting occupancy/vacancy of the space based on movement of the user. The daylight sensor may detect a level of daylight received in the space. The color temperature sensor may determine a color temperature within the user environment based on the wavelength and/or frequency of the light. The temperature sensor may detect a current temperature of the space. A window sensor (e.g., a glare sensor) may be positioned facing an exterior of the building (e.g., on a window or outside of the building) to measure a total amount of natural light detected outside the building and/or to detect a glare condition.

Some load control systems control motorized window treatments to prevent glare conditions inside a building (e.g., glare conditions caused by direct sunlight shining into the building). The load control system may include a system controller for determining a position for controlling a shade fabric of a motorized window treatment to prevent a glare condition based on a predicted position of the sun (e.g., using a current time of year and day, a position and/or orientation of a building, etc.). The load control system may automatically control the motorized window treatments throughout the day based on the estimated position of the sun. The load control system may also include a window sensor configured to detect low light conditions (e.g., on cloudy days) and/or high light conditions (e.g., on very bright days) to enable the system controller to override automatic control of the motorized window treatments on cloudy and sunny days. However, such load control systems require complex configuration programs and advanced system controllers to operate properly. These systems also perform an estimation of daylight glare based on known conditions (e.g., the current time of year and day, the location and/or orientation of the building, etc.) and/or the total amount of daylight sensed at the location of a given sensor. An example OF such a load control system is described in commonly assigned U.S. patent No. 8,288,981 entitled METHOD OF automatic control a MOTORIZED WINDOW TREATMENT WHILE MINIMIZING OCCUPANT DISTRACTIONS, issued 10, 16, 2012, the entire disclosure OF which is incorporated herein by reference.

In some cases, daylight glare may distract occupants, but may not be detected by current systems inside the building. For example, daylight glare may be allowed into an occupant's space, but may not be detected due to the small relative amount of glare, or even if the intensity of the daylight glare may be high, may not be detected by existing systems. This type of glare condition may be considered "noise" and may cause the load control system to unnecessarily and/or incorrectly control the motorized window treatments. Such daylight glare sources may be caused, for example, by reflections on small surfaces outside the window, waves in the body of water, and rain drops on the window. Thus, the load control system may filter this "noise" when detecting a glare condition and/or determining the position of the motorized window treatment.

Disclosure of Invention

An apparatus may be configured to detect a glare condition. The device may include a photo sensing circuit (photo sensing circuit) and a visible light sensing circuit. The photo-sensing circuit may be configured to periodically generate an illuminance signal indicative of illuminance within the space. The visible light sensing circuit may be configured to periodically record an image of the space. The device may receive an illumination signal from a photo-sensing circuit. The apparatus may determine a current illuminance based on the illuminance signal. The apparatus may adjust a frequency (e.g., an Image Processing (IP) rate) at which the visible light sensing circuit records and/or processes an image of the space to determine the presence of the glare condition based on the current illuminance.

The apparatus may track a current illumination of the space and adjust a frequency at which the visible light sensing circuit records the aerial image based on the illumination change. For example, the apparatus may receive an illumination signal from the photo-sensing circuit and determine a current illumination value based on the illumination signal. The apparatus may compare the current illuminance value to a previous illuminance value and determine an illuminance change in the space. The apparatus may compare the change in illumination to a threshold. When the change in illumination is greater than or equal to the threshold, the apparatus may adjust a frequency (e.g., an IP rate) at which the visible-light-sensing circuit records and/or processes the aerial image (e.g., to determine the presence of a glare condition). Additionally, or alternatively, the apparatus may compare the change in illuminance of the space to a threshold and adjust the frequency at which the visible-light-sensing circuit records and/or processes the image of the space (e.g., to determine the presence of the glare condition) when the change in illuminance is less than the threshold.

The device may record an image of the space via a visible light sensing circuit during the exposure time. The exposure time may be determined based on the current illuminance and the glare condition type. The glare condition type may indicate a type of glare condition (e.g., a small glare condition, a large glare condition, an absolute glare condition, a relative glare condition, a contrast glare condition, and/or any combination thereof) that the apparatus is detecting. The apparatus may receive an illumination signal from a photo-sensing circuit and determine a current illumination based on the illumination signal. The apparatus may determine a contrast-based exposure time based on the current illuminance and the glare condition type. The device may compare the contrast-based exposure time to an absolute exposure time to determine a capture exposure time. When the contrast-based exposure time is greater than or equal to the absolute exposure time, the capture exposure time may include the contrast-based exposure time. When the contrast-based exposure time is less than the absolute exposure time, the capture exposure time may include the absolute exposure time. The device may record an image at the time of capture exposure. An image recorded at a corresponding exposure time may cause pixels above a particular illumination value to be whitened. The apparatus may detect a glare condition at the location of a washed out pixel. Further, the apparatus may find the lowest whitish pixel in the image and remove the glare condition at the location of the lowest whitish pixel. For example, the apparatus may transmit a shade control command including control instructions to transition a shade of the motorized window treatment to a position of a lowest whitened pixel and/or to remove a glare condition.

Drawings

FIG. 1 is a diagram of an exemplary load control system having a visible light sensor.

FIG. 2 is a side view of an exemplary space with a visible light sensor.

Fig. 3 is a block diagram of an exemplary visible light sensor.

Fig. 4 shows an exemplary flowchart of a procedure for dynamically determining an image processing rate that may be performed by the control circuit of the visible light sensor.

Fig. 5 shows an exemplary flowchart of an image processing routine that may be executed by the control circuit of the visible light sensor.

FIG. 6 is an example of a non-distorted image for glare detection.

Fig. 7 shows an exemplary flowchart of an image processing routine that may be executed by the control circuit of the visible light sensor.

Fig. 8 shows another exemplary flowchart of an image processing routine that may be executed by the control circuit of the visible light sensor.

FIG. 9A shows a sequence diagram of an exemplary glare detection routine that may be performed by the visible light sensor and the motorized window treatment.

FIG. 9B shows a sequence diagram of an exemplary glare detection routine that may be performed by the visible light sensor, the system controller, and the motorized window treatment.

FIG. 10 is a block diagram of an exemplary system controller.

Fig. 11 is a block diagram of an exemplary control-target device.

Detailed Description

Fig. 1 is a diagram of an exemplary load control system 100 for controlling the amount of power delivered from an Alternating Current (AC) power source (not shown) to one or more electrical loads. The load control system 100 may be installed in a room 102 of a building. The load control system 100 may include a plurality of control devices configured to communicate with one another via wireless signals, such as Radio Frequency (RF) signals 108. Alternatively or additionally, the load control system 100 may include a wired digital communication link coupled to one or more of the control devices to provide communication between the load control devices. The control devices of the load control system 100 may include a plurality of control-source devices (e.g., input devices operable to transmit digital messages in response to user input, occupancy/vacancy conditions, changes in measured light intensity, etc.) and a plurality of control-target devices (e.g., load control devices operable to receive digital messages and control respective electrical loads in response to received digital messages). A single control device of the load control system 100 may operate as both a control-source device and/or a control-target device.

The control-source device may be configured to transmit the digital message directly to the control-target device. Additionally, the load control system 100 may include a system controller 110 (e.g., a central processor or load controller) operable to slave control devices (e.g., a control-source device and/or a control-target device)And transmitting the digital message to the control device. For example, the system controller 110 may be configured to receive a digital message from a control-source device and transmit the digital message to a control-target device in response to the digital message received from the control-source device. The control-source device, the control-target device, and/or the system controller 110 may be configured to use a proprietary RF protocol (such asProtocol) or another protocol (such asProtocol,Protocol or another wireless protocol) to transmit and receive RF signals 108. Alternatively, the RF signals 108 may be transmitted between one or more devices using a different RF protocol (such as a standard protocol, e.g., one of WIFI, ZIGBEE, Z-WAVE, KNX-RF, ENOCEANRADIO protocol, or a different proprietary protocol).

The load control system 100 may include one or more load control devices, such as a dimmer switch 120 for controlling a lighting load 122. The dimmer switch 120 may be adapted to be wall-mounted in a standard electrical wall box. The dimmer switch 120 may comprise a table top or plug-in load control device. The dimmer switch 120 may include a toggle actuator (e.g., a push button) and an intensity adjustment actuator (e.g., a rocker switch). Actuation (e.g., continuous actuation) of the toggle actuator may toggle (e.g., turn off and on) the lighting load 122. Actuation of the upper or lower portions of the intensity adjustment actuator may increase or decrease, respectively, the amount of power delivered to the lighting load 122, and thus increase or decrease the intensity of the receptive lighting load between a minimum intensity (e.g., about 1%) and a maximum intensity (e.g., about 100%). The dimmer switch 120 may include a plurality of visual indicators, e.g., Light Emitting Diodes (LEDs), which may be arranged in a linear array and illuminated to provide feedback on the intensity of the lighting load 122. Examples of wall-mounted dimmer switches are described in more detail in U.S. patent No. 5,248,919 entitled LIGHTING CONTROL DEVICE, published 9/28 1993, and U.S. patent No. 9,676,696 entitled WIRELESS LOAD CONTROL DEVICE, published 6/13 2017, the entire disclosures of which are incorporated herein by reference.

The dimmer switch 120 may be configured to wirelessly receive digital messages via the RF signals 108 (e.g., from the system controller 110) and control the lighting load 122 in response to the received digital messages. An example of a dimmer switch operable to transmit and receive digital messages is described in more detail in commonly assigned U.S. patent application publication No. 2009/0206983 entitled dimmer switch FOR a RADIO-FREQUENCY LOAD CONTROL SYSTEM, published on 20/8/2009, the entire disclosure of which is incorporated herein by reference.

The load control system 100 may include one or more remotely located load control devices, such as a Light Emitting Diode (LED) driver 130 for driving an LED light source 132 (e.g., an LED light engine). The LED driver 130 may be remotely located, for example, in or near the lighting fixture of the LED light source 132. The LED driver 130 may be configured to receive digital messages via the RF signal 108 (e.g., from the system controller 110) and to control the LED light sources 132 in response to the received digital messages. The LED driver 130 may be configured to adjust the color temperature of the LED light source 132 in response to the received digital message. An example of an LED driver configured to control the COLOR TEMPERATURE of an LED light source is described in more detail in commonly assigned U.S. patent No. 9,538,603, entitled SYSTEMS AND METHODS FOR CONTROLLING COLOR TEMPERATURE measurement, published on 3.1.2017, the entire disclosure of which is incorporated herein by reference. The load control system 100 may also include other types of remotely located load control devices, such as electronic dimming ballasts for driving fluorescent lamps.

The load control system 100 may include a plug-in load control device 140 for controlling a plug-in electrical load, e.g., a plug-in lighting load such as a floor lamp 142 or a table lamp, and/or an instrument such as a television or computer monitor. For example, the floor lamp 142 may be plugged into the plug-in load control device 140. The plug-in load control device 140 may plug into a standard electrical outlet 144 and may therefore be coupled in series between the AC power source and the plug-in lighting load. The plug-in load control device 140 may be configured to receive digital messages via the RF signal 108 (e.g., from the system controller 110) and to turn on and off or adjust the intensity of the floor lamp 142 in response to the received digital messages.

Alternatively or additionally, the load control system 100 may include a controllable jack for controlling a plug-in electrical load plugged into the jack. The load control system 100 may include one or more load control devices or appliances capable of directly receiving wireless signals 108 from a system controller 110, such as a speaker 146 (e.g., part of an audio/visual or intercom system), capable of generating audible sounds, such as alarms, music, intercom functions, etc.

The load control system 100 may include one or more daylight control devices, for example, a motorized window treatment 150, such as a motorized cellular shade, for controlling the amount of daylight entering the room 102. Each motorized window treatment 150 may include a window treatment fabric 152 suspended from a head rail 154 in front of the corresponding window 104. Each motorized window treatment 150 may also include a motor drive unit (not shown) located inside the head rail 154 for raising and lowering the window treatment fabric 152 to control the amount of sunlight entering the room 102. The motor drive unit of the motorized window treatments 150 may be configured to receive digital messages via the RF signals 108 (e.g., from the system controller 110) and adjust the position of the corresponding window treatment fabric 152 in response to the received digital messages. The load control system 100 may include other types of daylight control devices, such as cellular shades, blinds, roman shades, blind shades, Persian shades, pleated shades, tensioned roller shade systems, electrochromic or smart windows, and/or other suitable daylight control devices. Examples of battery powered MOTORIZED WINDOW TREATMENTs are described in detail in U.S. patent No. 8,950,461 entitled MOTORIZED WINDOW TREATMENTs, issued 2/10/2015 and U.S. patent No. 9,488,000 entitled INTEGRATED ACCESSIBLE BATTERY COMPARTMENT FOR MOTORIZED WINDOW TREATMENT, issued 11/8/2016, the entire disclosures of which are incorporated herein by reference. Furthermore, the daylight control device may include controllable dynamic glass (e.g., smart glass and/or electrochromic glass) and/or indoor or outdoor controllable blinds.

The load control system 100 may include one or more temperature control devices, such as a thermostat 160 for controlling the room temperature in the room 102. The thermostat 160 may be coupled to a heating, ventilation, and air conditioning (HVAC) system 162 via a control link (e.g., an analog control link or a wired digital communication link). The thermostat 160 may be configured to wirelessly communicate digital messages with a controller of the HVAC system 162. The thermostat 160 may include a temperature sensor for measuring the room temperature of the room 102, and may control the HVAC system 162 to adjust the temperature in the room to a set point temperature. The load control system 100 may include one or more wireless temperature sensors (not shown) located in the room 102 for measuring room temperature. The HVAC system 162 may be configured to turn the compressor on and off to cool the room 102 and turn the heating source on and off to heat the room in response to control signals received from the thermostat 160. The HVAC system 162 may be configured to turn a fan of the HVAC system on and off in response to a control signal received from the thermostat 160. The thermostat 160 and/or the HVAC system 162 can be configured to control one or more controllable dampers to control the flow of air in the room 102. The thermostat 160 may be configured to receive digital messages via the RF signals 108 (e.g., from the system controller 110) and adjust heating, ventilation, and cooling in response to the received digital messages.

The load control system 100 may include one or more other types of load control devices, for example, a screw-in lighting fixture including a dimmer circuit and an incandescent or halogen lamp; a screw-in lighting fixture comprising a ballast and a compact fluorescent lamp; a screw-in lighting device comprising an LED driver and an LED light source; an electronic switch, controllable circuit breaker or other switching device for opening and closing the appliance; a plug-in load control device, a controllable power jack or a controllable power board for controlling one or more plug-in loads; a motor control unit for controlling a motor load (such as a ceiling fan or a ventilator); a driving unit for controlling the motorized window treatment or the projection screen; electrically powered internal and/or external blinds; a thermostat for a heating and/or cooling system; a temperature control device for controlling a setpoint temperature of the HVAC system; an air conditioner; a compressor; an electrical substrate heater controller; a controllable damper; a variable air volume controller; a fresh air intake controller; a ventilation controller; hydraulic valves used in radiators and radiant heating systems; a humidity control unit; a humidifier; a dehumidifier; a water heater; a boiler controller; a pool pump; a refrigerator; an ice chest; a television and/or computer monitor; a camera; an audio system or amplifier; an elevator; a power supply; a generator; a charger, such as an electric vehicle charger; and alternative power controllers.

The load control system 100 may include one or more input devices, such as a remote control device 170, a first visible light sensor 180 (e.g., a room sensor), and/or a second visible light sensor 182 (e.g., a window sensor). The input device may be a fixed or a movable input device. The system controller 110 may be configured to transmit one or more digital messages to the load control devices (e.g., the dimmer switch 120, the LED driver 130, the plug-in load control device 140, the motorized window treatments 150, and/or the thermostat 160) in response to digital messages received from the remote control 170 and/or the visible light sensors 180, 182. The remote control 170 and/or the visible light sensors 180, 182 may be configured to transmit digital messages directly to the dimmer switch 120, the LED driver 130, the plug-in load control device 140, the motorized window treatments 150, and/or the temperature control device 160.

Remote control 170 may be configured to transmit digital messages to system controller 110 via RF signals 108 (e.g., directly to the system controller) in response to actuation of one or more buttons of the remote control. For example, the remote control 170 may be battery powered. The load control system 100 may include other types of input devices such as, for example, a temperature sensor, a humidity sensor, a radiometer, an cloudy day sensor, a shadow sensor, a pressure sensor, a smoke detector, a carbon monoxide detector, an air quality sensor, a motion sensor, a security sensor, a proximity sensor, a snap ring sensor, a zoned sensor, a keypad, a multi-zone control unit, a slider control unit, a kinetic or solar remote control, a key fob, a mobile phone, a smartphone, a tablet computer, a personal digital assistant, a personal computer, a laptop computer, a time clock, an audiovisual control, a security device, a power monitoring device (e.g., such as a power meter, an electric energy meter, a utility meter table, etc.), a central control transmitter, a residential controller, a commercial controller, an industrial controller, and/or any combination thereof.

The system controller 110 may be coupled to a network, such as a wireless or wired Local Area Network (LAN), for example, to access the internet. The system controller 110 may be wirelessly connected to the network, for example, using Wi-Fi technology. The system controller 110 may be coupled to a network via a network communication bus (e.g., an ethernet communication link). The system controller 110 may be configured to communicate with one or more network devices (e.g., a mobile device 190, such as a personal computing device and/or a wearable wireless device) via a network. The mobile device 190 may be located on the occupant 192, for example, may be attached to the occupant's body or clothing, or may be held by the occupant. The mobile device 190 may be characterized by a unique identifier (e.g., a serial number or address stored in memory) that uniquely identifies the mobile device 190 and, thus, the occupant 192. Examples of personal computing devices may include smart phones (e.g.,a smart phone, which is a mobile phone,smart phones, orSmart phones), laptop computers, and/or tablet devices (e.g.,a handheld computing device). Examples of wearable wireless devices may include activity tracking devices (such as,the device,Device and/or SonyDevice), a smart watch, a smart garment (e.g.,smart wear, etc.) and/or smart glasses (such as Google)Eyeglasses). The system controller 110 may be configured to communicate with one or more other control systems (e.g., a building management system, a security system, etc.) via a network.

The mobile device 190 may be configured to transmit the digital message to the system controller 110, for example, in one or more internet protocol packets. For example, the mobile device 190 may be configured to transmit digital messages to the system controller 110 over a LAN and/or via the internet. The mobile device 190 may be configured to transmit the digital message to an external service (e.g., if so) over the internetService), the digital message may then be received by the system controller 110. The mobile device 190 may transmit and receive the RF signals 109 via a Wi-Fi communication link, a Wi-MAX communication link, a bluetooth communication link, a Near Field Communication (NFC) link, a cellular communication link, a television white space (TVWS) communication link, another wireless communication link, or any combination thereof. The mobile device 190 may be configured to transmit RF signals according to a proprietary protocol. The load control system 100 may include a network coupled theretoHis type of network device such as a desktop personal computer, a television capable of Wi-Fi or wireless communication, or any other suitable internet protocol enabled device. An example of a LOAD CONTROL system operable to communicate with mobile devices and/or network devices on a network is described in more detail in commonly assigned U.S. patent No. 10,271,407 entitled LOAD CONTROL DEVICE HAVING INTERNET connection availability, published on 23/4/2019, the entire disclosure of which is incorporated herein by reference.

The system controller 110 may be configured to determine the location of the mobile device 190 and/or the occupant 192. For example, the location of the mobile device 190 and/or occupant 192 may be determined using Global Positioning Satellites (GPS), beacon signals, and the like. The system controller 110 may be configured to control (e.g., automatically control) the load control devices (e.g., the dimmer switch 120, the LED driver 130, the plug-in load control device 140, the motorized window treatments 150, and/or the temperature control device 160) in response to determining the position of the mobile device 190 and/or the occupant 192. One or more of the control devices of the load control system 100 may transmit a beacon signal, e.g., an RF beacon signal transmitted using short-range and/or low-power RF technology, such as bluetooth technology. The load control system 100 may further comprise at least one beacon transmitting device 194 for transmitting a beacon signal. The mobile device 190 may be configured to receive a beacon signal when located in proximity to a control device that is currently transmitting the beacon signal. The beacon signal may include a unique identifier that identifies the location of the load control device transmitting the beacon signal. Since the beacon signal may be transmitted using short range and/or low power techniques, the unique identifier may indicate the approximate location of the mobile device 190. The mobile device 190 can be configured to transmit the unique identifier to the system controller 110, and the system controller 110 can be configured to use the unique identifier (e.g., using data stored in memory or retrieved via the internet) to determine the location of the mobile device 190. AN example OF a LOAD CONTROL SYSTEM for controlling one or more electrical LOADs in response TO the position OF a MOBILE device AND/or OCCUPANT within a building is described in more detail in commonly assigned U.S. patent application publication No. 2016/0056629, entitled LOAD CONTROL SYSTEM response TO LOCATION OF AN electric vehicle AND MOBILE DEVICES, which is published 2016, 2, 25, AND the entire disclosure OF which is incorporated herein by reference.

The visible light sensors 180, 182 may each include, for example, a camera and a fisheye lens. The camera of the first visible light sensor 180 may be directed into the room 102 and may be configured to record an image of the room 102. For example, the first visible light sensor 180 may be mounted to a ceiling of the room 102 (as shown in fig. 1), and/or may be mounted to a wall of the room. If the first visible light sensor 180 is mounted to the ceiling, the image recorded by the camera may be a top view of the room 102. The camera of the second visible light sensor 182 may be directed outside of the room 102 (e.g., outside of the window 104) and may be configured to record images from outside the building. For example, the second visible light sensor 182 may be mounted to one of the windows 104 (as shown in fig. 1) and/or may be mounted to the exterior of a building.

The visible light sensors 180, 182 may each be configured to process an image recorded by the camera and transmit one or more messages (e.g., digital messages) to the load control device in response to the processed image. Each visible light sensor 180, 182 may be configured to sense one or more environmental characteristics of a space (e.g., room 102 and/or room 200) from an image. For example, the first visible light sensor 180 may be configured to operate in one or more sensor modes (e.g., occupancy and/or vacancy sensor modes, daylight sensor modes, color sensor modes, glare detection sensor modes, occupant count modes, etc.). The second visible light sensor 182 may be configured to operate in one or more same or different sensor modes (e.g., a color sensor mode, a glare detection sensor mode, a weather sensor mode, etc.). Each visible light sensor 180, 182 may execute a different algorithm to process the image in each of the sensor modes to determine the data to be transmitted to the load control device. Visible light sensors 180, 182 may each transmit a digital message via RF signal 108 (e.g., using a proprietary protocol) in response to an image. The visible light sensors 180, 182 may each send a digital message directly to the load control device and/or the system controller 110, which may then transmit the message to the load control device. Each visible light sensor 180, 182 may include a first communication circuit for transmitting and receiving RF signals 108 using a proprietary protocol.

The visible light sensors 180, 182 may each be configured to perform a plurality of sensor events to sense various environmental characteristics inside and/or outside the room 102. For example, to perform a sensor event, each visible light sensor 180, 182 may be configured to operate in one of a plurality of sensor modes to execute one or more corresponding algorithms to sense environmental characteristics. Each visible light sensor 180, 182 may be configured to obtain from memory certain preconfigured operating characteristics (e.g., sensitivity, baseline values, thresholds, limit values, etc.) that may be used by the algorithm to sense environmental characteristics during a sensor event.

Further, each visible light sensor 180, 182 may be configured to focus on one or more regions of interest in the image recorded by the camera when processing the image to sense environmental characteristics during a sensor event. For example, certain areas of an image recorded by a camera of one of the visible light sensors 180, 182 may be masked (e.g., digitally masked) such that the respective visible light sensor may not process portions of the image in the masked areas. Each visible light sensor 180, 182 can be configured to apply a mask (e.g., a predetermined digital mask that can be stored in memory) to focus on a particular region of interest and process portions of the image in the region of interest. Each visible light sensor 180, 182 may be configured to focus on multiple regions of interest in an image simultaneously. A specific mask may be defined for each sensor event.

The visible light sensors 180, 182 may each be configured to dynamically change between sensor modes, apply a digital mask to the image, and/or adjust operating characteristics according to the current sensor event. Each visible light sensor 180, 182 may be configured to perform a plurality of different sensor events to sense a plurality of environmental characteristics of the space. For example, each visible light sensor 180, 182 may be configured to sequentially and/or periodically step through sensor events to sense a plurality of environmental characteristics of the space. Each sensor event may be characterized by a sensor pattern (e.g., specifying an algorithm to be used), one or more operating characteristics, and/or one or more digital masks. An example of a VISIBLE LIGHT SENSOR having multiple SENSOR modes is described in more detail in commonly assigned U.S. patent No. 10,264,651 entitled LOAD CONTROL SYSTEM HAVING A VISIBLE LIGHT SENSOR, published on 16.4.2019, the entire disclosure of which is incorporated herein by reference.

The first visible light sensor 180 may be configured to operate in an occupancy and/or vacancy sensor mode to determine occupancy and/or vacancy conditions in the room 102 in response to detecting movement within one or more regions of interest. The first visible light sensor 180 may be configured to determine that the room 102 is occupied using an occupancy and/or vacancy detection algorithm in response to the amount of movement and/or the speed of movement exceeding an occupancy threshold.

During a sensor event for detecting occupancy and/or vacancy, the first visible light sensor 180 may be configured to apply a predetermined mask to focus on one or more regions of interest in one or more images recorded by the camera, and determine occupancy or vacancy of the space based on detection or non-detection of motion in the regions of interest. The first visible light sensor 180 may respond to movement in the region of interest and not in the masked region. For example, the first visible light sensor 180 may be configured to apply a mask to an image of the room to exclude detection of motion in the doorway 108 and/or the window 104 of the room 102, and may focus on a region of interest that includes an interior space of the room. The first visible light sensor 180 can be configured to apply a first mask to focus on the first region of interest, apply a second mask to focus on the second region of interest, and determine occupancy or vacancy based on movement detected in either of the regions of interest. The first visible light sensor 180 may be configured to focus on multiple regions of interest in an image simultaneously by applying different masks to the image.

The first visible light sensor 180 may be configured to adjust certain operating characteristics (e.g., sensitivity) to be used by the occupancy and/or vacancy algorithms depending on the current sensor event. The occupancy threshold may depend on the sensitivity. For example, the first visible light sensor 180 may be configured to be more or less sensitive to movement in the first region of interest than in the second region of interest. For example, the first visible light sensor 180 may be configured to increase sensitivity and apply a mask to focus on a region of interest around the computer keyboard, thereby being more sensitive to movement around the keyboard. In other words, by using a mask that focuses on "smaller" and "larger" portions (e.g., the keyboard and the tabletop on which the keyboard may sit), the first visible light sensor 180 may be configured to increase and/or decrease the sensitivity of detected or undetected movement. The sensitivity level may be adjusted based on a size threshold of the region of interest, with relatively higher sensitivity to movement in smaller regions of interest. By using a mask, the first visible light sensor 180 may be configured not to simply detect movement in space, but to detect where the movement occurs.

The first visible light sensor 180 may transmit a digital message to the system controller 110 via the RF signal 108 (e.g., using a proprietary protocol) in response to detecting an occupancy or vacancy condition. The system controller 110 may be configured to turn on and off the lighting loads (e.g., the lighting load 122 and/or the LED light sources 132) in response to receiving the occupancy command and the vacancy command, respectively. Alternatively, the first visible light sensor 180 may transmit the digital message directly to the lighting load. The first visible light sensor 180 may function as an empty sensor such that the lighting load is only turned off in response to detecting an empty condition (e.g., but not turned on in response to detecting an occupancy condition). Examples of RF load control systems with occupancy and vacancy sensors are described in more detail in the following items: commonly assigned U.S. patent No. 8,009,042 entitled RADIO-FREQUENCY LIGHTING CONTROL SYSTEM WITH OCCUPANCY SENSING published on 8/30/2008/9/3/2011; U.S. patent No. 8,199,010 entitled METHOD AND APPARATUS FOR configuration A WIRELESS SENSOR, issued on 12.6.2012; and U.S. patent No. 8,228,184 entitled BATTERY-POWERED OCCUPANCY SENSOR, issued on 24/7/2012, the entire disclosure of which is incorporated herein by reference.

The first visible light sensor 180 may be configured to operate in a daylight sensor mode to measure light intensity at a spatial location. For example, the first visible light sensor 180 may apply a digital mask to focus on a particular location in space (e.g., on a task surface such as the table 106 shown in fig. 1), and may use a daylighting algorithm to measure the light intensity at that location. For example, the first visible light sensor 180 may be configured to apply a mask to focus on a region of interest including the mesa. The first visible light sensor 180 may be configured to integrate light intensity values of pixels of the image across the region of interest to determine the light intensity measured at the tabletop.

The first visible light sensor 180 may transmit a digital message (e.g., including the measured light intensity) to the system controller 110 via the RF signal 108 to control the intensity of the lighting load 122 and/or the LED light source 132 in response to the measured light intensity. The first visible light sensor 180 may be configured to focus on multiple regions of interest in an image recorded by the camera and measure the light intensity in each of the different regions of interest. Alternatively, the first visible light sensor 180 may transmit the digital message directly to the lighting load. The first visible light sensor 180 may be configured to adjust certain operating characteristics (e.g., gain) based on the region of interest in which the light intensity is currently being measured. Examples OF RF load control systems with DAYLIGHT SENSORs are described in more detail in commonly assigned U.S. patent No. 8,410,706 entitled METHOD OF calibration A DAYLIGHT SENSOR, published on 2.4.2013, and U.S. patent No. 8,451,116 entitled WIRELESS BATTERY-POWERED light SENSOR, published on 28.5.2013, the entire disclosures OF which are incorporated herein by reference.

The system controller 110 may be configured to determine a degradation in light output of one or more of the lighting loads (e.g., the lighting load 122 and/or the LED light sources 132) in the space, and control the intensity of the lighting load to compensate for the degradation (e.g., lumen maintenance). For example, the system controller 110 may be configured to turn each lighting load on individually (e.g., when dark at night) and measure the magnitude of the light intensity at a location (e.g., on the table 106 or the table 220). For example, the system controller 110 may be configured to turn on the lighting load 122 at night and control the first visible light sensor 180 to record an image of the room, to apply a mask to focus on an area of interest illuminated by the lighting load 122 (e.g., the surface of the table 106 or table 220), to measure the light intensity in the area of interest, and to communicate this value to the system controller 110. The system controller 110 may store the value as a baseline value. At some time and/or date thereafter, the system controller 110 may repeat the measurement and compare the measurement to the baseline value. If the system controller 110 determines that there is degradation, such as by detecting that the degradation is greater than a threshold, the system controller may control the lighting load 122 to compensate for the degradation, alarm maintenance, and so forth.

The first visible light sensor 180 may be configured to operate in a color sensor mode to sense a color (e.g., measure a color temperature) of light emitted by one or more of the lighting loads in the space (e.g., to function as a color sensor and/or a color temperature sensor). For example, the first visible light sensor 180 may be configured to apply a mask to focus on a region of interest in the room 102, and a color sensing algorithm may be used to determine the measured color and/or color temperature in the room. For example, the first visible light sensor 180 may integrate color values of pixels of the image across the region of interest to determine the measured color and/or color temperature in the room. The first visible light sensor 180 may transmit a digital message (e.g., including a measured color temperature) to the system controller 110 via the RF signal 108 to control a color (e.g., a color temperature) of the lighting load 122 and/or the LED light sources 132 in response to a measured light intensity (e.g., a color adjustment of light in the space). Alternatively, the first visible light sensor 180 may transmit the digital message directly to the lighting load. An example of a load control system FOR CONTROLLING the COLOR TEMPERATURE of one or more lighting loads is described in more detail in commonly assigned U.S. patent No. 9,538,603 entitled SYSTEMS AND METHODS FOR CONTROLLING COLOR TEMPERATURE measurement published on 3.1.2017, the entire disclosure of which is incorporated herein by reference.

The first visible light sensor 180 may be configured to operate in a glare detection sensor mode. For example, the first visible light sensor 180 may be configured to perform a glare detection algorithm to determine the depth of penetration of direct sunlight into the space from the image recorded by the camera. For example, the first visible light sensor 180 may be configured to apply a mask to focus on a region of interest on the floor of the room 102 near the window 104 to sense the depth of penetration of direct sunlight into the room. Based on the detection and/or measurement of the depth of penetration of direct sunlight into the room, the first visible light sensor 180 may transmit a digital message to the system controller 110 via the RF signal 108 to limit the depth of penetration of direct sunlight into the space, e.g., to prevent direct sunlight from impinging on a surface (e.g., a table or a table). The system controller 110 may be configured to lower the glazing fabric 152 of each of the motorized window treatments 150 to prevent direct sunlight penetration beyond the maximum sunlight penetration depth. Alternatively, the first visible light sensor 180 may be configured to directly control the window treatment 150 to lower the window treatment fabric 152. An example of a method for limiting the depth of penetration of sunlight in a space is described in more detail in previously referenced U.S. patent No. 8,288,981.

The first visible light sensor 180 may be configured to focus only sunlight entering the space through, for example, one or both of the windows 104 (e.g., to act as a window sensor). The system controller 110 may be configured to control the lighting load (e.g., the lighting load 122 and/or the LED light sources 132) in response to the amount of daylight entering the space. The system controller 110 may be configured to implement an override for automatic control of the motorized window treatments 150, for example, in response to determining a cloudy or extremely sunny day. Alternatively, the first visible light sensor 180 may be configured to directly control the window treatment 150 to lower the window treatment fabric 152. An example OF a load control system with a WINDOW sensor is described in more detail in commonly assigned U.S. patent No. 9,933,761 entitled METHOD OF CONTROLLING a MOTORIZED WINDOW tree, published on 3.4.2018, the entire disclosure OF which is incorporated herein by reference.

The first visible light sensor 180 may be configured to detect a source of glare (e.g., sunlight reflected from a surface) outside or inside the room 102 in response to images recorded by the camera. The system controller 110 may be configured to lower the window treatment fabric 152 of each of the motorized window treatments 150 to eliminate glare. Alternatively, the first visible light sensor 180 may be configured to directly control the window treatment 150 to lower the window treatment fabric 152 to eliminate glare.

The first visible light sensor 180 can also be configured to operate in an occupant counting mode and can execute an occupant counting algorithm to count the number of occupants in a particular region of interest and/or the number of occupants entering and/or exiting a region of interest. For example, the system controller 110 may be configured to control the HVAC system 162 in response to the number of occupants in the space. The system controller 110 may be configured to control one or more of the load control devices of the load control system 100 in response to the number of occupants in the space exceeding an occupancy threshold. Alternatively, the first visible light sensor 180 may be configured to directly control the HVAC system 162 and other load control devices.

The second visible light sensor 182 may be configured to operate in a glare detection sensor mode. For example, the second visible light sensor 182 may be configured to execute a glare detection algorithm to determine whether a glare condition may exist in the room 102 from one or more images recorded by the camera. The glare condition in the room 102 may be generated by a glare source outside the room, such as the sun, exterior lights (e.g., outdoor building lights or street lights), and/or reflections of sunlight or other bright light sources. The second visible light sensor 182 may be configured to analyze one or more images recorded by the camera to determine whether an absolute glare condition and/or a relative glare condition exists outside the room 102 when viewed from one of the windows 104. An absolute glare condition may occur when the light level (e.g., light intensity) of a potentially glare source is too high (e.g., exceeds an absolute glare threshold). A relative glare condition (e.g., a contrast glare condition) may occur when the difference between the light level of the potential glare source and the background light level (e.g., baseline) is too high (e.g., exceeds a relative glare threshold).

Based on the detection of the glare condition, the second visible light sensor 182 may transmit a digital message to the system controller 110 via the RF signal 108 to open, close, or adjust the position of the window treatment fabric 152 of each of the motorized window treatments 150. For example, the system controller 110 may be configured to lower the window treatment fabric 152 of each of the motorized window treatments 150 to prevent direct sunlight from penetrating onto a task surface (e.g., a table or desk) in the room 102. If the second visible light sensor 182 does not detect the glare condition, the system controller 110 may be configured to open the motorized window treatment 150 (e.g., to control the position of the glazing fabric 152 to the fully-open position or the visor position). Alternatively, the second visible light sensor 182 may be configured to directly control the window treatment 150.

The operation of the load control system 100 may be programmed and configured using, for example, the mobile device 190 or other network devices (e.g., when the mobile device is a personal computing device). The mobile device 190 may execute Graphical User Interface (GUI) configuration software for allowing a user to program how the load control system 100 will operate. For example, the configuration software may run as a PC application or web interface. The configuration software and/or the system controller 110 (e.g., via instructions from the configuration software) may generate a load control database that defines the operation of the load control system 100. For example, the load control database may include information regarding operational settings of different load control devices of the load control system (e.g., the dimmer switch 120, the LED driver 130, the plug-in load control device 140, the motorized window treatment 150, and/or the thermostat 160). The load control database may include information regarding associations between load control devices and input devices (e.g., remote control devices 170, visible light sensors 180, etc.). The load control database may include information about how the load control devices respond to inputs received from the input devices. An example of a configuration procedure for a load control system is described in more detail in the following: commonly assigned U.S. patent No. 7,391,297 entitled hand program FOR A LIGHTING CONTROL SYSTEM, issued 24.6.2008; U.S. patent application publication No. 2008/0092075 entitled METHOD OF BUILDING A DATABASE OF A LIGHTING CONTROL SYSTEM, published on 17.4.2008; and us patent No. 10,027,127 entitled commistioning LOAD CONTROL SYSTEMS, published 7/2017, the entire disclosure of which is incorporated herein by reference.

The operation of the visible light sensors 180, 182 may be programmed and configured using the mobile device 190 or other network device. Each visible light sensor 180, 182 may include a second communication circuit for transmitting and receiving RF signals 109 (e.g., directly with network device 190 using a standard protocol such as Wi-Fi or bluetooth). During a configuration procedure of the load control system 100, the visible light sensors 180, 182 may each be configured to record an image of the space and transmit the image to the network device 190 (e.g., directly to the network device via the RF signal 109 using a standard protocol). The network device 190 may display an image on the visual display and the user may configure the operation of each visible light sensor 180, 182 to set one or more configuration parameters (e.g., configuration information) of the visible light sensor. For example, for different environmental characteristics to be sensed and controlled by the visible light sensors 180, 182 (e.g., movement of occupants, light levels inside the room, daylight levels outside the room, etc.), the user may indicate different regions of interest on the image by tracking (such as with a finger or stylus) the masked areas on the image displayed on the visual display. The visible light sensors 180, 182 may each be configured to establish different masks and/or operating characteristics depending on the environmental characteristics to be sensed (e.g., movement of occupants, light levels inside the room, daylight levels outside the room, color temperature, etc.).

After configuration of the visible light sensors 180, 182 is completed at the network device 190, the network device may transmit the configuration information to the visible light sensors (e.g., directly to the visible light sensors via the RF signals 109 using standard protocols). The visible light sensors 180, 182 may each store configuration information in memory so that the visible light sensors may operate properly during normal operation. For example, for each sensor event to be monitored by the visible light sensors 180, 182, the network device 190 may transmit a sensor pattern for the event, one or more masks defining a region of interest for the event, a possible indication of an algorithm for sensing an environmental characteristic of the event, and one or more operating characteristics of the event to the corresponding visible light sensor.

Although the load control system 100 of fig. 1 has been described above with reference to two visible light sensors 180, 182, the load control system 100 may also simply include either of the visible light sensors 180, 182. For example, the load control system 100 may not include the first visible light sensor 180, but may include a second visible light sensor 182 that may be mounted to the window 104 and may operate to prevent sun glare from occurring on task surfaces in the room 102. The load control system 100 may have more than two visible light sensors. Each window may have a corresponding visible light sensor, or a visible light sensor may receive images through a window representing a group of windows having motorized window treatments that are commonly controlled based on the images of a single visible light sensor.

Fig. 2 is a simplified side view of an exemplary space 200 having a visible light sensor 210 (e.g., such as the second visible light sensor 182 of the load control system 100 shown in fig. 1). The visible light sensor 210 may be mounted to the window 202, and the window 202 may be located where the space 200 is locatedIn the facade 204 of the building and may allow light (e.g., sunlight) to enter the space. The visible light sensor 210 may be mounted to an inner surface of the window 202 (e.g., as shown in fig. 2) or an outer surface of the window 202. The window 202 may be defined by a height h of a bottom of the windowWindow-bottomAnd the height h of the top of the windowWindow-roofAnd (5) characterizing. The space 200 may also include a work surface, such as a table 206, which may have a height hWork byAnd may be located at a distance d from the window 202Work byTo (3).

Motorized window treatments, such as motorized roller shades 220, may be mounted over the window 202. The motorized roller shade 220 may include a roller shade tube 224 and a shade fabric 222 may be wound around the roller shade tube 224. Shade fabric 222 can have hembar (hembar)226 at the lower edge of the shade fabric, which can have a height h above the floorEdge rolling strip. The motorized roller shade 220 may include a motor drive unit (not shown) that may be configured to rotate a roller shade tube 224 to position the shade fabric 222 in the fully-open position POpen(e.g., where window 202 is uncovered and hembar 226 may be at the top of the window) and a fully-closed position PClosure is provided(e.g., where the window 202 is completely covered and the hembar 226 may be at the bottom of the window). Further, the motor drive unit may control the position of the shade fabric 222 to one of a plurality of preset positions between the fully open position and the fully closed position.

The glare condition for an occupant of the room 200 may be caused by a source of glare that may be located outside the window 202, such as the sun, external lights (e.g., outdoor building lights or street lights), or reflections of the sun or other bright light sources. For example, light from a glare source may shine into the room 200 through the window 202 and may extend into the room (e.g., onto the floor) from the window 202 and/or from the facade 204 for a penetration distance dPenetration. The penetration distance d of light may be measured in a direction perpendicular to the window 202 and/or starting from the facade 204Penetration. Penetration distance d of light from a glare sourcePenetrationMay be the height h of the hem bar 226 of the motorized roller shade 220Edge rolling stripAnd the profile angle theta of the glare sourcePAs a function of (c). Profile angle thetaPMay represent the exterior of the window 202The location of the glare source. The position of the glare source may be defined by an elevation angle (e.g., a vertical angle) and an azimuth angle (e.g., a horizontal angle) originating from a visual center of the visible light sensor 210 (e.g., a direction perpendicular to the window 202 and/or facade 204). Profile angle thetaPMay be defined as the angle at which a line from the glare source to the visible light sensor projects onto a vertical plane perpendicular to the window 202 and/or facade 204. Penetration distance d of light from a glare source onto a floor of space 200 (e.g., in a direction perpendicular to window 202 and/or facade 204)PenetrationBy taking into account the penetration distance dPenetrationHeight h of bead 226Edge rolling stripAnd the length l of the light irradiated into the space 200 in the direction perpendicular to the window 202, as shown in the side view of the window 202 in fig. 2, for example,

tan(θP)=hedge rolling strip/dPenetration. (equation 1)

In response to the visible light sensor 210 detecting a glare source outside the window 202, the visible light sensor 210 and/or a system controller (e.g., the system controller 110) may be configured to determine a location to which to control the shade fabric 224 (e.g., the hembars 226 of the shade fabric 224) of the motorized roller shade 220 to prevent a glare condition in the space. For example, the position of the hem bar 226 of the motorized roller shade 220 may be adjusted to prevent the penetration distance dPenetrationBeyond the maximum penetration distance dPenetration-maximum. For example, if the sun is shining within window 220, visible light sensor 210 may be configured to process the image to determine a profile angle θ that defines the location of the glare sourceS. Visible light sensor 210 and/or system controller may be configured to calculate a maximum penetration distance d to be reached by control hembar 226 to prevent light from a glare source from exceeding a maximum penetration distancePenetration-maximumA desired height h above the floorEdge rolling stripThe amount of the solvent to be used is, for example,

hedge rolling strip=tan(θP)·dPenetration-maximum. (equation 2)

The visible light sensor 210 and/or the system controller may be configured with a top height h of the window 220Window-roofAnd a bottom height hWindow-bottomE.g., during configuration of the visible light sensor and/or the system controller. The visible light sensor 210 and/or the system controller may be configured to use the top height hWindow-roofAnd a bottom height hWindow-bottomAnd the calculated height h of the hembarEdge rolling stripTo determine the fully open position P of the hembar 226 in the motorized roller shade 220OpenAnd a full close position PClosure is providedTo the desired position in between.

The position of the hem bar 226 of the motorized roller shade 220 may be adjusted to prevent light from the glare source from shining on the table 206. For example, visible light sensor 210 and/or system controller may be configured to calculate a desired height above the floor h to be controlled to be reached by hembar 226 to prevent light from the glare source from shining on table 206Edge rolling stripThe amount of the solvent to be used is, for example,

hedge rolling strip=(tan(θP)·dWork by)+hWork by. (equation 3)

The position of the hembar 226 of the motorized roller shade 220 can be adjusted to prevent light from the glare source from illuminating the eyes of an occupant of the space 200. For example, the visible light sensor 210 and/or the system controller may be configured to calculate a desired height above the floor h to control the hem bar 226 to reach based on an estimated height of the occupant's eyes and/or an estimated distance of the occupant from the windowEdge rolling strip. For example, if the room 200 includes a visible light sensor located within the room (e.g., as the visible light sensor 180 of the load control system 100 of fig. 1), the visible light sensor may be configured to process an image of the room to determine a value for the height of the occupant's eyes and/or the occupant's distance from the window.

The visible light sensor 210 and/or the system controller may store the maximum penetration distance dPenetration-maximumHeight h of the table 206Work byAnd the distance d of the table 206 from the window 202Work byThe value of (c). For example, the visible light sensor 210 and/or the system controller may be configured with these values (e.g., using the mobile device 190 or other network device) during configuration of the visible light sensor 210 and/or the system controller. Additionally or alternatively, a visible light sensor206 and/or the system controller may be configured with a maximum penetration distance dPenetration-maximumHeight h of the table 206Work byAnd the distance d of the table 206 from the window 202Work byIs used as a default value. For example, if the room 200 includes a visible light sensor (e.g., the visible light sensor 180 of the load control system 100 of fig. 1) located within the room, the visible light sensor may be configured to process an image of the room to determine the maximum penetration distance dPenetration-maximumHeight h of the table 206Work byAnd the distance d of the table 206 from the window 202Work byAnd transmits those values to the visible light sensor 210 and/or the system controller on the window 202.

Fig. 3 is a simplified block diagram of an example visible light sensor 300 that may be deployed as one or both of the visible light sensors 180, 182 of the load control system 100 shown in fig. 1 and/or the visible light sensor 210 of fig. 2. Visible light sensor 300 may include control circuitry 310, such as a microprocessor, Programmable Logic Device (PLD), microcontroller, Application Specific Integrated Circuit (ASIC), Field Programmable Gate Array (FPGA), or any suitable processing device. The control circuit 310 may be coupled to a memory 312 for storing sensor events, masks, operating characteristics, etc. of the visible light sensor 300. The memory 312 may be implemented as an external Integrated Circuit (IC) or as internal circuitry of the control circuit 310.

The visible light sensor 300 may include a visible light sensing circuit 320 having an image recording circuit, such as a camera 322, and an image processing circuit, such as an image processor 324. The image processor 324 may include a Digital Signal Processor (DSP), a microprocessor, a Programmable Logic Device (PLD), a microcontroller, an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), or any suitable processing device. The camera 322 may be positioned toward a space where one or more environmental characteristics in the space (e.g., in the room 102) are to be sensed. The camera 322 may be configured to capture or record images. For example, the image may be a Low Dynamic Range (LDR) image. LDR images may be characterized by a particular exposure time (e.g., shutter speed, i.e., the length of time the camera's shutter is open to record an image). Further, the image may be a High Dynamic Range (HDR) image, which may be a composite of a plurality of LDR images (e.g., six LDR images) recorded at different exposure times by the camera 322 and combined together by the image processor 324. The control circuit 310 may also receive a plurality of LDR images from the visible light sensing circuit 320 and combine the LDR images together to form an HDR image. Recording and/or generating HDR images may require more processing resources and/or may result in increased power consumption compared to recording LDR images.

For example, the camera 322 may be configured to capture images at a particular sampling rate, where a single image may be referred to as a frame acquisition. One exemplary frame acquisition rate is about ten frames per second. The frame acquisition rate may be limited to reduce the required processing power of the visible light sensor 300. Each image may be composed of an array of pixels, where each pixel has one or more values associated with it. The original RGB image may have three values for each pixel: there is one value for each of the red, green and blue intensities. One implementation may use an existing RGB system for pixel colors, where each component of intensity has a value of 0 to 255. For example, a red pixel will have an RGB value of (255,0,0), while a blue pixel will have an RGB value of (0, 255). Any given pixel that is detected as a combination of red, green, and/or blue may be some combination of (0-255,0-255, 0-255). An over representation for the image may be used.

The camera 322 may provide captured images (e.g., raw images) to an image processor 324. The image processor 324 may be configured to process the images and provide one or more sensing signals to the control circuit 310 indicative of the sensed environmental characteristic (e.g., occurrence of movement, amount of movement, direction of movement, speed of movement, count of occupants, light intensity, light color, direct sunlight penetration, etc.). For example, one or more sensing signals provided to control circuitry 310 may be indicative of movement in space and/or a measured light level in space.

The image processor 324 may provide the raw image or the processed (e.g., pre-processed) image to the control circuitry 310, which may be configured to process the image to determine the sensed environmental characteristic. Regardless, the control circuit 310 may then use the sensed environmental characteristic to transmit a control command to the load device (e.g., directly or through the system controller 110).

As is known in the art, one example of a processed image is the luminance of a pixel, which can be measured from the image RGB by adding weighted R, G, B intensity values according to the following formula:

(perceived) lightness ═ 0.299R + 0.587G + 0.114B. (equation 4)

Exemplary weighting coefficients may take into account the non-uniform response of the human eye to light of different wavelengths. However, other coefficients may be used instead.

As previously described, if the visible light sensor 300 has a fisheye lens, an image captured by the camera 322 may be distorted. The image processor 324 may be configured to pre-process the image to warp correct the image and generate a non-warped image.

Another image processing technique may include mapping RGB sensor responses to CIE tristimulus values to obtain chromaticity coordinates, and thereby Correlated Color Temperature (CCT). One exemplary method is described by Joe Smith in the following references: calibration Color Temperature and illumination using the TAOS TCS3414CS Digital Color Sensor, Intelligent Opto Sensor design's notewood, 2.27 days 2009. Another example of a processed image may be an image to which a digital filter or digital mask has been applied. Digital masks may be used to eliminate areas within an image that may not have value for further analysis and processing. Alternatively, the complement of the digital mask may be a region of interest (e.g., a region within the image that has been identified for further processing or analysis). The processed image may also be created via a technique known as background subtraction. For example, using background subtraction, a background image may be subtracted from a current image (e.g., a current state of a room), which may incorporate a history of image changes over time (e.g., a previous state of the room). The techniques may identify differences in the images. Background subtraction may be useful for detecting movement in an image as well as occupancy and vacancy detection. Various algorithms may be used for background maintenance to determine how to efficiently combine pixels into a background image over time. Some exemplary background maintenance algorithms may include: adjusted frame differences, mean and threshold, mean and covariance, gaussian mixture, and/or normalized block correlation. These and other similar details inherent to image processing will be familiar to those skilled in the art.

The control circuitry 310 and/or the image processor 324 may be configured to apply one or more masks to focus on one or more regions of interest in an image (e.g., a raw image and/or a pre-processed image) to sense one or more environmental characteristics of the space. As used herein, a mask may be any definition used to define a region of interest of an image. For example, assuming that an image can be defined as an N × M array of pixels, where each pixel has a defined coordinate/location in the array, a mask is defined as a series of pixel coordinates that define the periphery of a region of interest within the image. As another example, a mask may be defined as an N × M array corresponding to an N × M array of pixels of an image. Each entry of the mask may be a 1 or a 0, for example, whereby an entry with a 1 may define a region of interest. Such a representation may allow the image array and the mask array to be anded, which may cancel or zero out each pixel of the image that is not of interest. Instead of the mask defining the region of interest of the image, the mask may define the region of no interest. These are merely examples, and other representations may be used.

Visible light sensor 300 may include a first communication circuit 330 configured to transmit and receive digital messages via a first communication link using a first protocol. For example, the first communication link may comprise a wireless communication link and the first communication circuit 330 may comprise an RF transceiver coupled to an antenna. Additionally, the first communication link may comprise a wired digital communication link and the first communication circuitry 330 may comprise wired communication circuitry. The first protocol may comprise a proprietary protocol, such as the ClearConnect protocolOr another protocol, such asProtocol,Protocol or another wireless protocol. The control circuit 310 may be configured to transmit and receive digital messages via the first communication link during normal operation of the visible light sensor 300. The control circuit 310 may be configured to transmit an indication of the sensed environmental characteristic via the first communication link during normal operation of the visible light sensor 300. For example, the control circuit 310 may be configured to transmit, via the first communication link, an indication of a detected state (e.g., an occupied or empty state) and/or a measured environmental characteristic (e.g., a measured light level or illuminance) during normal operation of the visible light sensor 300.

Visible light sensor 300 may include a second communication circuit 332 configured to transmit and receive digital messages via a second communication link using a second protocol. For example, the second communication link may comprise a wireless communication link and the second communication circuit 332 may comprise an RF transceiver coupled to an antenna. Additionally, the second communication link may comprise a wired digital communication link and the second communication circuit 332 may comprise a wired communication circuit. The second protocol may include standard protocols such as Wi-Fi protocol, bluetooth protocol, Zigbee protocol, and the like. The control circuit 310 may be configured to transmit and receive digital messages via the second communication link during configuration of the visible light sensor 300. For example, the control circuitry 310 may be configured to transmit an image recorded by the camera 322 via the second communication link during configuration of the visible light sensor 300.

Visible light sensor 300 may include circuitry for generating a DC supply voltage VCCA power supply 340 to provide power to the control circuit 310, the memory 312, the image processor 324, the first communication circuit 330 and the second communication circuit 332, and/or other low voltage circuits of the visible light sensor 300. The power supply 340 may include a power supply configured to be powered from an external power source (e.g., an AC mains voltage supply and/or external)A DC power supply) receives an external supply voltage. The power supply 340 may include a battery for powering the circuitry of the visible light sensor 300.

The visible light sensor 300 may also include low power occupancy sensing circuitry, such as a Passive Infrared (PIR) detector circuit 350. The PIR detector circuit 350 may generate a PIR detection signal V in response to passive infrared energy detected in the spacePIR(e.g., a low power occupancy signal) that represents an occupancy condition and/or an empty condition in the space. The PIR detector circuit 350 may consume less power (e.g., to detect occupancy and/or vacancy conditions in the space) than the visible light sensing circuit 320. However, the visible light sensing circuit 320 may be more accurate than the PIR detector circuit 350. For example, when the power source 340 is a battery, the control circuit 310 may be configured to disable the visible light sensing circuit 320 and detect the occupancy condition using the PIR detector circuit 350. For example, when the space is empty, the control circuit 310 may disable the visible light sensing circuit 320. The control circuit 310 may be responsive to the PIR detection signal VPIRTo detect occupancy conditions in the space, and then the visible light sensing circuit 320 may be enabled to detect continued occupancy conditions and/or vacant conditions. The control circuit 310 may be responsive to the PIR detection signal VPIRThe visible light sensing circuit 320 is enabled immediately after detecting an occupancy condition in the space. After detecting an occupancy condition in the space (in response to the PIR detection signal V)PIR) The control circuit 310 may keep the visible light sensing circuit 320 disabled. The control circuit 310 may keep the visible light sensing circuit 320 disabled until the PIR detection signal VPIRIndicating that the space is empty. The control circuit 310 may not be able to determine that the space is empty until the visible light sensing circuit 320 subsequently indicates that the space is empty.

When the visible light sensor 300 is mounted to the window (e.g., as the second visible light sensor 182 of the load control system of fig. 1), the control circuit 310 may be configured to record one or more images of the space outside the window via the camera 322 and process the one or more images to determine whether a glare condition exists. Visible light sensor 300 may include a fisheye lens (not shown) that may distort the image recorded by camera 322. The control circuit 310 and/or the image processor 324 may be configured to warp modify the image recorded by the camera 322 to produce a non-warped image, which may be characterized by a plurality of rows of constant profile angles.

The control circuit 310 may be configured to process each pixel of the non-distorted image to determine whether a glare condition exists for each pixel. For example, control circuit 310 may determine the luminance L of each pixel of the non-distorted imagePITo determine whether a glare condition exists for each pixel. Control circuitry 310 may begin processing the image at a portion of the image that may be relative to a location on the window or set of windows from which the image was taken. For example, the portion of the image may represent the bottom of the window, and the control circuitry may begin processing the non-distorted image at the bottom. The bottom portion may include a predetermined number of rows of pixels from the bottom portion of the image (e.g., bottom row of pixels in a non-warped image). The control circuitry may also or alternatively process the image starting from the top of the image (e.g., the top row of pixels). The portion of the image that is processed first may depend on the direction from which the motorized window treatment moves the covering material to close the covering material and/or the current position of the covering material to reduce processing resources for identifying the glare condition in the image.

The control circuit 310 may be configured to start with the bottom row of pixels (e.g., on the left or right side) of the non-distorted image. Control circuit 310 may step through each pixel in the bottom row and process each pixel to determine whether a glare condition exists before moving on to the next row. After the control circuit 310 determines that the glare condition exists, the control circuit 310 may stop processing the non-distorted image and may operate to control one or more motorized window treatments (e.g., such as the motorized window treatments 140 of fig. 1 and/or the motorized roller shades 220 of fig. 2) to remove the glare condition (e.g., as will be described in greater detail below). This may prevent processing the rest of the image to detect the glare condition. And if the control circuit 310 determines that a glare condition exists, the control circuit 310 may transmit control instructions to operate the window covering of the one or more motorized window treatments to transition to a position to remove the glare condition. However, if the control circuit 310 processes the entire image without detecting a glare condition, the control circuit may conclude that a glare condition is not present and may control the motorized window treatments to open. Since control circuit 310 processes pixels of the non-distorted image starting from the bottom row of the non-distorted image, control circuit 310 may find the lowest pixel indicating a glare source before other higher glare sources are detected. The lowest pixel indicative of the glare source is an important parameter for determining the shade position that the motorized window treatment reaches to prevent glare on the task surface. This allows the control circuit 310 to minimize the amount of processing required to determine shade control commands for preventing glare in the room.

When processing the non-distorted image to determine whether a glare condition exists, the control circuitry 310 may be configured to determine whether an absolute glare condition and/or a relative glare condition (e.g., a contrast glare condition) exists. If the absolute light level (e.g., absolute intensity or luminance) of a pixel exceeds an absolute glare threshold (e.g., about 10,000 cd/m)2) Then control circuitry 310 may be configured to determine that an absolute glare condition exists. If a relative light level (e.g., the difference between the absolute light level of the pixel and the background light level) compared to the background light level exceeds a relative glare threshold (e.g., about 4,000 cd/m)2) Then control circuitry 310 may be configured to determine that a relative glare condition exists. If the control circuit 310 detects that an absolute glare condition exists or that a relative glare condition exists, the control circuit may stop processing the non-distorted image and move to control the motorized window treatments to remove the glare condition. For example, the motorized window treatment may remove the glare condition by determining a shade position based on the position of the glare condition. The threshold may be adjustable to adjust the sensitivity of the visible light sensor 300. For example, the threshold may be adjusted by a user during configuration of the visible light sensor 300.

To determine whether a relative glare condition exists, the control circuitry 310 may determine the background light level from the non-distorted image (e.g., baseline). The background light level may be a value representing the luminance of the background of the non-distorted image. For exampleThe background light level may be the percentile luminance of the non-distorted image (e.g., the 25 th percentile luminance, L)25). 25 th percentile luminance L25May be a luminance where 25% of the pixels of the non-distorted image are darker than the 25 th percentile luminance. The control circuit 310 may be based on the luminance L of the pixelPIAnd 25 th percentile luminance L25To calculate the contrast ratio C of the pixels of the recorded imagePI(e.g., C)PI=LPI/L25). If contrast ratio CPIGreater than the contrast threshold CTH(e.g., about 15), then the control circuitry 310 may determine that a glare condition (e.g., a relative glare condition) exists.

When control circuit 310 has determined that a glare condition exists, control circuit 310 may process the pixels to determine a profile angle of the glare source. For example, each pixel of the image may be characterized by a value of a profile angle. The values of the profile angles may be stored in memory 312. Control circuit 310 may retrieve the appropriate profile angle based on the pixel being processed. In addition, profile angles may be determined and/or calculated from data of the image. The control circuit 310 may use the profile angle (e.g., as shown in equations 2 and/or 3 above) to determine the position to which to control the arrival of the motorized window treatment. The control circuit 310 may transmit the profile angle to another device (e.g., the system controller 110), which may determine a position to control the arrival of the motorized window treatments to avoid a glare condition in the room.

Visible light sensor 300 may also include low power photo-sensing circuitry, such as photo-sensor circuitry 360. The photosensor circuit 360 can include a photodiode (not shown). The visible light sensor 300 may include a lens (not shown) for guiding light (e.g., sunlight or sunlight) from outside the visible light sensor 300 onto the photodiode. For example, the photosensor circuit 360 can be configured to determine an average illuminance (e.g., average light level) of light impinging on the lens of the visible light sensor 300. The photosensor circuit 360 may consume less power (e.g., to measure the average illuminance of light impinging on the visible light sensor) than the visible light sensing circuit 320. Photosensor circuit 360Can be configured to generate an illuminance signal VE(e.g., a low power daylight signal) that may indicate an average illumination of light impinging on the photodiode. The control circuit 310 may be configured to periodically compare the illumination signal V at the Photosensor (PS) rateESampling is performed. The PS rate may be a heartbeat rate that occurs periodically.

As described herein, the visible light sensor 300 may be powered by a limited power source (e.g., the power source 304 may be a battery) and may have limited power storage. Further, the visible light sensor 300 may have limited memory resources and/or processing resources. The image processing performed by the visible light sensing circuit 320 and/or the control circuit 310 may cause the visible light sensor 310 to consume a greater amount of power storage, memory resources, and/or processing resources on the visible light sensor than when performing other computer processing techniques. Reducing the amount of image processing performed by the visible light sensing circuit 320 and/or the control circuit 310 may reduce the amount of power and/or resources used on the visible light sensor 300.

When the power source 340 is a battery, the control circuit 310 may be configured to disable the visible light sensing circuit 320 and measure the average illuminance of light outside the room using the photosensor circuit 360. For example, when the average illumination measured by the photosensor circuit 360 is below the illumination threshold ETHAnd/or when the average illuminance does not vary much (e.g., the illuminance change Δ E is less than the illuminance change threshold Δ E)TH) The control circuit 310 may disable the visible light sensing circuit 320. For example, the illumination change Δ E may be the current illumination E measured by the photosensor circuit 360At presentAnd previous illuminance EPreviously describedThe difference between them. In response to detecting that the average illuminance measured by the photosensor circuit 360 is above the illuminance threshold ETHAnd/or the illumination change Δ E (e.g., increase in illumination) is greater than the illumination change threshold Δ ETHThe control circuit 310 may be configured to enable (e.g., wake up) the visible light sensing circuit 320 so that the control circuit can determine the location (e.g., profile angle) of a possible glare source from the image captured by the visible light sensing circuit and substantially control the motorized window treatment using the profile angle.

The control circuit 310 may be configured to periodically enable the visible light sensing circuit 320 at an Image Processing (IP) rate. The IP rate may be a heartbeat rate that occurs periodically. For example, control circuit 310 may be configured to respond to a current illuminance EAt presentAnd/or illumination change Δ E (e.g., as a function of illumination signal V)EDetermined) to adjust the IP rate to conserve power, processing resources, and/or memory resources. For example, when the position of the sun is such that a glare condition is unlikely to occur, the control circuitry 310 may adjust (e.g., decrease) the IP rate. For example, the control circuitry may reduce the IP rate when the daylight intensity level is low (e.g., when the sun is behind the cloud or at night when a glare condition is unlikely to be detected), which may reduce the amount of image processing performed on the visible light sensor 300. Control circuitry 310 may increase the IP rate when the daylight intensity level is high (e.g., on sunny days where a glare condition is more likely to be detected). If the IP rate is reduced when a glare condition is unlikely to be detected (e.g., when the sun is behind the cloud or at night), the visible light sensor 300 may reduce the amount of image processing performed during these times.

Control circuit 310 may be configured to respond to the current illuminance EAt presentTo adjust the operation of the visible light sensing circuit 320. For example, the control circuit 310 may be configured to adjust an exposure time of the camera 322 for recording an image (e.g., an LDR image). The control circuit 310 may be configured to use the current illuminance EAt presentTo determine an appropriate exposure time for recording a single image that may indicate the position (e.g., profile angle) of the glare source (e.g., such that the visible light sensing circuit 320 and/or the control circuit 310 need not generate an HDR image). The control circuit 310 may record images at different exposure times to detect the position (e.g., profile angle) of the glare source due to different types of glare conditions (e.g., smaller glare conditions, larger glare conditions, absolute glare conditions, and/or relative glare conditions). For example, if the glare condition is an absolute glare condition, the control circuit 310 may be configured to use the absolute exposure time TExposure-AbsoluteTo detect the position of the glare source. Absolute exposure time TExposure-AbsoluteMay be a fixed exposure time (e.g., minimum exposure time) at which the position of the glare source may be detected in the LDR image (e.g., single LDR image) due to the absolute glare condition. If the glare condition is a relative glare condition (e.g., a contrast glare condition), the control circuitry 310 may be configured to use the contrast-based exposure time TExposure-contrast ratioTo detect the position of the glare source. Contrast-based exposure time TExposure-contrast ratioThere may be a variable exposure time at which the position of the glare source may be detected in the LDR image (e.g., a single LDR image) due to the contrast glare condition. Contrast-based exposure time TExposure-contrast ratioMay have a dependence on the current illuminance EAt presentE.g. as a function of the illumination signal VEDetermined) value.

Fig. 4 is an example of a routine 400 that may be performed by control circuitry of a visible light sensor (e.g., control circuitry 310 and/or image processor 322 of visible light sensor 300) to detect a glare condition. For example, the process 400 may be triggered periodically by an Image Processing (IP) function at an IP rate. The IP rate may be dynamically adjusted (e.g., as will be described in more detail below with reference to fig. 5), which may reduce and/or increase the frequency with which procedure 400 is triggered by IP functions. The routine 400 may also be performed by control circuitry of one or more other devices, such as a system controller (e.g., the system controller 110 shown in fig. 1). For example, the visible light sensor and/or the system controller may include a visible light sensing circuit (e.g., visible light sensing circuit 320) and a photosensor circuit (e.g., photosensor circuit 360) that may be capable of measuring a current illuminance E of light impinging on the visible light sensorAt present

As shown in fig. 4, at 410, procedure 400 may be triggered by an IP function (e.g., at an IP rate). At 412, the control circuit may compare the luminance signal V that may be generated by the photosensor circuitESampling is performed. At 414, the control circuit may be based on the illumination signal VETo determine the current illuminance EAt present. At 416, the control circuitry may process the image. For example, controlThe circuitry may process the image to detect a glare condition (e.g., as will be described in more detail below with reference to fig. 7 and/or 8). For example, if a glare condition is detected, the control circuitry may also remove the glare condition. For example, as described herein, the control circuitry may determine a shade position of a motorized window treatment (e.g., motorized roller shade 220) to remove a glare condition. Further, the control circuitry may transmit control instructions to the motorized window treatment that transition a shade fabric (e.g., shade fabric 224) of the motorized window treatment to the determined shade position to remove the detected glare condition.

Fig. 5 is an example of a procedure 500 that may be performed by control circuitry of a visible light sensor (e.g., control circuitry 310 and/or image processor 322 of visible light sensor 300) to dynamically adjust the IP rate of an IP function. As described herein, the IP function may trigger the procedure 400 shown in fig. 4, which may include image processing for performing glare detection. For example, glare detection may be performed by waking up a visible light sensor circuit at an IP rate to perform image processing on one or more images. The routine 500 may also be performed by control circuitry of one or more other devices, such as a system controller (e.g., the system controller 110 shown in fig. 1). For example, the routine 500 may be performed at a visible light sensor, a system controller, or may be distributed across multiple devices (such as a visible light sensor and a system controller). The visible light sensor may include a visible light sensing circuit (e.g., visible light sensing circuit 320) and a photosensor circuit (e.g., photosensor circuit 360) that may be capable of measuring a current illuminance E of light impinging on the visible light sensorAt present

As shown in fig. 5, at 510, the process 500 may be triggered by a photo-sensing (PS) function (e.g., at a PS rate). For example, and as described herein, the PS function may be periodically triggered (e.g., at a PS rate) to determine a likelihood of detecting a glare condition and/or adjust an IP rate accordingly (e.g., based on the illuminance signal). The PS rate may be a higher rate than the IP rate (e.g., such that procedure 500 is triggered more frequently than procedure 400 shown in fig. 4). Furthermore, compared to IPThe rate periodically triggered processes, which are periodically triggered at the PS rate, may be less intensive (e.g., consume less processing and/or power resources). At 512, the control circuit may provide an illumination signal V that may be generated by the photosensor circuitESampling is performed. Illuminance signal VEMay be used to determine the likelihood of detecting a glare condition.

At 514, control circuitry may base illumination signal V onETo determine the current illuminance EAt present. Current illuminance EAt presentMay be used to indicate the likelihood of detecting a glare condition. E.g. above the illumination threshold ETHCurrent illuminance E ofAt presentThe value of (d) may indicate the position of the sun such that a glare condition is more likely to exist (e.g., daytime and sun not occluded by clouds, buildings, etc.). Below the illumination threshold ETHCurrent illuminance E ofAt presentMay indicate the position of the sun such that a glare condition is unlikely to exist (e.g., nighttime or the sun being obscured by clouds, buildings, etc.). At 516, the control circuit may convert the current illuminance EAt presentAnd an illumination threshold ETHA comparison is made (e.g., to determine whether a glare condition is nighttime and/or is unlikely to be detected). If the current illuminance EAt presentLess than an illumination threshold ETHThen, at 518, the control circuitry may turn off the IP rate (e.g., adjust the IP rate to zero). As described herein, the IP rate may be turned off when a glare condition is unlikely to exist. As shown in fig. 5, and further described herein, the control circuitry may stop recording and/or processing images when the IP rate is off, which may also reduce power consumption.

As described herein, the program 500 may dynamically adjust the IP rate when the glare condition is unlikely to exist. Thus, if at 516 the current illumination E isAt presentNot less than an illumination threshold ETH(e.g., indicating the position of the sun such that a glare condition may be detected), then at 520, the control circuitry may determine whether the IP rate is on (e.g., the IP rate is greater than zero). If the IP rate is turned off, the control circuitry may turn on the IP rate at 522. At 524, it is determined whether to turn on the IP rate and/or 520After turning on the IP rate at 522, the control circuit may determine (e.g., calculate) the change in illumination Δ E. For example, the illumination change Δ E may include the current illumination EAt presentAnd previous illuminance EPreviously describedThe difference between them. The previous illuminance may comprise the illuminance determined at 514 and/or stored at 532 during a previous invocation of the routine 500.

The change in illuminance Δ E calculated at 524 may be used to predict the presence of a glare condition or the likelihood of detecting a glare condition. For example, a value of the illumination change Δ Ε above the illumination threshold may indicate a position of the sun such that a glare condition is more likely to exist (e.g., because the sun may move out of the back of a building or cloud). At 526, the apparatus may compare the illumination change Δ E to a first illumination change threshold Δ ETH1A comparison is made. For example, the first illuminance change threshold Δ ETH1May be a fixed value or a variable value, which may be determined as the current illuminance EAt presentA function of, e.g. Δ ETH1=α·EAt presentWhere α is a predetermined constant, such as 0.10 or 10%. If the illumination change Δ E is greater than or equal to the first illumination change threshold Δ ETH1(e.g., indicating the position of the sun makes it more likely that a glare condition is detected), then at 528, the control circuitry may adjust (e.g., increase) the IP rate. The control circuit may be based on, for example, the current illuminance EAt presentThe change in illuminance Δ E, and/or the likelihood of detecting a glare condition. For example, the control circuit may be at the current illuminance EAt presentAnd/or increase the IP rate when the illuminance change Δ E is large, and increase the IP rate when the current illuminance E is largeAt presentAnd/or the IP rate is reduced when the illumination change Δ Ε is small. At 530, the control circuitry may process the images, for example, to capture one or more images and/or detect a glare condition. For example, if a glare condition is detected, the control circuitry may also remove the glare condition. For example, as described herein, the control circuitry may determine a shade position of a motorized window treatment (e.g., motorized roller shade 220) to remove a glare condition. In addition, the control circuitry may transmit control instructions to the motorized window treatment that will shade the shade fabric (e.g., shade fabric 22) of the motorized window treatment4) Transitioning to the determined blackout position to remove the detected glare condition.

At 532, the device may compare the illumination change Δ E to a second illumination change threshold- Δ ETH2A comparison is made. If the change in illumination Δ E is less than or equal to a second threshold change in illumination- Δ ETH2(e.g., indicating the position of the sun is such that the glare condition is unlikely to be detected), then at 534, the control circuitry can adjust (e.g., decrease) the IP rate. For example, the second illumination variation threshold- Δ ETH2May be a fixed value or a variable value, which may be determined as the current illuminance EAt presentA function of, e.g., - Δ ETH2=-β·EAt presentWhere β is a predetermined constant, such as 0.10 or 10%.

Changing the illuminance by a threshold value- Δ E from the second illuminance at 532TH2After comparing, performing image processing at 530, and/or adjusting the IP rate at 534, the control circuit may compare the previous illuminance E at 536Previously describedSet equal to the current illuminance EAt present. As described herein, previous illuminance EPreviously describedMay be used to predict the likelihood of detecting a glare condition. For example, previous illuminance EPreviously describedMay be used in later invocations of routine 500 to determine the change in illumination deltae (e.g., at 524).

FIG. 6 is an example of a non-distorted image 600 that may be used to detect a glare condition. Non-distorted image 600 may include one or more pixels (e.g., pixel 610, pixel 608) indicative of glare. For example, glare as indicated by pixels 608 and 610 may be caused by reflection of the sun on small surfaces, ripples in the body of water, and/or rain drops on windows. As described herein, pixel 608 and pixel 610 may be referred to as a white-bleed pixel (e.g., an overexposed pixel). The whitened pixels may be used to indicate the location of the glare condition. The visible light sensor and/or the system controller may perform image processing on the image 600 to detect the glare condition. For example, image processing may include searching for white-ish pixels to detect the location of the glare condition.

Fig. 7 is an exemplary routine 700 that may be performed by control circuitry of a visible light sensor (e.g., control circuitry 310 and/or image processor 322 of visible light sensor 300) to detect a glare condition and/or determine a position of a glare source using image processing. Procedure 700 may be performed periodically (e.g., periodically at the IP rate for procedure 400 shown in fig. 4 and/or the PS rate for procedure 500 shown in fig. 5). The routine 700 may also be performed by control circuitry of one or more other devices, such as a system controller (e.g., the system controller 110 shown in fig. 1). For example, the routine 700 may be executed on a single device (such as a visible light sensor) or distributed across multiple devices (such as an image processor and a system controller). As described herein, the visible light sensor may include a visible light sensing circuit (e.g., visible light sensing circuit 320) and a photosensor circuit (e.g., photosensor circuit 360) capable of detecting the illuminance of light impinging on the visible light sensor. Procedure 700 may be performed in conjunction with procedure 400 shown in fig. 4 and/or procedure 500 shown in fig. 5 (e.g., at step 530 of procedure 500 and/or at step 416 of procedure 400).

As shown in fig. 7, the process 700 may begin at 710 (e.g., at 416 of the process 400 shown in fig. 4 and/or at 530 of the process 500 shown in fig. 5). The glare condition may be detected by capturing an image at a particular exposure time (e.g., shutter speed). For example, capturing an image at a particular exposure time may cause pixels above a particular luminance value to be whitened. After capturing the images at the respective exposure times, the visible light sensor may detect the glare condition and/or determine a position (e.g., a profile angle) of the glare source based on the position of the whitish pixel. Images may be captured at different exposure times to determine the position of the glare source due to different types of glare conditions (e.g., greater glare conditions, lesser glare conditions, absolute glare conditions, and/or contrast glare conditions). For example, the exposure time T based on contrastExposure-contrast ratioCan be used to detect the position of the glare source due to contrasting glare conditions (e.g., relative glare conditions), and absolute exposure time TExposure-AbsoluteMay be used to detect the position of the glare source due to an absolute glare condition. Absolute exposure time TExposure-AbsoluteCan be fixed and based onExposure time T of contrastExposure-contrast ratioMay be variable. Absolute exposure time TExposure-AbsoluteThe adjustment may be made, for example, using configuration software running on a programming device (e.g., mobile device 190). Determining the exposure of the captured image prior to capturing the image may allow the visible light sensor to detect glare conditions and/or detect the position of the glare source by processing (e.g., only processing) a single image of the room. This may allow for a reduction in the amount of image processing performed by the visible light sensor, which may reduce the amount of power and/or resources used on the visible light sensor. Furthermore, capturing the image at the determined exposure rate may allow the control circuitry to process the image in the same manner regardless of the type of glare condition present.

At 712, the control circuit may use the current illuminance EAt presentTo calculate the exposure time T based on the contrastExposure-contrast ratio. As described herein, the current illuminance EAt presentMay be based on the illumination signal VETo determine (e.g., as determined at 514 of routine 500 shown in fig. 5 and/or at 414 of routine 400 shown in fig. 4), the illuminance signal may be generated from the photosensor circuit. Contrast-based exposure time T calculated at 712Exposure-contrast ratioMay be used to capture images that may be used to detect contrast glare conditions and/or determine the location of the glare source due to contrast glare conditions (e.g., by whitening pixels where the glare source is located). For example, at an exposure time T based on contrastExposure-contrast ratioCapturing the image may cause pixels within the image having luminance values greater than or equal to a threshold to be whitened. Further, the whitish pixels may indicate the location of the contrast glare condition. For example, the control circuit may be set as the current illuminance EAt presentTo calculate a contrast-based exposure time TExposure-contrast ratio(e.g., T)Exposure-contrast ratio=C*EAt present+C0Wherein C and C0Is a constant). Contrast-based exposure time TExposure-contrast ratioCan be matched with the current illumination EAt presentProportional (e.g. contrast-based exposure time TExposure-contrast ratioCan follow the current illuminationEAt presentIs increased by an increase in). Additionally, or alternatively, contrast-based exposure time TExposure-contrast ratioMay be inversely proportional to the brightness of the whitened pixel (e.g., the longer the exposure time, the lower the level of pixel whitening).

At 714, the calculated contrast-based exposure time TExposure-contrast ratioCan be compared with the absolute exposure time TExposure-AbsoluteA comparison is made in order to determine an exposure time of an image to be recorded (e.g. a single LDR image) to detect a glare condition and/or to determine a position of a glare source. If contrast-based exposure time T is usedExposure-contrast ratioTo capture an image, the whitened pixels of the image may identify the location of the glare source due to the contrast glare condition. If absolute exposure time T is usedExposure-AbsoluteAn image is captured and the whitened pixels of the image may identify the location of the glare source due to an absolute glare condition. By determining an appropriate exposure time (e.g. absolute exposure time T) before recording the imageExposure-AbsoluteOr an exposure time T based on contrastExposure-contrast ratio) The visible light sensor may capture an image (e.g., a single LDR image) at a single exposure time to detect the position of the glare source due to either of an absolute glare condition and a relative glare condition. If at 714 the exposure time T based on contrast is takenExposure-contrast ratioLess than absolute exposure time TExposure-AbsoluteThen an absolute glare condition may be occurring and the control circuit may use the absolute exposure time T at 716Exposure-AbsoluteTo record the image. If at 714 the exposure time T based on contrast is takenExposure-contrast ratioGreater than or equal to the absolute exposure time TExposure-AbsoluteThen a contrast glare condition may be occurring and the control circuit may use the calculated contrast based exposure time T at 716Exposure-contrast ratioTo record the image.

After capturing the image at the appropriate exposure time, the control circuitry may process the image to detect a glare condition and/or determine a position of the glare source. The control circuit may begin processing pixels at a position relative to a fully closed position of the motorized window treatment. For example, if the motorized window treatment is positioned on top of the window and the shade fabric is lowered toward the bottom of the window (e.g., to a fully closed position), the control circuitry may process the image starting at the bottom of the image. At 720, the control circuit may start at a pixel at the bottom of the image. At 722, the control circuit may process the ith pixel of the image, which may be the first pixel in the bottom row of pixels in the image, to see if a glare condition exists. At 724, the control circuitry may determine whether the current pixel (e.g., the ith pixel) is whitened. For example, the control circuitry may determine whether the luminance value of the ith pixel is equal to 100 and/or whether the luminance values (e.g., RGB values) of the red, green, and blue content are all at a maximum luminance value (e.g., a maximum luminance value of a pixel in the image, such as 255). If the control circuitry determines that the pixel is not whitened at 724, the control circuitry may determine whether the image includes more unprocessed pixels at 726. If the image includes more unprocessed pixels, the control circuit may move to the next pixel at 728 and then process the next pixel at 722. The control circuitry may continue to process the remaining pixels in the image to determine the lowest pixel in the image that is whitish (e.g., has the greatest luminance value of the pixel in the image). If the image does not include more unprocessed pixels, the control circuitry may determine that a glare condition is not detected in the image at step 730 and may transmit a command to open the motorized window treatment at 732 before the routine 700 may exit.

If the control circuit determines at 724 that the pixel is whitish, the control circuit may determine at 734 that a glare condition exists. As described herein, the pixel determined to be whitened at 724 can be the lowest pixel in the image that is whitened (e.g., has the maximum luminance value of the pixel in the image). The control circuit may then calculate 736 the profile angle for the ith pixel. As described herein, the profile angle may indicate a position of a detected glare source outside the room. At 738, the control circuitry may determine a shade position to control the motorized window treatment to reach based on the profile angle. For example, determining a shade position to control the motorized window treatment to reach based on the profile angle may allow the motorized window treatment to block the position of the glare source out of the field of view of the room occupant, thereby preventing glare from occurring inside the room. After determining the shade position at 738, the control circuit may transmit shade control commands to the motorized window treatments at 738. For example, the shade control commands may include control instructions to move the motorized window treatment to block a glare source (e.g., as indicated by the ith pixel position in the image) out of view of the room occupant. Additionally, or alternatively, after determining the shade position at 738, the control circuitry may transmit an open command or a close command.

Although FIG. 7 is described as using motorized window treatments as daylight control devices, other daylight control devices, such as controllable dynamic glass, may also be used. The dynamic glass may include one or more horizontal bands (e.g., zones) that may be controlled between a high transmittance state and a low transmittance state, and the dynamic glass may be controlled to the low transmittance state to remove (e.g., block) the glare condition. And after calculating the profile angle at 736, the control circuitry can determine the bands associated with the determined profile angle and transmit control instructions to control all bands above the determined bands to a low transmittance state to remove the glare condition.

Fig. 8 is an exemplary procedure 800 that may be performed by control circuitry of a visible light sensor (e.g., control circuitry 310 and/or image processor 322 of visible light sensor 300) to detect a glare condition and/or determine a position of a glare source using image processing. Procedure 800 may be performed periodically (e.g., at the IP rate for procedure 400 shown in fig. 4 and/or the PS rate for procedure 500 shown in fig. 5). The routine 800 may also be performed by control circuitry of one or more other devices, such as a system controller (e.g., the system controller 110 shown in fig. 1). For example, the routine 800 may be executed on a single device (such as a visible light sensor) or distributed across multiple devices (such as an image processor and a system controller). As described herein, the visible light sensor may include a visible light sensing circuit (e.g., visible light sensing circuit 320) and a photosensor circuit (e.g., photosensor circuit 360) capable of detecting the illuminance of light impinging on the visible light sensor. Procedure 800 may be performed in conjunction with procedure 400 shown in fig. 4 and/or procedure 500 shown in fig. 5 (e.g., at step 530 of procedure 500 and/or at step 416 of procedure 400).

As shown in fig. 8, procedure 800 may begin at 810 (e.g., at 416 of procedure 400 shown in fig. 4 and/or at 530 of procedure 500 shown in fig. 5). The glare condition may be detected by capturing an image at a particular exposure time (e.g., shutter speed). For example, capturing an image at a particular exposure time may cause pixels above a particular luminance value to be whitened. After capturing the images at the respective exposure times, the visible light sensor may detect the glare condition and/or determine a position (e.g., a profile angle) of the glare source based on the position of the whitish pixel. Images may be captured at different exposure times to determine the position of the glare source due to different types of glare conditions (e.g., absolute glare conditions and/or contrast glare conditions). For example, the exposure time T based on contrastExposure-contrast ratioCan be used to detect the position of the glare source due to contrasting glare conditions (e.g., relative glare conditions), and absolute exposure time TExposure-AbsoluteMay be used to detect the position of the glare source due to an absolute glare condition. Absolute exposure time TExposure-AbsoluteMay be fixed and/or may depend on the resolution of the image being processed. Contrast-based exposure time TExposure-contrast ratioMay be variable. Determining the exposure of the captured image prior to capturing the image may allow the visible light sensor to detect glare conditions and/or detect the position of the glare source by processing (e.g., processing only) a single image and/or only a few images. This may allow for a reduction in the amount of image processing performed by the visible light sensor, which may reduce the amount of power and/or resources used on the visible light sensor.

At 812, the control circuit may use the current illuminance EAt presentTo calculate the exposure time T based on the contrastExposure-contrast ratio. As described herein, the current illuminance EAt presentMay be based on (e.g., as determined at 514 of routine 500 shown in fig. 5 and/or as shown in fig. 4)Determined at 414 of sequence 400) illumination signal VEIt is determined that the illumination signal may be generated from a photosensor circuit. Contrast-based exposure time T calculated at 812Exposure-contrast ratioMay be used to capture images that may be used to detect contrast glare conditions and/or determine the position of the glare source due to contrast glare conditions (e.g., by whitening pixels where the glare source is located). For example, at an exposure time T based on contrastExposure-contrast ratioCapturing the image may cause pixels within the image having a luminance value greater than or equal to a threshold to be whitened. Further, the whitish pixels may indicate the location of the contrast glare condition. For example, the control circuit may be set as the current illuminance EAt presentCalculates the contrast-based exposure time TExposure-contrast ratio(e.g., T)Exposure-contrast ratio=C*EAt present+C0Wherein C and C0Is a constant). Additionally, or alternatively, the control circuit may be based on the resolution of the image and/or the type of glare condition the control circuit is attempting to detect (e.g., constants C and C)0May depend on the resolution of the image and/or the type of glare condition detected) to calculate the contrast-based exposure time T)Exposure-contrast ratio. As described herein, the contrast-based exposure time TExposure-contrast ratioCan be matched with the current illumination EAt presentProportional (e.g. exposure time T)Exposure-contrast ratioCan follow the current illuminance EAt presentIs increased by an increase in). Additionally, or alternatively, contrast-based exposure time TExposure-contrast ratioMay be inversely proportional to the brightness of the whitened pixel (e.g., the longer the exposure time, the lower the level of pixel whitening).

At 814, the calculated contrast-based exposure time TExposure-contrast ratioCan be compared with the absolute exposure time TExposure-AbsoluteA comparison is made to determine an exposure time for an image to be recorded (e.g., a single LDR image) to detect a glare condition and/or to determine a location of a glare source at a current resolution of the image. If contrast-based exposure time T is usedExposure-contrast ratioTo capture an image, the whitened pixels of the image may identify glare due to contrast glare conditionsThe location of the source. If absolute contrast time T is usedExposure-AbsoluteTo capture an image, the whitened pixels of the image may identify the location of the glare source due to an absolute glare condition. As described herein, a constant may be used to determine TExposure-contrast ratio

If at 814 the exposure time T based on contrast is determinedExposure-contrast ratioLess than absolute exposure time TExposure-AbsoluteThen an absolute glare condition may be occurring and the control circuit may use the absolute exposure time T at 816Exposure-AbsoluteTo record the image. If the exposure time T is based on the contrastExposure-contrast ratioGreater than or equal to TExposure-AbsoluteThen a contrast glare condition may be occurring and the control circuit may use the calculated contrast based exposure time T at 816Exposure-contrast ratioTo record the image. The control circuitry may record an image at 816 or 818 using the desired resolution. For example, the control circuitry may record images at 816 or 818 using low resolution in order to detect large-sized glare sources. In addition, the control circuitry may record images at 816 or 818 using high resolution to detect small-sized glare sources.

After capturing the image at the appropriate exposure time, the control circuitry may process the image to detect a glare condition and/or determine a position of the glare source. The control circuit may begin processing the pixel groups at a position relative to a fully closed position of the motorized window treatment. For example, if the motorized window treatment is positioned on top of the window and the shade fabric is lowered toward the bottom of the window (e.g., to a fully closed position), the control circuitry may process the image starting at the bottom of the image. At 820, the control circuit may start at a pixel at the bottom of the image. At 822, the control circuitry may process the ith pixel of the image, which may be the first pixel in the bottom row of pixels, to see if a glare condition exists. At 824, the control circuitry may determine whether the current pixel (e.g., ith pixel) is whitened. For example, the control circuit may determine whether the luminance value of the ith pixel is equal to 100 and/or whether the luminance values (e.g., RGB values) of the red content, green content, and blue content are all at a maximum value (e.g., 255). If the control circuit determines that the pixel is not whitened at 824, the control circuit may determine whether the image includes more unprocessed pixels at 826. If there are more unprocessed pixels in the image, the control circuit may move to the next pixel at 828. If the image does not include more unprocessed pixels, the control circuitry may determine that a glare condition is not detected in the image. If the control circuit determines that the pixel is whitish at 824, the control circuit may determine that a glare condition exists at 830, which may include storing (e.g., in memory) a location of the glare condition (e.g., a location of an ith pixel).

As shown in fig. 8, and described further herein, the program 800 may capture and/or process images at multiple resolutions. For example, different resolutions may be used to detect different types of glare conditions (e.g., a greater glare condition and/or a lesser glare condition). For example, the control circuitry may record images using a low resolution (e.g., minimum resolution) to detect larger glare sources, record images using a high resolution (e.g., maximum resolution) to detect smaller glare sources, and/or record images using a resolution between the low resolution and the high resolution to detect other sizes of glare sources. Absolute exposure time TExposure-AbsoluteAnd an exposure time T based on the contrastExposure-contrast ratioMay be different and/or may be adjusted based on the resolution of the image to be processed to detect glare conditions and/or to determine the position of the glare source. For example, for calculating the contrast-based exposure time TExposure-contrast ratioConstant (e.g., constants C and C)0) May depend on the resolution of the image to be processed.

At 834, the control circuitry may determine whether procedure 800 is complete. For example, at 834, the control circuitry may determine that routine 800 is not complete in order to detect a glare condition of different magnitude. The control circuit may then use the constants C and C according to the desired resolution of the image to be processed (e.g., processing at 822)0Is calculated at 812 the contrast based exposure time TExposure-contrast ratio. For example, depending on the desired resolution of the image to be processed, the control circuitry may recall the constants C and C from memory at 8120Is a suitable value of. At 814, the control circuit may calculate the contrast-based exposure time TExposure-contrast ratioAnd absolute exposure time TExposure-AbsoluteMaking a comparison in which the absolute exposure time TExposure-AbsoluteMay be recalled from memory and may depend on the resolution of the image to be processed. At 816 or 818, the control circuitry may record an image using the appropriate exposure time. For example, the control circuitry may record an image at 816 or 818 using a desired resolution (e.g., the control circuitry may record an image at a high resolution when performing 816 or 818 for the first time and then at a low resolution when performing 816 or 818 for the second time).

If the control circuitry determines that the routine 800 is complete at 834, the control circuitry may determine whether any glare conditions are detected at 836 (e.g., whether there are any white pixels at 824 for any execution of 824). For example, the control circuitry may determine whether any glare condition is detected by querying the memory (e.g., retrieving from the memory the location of one or more pixels that may indicate the glare condition). At 838, the control circuitry may determine at 836 the lowest of those pixels retrieved from memory. At 840, the control circuitry may process the lowest pixel indicative of the glare condition to remove the glare condition. For example, the control circuitry may process the pixels to remove the glare condition by performing one or more of: the method may include calculating a profile angle for a pixel having a glare condition, determining a shade position based on the profile angle, transmitting a shade control command including a control instruction indicating the shade position to the motorized window treatment, transmitting an open command to the motorized window treatment, and/or transmitting a close command to the motorized window treatment. Further, processing the pixels to remove the glare condition may include steps similar to step 734, step 736, and/or step 738 of the procedure 700. Additionally, or alternatively, processing the pixels to remove the glare condition may include controlling each of the bands of the controllable dynamic glass above the detected glare condition to a low transmittance state.

Fig. 9A is a sequence diagram 900 illustrating communication between control devices during an exemplary anti-glare procedure. As seen in fig. 9A, the anti-glare procedure may be performed by a visible light sensor 902 (e.g., visible light sensors 182, 300) and a motorized window treatment 904 (e.g., motorized roller shade 220). At 910, the visible light sensor 902 may record an image of the exterior of the room and/or building. At 912, the visible light sensor may process the image to detect a glare condition. For example, detection of a glare condition may include one or more steps from procedures 400, 500, 700, and/or 800.

If a glare condition is detected, visible light sensor 902 may determine a profile angle for the glare condition at 914. As described herein, the profile angle may define a location of a glare source outside of a window (e.g., window 202 in fig. 2). The profile angle may be determined based on the position of the detected glare source (e.g., a pixel in the image recorded at 910). Visible light sensor 902 may include a look-up table to determine the profile angle. For example, the lookup table may provide an indication of the profile angle based on the location of the detected glare source (e.g., the pixel in the image recorded at 910).

At 916, the visible light sensor 902 may determine a shade position for the motorized window treatment 904. The shaded position may prevent glare conditions from affecting a room (e.g., room 102 and/or space 200). For example, the shading fabric may be positioned such that the shading fabric blocks light from a glare source represented by the pixel where the glare is detected. At 918, the shade position can be transmitted to the motorized window treatment 904. Upon receiving the shade position, the motorized window treatment may move the shade fabric to the indicated position at 920.

Fig. 9B is a sequence diagram 950 illustrating communication between control devices during an exemplary anti-glare procedure. As seen in fig. 9B, the anti-glare procedure may be performed by the visible light sensor 952 (e.g., visible light sensors 182, 300), the system controller 954 (e.g., system controller 110), and the motorized window treatments 956 (e.g., motorized roller shades 220). At 958, visible light sensor 952 may record an image of the exterior of the room and/or building. At 960, the visible light sensor may process the image to detect a glare condition. For example, detection of a glare condition may include one or more steps from procedures 400, 500, 600, and/or 700.

If a glare condition is detected, visible light sensor 952 may determine a profile angle for the glare condition at 962. As described herein, the profile angle may define a location of a glare source outside of a window (e.g., window 202 in fig. 2). The profile angle may be determined based on the position of the detected glare source (e.g., a pixel in the image recorded at 958). Visible light sensor 952 may include a look-up table to determine the profile angle. For example, the lookup table may provide an indication of the profile angle based on the location of the detected glare source (e.g., the pixel in the image recorded at 958).

At 964, the visible light sensor 952 may transmit the profile angle to the system controller 954. At 966, the system controller 954 may determine a shade position for the motorized window treatments 956. For example, the shading fabric may be positioned such that the shading fabric blocks light from a glare source represented by the pixel where the glare is detected. At 968, the system controller 954 may transmit the shade position to the motorized window treatments 956. Upon receiving the shade position, the motorized window treatment may move the shade fabric to the indicated position at 970. Although visible light sensor 952 is shown as processing images, system controller 954 may also or alternatively perform image processing after visible light sensor 952 generates images.

Fig. 10 is a block diagram illustrating an example system controller 1000, such as the system controller 110 described herein. The system controller 1000 may include a control circuit 1002 for controlling the functions of the system controller 1000. The control circuitry 1002 may include one or more general-purpose processors, special-purpose processors, conventional processors, Digital Signal Processors (DSPs), microprocessors, integrated circuits, Programmable Logic Devices (PLDs), Application Specific Integrated Circuits (ASICs), and the like. The control circuitry 1002 may perform signal coding, data processing, image processing, power control, input/output processing, or any other function that enables the system controller 1000 to perform as described herein. The control circuitry 1002 may store information in and/or retrieve information from the memory 1004. The memory 1004 may include non-removable memory and/or removable memory. The non-removable memory may include Random Access Memory (RAM), Read Only Memory (ROM), a hard disk, or any other type of non-removable memory storage device. The removable memory may include a Subscriber Identity Module (SIM) card, a memory stick, a memory card, or any other type of removable memory.

The system controller 1000 may include communication circuitry 1006 for transmitting and/or receiving information. The communication circuit 1006 may perform wireless and/or wired communication. The system controller 1000 may also or alternatively include a communication circuit 1008 for transmitting and/or receiving information. The communication circuit 1006 may perform wireless and/or wired communication. The communication circuits 1006 and 1008 may communicate with the control circuit 1002. The communication circuits 1006 and 1008 may include an RF transceiver or other communication module capable of performing wireless communication via an antenna. The communication circuit 1006 and the communication circuit 1008 may be capable of performing communication via the same communication channel or different communication channels. For example, the communication circuitry 1006 may be capable of communicating via a wireless communication channel (e.g.,near Field Communication (NFC),Cellular, etc.) and may be capable of communicating (e.g., with a network device, over a network, etc.), and the communication circuit 1008 may be capable of communicating via another wireless communication channel (e.g.,or a proprietary communication channel, such as Clear) Performing communication(e.g., with the control device and/or other devices in the load control system).

The control circuit 1002 may communicate with an LED indicator 1012 to provide an indication to a user. The control circuitry 1002 may be in communication with an actuator 1014 (e.g., one or more buttons) that may be actuated by a user to communicate user selections to the control circuitry 1002. For example, the actuator 1014 may be actuated to place the control circuitry 1002 in an association mode and/or to transmit an association message from the system controller 1000.

Each of the modules within the system controller 1000 may be powered by a power supply 1010. The power supply 1010 may include, for example, an AC power supply or a DC power supply. The power supply 1010 may generate a supply voltage V for powering modules within the system controller 1000CC

Fig. 11 is a block diagram illustrating an exemplary control-target device (e.g., load control device 1100, as described herein). The load control device 1100 may be a dimmer switch, an electronic ballast for a lamp, an LED driver for an LED light source, an AC plug-in load control device, a temperature control device (e.g., a thermostat), a motor drive unit for a motorized window treatment, or other load control device. The load control device 1100 may include a communication circuit 1102. The communication circuit 1102 may include a receiver, RF transceiver, or other communication module capable of performing wired and/or wireless communication via a communication link 1110. The communication circuit 1102 may be in communication with a control circuit 1104. The control circuitry 1104 may include one or more general-purpose processors, special-purpose processors, conventional processors, Digital Signal Processors (DSPs), microprocessors, integrated circuits, Programmable Logic Devices (PLDs), Application Specific Integrated Circuits (ASICs), and the like. The control circuit 1104 may perform signal coding, data processing, power control, input/output processing, and/or any other function that enables the load control device 1100 to perform as described herein.

The control circuit 1104 may store information in and/or retrieve information from the memory 1106. For example, memory 1106 may maintain a registry of associated control devices and/or control instructions. Memory 1106 can include non-removable memory and/or removable memory. The load control circuit 1108 may receive instructions from the control circuit 1104 and may control the electrical load 1116 based on the received instructions. For example, the electrical load 1116 may control a motorized window treatment (e.g., the motorized window treatment 150). The load control circuit 1108 may send state feedback to the control circuit 1104 regarding the state of the electrical load 1116. The load control circuit 1108 may receive power via the hot 1112 and neutral 1114 connections and may provide an amount of power to the electrical load 1116. The electrical load 1116 may include any type of electrical load.

The control circuit 1104 may be in communication with an actuator 1118 (e.g., one or more buttons) that may be actuated by a user to communicate user selections to the control circuit 1104. For example, the actuator 1118 may be actuated to place the control circuit 1104 in the association mode and/or to transmit an association message from the load control device 1100.

Although features and elements are described herein in particular combinations, each feature or element can be used alone or in any combination with the other features and elements. For example, the functions described herein may be described as being performed by a control device (such as a remote control device or a lighting device), but may similarly be performed by a hub device or a network device. The methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor. Examples of computer readable media include electronic signals (transmitted over a wired or wireless connection) and computer readable storage media. Examples of computer readable storage media include, but are not limited to, read-only memory (ROM), random-access memory (RAM), removable disks, and optical media such as CD-ROM disks and Digital Versatile Disks (DVDs).

Although the methods described herein are described with reference to controlling a motorized window treatment (e.g., motorized window treatment 150 and/or motorized roller shade 220) for preventing a glare condition, the methods may be used to control other types of daylight control devices to prevent and/or mitigate the glare condition. For example, the methods described herein may be used to control the transmittance of controllable dynamic glass (e.g., smart glass and/or electrochromic glass) and/or adjust the position of indoor or outdoor controllable louvers to prevent and/or mitigate glare conditions. For example, the dynamic glass may include one or more horizontal bands (e.g., zones) that may be controlled individually or across each horizontal band between a high transmittance state and a low transmittance state. The dynamic glass may be controlled to a low transmittance state to remove (e.g., block) the glare condition. The dynamic glass may be controlled in each band, similar to the window shades described herein for controlling motorized window treatments. The ribbon of dynamic glass controlled to a low transmittance state may be determined based on the determined profile angle of the glare source. For example, each band higher than the band determined based on the determined profile angle of the glare source may be controlled to a low transmittance state.

43页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于非接触式无源NFC和RFID设备的开关按钮

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!