Sensor control apparatus

文档序号:1432573 发布日期:2020-03-17 浏览:14次 中文

阅读说明:本技术 传感器控制设备 (Sensor control apparatus ) 是由 A.V.潘德里潘德 D.R.凯塞多费尔南德斯 于 2018-07-10 设计创作,主要内容包括:一些实施例针对一种用于连接的照明系统(100)的传感器控制设备(200),所述连接的照明系统包括布置在区域中的多个照明器(122),该传感器控制设备从接收的传感器数据获取与传感器数据相关联的子区域中的占用者位置和/或子区域中的多个点的多个照度水平,并且在虚拟传感器位置处定义虚拟传感器。来自虚拟传感器的占用状态和/或照度水平用于控制多个照明器中的一照明器,该照明器覆盖区域中的虚拟传感器位置。(Some embodiments are directed to a sensor control device (200) for a connected lighting system (100) comprising a plurality of luminaires (122) arranged in an area, the sensor control device acquiring, from received sensor data, an occupant position in a sub-area associated with the sensor data and/or a plurality of illuminance levels of a plurality of points in the sub-area, and defining a virtual sensor at the virtual sensor position. The occupancy state and/or the illuminance level from the virtual sensor is used to control a luminaire of a plurality of luminaires, which luminaire covers the virtual sensor position in the area.)

1. A sensor control device (200) for a connected lighting system (100) comprising a plurality of luminaires (122) arranged in an area, the sensor control device comprising

A receiving unit (210) arranged to receive sensor data from a plurality of sensors (124) arranged in the area, each sensor being associated with a sub-area comprised in the area, the sensor data being indicative of one or more occupant positions of occupants in the sub-area and/or a plurality of illuminance levels at a plurality of different points in the sub-area,

-a virtual sensor position storage (246) storing virtual sensor positions of virtual sensors for occupancy and/or illumination in the area,

-a processing unit arranged to

-obtaining, from the received sensor data, an occupant position in a sub-area associated with the sensor data and/or a plurality of illuminance levels of a plurality of points in the sub-area associated with the sensor data,

integrating the occupancy states of the plurality of occupant locations and/or the plurality of illuminance levels of the plurality of points into an integrated area-wide occupancy and/or illuminance map,

-defining a virtual sensor at the virtual sensor location and calculating an occupancy state and/or an illuminance level of the virtual sensor from the virtual sensor location and the integrated occupancy and/or illuminance map,

-using the occupancy state and/or illuminance level from the virtual sensor to control a luminaire of the plurality of luminaires that covers the virtual sensor location in the area.

2. Sensor control device according to claim 1, wherein the processing unit is arranged to

-converting the occupant positions and/or the plurality of points in the sub-area from sub-area-wide coordinates to area-wide coordinates before integrating the occupancy states of the plurality of occupant positions and/or the plurality of illuminance levels of the plurality of points.

3. Sensor control device for a connected lighting system according to claim 1 or 2, comprising

-a luminaire location storage (248) storing locations of the plurality of luminaires, the processing unit being arranged to select a luminaire from the plurality of luminaires having an illumination coverage comprising the virtual sensor location.

4. Sensor control device for a connected lighting system according to any of the preceding claims, wherein the virtual sensor position is selected in a zone around a luminaire, said zone being defined as a zone where the light intensity of the controlled luminaire contributes at least a predetermined percentage to the total light intensity at the virtual sensor position.

5. The sensor control device for a connected lighting system according to any of the preceding claims, wherein the sensor data comprises a series of coordinates indicative of a sub-area range of the occupant position and/or an illuminance level of the plurality of points, the occupant position and/or points being relative to the sensor.

6. The sensor control device for a connected lighting system according to any one of the preceding claims, wherein at least one of the plurality of sensors is a multimodal sensor arranged to provide sensor data indicative of a plurality of distinct occupant positions in the sub-area and a plurality of distinct illuminance levels at a plurality of different points.

7. The sensor control device for a connected lighting system according to any of the preceding claims, wherein a first sub-area and a second sub-area associated with a first sensor and a second sensor overlap, the processor being arranged to integrate the illumination sensor values of the occupancy states and/or points in the overlapping part of the first and second area by applying a merging routine arranged to merge the sensor data in the overlapping part of the first and second sub-area.

8. Sensor control device for a connected lighting system according to any of the preceding claims, wherein the processor circuit is arranged to

-calculating an occupancy state from the virtual sensor position and the integrated occupancy map of the area by: selecting occupancy states from the integrated map obtained for points in another sub-area around the virtual sensor position, assigning an occupancy state to the virtual sensor if any of the selected occupancy states indicates occupancy, and/or

-calculating an illumination level from the virtual sensor position and the integrated illumination map of the area by: obtaining from an integrated map an illuminance level obtained for a point in another sub-area around the virtual sensor location, assigning the virtual sensor an illuminance level by interpolation from the selected illuminance level.

9. Sensor control device for a connected lighting system according to any of the preceding claims, wherein the lighting system is configured with a plurality of control zones, luminaires of the connected lighting system being assigned to control zones for joint control of the luminaires, control of at least one of the control zones depending on the virtual sensor.

10. Sensor control device for a connected lighting system according to claim 9, wherein the lighting system comprises a lighting controller (120) configured to actuate at least a part of the luminaires, the lighting controller having a sensor input configured to receive sensor values, the lighting controller being configured with a control map indicating the assignment of luminaires to control zones and the dependence of luminaires in a control zone on sensor values received at the input, the processor circuit of the sensor control device being arranged to provide sensor values to the sensor input, at least one of the sensor values being a sensor value calculated for a virtual sensor.

11. The sensor control device for a connected lighting system of claim 9, wherein the processor circuit is arranged to

-selecting a region from the integrated occupancy map wherein the integrated occupancy map indicates occupancy,

-selecting a virtual sensor position for the virtual sensor in the zone,

-selecting a luminaire having a position in or near said zone from a luminaire position database,

-defining a control map, wherein a control zone comprises the selected luminaires, and wherein the control zone is controlled by the virtual sensors,

-configuring the lighting system, e.g. a lighting controller (120), with the control map.

12. A connected lighting system comprising a sensor control device as claimed in any one of the preceding claims, the connected lighting system comprising

A plurality of sensors arranged in the area, each sensor being associated with a sub-area comprised in the area and comprising a transmitter for transmitting sensor data to the sensor control device, the sensor data being indicative of one or more occupant positions of occupants in the sub-area and/or a plurality of illuminance levels at a plurality of different points in the sub-area,

-a plurality of luminaires arranged in the area, the luminaires comprising a receiver for receiving luminaire actuation commands from the lighting system.

13. A sensor control method for a connected lighting system comprising a plurality of luminaires arranged in an area, the sensor control method comprising

-receiving sensor data from a plurality of sensors arranged in the area, each sensor being associated with a sub-area comprised in the area, the sensor data being indicative of one or more occupant positions of occupants in the sub-area and/or a plurality of illuminance levels at different points in the sub-area,

-storing virtual sensor positions in the area for virtual sensors of occupancy and/or illuminance,

-obtaining, from the received sensor data, an occupant position in a sub-area associated with the sensor data and/or a plurality of illuminance levels of a plurality of points in the sub-area associated with the sensor data,

-converting the occupant positions and/or the plurality of points in the sub-area into area-wide coordinates and integrating the occupancy states of the plurality of occupant positions and/or the plurality of illuminance levels of the plurality of points into an integrated area-wide occupancy and/or illuminance map,

-defining a virtual sensor at the virtual sensor location and calculating an occupancy state and/or an illuminance level of the virtual sensor from the virtual sensor location and the integrated occupancy and/or illuminance map,

-using the occupancy state and/or the illuminance level from the virtual sensor to control a luminaire of the plurality of luminaires covering the virtual sensor position in the area.

14. A computer-readable medium (1000) comprising transitory or non-transitory data (1020) representing instructions that cause a processor system to perform the method according to claim 13.

Technical Field

The present invention relates to a sensor control device for a connected lighting system, a sensor control method for a connected lighting system, and a computer readable medium.

Background

European patent application EP2987390, entitled "calibration operation of a luminaire" (incorporated herein by reference), discloses a known light system. The known light system comprises one or more luminaires. The luminaire is arranged to emit artificially generated light into the space. The space will tend to include some amount of ambient light, such as sunlight or other natural light, at least some times of the day.

Each luminaire includes a respective light sensor, a presence sensor, and a controller associated with each respective lighting device. The respective presence sensor is arranged to sense the presence of a living being, typically a human user, in the area of the space illuminated by the respective lighting device. The lamps are controlled based on the detected presence and may be arranged to dim in a granular manner, i.e. each individual luminaire. Each controller controls the light of its respective lighting device based on its respective light sensor. To this end, the controller is calibrated so that the light emitted from the device can be controlled to provide a specified light level at a certain point or height within a room or other space (e.g., within a workspace plane such as the height of a table).

Disclosure of Invention

A sensor control device for a connected lighting system is provided. The connected lighting system comprises a plurality of luminaires arranged in an area. The sensor control device comprises

A receiving unit arranged to receive sensor data from a plurality of sensors arranged in a region, each sensor being associated with a sub-region comprised in the region, the sensor data being indicative of one or more occupant positions of occupants in the sub-region and/or a plurality of illuminance levels at different points in the sub-region,

a virtual sensor position storage storing virtual sensor positions of virtual sensors in the area for occupancy and/or illumination,

-a processing unit arranged to

-obtaining, from the received sensor data, an occupant position in a sub-area associated with the sensor data and/or a plurality of illuminance levels of a plurality of points in the sub-area associated with the sensor data,

integrating occupancy states of a plurality of occupant locations and/or a plurality of illuminance levels of a plurality of points into an integrated area-wide occupancy and/or illuminance map,

-defining a virtual sensor at a virtual sensor location and calculating an occupancy state and/or an illuminance level of the virtual sensor from the virtual sensor location and the integrated occupancy and/or illuminance map,

-using the occupancy state and/or the illuminance level from the virtual sensor to control a luminaire of the plurality of luminaires, the luminaire covering a virtual sensor position in the area.

By defining a virtual sensor based on the received sensor information, the physical location of the sensor is independent of the location used for controlling the luminaire. This provides great flexibility when recombining lighting settings. Without moving the physical sensors, a new control map may be created based on occupancy or illumination at different locations. Furthermore, the lighting control may even be changed using a conventional lighting controller where sensor information is desired at the input. The lighting controller may not notice the fact that: the sensor information is not directly from the physical sensors, but from virtual sensors.

Also, the system may be used to reduce the number of sensors used. For example, by several sensors covering a larger area but giving information about multiple occupants and/or multiple illumination levels, no separate sensor for each luminaire is required. In particular, the system may be used with luminaires that do not have their own sensors. This allows modern intelligently connected lighting systems to be used with conventional luminaires that do not have integrated sensors. At the same time, advanced and more complex control maps can still be used. For example, in one example, if some portion of an office is unoccupied, it may be desirable to reduce the lighting. However, if there is a person in an adjacent area, the illumination may be reduced to a higher level than if there is no person in the adjacent area.

The sensor control device is an electronic device. For example, the sensor control device may be implemented in a computer or a server. The sensor control device may also be integrated in the lighting controller. The sensor control apparatus or the sensor control method as described herein can be applied in a wide range of practical applications. Such practical applications include offices, hospitals, public places, and the like.

The method according to the invention can be implemented on a computer as a computer-implemented method, or can be implemented in dedicated hardware, or in a combination of both. Executable code for the method according to the invention may be stored on a computer program product. Examples of computer program products include memory devices, optical memory devices, integrated circuits, servers, online software, and so forth. Preferably, the computer program product comprises non-transitory program code stored on a computer readable medium for performing the method according to the invention when said program product is executed on a computer.

In a preferred embodiment, the computer program comprises computer program code adapted to perform all the steps of the method according to the invention when the computer program is run on a computer. Preferably, the computer program is embodied on a computer readable medium.

Another aspect of the invention provides a method of making a computer program available for download. This aspect is used when the computer program is uploaded to, for example, the apple App store, google's Play store, or microsoft's Windows store, and when the computer program is available for download from such stores.

Drawings

Further details, aspects and embodiments of the invention will be described, by way of example only, with reference to the accompanying drawings. Elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. In the drawings, elements corresponding to elements already described may have the same reference numerals. In the drawings, there is shown in the drawings,

figure 1 schematically shows an example of embodiment of a connected lighting system,

figure 2 schematically shows an example of embodiment of a control procedure,

figure 3 schematically shows an example of embodiment of a connected lighting system,

figure 4a schematically shows an example of an embodiment of a luminaire in an open office space,

figure 4b schematically shows an example of an embodiment of a physical sensor in an open office space,

figure 4c schematically shows an example of an embodiment of office furniture in an open office space,

figure 4d schematically shows an example of an embodiment of a virtual sensor in an open office space,

figure 4e schematically shows the overlay of figures 4c and 4d,

figure 4f schematically shows an example of an embodiment of office furniture in an open office space,

figure 4g schematically shows an example of an embodiment of a virtual sensor in an open office space,

figure 4h schematically shows the overlay of figures 4f and 4g,

figure 5a schematically shows an example of an embodiment of an occupancy map of an integrated area range,

figure 5b schematically shows an example of an embodiment of a cluster in an occupancy map for an integrated area range of an open office space,

figure 5c schematically shows an example of an embodiment of a virtual sensor in an open office space,

figure 5d schematically shows an example of an embodiment of a control zone of an open office space,

figure 6a schematically shows an example of an embodiment of virtual sensors and luminaires in an open office space,

figure 6b schematically shows an example of an embodiment of added luminaires and virtual sensors,

figure 7a schematically shows an example of an embodiment of two physical sensors with overlapping sub-areas,

figure 7b schematically shows an example of an embodiment of two physical sensors with overlapping sub-areas,

figure 8 schematically shows an example of embodiment of a sensor control method,

figure 9a schematically shows a computer-readable medium having a writeable part comprising a computer program according to an embodiment,

fig. 9b schematically shows a representation of a processor system according to an embodiment.

Detailed Description

While this invention is susceptible of embodiment in many different forms, there is shown in the drawings and will herein be described in detail one or more specific embodiments, with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention and is not intended to limit the invention to the specific embodiments illustrated and described.

Hereinafter, elements of the embodiments are described in operation for understanding. It will be apparent, however, that the various elements are arranged to perform the functions described as being performed by them.

Furthermore, the invention is not limited to the embodiments, and the invention lies in each and every novel feature or combination of features described herein or recited in mutually different dependent claims.

Fig. 1 schematically shows an example of an embodiment of a connected lighting system 100. The connected lighting system 100 includes a plurality of illuminators 122 and a plurality of sensors 124 arranged in an area. The connected lighting system 100 further comprises a sensor control device 200. The embodiment shown in fig. 1 also includes a light controller 120. Note that embodiments without a light controller are also possible.

For example, the area may include an interior area of a building, such as an office, laboratory, store floor, or other room; or may include outdoor areas such as gardens, parks, squares, shopping centers, or stadiums; or a coverage area such as a kiosk. The area may also be an open office. Several examples of embodiments are given below in which the area is an open office; the skilled person may adapt such examples to other areas.

The plurality of luminaires 122 may take the form of integrated fixtures or stand-alone units of a room. Each luminaire comprises a respective lighting device, such as an LED (light emitting diode) or an electrical filament, and any associated fixtures or accessories. For example, the luminaire may be mounted on a ceiling or wall of a room. The lighting devices of each luminaire are arranged to emit artificially generated light into the area. Further, the area will tend to include some amount of ambient light, such as sunlight or other natural light, at least some times of the day. For example, if the area is a room, it will typically include one or more openings, such as windows, for example windows and/or skylights in the side walls of the room. The window allows other light to enter the room from the outside, mainly natural light including sunlight from the sun.

The sensor control device 200 comprises a processing unit arranged to perform the activities of the sensor control device. Examples of processing units are shown herein. Fig. 1 and 3 show functional units that may be functional units of a processor circuit. For example, fig. 1 and 3 may be used as blueprints for a possible functional organization of processor circuitry. The processor circuit is not shown separately from the units in these figures. For example, the functional elements shown in the figures may be implemented in whole or in part in computer instructions stored at device 200 (e.g., in an electronic memory of device 200), and executable by a microprocessor of device 200. In a hybrid embodiment, the functional units are implemented partly in hardware (e.g. as a co-processor) and partly in software stored and executed on the device 200.

The connected lighting system 100 comprises a plurality of sensors arranged in an area. Each sensor is associated with a sub-area included in the area. The sensor may be an occupancy sensor or an illumination sensor, or both. The occupancy sensor senses the presence of an individual in the sub-area. The illumination sensor senses an illumination level in the sub-area. The sensor is arranged to encode the sensed information in sensor data. Generally, the sensor data is digital data transmitted to the sensor control apparatus 200 through a digital message network. For example, the sensor may include a digital transmitter for periodically transmitting sensor data to the sensor control device 200. In one embodiment, the sensor data is reported to the sensor control device 200 along with the sensor ID and optional timestamp.

As occupancy sensors, the sensor data indicates one or more occupant positions of occupants in the sub-area. In other words, the sensor is arranged to detect the position of a person in the sub-area associated with the sensor relative to the sensor. In an embodiment, the sensor is capable of detecting at least two persons in the sub-area, although the sensor may be arranged to detect more persons, e.g. 2 or more, 3 or more, etc. There may be a maximum value; for example, the sensor may be limited to detecting up to 10 different people, etc. This limit may affect the accuracy of the system, but if the limit is large enough compared to the size of the sub-region, this will generally not be a problem. In an embodiment, the sensor data indicates the location of a plurality of distinct individuals within the sub-area. In an embodiment, the sensor data is indicative of illumination of a plurality of distinct points or cells within the sub-area. The illumination values of the cells (e.g. small sub-areas of the sub-area) may be averaged.

As an illumination sensor, the sensor data indicates a plurality of illumination levels at a plurality of different points in the sub-area. For example, the sensor may detect and report the illumination level at 8 or more points, 16 or more points, more than 128 points, and the like.

These sensors therefore differ from conventional sensors used in known lighting systems for lighting control, in which the occupancy value is binary and the measurement of the light sensor indicates the illumination for a single point or a single average of the sub-area.

One, more than one, or even all of the sensors may be multimode sensors. The multimodal sensor is arranged to provide sensor data indicative of a plurality of distinct illumination levels and a plurality of distinct occupant positions at a plurality of different points in the sub-area. In other words, using a single multimode sensor, there is no need to have separate occupancy and illuminance sensors. By replacing separate occupancy and illumination sensors with a single multimodal sensor, the embodiments described herein may be adapted to multimodal sensors; or vice versa, embodiments using multimodal sensors may be adapted by replacing the multimodal sensors with separate occupancy and illumination sensors.

In an embodiment, the multimodal sensor (e.g., a camera-based vision sensor) has a high spatial granularity at which data is generated. For example, conventional lighting or occupancy sensors typically average a relatively large area. For example, a high granularity multimode sensor may have a granularity of 1 measurement per square centimeter of floor space (e.g., office floor space) or better.

Some or all of the sensors may be integrated with the luminaire, but this is not essential. For example, a subregion may relate to (e.g. be the same as) the coverage of the luminaire; but this need not be the case. In practice, the sensor may be installed as an upgrade of the lighting system. In an embodiment, the sensor, in particular the multimode sensor, may be mounted independently of and/or separately from the luminaire. Typically, the sub-area associated with the sensor will be larger, or even much larger, than the coverage area of the illuminator.

The occupancy and/or illumination sensor may be a vision sensor; for example, the sensors may comprise a camera for recording an image of the sub-area. For example, the sensor may be mounted in the ceiling of the area.

The sensor data may be raw, e.g., raw measurement data that requires further processing downstream. For example, the sensor data may include digital images recorded by the sensor. The sensor data may also be processed, for example, as indicated below. The latter has the advantage of significantly reducing system bandwidth requirements. For example, the sensor may generate data in the form of user position and illumination values for a plurality of points or grid cells within its sensing region. The points or cells may be at a predefined spatial granularity. In an embodiment, the sensor data generated by the sensor comprises a series of coordinates indicative of a range of sub-regions of occupant positions and/or illumination levels of a plurality of points, the occupant positions and/or points being relative to the sensor.

The processing of visual data (e.g., digital images for illumination) may use methods known per se in the art. For example, the paper "detecting illumination in an image" by Graham Finlayson et al discloses a method for detecting illumination in an image, such as determining which pixels are illuminated by different lamps. The method uses a sensor that includes a colorimetric (chromogenic) camera. The color rendering camera takes two pictures for each scene: one was captured normally and the other was captured through a color filter. The camera may be used directly for the estimation of the source of the colored light. A combinatorial search may be used to improve the estimate.

The processing of the visual data (e.g., for occupied digital images) may also use methods known per se in the art. For example, the Daniel Toth et al paper "illumination constant variation detection" discloses the detection of moving objects in a sequence of images acquired by a still camera. The method analyzes the gray scale difference between successive frames. The motion detection algorithm is combined with a homomorphic filter that effectively suppresses variable scene illumination.

The processing of the digital image may be done in the sensor, in which case the sensor data may include information directly identifying the individuals and lighting of the entire sub-area: for example, one or more coordinate pairs of an individual in the sub-region and/or a plurality of luminance values of a plurality of points in the sub-region are identified. On the other hand, this process may also be completed at the sensor control device 200.

The sensor control device 200 includes a receiving unit 210. The receiving unit 210 is arranged to receive sensor data from the plurality of sensors 124. For example, the sensor 124 and the receiving unit 210 may communicate over a digital network (e.g., a Local Area Network (LAN), including, for example, a Wi-Fi network, such as a ZigBee network, etc.). The digital network may be partially or wholly wireless or partially or wholly wired. For example, a digital network receiver or transmitter/receiver, etc., such as a Wi-Fi transmitter/receiver, may be included.

The sensor control device 200 comprises an input unit 220, the input unit 220 being arranged to acquire from the received sensor data the occupant position in the sub-area associated with the sensor data and/or a plurality of illuminance levels of a plurality of points in the sub-area associated with the sensor data. The input unit 220 may be more or less complex depending on the complexity of the sensor. For example, if most or all of the processing is done in the sensors, the input unit 220 may only need to parse the sensor data to obtain therefrom occupant locations, e.g., coordinates relative to the sensors, and/or illumination levels of a plurality of points, e.g., arranged as a series of illumination levels or a series of coordinates and corresponding illumination levels, etc. On the other hand, the sensors may be less intelligent, in which case the input unit 210 may process the received sensor data to obtain occupancy and illuminance values, for example, by performing the above-described processing.

The sensor control device 200 includes a sensor data aggregation unit 230. The sensor data aggregation unit 230 is arranged to convert the occupant positions and/or the plurality of points in the sub-area into area-wide coordinates and to integrate the occupancy states of the plurality of occupant positions and/or the plurality of illuminance levels of the plurality of points into an integrated area-wide occupancy and/or illuminance map. For example, the sensor control device 200 may include a physical sensor location storage 242 that stores the physical locations of the plurality of sensors 124 in the area. For example, the storage 242 may associate a sensor ID with its location.

For example, in an embodiment, a sensor may report sensor data such as { (-1, + 3), (5, -4), (2, 3) } to indicate detection of three individuals in the sub-region associated with the sensor. The coordinates are relative to the sensor and may be expressed in some suitable dimension, for example in meters, centimeters, or the like. The sensor data aggregation unit 230 may retrieve from the physical sensor location storage 242 where the particular sensor is, e.g., identified from a sensor ID received with the sensor data, e.g., the sensor is located at a particular coordinate, such as (15, 13). The latter is the coordinates of the area, e.g. relative to a fixed orientation point shared by all sensors in the area. The sensor data aggregation unit 230 may convert the coordinates of the local sub-area range into the coordinates of the area range, for example, by adding the local coordinates to the coordinates of the sensor. In the latter case, the coordinates are converted into { (14, 16), (20, 9), (17, 16) }.

The situation is similar for the illumination values. For example, the sensor may report { (0, 0), 430}, { (1, 0), 431}, { (0, 1), 429}, … … }, which combines the coordinates of the sub-region range with the illumination level. Converted to coordinates of a range of areas, these may become: { (15, 13), 430}, { (16, 13), 431}, { (15, 14), 429}, … … }. For which coordinates the reporting illumination may be known in advance. For example, if the sensor control device and the sensor can access a list of predetermined locations (e.g., coordinates of a plurality of points), the sensor can simply report (430, 431, 429, … …).

The sensor may also report occupancy or illumination by including raw data in the sensor data (e.g., image data). For example, the image taken from the sensor includes the location of the individual in the image. Image recognition software may be used to detect objects, particularly individuals in images. The pixels of the image may correspond to particular coordinates relative to the sensor. For example, after an image has been received at the sensor control device and an individual in the image has been identified, the position of the individual relative to the sensor (e.g., as coordinates) is known. The local coordinates may then be converted to coordinates of a region range.

After the local sub-area coordinates are converted, they are integrated into an integrated area-wide occupancy and/or illumination map. For example, the integrated area-wide occupancy map and the integrated area-wide illuminance map may be implemented as a two-dimensional array, e.g., a grid. The array location may be written with an occupied or illuminated state. Arrays are convenient but not required and other data structures include linked lists. For example, in one embodiment, a linked list is used for the occupancy map of the integrated area coverage and an array is used for the illumination map of the integrated area coverage. The integrated map may be stored in map storage 244.

In most cases, the data reported by the sensors can be copied directly to the map. Handling overlapping sensors may require some care. This situation is shown in fig. 7 a. Fig. 7a shows two sensors, indicated as small squares and their corresponding sub-regions. For the left square, its sub-regions are solid lines, and for the right square, the sub-regions are dashed lines. Shown in fig. 7b are two individuals in the overlapping portion of the sub-regions. Both sensors detect the same individual, but at slightly different places. For example, the difference may be caused by measurement inaccuracies, e.g., by slight inaccuracies in the sensor position. The identity of the left sensor is represented by the solid line ellipse and the identity of the right sensor is represented by the dashed line ellipse.

The sensor data aggregation unit 230 is arranged to apply a merging routine arranged to merge sensor data in overlapping parts of the left and right sub-areas. For example, in an embodiment, the sensor data aggregation unit 230 is arranged to prioritize some sensors over others, using only the data of the prioritized sensors in the overlap region. For example, if the left sensor has a higher priority than the right sensor, the merge routine may ignore data from the right sensor. This applies to occupancy and lighting. For example, all sensors may be assigned a priority value, which may be used to determine which sensor gives priority. For example, all priority values may be different. For example, priority values may be randomly assigned.

For example, in one embodiment, the merging routine is arranged to map individuals identified in the left sensor to individuals identified in the right sensor; for example by mapping the left individual to the nearest right individual. After identification, the routine may average the coordinates of the individuals. In the case of fig. 7b, this will result in detected individuals between the left and right detected individuals.

For illumination, a similar combination can be made. For example, the light level in the overlapping region may be an average value. The average may be weighted based on the distance to the left and right sensors in the overlap region. This has the advantage that a smoother transition from the left sensor to the right sensor is achieved. Other interpolations may be used, for example, non-linear and/or polynomial interpolations.

The sensor control device 200 comprises a virtual sensor position storage 246 arranged to store the position of the virtual sensor in the area. The sensor control device 200 includes a virtual sensor unit 270, the virtual sensor unit 270 defining a virtual sensor at a virtual sensor location and calculating an occupancy state and/or an illuminance level of the virtual sensor from the virtual sensor location and the integrated occupancy and/or illuminance map.

In an embodiment, the occupancy status and/or illumination may be read directly from the integrated map. Interpolation may be used where the granularity of the location of the virtual sensor is different from the granularity in the map. The denser the number of points relative to the area where measurements can be taken, the more accurate the value of the virtual sensor will be.

For example, to calculate the occupancy state from the virtual sensor position and the integrated occupancy map of the area, the virtual sensor unit 270 may select an occupancy state from the integrated occupancy map that is acquired for a point in another sub-area around the virtual sensor position, and assign the occupancy state to the virtual sensor if any of the selected occupancy states indicates occupancy. For example, the virtual sensor unit 270 draws a virtual circle indicating another sub-area in the integrated map using the position of the virtual sensor as a center and a predetermined value as a radius. If the map indicates occupancy of any of the points in the virtual circle, then the virtual sensor is assigned an occupied value, and otherwise the virtual sensor is assigned an unoccupied value.

For example, in order to calculate the illuminance level from the integrated illuminance map of the virtual sensor location and the area, the virtual sensor unit 270 may acquire the illuminance level acquired for a point in another sub-area around the virtual sensor location from the integrated map, and assign the illuminance level to the virtual sensor by interpolating according to the selected illuminance level. The illumination may be averaged in this further sub-area. The two other sub-areas for occupancy and illumination do not have to be identical.

In an embodiment, the diameter of the virtual circle or virtual measurement zone is selected to be larger than the minimum distance between measurement points reported by the physical sensors.

The sensor control device 200 comprises a light controller output 280 arranged to control a luminaire of a plurality of luminaires covering a virtual sensor position in the area using the occupancy state and/or the illuminance level from the virtual sensor. For example, control may be direct, e.g., by sending digital commands directly to the luminaire instructing it to increase or decrease the light level. Virtual sensor information (e.g., occupancy and/or illumination level) may also be sent directly to the luminaire. The luminaire can then use these values as if they were inputs taken from real sensors and increase or decrease the light level accordingly. Fig. 1 shows a further different embodiment, in which an existing lighting network infrastructure is used.

Fig. 1 shows a light controller 120. The light controller 120 may be conventional and arranged to control luminaires arranged in one or more control zones based on sensor input. However, the light controller 120 is wired to receive sensor data from the (at least one) virtual sensor defined by the sensor control device 200. The light controller 120 is unaware of the fact that it receives sensor information not from a physical sensor but from a virtual sensor. This allows upgrading of existing light control networks without the need to replace existing luminaires and/or light controllers.

For example, in one embodiment, a plurality of illuminators are organized in a plurality of control zones. The luminaires of the connected lighting system are assigned to a control zone for joint control of the luminaires in the control zone. Typically, each luminaire is assigned to one and only one control zone. Control of at least one of the control zones relies on the virtual sensor(s). The latter is most conveniently done using a light controller such as controller 120 as an intermediary. The lighting controller 120 may be a lighting controller device. More complex control maps may use multiple virtual sensors.

For example, the lighting controller 120 may be configured to actuate at least a portion of the luminaires. The lighting controller 120 has a sensor input configured to receive a sensor value. The sensor control device 200 is arranged to provide sensor values to the sensor inputs, at least one of the sensor values being a sensor value calculated for a virtual sensor. The virtual and non-virtual sensor values may be mixed. For example, a non-virtual sensor may be acquired directly from a physical sensor.

The lighting controller 120 is configured with a control map indicating the assignment of luminaires to control zones and the dependence of luminaires in the control zones on sensor values received at the input.

Fig. 2 schematically shows an example of an embodiment of the control process. The control process according to fig. 2 may be performed, for example, using the connected lighting network of fig. 1.

Fig. 2 shows a closed loop version of the lighting controller. It takes the illuminance value of the virtual sensor and the illuminance set point for that location as inputs; the set point may be defined based on the occupancy state. The latter may also be obtained from virtual sensors. For example, if there is occupancy, a higher illumination level may be defined, e.g., corresponding to 500 lux within the workspace; and if there is no occupancy but there is occupancy at a set of adjacent locations, a lower level may be defined, e.g., corresponding to 300 lux within the workspace. The illumination controller may be a PID type illumination controller (proportional-integral-derivative controller or PID) or a variant thereof. The output of the controller is a dimming level that is used to actuate the corresponding luminaire(s).

Fig. 2 shows at 310 a virtual occupancy state, for example, obtained from a sensor control device. At 320, the virtual occupancy state is used to define a set point for the location, e.g., a sub-region in the area. At 360, a virtual illumination value is also obtained, for example, from a sensor control device (which may be the same sensor control device). Block 330 represents lighting control, where light is controlled (possibly in part) based on the illumination difference between the setpoint and the virtual illumination value. This results in actuation 340 of one or more illuminators. For example, the output of the controller 330 may be a dimming level. At 350, the output of the luminaire is changed (possibly increased) due to sunlight and/or other light sources. This in turn will lead to updated values of the virtual illumination sensor, which in turn may lead to a change in the actuation of the luminaire.

Thus, shown in fig. 2 is a lighting controller that receives as inputs a set point determined from a virtual occupancy sensor and a virtual sensor illuminance value. Based on these, the luminaire is actuated. As a result, the value of the virtual sensor is updated. These may in turn lead to a further adaptation of the luminaire. Instead of the set point, the occupancy value may also be directly input to the lighting controller. Note that the lighting controller in fig. 2 may not notice the fact that: virtual sensors are used instead of physical sensors. The set point may be set by the light controller based on the received occupancy input. Alternatively, the set point may be determined by the sensor control device and provided to the lighting controller, possibly in place of the occupancy information itself.

Fig. 3 schematically shows an example of an embodiment of a connected lighting system 101. The connected lighting system of fig. 3 is similar to the connected lighting system of fig. 1. Important differences are discussed below.

The sensor controller device 200 of fig. 3 has an additional component, a luminaire location storage 248. The luminaire location storage 248 is configured to store locations of a plurality of luminaires (e.g., each of the plurality of luminaires). The sensor controller device is arranged to select a luminaire from the plurality of luminaires having an illumination coverage comprising a virtual sensor position.

In other words, the sensor controller device 200 of fig. 3 also allows to define virtual sensors, for example using the virtual sensor position storage 246, but additionally one or more luminaires may be selected which may be used to illuminate the position. For example, luminaires near the virtual sensor locations from the luminaire location storage 248. For example, the illumination coverage of the selected luminaire may include the virtual sensor location.

The illumination coverage of the luminaire may be defined as a region in which the light intensity is at least a predetermined percentage (e.g., at least 90% or at least 80%, etc.) of the total light intensity.

For example, the virtual sensor unit 270 may define a virtual sensor as in fig. 1, but the light controller output 280 may select a luminaire to be controlled from the luminaire location storage 248, e.g. one or more luminaires near the virtual sensor location. If multiple virtual sensor locations are defined, the light controller output 280 may select multiple luminaires for the multiple virtual sensors.

Instead of selecting luminaires near the virtual sensor, virtual sensors near the luminaires may also be selected. For example, a virtual sensor location may be selected in the illumination coverage around the luminaire. For example, in an embodiment according to the latter, a conventional lighting network may be simulated, wherein each luminaire has its own integrated sensor for occupancy and/or illumination. Interestingly, a conventional lighting controller may be used which expects each luminaire to have its own sensors, but in practice luminaires are used which do not have any sensors, e.g. replacing them with virtual sensors.

The virtual sensor values may be used to control the selected luminaires. In both cases, a control zone may be defined and virtual sensors assigned to the control zone as needed. In an embodiment, the virtual sensors are also assigned virtual sensor ids, which may be used in the control map. The virtual sensor id may be reported to the lighting controller along with the virtual sensor value.

As will be explained below, the sensor control device can be used to reconfigure an existing lighting network with great flexibility, without investing in new luminaires and lighting controllers. However, the virtual sensor according to the present invention may be used for different applications. For example, the sensor control device 200 according to fig. 3 may be configured for a system in which control zones are dynamically allocated. For example, the dynamic configuration may be performed by the luminaire output unit 280.

For example, the sensor control device 200 may be arranged to select a region from the integrated occupancy map in which the integrated occupancy map indicates occupancy. For example, a clustering algorithm may be applied to the integrated occupancy map. The clustering algorithm may, for example, combine Matlab's "immediate" command with "bwleabel" command. More advanced embodiments may use, for example, a k-means clustering algorithm.

Once the cluster of individuals is computed, virtual sensor locations are selected for the virtual sensors in the zone. Multiple sensors may also be defined for a cluster. Also, luminaires having locations in or near the zone are selected from the luminaire location database 248. Having one or more virtual sensors and one or more luminaires, a control map is defined, wherein the control zone comprises the selected luminaires, and wherein the control zone is controlled by the one or more virtual sensors. The lighting system is then configured using the control map. For example, the control mapping may be uploaded to a lighting controller, such as controller 120. The effect is to create a control zone, in particular for the location where the individual happens to be located at the moment.

Fig. 4, 5 and 6 illustrate various configurations and/or uses of embodiments of a lighting control network and/or sensor control device according to the present invention. These examples are all shown in an open office where multiple office workers may work. Examples may be adapted to other lighting situations.

In fig. 4a, the luminaires in an office space are schematically shown as small circles. In fig. 4b, the sensor (e.g., a multimode sensor) is schematically shown as a small square. Note that the number of sensors is much smaller than the number of illuminators. Fig. 4c shows a schematic example of office furniture. A number of tables and chairs are shown. Using the sensor of fig. 4b, a plurality of virtual sensors are defined. The virtual sensor positions are shown in fig. 4 d. Fig. 4e shows an overlay of fig. 4c and 4 d. Note that the virtual sensors are defined exactly where they are needed, regardless of where the physical sensors happen to have.

Sometimes, the office space is rearranged, for example because a new company moves into the building. The existing lighting infrastructure is maintained, including the luminaire of fig. 4a and the sensor of fig. 4 b. The light controllers may also be retained, although they may be reconfigured. Fig. 4f schematically shows a new office furniture. Note that furniture has different sizes and locations. Fig. 4g shows the new virtual sensor position. For example, the new virtual sensor locations may be uploaded into virtual sensor location storage 246. Fig. 4h shows the overlay of fig. 4f and 4 g. Note that the new virtual sensor is advantageously arranged to fit the new arrangement shown in fig. 4 f. The lighting controller may be arranged to use the virtual sensors and appropriate control maps defined in relation to figure 4 g. The control map is not shown in these figures. Thus, the lighting network can be adapted to new situations in an open office without physical rearrangement, requiring only reconfiguration of the sensor control devices and the lighting controllers.

The location of the virtual sensor may be determined by a human operator. Furthermore, for the scenarios in fig. 4a-4h, no control zones are indicated, however, appropriate groups of lights may be assigned to the virtual sensors as needed. Determining the control mapping may be done manually, but may also be done automatically. For example, a luminaire may be controlled by the virtual sensor closest to it.

Fig. 5a schematically shows an example of an embodiment of an occupancy map of an integrated area range. In these figures, detected individuals are indicated with small ovals. For example, the sensors of FIG. 4b may be used to generate the integrated map of FIG. 5 a. The clustering algorithm is applied to the map of fig. 5 a. The result is shown schematically in fig. 5 b-two clusters have been found. The clusters are schematically indicated with solid lines. Next, the sensor control device may assign the virtual sensor to the detected cluster. In this case, two sensors are assigned, and it is also possible to use a single sensor, or it is also possible to use more than two sensors. One or more virtual sensors are configured for occupancy as well as illumination. Next, luminaires are selected that are close to the cluster. For example, it is possible to use as a rule that only luminaires are selected that still provide, say, x% of their light at the location of the virtual sensor. X% may be, for example, 50% or 80%. In fig. 5d, two control zones are indicated with dashed lines. Virtual sensors are assigned to their corresponding control zones. The control area is then uploaded to the lighting controller. In embodiments according to these principles, the assignment of virtual sensors and control zones may be fully automated.

The effect is that the individual receives light controlled by occupancy and especially lighting, even if such a specific allocation of control zones is not anticipated beforehand.

Fig. 6a schematically shows an example of an embodiment of virtual sensors and luminaires in an open office space. Here, the system is used to simulate a luminaire with integrated sensors by selecting the virtual sensor location to be the same as the physical location of the luminaire. In fig. 6b, a luminaire is added to the system. Indicated in fig. 6b with dashed lines is the illumination coverage of the new illuminator. For example, illumination coverage may be defined as the area where the luminaire still has 80% of its intensity. The virtual sensor is selected at a point in the illumination coverage. The location may be chosen to be the same as the luminaire location, but as shown, this is not essential.

In various embodiments of the sensor control device, the receiving unit may comprise an input interface, which may be selected from various alternatives. For example, the input interface may be a network interface to a local or wide area network, such as the internet, an application interface (API), or the like.

The sensor control device may have a user interface which may include well-known elements such as one or more buttons, a keypad, a display, a touch screen, etc. The luminaire may also have a user interface. The user interface may be arranged for accommodating user interaction for changing the lighting at a specific location or a specific luminaire or the like. The user interface may be arranged for accommodating user interaction to obtain information from the sensor control device, e.g. to obtain virtual sensor information, e.g. on a display connected to the sensor control device.

Various storage devices, such as physical sensor location storage 242, map storage 244, virtual sensor location storage 246, and luminaire location storage, may be implemented as electronic memory (such as flash memory) or magnetic memory (such as a hard disk), among others. The storage may also be implemented as an application interface to an offsite storage (e.g., a cloud storage). The storage device may comprise a plurality of discrete memories which together constitute the storage device. The storage device may also be a temporary memory, such as RAM. In the case of temporary storage, the sensor control apparatus contains some means of acquiring data prior to use, such as for example acquiring them from cloud storage over an optional network connection (not shown).

Typically, the sensor control device 200, the luminaire and the lighting controller each comprise a microprocessor (not shown separately) executing suitable software stored at the sensor control device 200; for example, the software may have been downloaded and/or stored in a corresponding memory, e.g. a volatile memory such as RAM or a non-volatile memory such as Flash (not separately shown). Alternatively, the sensor control device 200, the luminaire and the lighting controller may be implemented in whole or in part in programmable logic, for example as Field Programmable Gate Arrays (FPGAs). The device may be implemented in whole or in part as a so-called Application Specific Integrated Circuit (ASIC), i.e. an Integrated Circuit (IC) tailored to its specific use. For example, the circuit may be implemented in CMOS, e.g., using a hardware description language such as Verilog, VHDL, etc.

In an embodiment, the sensor control device 200 may comprise a receiving circuit, an input circuit, a sensor data aggregation circuit, a physical sensor location storage circuit, a map storage circuit, a virtual sensor location storage circuit, a luminaire location storage circuit, a virtual sensor circuit, a light controller circuit. The circuitry implements the corresponding units described herein. These circuits may be processor circuits and memory circuits, with the processor circuits executing instructions electronically represented in the memory circuits.

The processor circuit may be implemented in a distributed manner, for example, as a plurality of sub-processor circuits. The storage device may be distributed over a plurality of distributed sub-storage devices. Some or all of the memory may be electronic memory, magnetic memory, or the like. For example, the storage device may have volatile and non-volatile portions. Portions of the memory device may be read-only.

In an embodiment, the connected lighting system is provided with a plurality of luminaires and sensors, wherein the lighting data is collected in a back-end database or cloud. The lighting system may use a plurality of sensor inputs, e.g. in the form of occupancy and light measurements, to control the light output of the luminaire and adapt the artificial lighting conditions to prevailing environmental conditions. The system may use advanced sensors such as vision sensors and other systems such as indoor positioning systems in conjunction with the lighting system. Such sensing systems may generate output in a variety of modes. For example, the vision sensor need not act merely as a binary occupancy sensor, but may provide the illumination level and user position within its sensing region. The multi-mode sensor data can be used to actuate a luminaire in the lighting system.

In a conventional sense, the multimode sensor need not be part of the lighting system. They can be installed and maintained separately from the luminaire.

In one embodiment, the system has a plurality of multimodal sensors, such as vision sensors, within an indoor space, such as an open office. Each sensor generates data in the form of user position and illuminance values for the unit within its sensing region. The units may be at a predefined spatial granularity. This data may be reported to the sensor data aggregator along with the sensor ID and timestamp.

Fig. 8 schematically illustrates an example of an embodiment of a sensor control method 800. The sensor control method 800 is arranged for a connected lighting system, such as the system 100 or 101, for example. The connected lighting system comprises a plurality of luminaires arranged in an area. The sensor control method 800 includes

Receiving 810 sensor data from a plurality of sensors arranged in a region, each sensor being associated with a sub-region comprised in the region, the sensor data being indicative of one or more occupant positions of occupants in the sub-region and/or a plurality of illuminance levels at a plurality of different points in the sub-region,

storing 820 virtual sensor positions in the area for virtual sensors of occupancy and/or illuminance,

-acquiring 830 from the received sensor data an occupant position in a sub-area associated with the sensor data and/or a plurality of illuminance levels of a plurality of points in the sub-area associated with the sensor data,

-converting 840 the occupant positions and/or the plurality of points in the sub-area into coordinates of an area extent and integrating 850 the occupancy states of the plurality of occupant positions and/or the plurality of illuminance levels of the plurality of points into an integrated area-extent occupancy and/or illuminance map,

defining 860 a virtual sensor at the virtual sensor location and calculating an occupancy state and/or an illuminance level of the virtual sensor from the virtual sensor location and the integrated occupancy and/or illuminance map,

-using 870 the occupancy state and/or the illuminance level from the virtual sensor to control a luminaire of a plurality of luminaires, the virtual sensor location in the luminaire coverage area.

As will be clear to a person skilled in the art, many different ways of performing the method are possible. For example, the order of the steps may be changed, or some of the steps may be performed in parallel. In addition, other method steps may be inserted between the steps. The intervening steps may represent modifications to the methods, such as those described herein, or may be unrelated to the methods. For example, steps 830 and 840 may be performed at least partially in parallel. Furthermore, a given step may not have yet fully ended before the next step is started.

The method according to the invention may be performed using software comprising instructions for causing a processor system to perform the method 800. The software may include only those steps that are employed by a particular sub-entity of the system. The software may be stored on a suitable storage medium such as a hard disk, floppy disk, memory, optical disk, and the like. The software may be transmitted as signals along a wired or wireless link or using a data network (e.g., the internet). The software may be made available for download and/or remote use on a server. The method according to the invention may be performed using a bitstream arranged to configure programmable logic, e.g. a Field Programmable Gate Array (FPGA), to perform the method.

It will be appreciated that the invention also extends to computer programs, particularly computer programs on or in a carrier, adapted for putting the invention into practice. The program may be in the form of source code, object code, a code intermediate source and object code such as partially compiled form, or in any other form suitable for use in the implementation of the method according to the invention. Embodiments relating to a computer program product include computer-executable instructions corresponding to each of the process steps of at least one of the methods set forth. These instructions may be subdivided into subroutines and/or stored in one or more files that may be linked statically or dynamically. Another embodiment relating to a computer program product comprises computer-executable instructions corresponding to each of the means of at least one of the systems and/or products set forth.

Fig. 9a shows a computer-readable medium 1000 with a writeable portion 1010 comprising a computer program 1020, the computer program 1020 comprising instructions for causing a processor system to perform a sensor control method according to an embodiment. The computer program 1020 may be embodied on the computer readable medium 1000 as physical indicia or by means of magnetization of the computer readable medium 1000. However, any other suitable embodiment is also conceivable. Further, it will be appreciated that although the computer-readable medium 1000 is illustrated herein as an optical disc, the computer-readable medium 1000 may be any suitable computer-readable medium, such as a hard disk, solid state memory, flash memory, etc., and may be non-recordable or recordable. The computer program 1020 comprises instructions for causing a processor system to perform the sensor control method.

Fig. 9b shows a schematic representation of a processor system 1140 according to an embodiment. The processor system includes one or more integrated circuits 1110. The architecture of one or more integrated circuits 1110 is schematically illustrated in fig. 9 b. The circuitry 1110 comprises a processing unit 1120, e.g. a CPU, for running computer program means to perform the method according to the embodiments and/or to implement modules or units thereof. The circuit 1110 includes a memory 1122 for storing programming code, data, and the like. A portion of the memory 1122 may be read-only. The circuit 1110 may include a communication element 1126, such as an antenna, a connector, or both, among others. Circuitry 1110 may include an application specific integrated circuit 1124 for performing some or all of the processing defined in the method. The processor 1120, memory 1122, application specific IC 1124, and communication element 1126 may be connected to each other via an interconnect 1130, such as a bus. The processor system 1110 may be arranged for contact and/or contactless communication using an antenna and/or a connector, respectively.

For example, in an embodiment, the sensor control device may comprise a processor circuit and a memory circuit, the processor being arranged to execute software stored in the memory circuit. For example, the processor circuit may be an Intel core i7 processor, an ARM Cortex-R8, or the like. The memory circuit may be a ROM circuit or a non-volatile memory, such as a flash memory. The memory circuit may be a volatile memory, such as an SRAM memory. In the latter case, the device may comprise a non-volatile software interface, e.g. a hard disk drive, a network interface, etc., arranged for providing software.

It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments.

In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. Use of the verb "comprise" and its conjugations does not exclude the presence of elements or steps other than those stated in a claim. The article "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

In the claims, any reference signs placed between parentheses shall be construed as indicating any reference signs in the drawings or any formula indicating any embodiment, so as to increase the intelligibility of the claims. These should not be construed as limiting the claims.

List of reference numerals in fig. 1 and 3:

100. 101 connected lighting system

120 light controller

122 multiple illuminators

124 physical sensors

200 sensor control device

210 receiving unit

220 input unit

230 sensor data aggregation unit

242 physical sensor location storage

244 map storage device

246 virtual sensor location storage

248 illuminator position storage device

270 virtual sensor unit

280 light controller output terminal

27页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于显示街道照明水平的系统和方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!