Bluetooth NLC vs DALI+: Capacity and Performance

Whenever I come across a discussion that involves wireless communications over the Thread protocol (such as this one), there is almost 100% chance people don't understand the underlying physics and traffic patterns. This is the result of Thread marketing since 2014, highlighting the IPv6 protocol as its key strength. And typically it goes like that: "Thread is based on IPv6 and there are more IPv6 addresses that grains of sand in the Universe". And people read this "because Thread is based on IPv6, it is a highly scalable wireless protocol". Treating the potential number of static addresses equally to the dynamic situation when these devices actually send data.

If we consider the networked lighting control domain, the most prevalent pattern today is to have a motion sensor in every luminaire. The motion sensor usually has multiple functions: it is a motion sensor of course. But it also is a light level sensor enabling daylight harvesting (or dimming lights down when there is sufficient natural light coming in). And it also outputs a light dimming signal (0-10V analog or DALI digital) to the LED driver.

In DALI+ systems typically there is a room or zone or area controller - a small programmable box that collects the signals from the sensors and tells the lights what to do. For example: there are people in the room (occupancy detected), the light level is 150 lux (too low), so increase the level by dimming up. 

From the underlying Thread network perspective this typically means four network messages:

1. Sensors sending the occupancy / light level data to their associated Thread routers;
2. The Routers relaying the messages to the controller;
3. The controller responding with a multicast (group) command to the lights;
4. The group command is too relayed by Thread routers.

Messages 1 and 2 are additionally acknowledged at the link layer. Messages 3 and 4 are usually sent 3 times each, for reliability, as there is no acknowledgment for multicast messages. 

Thread uses the IEEE 802.15.4 standard at 250kbps data rate. The 802.15.4 uses the Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) scheme to to control how devices share a communication channel without interfering with each other (interference effectively kills the messages, so must be avoided).

Now let's assume each sensor reports an event (motion or light level change) once per second, on average. This is a good approximation, although in busy peak moments there may be more (many people moving or sunrise when light level changes significantly). And let's assume we want to have at most 10% collision / interference (it is still a lot). Then with these assumptions: how many sensors can the DALI+ over Thread network handle?

The short answer is: about 20. Yes, twenty. Not two hundred and definitely not thousands.

(I'd be happy to do a deeper dive into my calculations, so reach out if you want to understand the details)

Now let's compare this to Bluetooth® NLC. What are the key differences?

1. Bluetooth® NLC uses 1Mbps data rate (4x faster than Thread) and on top of that has ultra compact packet size.
2. The different security and network architecture allows any device to talk directly to any other device in a stateless fashion. Thread relies on network "links" that require child supervision / keep alive packets (that leads to this whole concept of "self-healing" which is another lie (at least a misconception, as Bluetooth® NLC never needs that and can be considered "always healthy").
3. Bluetooth® NLC uses distributed lighting control architecture (each sensor is also a controller), removing the need for the message routers and bypassing the congestion created by bi-directional traffic over multi-hop half-duplex network.

All in all, in Bluetooth® NLC we have much more efficient radio technology, and much more efficient network traffic pattern. Both combined, give us the practical limit of around 200 devices in the same area. That assumes each sending one message (sensor data) per second and leaves room for some much higher bursts while maintaining the expected high reliability and low latency).

So here we have it: DALI+: 20 per area, Bluetooth® NLC: 200. 10x (an order of magnitude) difference in the real world.

Also remember that 20 sensors per area is really a low number - very often entirely insufficient, as a typical area is an entire building floor. We have Bluetooth® NLC projects with much bigger areas -  as I said the 200 number is conservative. 

Comments

  1. Allan Organ (zencontrol)Aug 12, 2025, 1:07:00 AM

    We like to see analysis of technical content here — it helps create good, robust standards and open protocols.

    A few notes though, to make sure we’re operating in the real world and making better comparisons:

    Sensor event rates — Per 62386-303, the default report time is 20 s. This means a sensor will send “still occupied” events at a maximum frequency of once every 20 s. This setting is configurable, and with standard occupancy timeouts well in excess of 3 minutes, it can be extended significantly without impacting system performance. The default hold time (before a movement sensor reverts to reporting unoccupied) is 15 minutes.

    Lighting control behaviour — While Messages 1 and 2 will be sent for each event message (with the worst case being the first movement across a floor), an application controller absolutely does not send new lighting instructions on every sensor trigger (Messages 3 and 4). The lights will be turned on once and could remain on for an entire day before needing to send an off command if the area is continuously occupied. In our worst-case scenario of 3-minute timeouts, with a new person walking through every 3.5 minutes, the lighting command rate is less than one command per minute. Further, lights are typically controlled in groups, and even with small groups of four lights each, the number of commands needed for class-leading lighting control is further reduced.

    Daylight harvesting — This follows a similar pattern, with lower priority than event messages and a longer default report time (30 s). Light sensors also have built-in hysteresis to ensure the network is not flooded with inconsequential changes in light levels. The application controller will consider lux inputs from multiple sensors and update levels only as appropriate. This again means the real-world command rate required for good daylight harvesting is almost an order of magnitude less than the article suggests.

    What does this mean in practice?
    That the real limit is likely to be 5–10 times higher than proposed, potentially even more if the controller reduces event rates to only what’s needed for good lighting control.

    In terms of practical systems, zencontrol controllers target mesh networks with around 120 devices (half lights, half sensors), in line with the standard DALI address limit of 64 output devices and 64 input devices. If we were only doing lighting control, our testing suggests we could double that mesh size (two DALI+ subnets) without penalty to lighting performance. Like Slyvair though, we prefer to reserve additional bandwidth for other building data — environmental monitoring, energy management, people counting, and similar applications.

    If you’re interested, I’d be more than happy to share network traffic from thousands of live controllers we have deployed, each with more than 20 sensors, so we can be sure of best practice.

    — Allan

    ReplyDelete

Post a Comment