What is it?
Adaptive Data Rate (ADR) is a scheme that enables a network server to individually control data rate (the actual data successfully transferred in a certain period), radio frequency (RF) transmission power (the amount of power input into a signal to an RF device), spreading factor (how much a data bit is spread in time), channels used, and the number of retransmissions made by each end device in the network while sending uplinks, i.e., messages transmitted from end devices to the server.
This mechanism helps to optimize a network’s airtime (when a signal is being transferred from a sender to a receiver) and energy consumption, as well as preserve the battery life of end devices and decrease interference.
How does it work?
A network server establishes how close an end device is to the nearest gateway, thanks to the Received Signal Strength Indicator (RSSI) of the messages this device transports. This way, the server can allocate the most optimal settings to each end device. For example, end devices that are close to gateways will require a lower spreading factor and higher data rate. On the other hand, devices located farther away will demand a high spreading factor.