iBeacon / Bluetooth Low Energy (BLE devices) - maximum number of beacons - bluetooth

I would like to track a large number of beacons (~500) at once within a 50-100 m radius via an app on an iPhone (5s). I've had a look at the spec and online and I can't see if there is any limit on the number of beacons you can track at once using BLE. Does anyone know if there is limitation on the number of beacons you can track exists or if an iPhone 5s would be up to the task of tracking that many beacons?

You used the word track, but iOS has two different methods: monitoring and ranging.
You can set a maximum of 20 regions to monitor. (Found in documentation for the startMonitoringForRegion: method.) Region limits mostly come into play if your app is in the background. The OS will alert your app when you enter or leave a region that you're monitoring (give or take a few minutes). The OS will even launch your app just to let it know what happened (although only for a short time).
The other method is ranging, which is to find all the beacons within the Bluetooth range of the device (typically around 100 feet give or take). If your beacons are spread out over 100 miles, then you probably won't run into any practical limit here. I have not found any documentation for this, and I have only four beacons that I'm testing with, and four at a time works.
Here's one way to handle your situation. Make all your 500 beacons use the same UUID, and make a beacon region using initWithProximityUUID:identifier: method. (Identifier is just for you -- it doesn't affect anything). Starting monitoring for that beacon region. That way, your app will be notified whenever one of your 500 beacons are found (give or take a few minutes). Once notified, you can use startRangingBeaconsInRegion: to find all the beacons around that area, then use the major and minor values to figure out which beacons the user is near.

I'll add to Tim Tisdall's answer, which sets out the right framework. I can't speak to the specific capabilities of the iPhone 5s, or iOS in general, but I don't see any reason why it wouldn't return every ADV_IND packet (i.e. beacon transmission) that it receives.
The question is, will the 500 beacons be able to transmit their ADV_IND packets without collisions?
It takes about 0.128ms to transmit an ADV_IND packet. The time between advertising transmissions is configurable between 20ms and 10240ms (at intervals of 0.625ms), so the probability of collisions depends on the configuration of the beacons.
Based on the Poisson distribution, the probability of a collision for any given ADV_IND packet is 1-exp(-2*N*(0.128/AI)), where N is the number of beacons within range, AI is the time in milliseconds of the advertising interval (assuming all the beacons are configured the same), and the 0.128 is the time in milliseconds it takes to send the ADV_IND packet. (See http://www3.cs.stonybrook.edu/~jgao/CSE590-fall09/aloha-analysis.pdf if you want an explanation.)
For 500 beacons with the maximum advertising interval of about 10 seconds, there will be a collision about once every 81 packets (or about 6 out of 500). If you're willing to wait for a couple intervals (i.e. 30 seconds), there's a good chance you'll be able to receive all 500 ADV_IND packets.
On the other hand, if the advertising interval is smaller, say 500ms, you'll have a collision about 23% of the time (or 113 out of 500). You'd have to wait for several more intervals to improve the probability that you'd see the broadcasts from all the beacons.
The other way to look at it is that the more beacons you have, the longer you have to wait to make sure you receive all their packets. (The math to calculate the delay to receive the packets with a certain probability from the number of beacons and the advertising interval is too much for me today.)
One caveat: if you want to connect to these beacons, as opposed to just receiving the ADV_IND packet, that requires an exchange of two more packets on the advertising channels, and the probability of a collision in the advertising channels goes up a bit.

If I am reading your question right, you want to put all 500 iBeacons within 100 meters of each other, meaning their transmissions will overlap. You will probably run into radio congestion problems long before you run into any limitations of iOS7 or your phone.
I have successfully tested 20 iBeacons in close proximity without problems, but 500 iBeacons is an extreme density. this discussion on the hardware issue suggests you may run into trouble.
At a minimum, the collisions of the transmissions of 500 iBEacons will make it take longer for your iOS device to see each iBeacon. Normally, iOS7 provides a ranging update once per second for each iOS device, but you may find that you get updates much less often. It all depends on your application whether or not less frequent updates are acceptable.
Even if delays are acceptable, I would absolutely test this before counting on it working at all. Unfortunately, that means getting your hands on lots of iBeacons.

I don't agree. It is true that ble beacons only transmit advertising data, but the transmission of such data last about 3ms (considering three advertising channels).
Having 500 beacons, WITHOUT considering any collision, the scanner will takes 1.5s to see them all.
But, if all beacons are configured in same way (same advertising interval) it is inevitable to have collisions which lead to have undiscovered beacons. Even if the advertising interval is different between beacons collisions occur. To avoid collision probability one should use longer advertising interval, but this lead to longer discovery latency.
This reasoning is very raw, it doesn't take care of many effects, but is just an order of magnitude calculation.
By the way, the question is not easy, there are many parameters which play role, some are known some are unknown. But I'm working with ble since one year about and, to me, 500 is a huge number and there is the possibility that you don't see the majority of nodes because of collisions.

I was doing some research into iBeacon's because of this question (I had no idea what it was about).
It seems that on the "beacon" side of things all that happens is general advertising packets are sent out. It's similar to how a device advertises that you can connect to it. However, you don't actually connect to iBeacon's, it just reads those advertising packets. There's no built-in limitation on how many advertising packets a device can receive.
So, it wouldn't surprise me if 500 iBeacon's would run with no issues. The advertising packets are small and are spaced out (time wise, they are repeated every X ms). There's no communication going from the phone to the iBeacon, the phone is simply receiving the packets it hears. If there's interference on one packet it'll likely manage to get the next one.

Related

Get Latency of Bluetooth Headphoners UWP C++

I want to make sure the latency between my app and the bluetooth headphones is accounted for, but I have absolutely no idea how I can get this value. The closest thing I found was:
BluetoothLEPreferredConnectionParameters.ConnectionLatency which is only available on Windows 11... Otherwise there isn't much to go on.
Any help would be appreciated.
Thanks,
Peter
It's very difficult to get the exact latency because it is affected by many parameters - but you're on the right track by guessing that the connection parameters are a factor of this equation. I don't have much knowledge on UWP, but I can give you the general parameters that affect the speed/latency, and then you can check their availability in the API or even contact Windows technical team to see if these are supported.
When you make a connection with a remote device, the following factors impact the speed/latency of the connection:-
Connection Interval: this specifies the interval at which the packets are sent during a connection. The lower the value, the higher the speed. The minimum value as per the Bluetooth spec is 7.5ms.
Slave Latency: this is the value you originally mentioned - it specifies the number of packets that can be missed before a connection is considered lost. A value of 0 means that you have the fastest most robust connection.
Connection PHY: this is the modulation on which the packets are sent. If both devices support 2MPHY, then the connection should be quicker.
Data Length/MTU Extension: these are two separate features but I am looping them together becuase the effect is the same - more bytes are sent per packet, which results in a higher throughput. The maximum value is 251 bytes per packet.
You can find more information about these parameters here:-
A Practical Guide to BLE Throughput
Maximizing BLE Throughput: Everything You Need to Know
Bluetooth 5 Speed - How to Achieve Maximum Throughput
And below are some other links that might help you understand what is supported on UWP:-
Bluetooth Developer FAQ
BluetoothLEConnectionParameters.OptimizedProperty
Bluetooth LE Preferred Connection Parameter Class
Bluetooth LE Connection PHY class
How to Change MTU Size and PHY on Windows UWP C++

Connection interval dependent of transmission frequency?

I'm new to BLE, and bluetooth in general, but I'm on a project that includes communication via BT 5.
As the BLE communication has to transmit around 2 bytes, to 1 MB at a time, I'm looking for a way to optimize the transmission time.
I know the pros n cons for the lower transmission freq (125 kbps), and for the highest transmission freq (2 Mbps), and for the DLE of 251 PDU bytes, but what I see from different forums and articles, the throughput mostly depend on the connection parameters as the connection interval and the packets per connection event. But where does the transmission frequency come in?
I've tried searching this forum for an answer, and several others, and even the BT core specification, but I haven't been able to find a solution for my problem.
If you read my answer at Why is BLE 4.2 faster than BLE 4.1, you can see that there are many components affecting the overall transfer speed.
You first have the radio transmission rate itself, which sets the upper limit.
You then have the overhead between all packets that becomes less apparant as longer packets you have.
The connection interval and length of each connection event can be important if you want the throughout to be high. If there is only one connection and the Bluetooth chip is not too stupid, the connection event length will fill the connection interval and therefore the connection interval doesn't really matter. However, if there are other conflicting radio events scheduled in a way that the connection event must be closed, the transmission cannot continue until the next connection event. So in this case, the throuhput will be higher if you lower the connection interval. So as a summary it highly depends on which Bluetooth stack the chip runs, how it's configured by the host and how many active connections you have.
The transmission rate controls your underlying bitrate but on top of that sits different layers of the BLE protocol which slow down the realizable throughput. This article has general derivation of how the different layers impact throughput in case that's useful!

BLE slave logic to increase battery duration

I'm looking for answers on the net for 2 days, and it seems like I can't find my answer so I finally post it here hoping I just mess something.
I'm conceiving a BLE slave device to log humidity in a room twice a day. This device has to run for at least 2 Years before getting recharged.
What is the BLE logic to ensure long battery life ?
1) Is a long advertisement / connection interval enough ?
2) Do I need to implement a RTC with interrupt to wake up my device and start advertising to get connected?
3) Do I have to use advertising packets only, and include my data into it?
I think I just miss something about bluetooth low energy, and it is a problem to create a ble device.
Thank you very much for you help, and have a good day !
You could calculate power consumption at https://devzone.nordicsemi.com/power/ for Nordic's chip. If the device does not advertise or has an active connection (i.e. it's just sleeping), it consumes almost no power at all and will definitely run for 2 years even on a CR2016 battery. So if possible, if you have for example a button that can start the advertising only when needed, that would be good.
Otherwise if you want it to be always available you must advertise. How long the advertisement interval should be depends on how long connection setup latency you want. If you have a BLE scanner that scans 100% of the time the advertisement interval will be equal to the connection setup time. If you have a low-power BLE scanner that only scans for example 10% of the time you'll have to multiply your advertitsement interval by 10 to get the expected setup time. It all comes down to simple math :)
I'd suggest create a connection rather than just putting the data in the advertisement packets since then you can acknowledge that the data has been arrived.
Note that if you have a connection interval of 4 seconds and have a stable constant connection you can get several years battery time on a coin cell battery.

Does RSSI depend on beacon and device?

I've read in many places that RSSI is highly environment specific (e.g., walls or weather) which can make it difficult to infer which beacon is the closest in a Euclidean sort of way. I also gather that RSSI is measured in arbitrary units from 0 (good connection strength) to -100 (bad connection strength). In spite of these challenges, I have questions about the following two thought experiments related to the reliability of the RSSI values for beacon <--> device communications.
Experiment 1. Given a particular beacon and two devices located at the exact same location, will those two devices register the same RSSI for that beacon?
Experiment 2. Given a particular device and two beacons located at the exact same location, will those two beacons register the same RSSI for that device?
To formalize this in a statistical sense, will p(signal | beacon1, device1) = p(signal | beacon2, device2) if beacon1-device1 are placed in the exact same environment of beacon2-device2?
Since different antennas and devices have different RF properties, I'm going to go ahead and say that unless your beacons/devices are identical to each other, then no, you should not expect the same RSSI reading, even if their locations are identical. This is because the device cannot know how much power is in an RF signal before it passes through its circuitry, and better and bigger antennas will transmit/receive better than crappier ones.
That said, RSSI values of 0 will be read as 0 with both devices, and also maximum RSSI values, assuming that the two devices use the same RSSI scale, which doesn't seem to have to be the case, as wikipedia says: "As an example, Cisco Systems cards have a RSSI_Max value of 100 and will report 101 different power levels, where the RSSI value is 0 to 100. Another popular Wi-Fi chipset is made by Atheros. An Atheros based card will return an RSSI value of 0 to 127 (0x7f) with 128 (0x80) indicating an invalid value."
Anyway, if your devices are identical, then I would expect the readings to be identical as well, or at least very close to each other.
Besides the differences in hardware and transmission power, timing is also important. If the interval between two measurements conducted by either the same device/beacon or different ones exceed channel coherence time, RSSI may vary drastically. Coherence time in indoor environment is at the level of 1s, and 10-100 times smaller outdoor.

Bluetooth Ping Latency

I am currently working on a project involving a Lego Mindstorms kit. The brick is the NXT and I was curious about the bluetooth ping rates.
I ran a test of 100 pings on it and got some interesting results. The latencies seemed to fall into bands. I increased to 10,000 pings and it highlighted this trend even more clearly. Does anyone know what could cause this to happen?
In case it is relevant, the distance between the sender and receiver was about 3 metres.
Few reasons :
Buffering and internal timers to flush buffers can cause it.
Also depends on the ping intervals (i.e. time between subsequent pings), as the link might go to power save mode during inactivity and it will take a fine time to come back up.
Size of the ping packets
What bluetooth profile is used here ?

Resources