Bluetooth advertising and scan - python-3.x

Hello I'm hoping someone can help me with Bluetooth advertising & scanning. This is for a school stem project.
I'm working with a circuit playground Bluefruit development board and using circuit python. Trying to implement a social distancing badge - lights up when another user is too near. General idea is to set up an advertising beacon and then switch to scan.
I've had a go at code but would appreciate someone to review it. I'm not sure if the eddystoneuid works; when I'm scanning will it only scan for my specific uid for example.
Thank you
import time
from adafruit_circuitplayground.bluefruit import cpb
import adafruit_ble
from adafruit_ble_eddystone import uid
emitter = adafruit_ble.BLERadio()
advertisement = uid.EddystoneUID(emitter.address_bytes)
receiver = adafruit_ble.BLERadio()
while cpb.switch:
# only if the switch is to the leftt the code will work
# so there's an option to turn the board off if needed.
emitter.start_advertising(advertisement)
time.sleep(0.5)
emitter.stop_advertising()
# advertises its "address" almost every half second
for advertisement in receiver.start_scan():
# then the same board scans for advertisements after
rssi = advertisement.rssi
if advertisement.rssi > -80:
# if the rssi is stronger than -80 (around 2m, yet to be certain)
cpb.pixels.fill((250, 0, 0))
time.sleep(0.2)
cpb.pixels.fill((250, 0, 0))
time.sleep(0.2)
cpb.pixels.fill((250, 0, 0))
cpb.pixels.fill((0, 0, 0))
# then the neopixels flash bright red three times, then turn off
receiver.stop_scan()

It appears that at the moment your code will trigger if detects any Bluetooth advertisement with an RSSI greater than -80 dBm. Is that what you want? Typically you would identify that the other device is broadcasting/advertising your notification service.
Typically a service like this would not use just the RSSI number. The advertisement data also includes the dBm that the advertisement was broadcast at. Using the difference between those two dBm numbers allows for a rough approximation of distance as explained in the following article: https://medium.com/personaldata-io/inferring-distance-from-bluetooth-signal-strength-a-deep-dive-fe7badc2bb6d
If all of your devices are the same and broadcasting at the same dBm then calculating the distance is less important.
So in conclusion, instead of broadcasting emitter.address_bytes, I would put some specific number that all badges would broadcast. Then when you scan look in the advertisement.ServiceData for that number. Only if it is your exposure notification, calculate if it is too close.
For further reference, here is a link to the Google/Apple Exposure Notification Bluetooth® Specification that was released last year.

Related

ESP8266 analogRead() microphone Input into playable audio

My goal is to record audio using an electret microphone hooked into the analog pin of an esp8266 (12E) and then be able to play this audio on another device. My circuit is:
In order to check the output of the microphone I connected the circuit to the oscilloscope and got this:
In the "gif" above you can see the waves made by my voice when talking to microphone.
here is my code on esp8266:
void loop() {
sensorValue = analogRead(sensorPin);
Serial.print(sensorValue);
Serial.print(" ");
}
I would like to play the audio on the "Audacity" software in order to have an understanding of the result. Therefore, I copied the numbers from the serial monitor and paste it into the python code that maps the data to (-1,1) interval:
def mapPoint(value, currentMin, currentMax, targetMin, targetMax):
currentInterval = currentMax - currentMin
targetInterval = targetMax - targetMin
valueScaled = float(value - currentMin) / float(currentInterval)
return round(targetMin + (valueScaled * targetInterval),5)
class mapper():
def __init__(self,raws):
self.raws=raws.split(" ")
self.raws=[float(i) for i in self.raws]
def mapAll(self):
self.mappeds=[mapPoint(i,min(self.raws),max(self.raws),-1,1) for i in self.raws ]
self.strmappeds=str(self.mappeds).replace(",","").replace("]","").replace("[","")
return self.strmappeds
Which takes the string of numbers, map them on the target interval (-1 ,+1) and return a space (" ") separated string of data ready to import into Audacity software. (Tools>Sample Data Import and then select the text file including the data). The result of importing data from almost 5 seconds voice:
which is about half a second and when I play I hear unintelligible noise. I also tried lower frequencies but there was only noise there, too.
The suspected causes for the problem are:
1- Esp8266 has not the capability to read the analog pin fast enough to return meaningful data (which is probably not the case since it's clock speed is around 100MHz).
2- The way software is gathering the data and outputs it is not the most optimized way (In the loop, Serial.print, etc.)
3- The microphone circuit output is too noisy. (which might be, but as observed from the oscilloscope test, my voice has to make a difference in the output audio. Which was not audible from the audacity)
4- The way I mapped and prepared the data for the Audacity.
Is there something else I could try?
Are there similar projects out there? (which to my surprise I couldn't find anything which was done transparently!)
What can be the right way to do this? (since it can be a very useful and economic method for recording, transmitting and analyzing audio.)
There are many issues with your project:
You do not set a bias voltage on A0. The ADC can only measure voltages between Ground and VCC. When removing the microphone from the circuit, the voltage at A0 should be close to VCC/2. This is usually achieved by adding a voltage divider between VCC and GND made of 2 resistors, and connected directly to A0. Between the cap and A0.
Also, your circuit looks weird... Is the 47uF cap connected directly to the 3.3V ? If that's the case, you should connect it to pin 2 of the microphone instead. This would also indicate that right now your ADC is only recording noise (no bias voltage will do that).
You do not pace you input, meaning that you do not have a constant sampling rate. That is a very important issue. I suggest you set yourself a realistic target that is well within the limits of the ADC, and the limits of your serial port. The transfer rate in bytes/sec of a serial port is usually equal to baud-rate / 8. For 9600 bauds, that's only about 1200 bytes/sec, which means that once converted to text, you max transfer rate drops to about 400 samples per second. This issue needs to be addressed and the max calculated before you begin, as the max attainable overall sample rate is the maximum of the sample rate from the ADC and the transfer rate of the serial port.
The way to grab samples depends a lot on your needs and what you are trying to do with this project, your audio bandwidth, resolution and audio quality requirements for the application and the amount of work you can put into it. Reading from a loop as you are doing now may work with a fast enough serial port, but the quality will always be poor.
The way that is usually done is with a timer interrupt starting the ADC measurement and an ADC interrupt grabbing the result and storing it in a small FIFO, while the main loop transfers from this ADC fifo to the serial port, along the other tasks assigned to the chip. This cannot be done directly with the Arduino libraries, as you need to control the ADC directly to do that.
Here a short checklist of things to do:
Get the full ESP8266 datasheet from Expressif. Look up the actual specs of the ADC, mainly: the sample rates and resolutions available with your oscillator, and also its electrical constraints, at least its input voltage range and input impedance.
Once you know these numbers, set yourself some target, the math needed for successful project need input numbers. What is your application? Do you want to record audio or just detect a nondescript noise? What are the minimum requirements needed for things to work?
Look up in the Arduino documentartion how to set up a timer interrupt and an ADC interrupt.
Look up in the datasheet which registers you'll need to access to configure and run the ADC.
Fix the voltage bias issue on the ADC input. Nothing can work before that's done, and you do not want to destroy your processor.
Make sure the input AC voltage (the 'swing' voltage) is large enough to give you the results you want. It is not unusual to have to amplify a mic signal (with an opamp or a transistor), just for impedance matching.
Then you can start writing code.
This may sound awfully complex for such a small task, but that's what the average day of an embedded programmer looks like.
[EDIT] Your circuit would work a lot better if you simply replaced the 47uF DC blocking capacitor by a series resistor. Its value should be in the 2.2k to 7.6k range, to keep the circuit impedance within the 10k Ohms or so needed for the ADC. This would insure that the input voltage to A0 is within the operating limits of the ADC (GND-3.3V on the NodeMCU board, 0-1V with bare chip).
The signal may still be too weak for your application, though. What is the amplitude of the signal on your scope? How many bits of resolution does that range cover once converted by the ADC? Example, for a .1V peak to peak signal (SIG = 0.1), an ADC range of 0-3.3V (RNG = 3.3) and 10 bits of resolution (RES = 1024), you'll have
binary-range = RES * (SIG / RNG)
= 1024 * (0.1 / 3.3)
= 1024 * .03
= 31.03
A range of 31, which means around Log2(31) (~= 5) useful bits of resolution, is that enough for your application ?
As an aside note: The ADC will give you positive values, with a DC offset, You will probably need to filter the digital output with a DC blocking filter before playback. https://manual.audacityteam.org/man/dc_offset.html

yj-16009 iBeacon Proximity BLT beacon

I'm making a project with Esp32 whroom, so I bought the yj-16009 iBeacon DataSheet and I'm trying to get it to work as wireless Bluetooth proximity sensor like in this Video
I used the this code from the video and the esp32 is monitoring and showing BLT scanning results like this the results shown are after I turned off any BLT device around so first I don't understand what it is reading, and second after I turn on the iBeacon the results remain with the same range of numbers no matter if I get the iBeacon closer or farther, therefor I came to the conclusion that it doesn't recognize the iBeacon sensor for some reason.
I also download an app named LightBlue which does recognize the iBeacon sensor.
My question is if anyone knows how to make the esp32 recognize the iBeacon sensor. Another thing I tried to find any information about this sensor and there is no info about it anywhere. I have read on other questions here that it might need to be programmed somehow which I don't know how to do because there is no info online. So if anyone is familiar with this kind of sensor and can help me figure how to make the ibeacon to work like the video above as a Bluetooth Proximity device it would be a blessing.
The code you reference is just scanning for any BLE advertisements (iBeacon or otherwise) and printing out the RSSI signal strength of each detection. The reason you do not see the RSSI change when you move the beacon is because the ESP32 is probably picking up non-iBeacon adverts from your phone, laptop and other Bluetooth enabled devices in the vicinity which are not moving (there are more around you than you think!)
In order to make the device detect iBeacon only (and not all the other devices) you need to change the C code to do a few more things:
Access the bytes of the advertisement payload and use them as follows:
Compare the beginning of these bytes to see if they include the iBeacon byte sequence FF 4C 00 02 15
If the above byte sequence is not in the advertising data, ignore that detection — it is not an iBeacon advert
If it does include that byte sequence, decode the next 16 bytes as the iBeacon uuid, the next two bytes as the major and the next two bytes as the minor. See my answer here: What is the iBeacon Bluetooth Profile
Print out the identifiers along with the RSSI that the code already prints.

Does RSSI depend on beacon and device?

I've read in many places that RSSI is highly environment specific (e.g., walls or weather) which can make it difficult to infer which beacon is the closest in a Euclidean sort of way. I also gather that RSSI is measured in arbitrary units from 0 (good connection strength) to -100 (bad connection strength). In spite of these challenges, I have questions about the following two thought experiments related to the reliability of the RSSI values for beacon <--> device communications.
Experiment 1. Given a particular beacon and two devices located at the exact same location, will those two devices register the same RSSI for that beacon?
Experiment 2. Given a particular device and two beacons located at the exact same location, will those two beacons register the same RSSI for that device?
To formalize this in a statistical sense, will p(signal | beacon1, device1) = p(signal | beacon2, device2) if beacon1-device1 are placed in the exact same environment of beacon2-device2?
Since different antennas and devices have different RF properties, I'm going to go ahead and say that unless your beacons/devices are identical to each other, then no, you should not expect the same RSSI reading, even if their locations are identical. This is because the device cannot know how much power is in an RF signal before it passes through its circuitry, and better and bigger antennas will transmit/receive better than crappier ones.
That said, RSSI values of 0 will be read as 0 with both devices, and also maximum RSSI values, assuming that the two devices use the same RSSI scale, which doesn't seem to have to be the case, as wikipedia says: "As an example, Cisco Systems cards have a RSSI_Max value of 100 and will report 101 different power levels, where the RSSI value is 0 to 100. Another popular Wi-Fi chipset is made by Atheros. An Atheros based card will return an RSSI value of 0 to 127 (0x7f) with 128 (0x80) indicating an invalid value."
Anyway, if your devices are identical, then I would expect the readings to be identical as well, or at least very close to each other.
Besides the differences in hardware and transmission power, timing is also important. If the interval between two measurements conducted by either the same device/beacon or different ones exceed channel coherence time, RSSI may vary drastically. Coherence time in indoor environment is at the level of 1s, and 10-100 times smaller outdoor.

iBeacon / Bluetooth Low Energy (BLE devices) - maximum number of beacons

I would like to track a large number of beacons (~500) at once within a 50-100 m radius via an app on an iPhone (5s). I've had a look at the spec and online and I can't see if there is any limit on the number of beacons you can track at once using BLE. Does anyone know if there is limitation on the number of beacons you can track exists or if an iPhone 5s would be up to the task of tracking that many beacons?
You used the word track, but iOS has two different methods: monitoring and ranging.
You can set a maximum of 20 regions to monitor. (Found in documentation for the startMonitoringForRegion: method.) Region limits mostly come into play if your app is in the background. The OS will alert your app when you enter or leave a region that you're monitoring (give or take a few minutes). The OS will even launch your app just to let it know what happened (although only for a short time).
The other method is ranging, which is to find all the beacons within the Bluetooth range of the device (typically around 100 feet give or take). If your beacons are spread out over 100 miles, then you probably won't run into any practical limit here. I have not found any documentation for this, and I have only four beacons that I'm testing with, and four at a time works.
Here's one way to handle your situation. Make all your 500 beacons use the same UUID, and make a beacon region using initWithProximityUUID:identifier: method. (Identifier is just for you -- it doesn't affect anything). Starting monitoring for that beacon region. That way, your app will be notified whenever one of your 500 beacons are found (give or take a few minutes). Once notified, you can use startRangingBeaconsInRegion: to find all the beacons around that area, then use the major and minor values to figure out which beacons the user is near.
I'll add to Tim Tisdall's answer, which sets out the right framework. I can't speak to the specific capabilities of the iPhone 5s, or iOS in general, but I don't see any reason why it wouldn't return every ADV_IND packet (i.e. beacon transmission) that it receives.
The question is, will the 500 beacons be able to transmit their ADV_IND packets without collisions?
It takes about 0.128ms to transmit an ADV_IND packet. The time between advertising transmissions is configurable between 20ms and 10240ms (at intervals of 0.625ms), so the probability of collisions depends on the configuration of the beacons.
Based on the Poisson distribution, the probability of a collision for any given ADV_IND packet is 1-exp(-2*N*(0.128/AI)), where N is the number of beacons within range, AI is the time in milliseconds of the advertising interval (assuming all the beacons are configured the same), and the 0.128 is the time in milliseconds it takes to send the ADV_IND packet. (See http://www3.cs.stonybrook.edu/~jgao/CSE590-fall09/aloha-analysis.pdf if you want an explanation.)
For 500 beacons with the maximum advertising interval of about 10 seconds, there will be a collision about once every 81 packets (or about 6 out of 500). If you're willing to wait for a couple intervals (i.e. 30 seconds), there's a good chance you'll be able to receive all 500 ADV_IND packets.
On the other hand, if the advertising interval is smaller, say 500ms, you'll have a collision about 23% of the time (or 113 out of 500). You'd have to wait for several more intervals to improve the probability that you'd see the broadcasts from all the beacons.
The other way to look at it is that the more beacons you have, the longer you have to wait to make sure you receive all their packets. (The math to calculate the delay to receive the packets with a certain probability from the number of beacons and the advertising interval is too much for me today.)
One caveat: if you want to connect to these beacons, as opposed to just receiving the ADV_IND packet, that requires an exchange of two more packets on the advertising channels, and the probability of a collision in the advertising channels goes up a bit.
If I am reading your question right, you want to put all 500 iBeacons within 100 meters of each other, meaning their transmissions will overlap. You will probably run into radio congestion problems long before you run into any limitations of iOS7 or your phone.
I have successfully tested 20 iBeacons in close proximity without problems, but 500 iBeacons is an extreme density. this discussion on the hardware issue suggests you may run into trouble.
At a minimum, the collisions of the transmissions of 500 iBEacons will make it take longer for your iOS device to see each iBeacon. Normally, iOS7 provides a ranging update once per second for each iOS device, but you may find that you get updates much less often. It all depends on your application whether or not less frequent updates are acceptable.
Even if delays are acceptable, I would absolutely test this before counting on it working at all. Unfortunately, that means getting your hands on lots of iBeacons.
I don't agree. It is true that ble beacons only transmit advertising data, but the transmission of such data last about 3ms (considering three advertising channels).
Having 500 beacons, WITHOUT considering any collision, the scanner will takes 1.5s to see them all.
But, if all beacons are configured in same way (same advertising interval) it is inevitable to have collisions which lead to have undiscovered beacons. Even if the advertising interval is different between beacons collisions occur. To avoid collision probability one should use longer advertising interval, but this lead to longer discovery latency.
This reasoning is very raw, it doesn't take care of many effects, but is just an order of magnitude calculation.
By the way, the question is not easy, there are many parameters which play role, some are known some are unknown. But I'm working with ble since one year about and, to me, 500 is a huge number and there is the possibility that you don't see the majority of nodes because of collisions.
I was doing some research into iBeacon's because of this question (I had no idea what it was about).
It seems that on the "beacon" side of things all that happens is general advertising packets are sent out. It's similar to how a device advertises that you can connect to it. However, you don't actually connect to iBeacon's, it just reads those advertising packets. There's no built-in limitation on how many advertising packets a device can receive.
So, it wouldn't surprise me if 500 iBeacon's would run with no issues. The advertising packets are small and are spaced out (time wise, they are repeated every X ms). There's no communication going from the phone to the iBeacon, the phone is simply receiving the packets it hears. If there's interference on one packet it'll likely manage to get the next one.

Motorola XT910 reads rssi equal to 0 from bluetooth low energy tags ticc2540,ticc2541,blue radios tags

I develop an Android application running on Motorola RAZR XT910 with OS version 4.0.4.
This application uses the Motorola_ICS_R2_sdkaddon_100 BluetoothGattService.jar and BluetoothGatt.jar libraries and communicates with Bluetooth
Low Energy Sensor Tags (TI CC2540,TI CC2541,Blue Radios Sensor Tags).
On Discovery procedure I always read the rssi value equal to 0 . I use the following code
to read the rssi value on receiving the Intent BluetoothDevice.ACTION_FOUND:
short rssi = intent.getShortExtra(BluetoothDevice.EXTRA_RSSI,(short) 0);
Also for non Bluetooth low energy devices,the rssi value i read is ok (not equal to 0).
Can anyone help me??
thanks
I found a similar issue on the TI discussion site for using the Vender Specific query for RSSI. It seems that it works for Classic BT but returns an error code of 2 for BLE.
http://e2e.ti.com/support/low_power_rf/f/660/t/289391.aspx
It might be a fundamental limit that you can't get the value. In your case, the getShortExtra might not return the error code (2) and just return a 0 as RSSI.

Resources