I am trying to know what is the maximum data transfer speed between an Android mobile phone and a BLE peripheral.
Wikipedia indicates that this is "125 kbit/s – 1 Mbit/s – 2 Mbit/s" (https://en.wikipedia.org/wiki/Bluetooth_Low_Energy#Technical_details)
However, I tried to implement a POC, with a BLE peripheral sending 20-bytes notifications every 10 ms (which should lead to a data transfer speed of 16 kbit/s), and a mobile app subscribing to these notifications. I only get a fraction of the emitted notifications (10%, which is 1600 bit/s). If I increase the number of notifications sent, the number of notifications received does not increase (sometimes, it even decreases).
My tests were done using react-native-ble-plx for the mobile phone and react-native-ble-peripheral for the fake BLE peripheral.
How can I do to achieve a 1 Mbit/s data transfer speed with a BLE peripheral?
You cannot increase the BLE data transfer speed. However, you can request an increase of the Maximum Transmission Unit (MTU) i.e. the the largest packet size, specified in bytes that can be sent between your BLE center and peripheral:
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
bluetoothGatt.requestMtu(mtu);
}
Then handle the response in your BluetoothGattCallback's
#Override
public void onMtuChanged(BluetoothGatt gatt, int mtu, int status) {
}
Related
Screen grab from WireShark showing traffic when problem occurs
Short question - Referring to WireShark image, What is causing Master to send LL_CHANNEL_MAP_IND and why is it taking so long?
We are working on a hardware/software project that is utilizing the TI WL18xx as a Bluetooth controller. One of the main functions of this product is to communicate with our sensor hardware over a Bluetooth Low Energy connection. We have encountered an issue that we have had difficulty pinpointing, but feel may reside in the TI WL18xx hardware/firmware. Intermittently, after a second Bluetooth Low Energy device is connected, the response times from writing and notification of one of the characteristics on one of the connected devices becomes very long.
Details
The host product device is running our own embedded Linux image on a TI AM4376x processor. The kernel is 4.14.79 and our communication stack sits on top of Bluez5. The wifi/bluetooth chip is the Jorjin WG7831-BO, running TIInit_11.8.32.bts firmware version 4.5. It is based on the TI WL1831. The sensor devices that we connect to are our own and we use a custom command protocol which uses two characteristics to perform command handshakes. These devices work very well on a number of other platforms, including Mac, Windows, Linux, Chrome, etc.
The workflow that is causing problems is this;
A user space application allows the user to discover and connect to our sensor devices over BLE, one device at a time.
The initial connection requires a flurry of command/response type communication over the aforementioned BLE characteristics.
Once connected, the traffic is reduced significantly to occasional notifications of new measurements, and occasional command/response exchanges triggered by the user.
A single device always seems stable and performant.
When the user connects to a second device, the initial connection proceeds as expected.
However, once the second device's connection process completes, we start to see that the command/response response times become hundreds of times longer on the initially connected device.
The second device communication continues at expected speeds.
This problem only occurs with the first device about 30% of the time we follow this workflow.
Traces
Here is a short snippet of the problem that is formed from a trace log that is a mix of our library debug and btmon traces.
Everything seems fine until line 4102, at which we see the following:
ACL Data TX: Handle 1025 flags 0x00 dlen 22 #1081 [hci0] 00:12:48.654867
ATT: Write Command (0x52) len 17
Handle: 0x0014
Data: 580fd8c71bff00204e000000000000
D2PIO_SDK: GMBLNGIBlobSource.cpp(1532) : Blob cmd sent: 1bh to GDX-FOR 07100117; length = 15; rolling counter = 216; timestamp = 258104ms .
HCI Event: Number of Completed Packets (0x13) plen 5 #1082 [hci0] 00:12:49.387892
Num handles: 1
Handle: 1025
Count: 1
ACL Data RX: Handle 1025 flags 0x02 dlen 23 #1083 [hci0] 00:12:51.801225
ATT: Handle Value Notification (0x1b) len 18
Handle: 0x0016
Data: 9810272f1bd8ff00204e000000000000
D2PIO_SDK: GMBLNGIBlobSource.cpp(1745) : GetNextResponse(GDX-FOR 07100117) returns 1bh cmd blob after 3139=(261263-258124) milliseconds.
The elapsed time reported by GetNextResponse() for most cmds should be < 30 milliseconds. Response times were short when we opened and sent a bunch of cmds to device A.
The response times remained short when we opened and sent a bunch of cmds to device B. But on the first subsequent cmd sent to device A, the response time is more than 3 seconds!
Note that a Bluetooth radio can only do one thing at a time. Receive or transmit. On one single frequency. If you have two connections and two connection events happen at the same time, the firmware must decide which one to prioritize, and which one to skip. Maybe the firmware isn't smart enough to handle your specific case. Try with other connection parameters to see if something gets better. You can also try another Bluetooth dongle from a different manufacturer.
I have an an external ADC which has capability to sample data at 40 Ksps and controller needs to poll the data from ADC. I am unable to get enough sampling rate by keep on calling it from while (1) loop, i.e nearly 8 Ksps what I am getting. It has no external pin to notify controller about data ready.
1) How to make it fast enough to reach up to that sampling rate?
2) Since I am sampling it at that rate, I need to transfer data on USB simultaneously so, how should I implement the buffering scheme to have minimum delay between consecutive data packet?
FYI:
My SYSCLK :168 MHz
SPI Clock: 10.5 MHz
controller: STM32F4
If I were to use the ALTBeacon library in Android, what is the fastest scan rate for individual bluetooth low energy (BLE) beacon packets that can be captured and recorded into memory?
This is defined by the underlying Android Bluetooth LE scanning APIs. I am unaware of a limit, but 10 Hz is typical, and I have tested up to 40 Hz from a single device.
I'm new to USB development, and i'm quite confused about what data rates are realistic.
I'm trying to develop an external sound card connected on an AVR32 processor, which supports USB Full Speed(12 Mb/s). I'll use USB audio class 1 to send the audio data to a PC. I need to send 24 bit, 48kHz, 2 channels as INput to the computer, but also 24 bit, 48kHz, 1 channel OUTput from the computer. Streaming both ways.
That gives me a data rate of: 24 bit * 48kHz * 3 channels = 3,5 Mb/s, which should be possible by using USB Full Speed?
I understand that the Audio Class sends data via an Isochronous transfer, but i'm confused about how many transactions ( e.g. IN = 256 bytes ) it is possible to make in one frame? according to the USB specification (http://www.usb.org/developers/docs/usb20_docs/#usb20spec - > table 5-4) it seems to be possible to send more than one transaction per frame?
Is it possible to send both IN and OUT packets within one frame?
Thanks in advance!
I have an application that sends 8000 audio packets per second. Now initially for experimenting purpose I am preparing a buffer of 8 audio packets and then making an IOCTL call and passing the buffer to my driver.
I am using "USB analyser". From the USB analyser I got that the inter packet gap(IPG) is around 20-40 usec. That is fine. But sometimes the IPG shows 200-300usec. Is it the USB subsystem(USB core/HCD) that is playing the role or the implementation of ioctl ?. Making >1000 ioctl calls and "copy_from_user"per second may be the culprit behind the late submission of packets. And moreover i am using USB3.0 which is capable of supporting 5 GBps data rate.
The code flow in my driver is like :
switch(cmd)
{
case SEND_AUDIO:
copy_from_user(...,....,...);
for(i=0; i<8; i++);
usb_bulk_msg();
break;
}