I'm in a situation where we are hooking up to a device that may speak a variety of different baud rates depending on model. Some of which may be non-standard, like 10000, but that's another problem for another day.
Ideally I could use Qt to auto detect the baud rate, but from my research that's likely not possible for a few reasons, which I'm okay with. However, is there any native Linux based method to auto detect the baud rate of the connected device? Even a 3rd party open source application could suffice.
Linux serial drivers don't support autobauding, because most hardware doesn't support it, because there's no agreement on how it might work. It's highly application-specific.
If you're using FTDI serial adapters, then most of them support the bit-bang mode, and you should use them as a digital oscilloscope in such a mode to get a bitstream that's very easy to autobaud on.
On other devices, the simplest way towards autobauding is to set the device to 2-3x the highest baudrate you expect, then treat the input data like a chunked digital oscilloscope, taking account of error bits, and use heuristics to detect the baud rate. It will succeed in a surprising number of cases, but you must get the statistical model of the data source right. I don't know of any pre-canned solutions for that.
Some additional kernel support could be had to better timestamp the input from the UART (whether hardware or USB) and thus decrease the uncertainity in your data and thus the number of samples you need to take to detect baud.
Some of which may be non-standard, like 10000, but that's another problem for another day.
No biggie. I figured it out 16 years ago :) This is the answer you're looking for. If you think that the API is sick as in very, very sick, then you'd be right.
Related
I'm building a linux-based cashless device and trying to achieve communication with VMC in vending machines over UART directly without needing additional hardware adapter to convert between 8-bit and 9-bit frame data.
I'm only using the cashless device, no intention to connect any other peripheral to the VMC.
I read questions asked about this before, some of them stressed on the need to an adapter, others suggested possible hacks to achieve the 9-bit to 8-bit conversion, but still can't find a confirmed working and stable solution.
My question is, Is it possible (and reliable) to achieve this using a pure software solution? and how?
Thanks
Yes.
The 9th bit is a control bit. It will show if the data is to be interpreted as an address or as data. If you are communicating with one device and sending only data you want to strip the 9th bit out and only look at data frames. Check and see if it's always zero:
If controlBit = 0:
ProcessData(byte)
Else:
print("This is an address: " + byte)
EDIT:
Many people have reported that your connection will not be stable without special hardware due to timing problems.
Instead of reinventing the wheel you can use opensource code as a starting point.
https://github.com/mhaqs/vendiverse/wiki/Programming-the-VMC
This way you don't have to make the same mistakes over and over again.
I am currently working on a Raspberry pi (Jessie Stretch), the issue is that I want to connect two FTDI FT2232H serially at 12 Mbps, but because 12Mbps is not a standard speed Raspbian does not allow me to add that baud rate. I would like to know if someone has transmitted at that speed or if someone knows how to achieve the Bit rate of 12 Mbps with the maximum baud rate in Raspbian (4,000,000) .
PS: I changed the UART clock to 64,000,000, modified the "termbits.h" library and created termios structures, but nothing worked.
Thanks.
The data sheet for the FT2232H does advertise it supports 12 Mbaud (not 12Mbps). But it looks like it comes in different modules with support for RS232, RS422, and RS485. The most typical being RS232.
I've never heard of anyone operating a RS232 connection at 120000000 baud. The typical maximum that almost everything supports is 115200. The highest I've seen is 921600. Typical RS232 cables started running into interference issues at the higher baud rates.
I suspect the 12Mbaud spec is for RS422/RS485 operation which requires different cabling and is designed for higher speeds.
If you're using an FT2232H with RS232, the speeds you're looking for are likely unrealistic. If you're using it with RS422/RS485 you can probably get there, but it will be a much more specialized endeavor. It does look like Linux does support RS485. But there's not nearly as much documentation out there as for RS232.
Can you provide any more information about the USB adapters you're using?
I had a assignment for college where we needed to play a precompiled wav as integer array through the PWM and DAC. Now, I wanted more of a challenge, so I went out of my way and created a audio dac over usb using the micro controller in question: The STM32F051. It basically listens to my soundcard output using a wasapi loopback recorder, changes the resolution from 16 to 12 bit (since the dac on the stm32 only has a 12 bit resolution) and sends it over using usart using 10x sample rate as baud rate (in my case 960000). All done in C#.
On the microcontroller I simply use a interrupt for usart and push the received data to the dac.
It works pretty well, much better than PWM, and at a decent sample frequency of 48kHz.
But... here it comes.. When there is some (mostly) high pitch symphonic melody it starts to sound "wobbly".
Here a video where you can hear it: https://youtu.be/xD3uTP9etuA?t=88
I read up on the internet a bit about DIY dac's and someone somewhere (don't remember where) mentioned that MCU's in general have interrupt jitter. So may basic question is: Is interrupt jitter actually causing this? If so, are there ways to limit the jitter happening?
Or is this something entirely different?
I am thinking of trying to compact the pcm data send over serial (as said before, resolution of 12 bits, but are sent in packet of 2 8bits forming 16bits, hence twice the samplerate as the baud rate, so my plan is trying to shift 12 bits to the MSB and adding four bits of the next 12 bit value to the current 16 bit variable, hence only needing 12 transfers instead of 16 per 8 samples. Might read upon more efficient ways of compacting data for transport.), put the samples in a buffer and then use another timer that triggers at 48kHz for sending the samples to the dac. Would this concept work? Or would I just waste time?
For code, here is the project: https://github.com/EldinZenderink/SoundOverSerial
I am trying to figure out what the maximum throughput of a Bluetooth 2.1 SPP connection is.
I found 2 publications concerned with the topic (1, 2) and they both show diagrams, which show the throughput as a function of the Signal to noise ratio (that I can assume to be perfect for my concideration) and the kind of ACL package used. My problem is, I have no Idea which ACL packets are used. How is this decision made? Is it made on the fly, like "what's needed to transfer the current data is used"?
Furthermore, in the Serial Port Profile specification (chapter 2.3) I found this sentence:
This profile requires support for one-slot packets only. This means that this profile
ensures that data rates up to 128 kbps can be used. Support for higher rates is optional.
The last sentence realy confuses me. How do I find out whether this "option" applies in my case?
This means that in SPP mode, all bluetooth modules should work up to 128kbps, and some modules may work even faster.
Under SPP is RFCOMM, which tries to deliver the packets as quickly as possible. If only one packet is sent in one timeslot, you get the 128kbps. The firmware of the bluetooth module, or the HCI driver however can do things differently.
There are SPP speeds up to 480kbps reported - however this requires that both SPP modules are from the same vendor (e.g. BlueGiga iWrap modules can do this speed).
On the other end, if you're connecting to an unknown device, for example a BT112, or an RN41 module to an Android device, the actual usable SPP bandwidth can be much lower than 128 kbps (I have measurements as low as 10kbps).
In case of some old generation iPhones, the usable SPP bandwidth is around 8 kbps.
It is a wise idea to treat "standards" and "datasheets" very conservative, and do your own measurements if actual net data bandwidth is critical.
Even though BT, BT+EDR has theoretical on-the-air bitrates of 3Mbps, the actual usable net data bandwidth is a way smaller.
Hello there I am a newbie trying to prove the working of RS 232 Full modem and also one RS 422( RX,TX,RTS,CTS)
These 2 ports are on a custom designed board and I need to prove they are working.
I am able to confirm the working at register level but I need to prove the working using softwares like Minicom or any other custom program.
How can I prove the working of these ports from one PC to a different PC using DB 9 connections and LOOP BACK too
Can someone help me with this? Do I need to use any extra hardware to prove the working of this in Linux?
The most common type of serial port test is probably a loopback test. Create a test fixture that connects output pins of the port to the input pins (TX->RX, RTS->CTS, etc). If you do not have matching input pins for every output pin, you will need to do a three-way connection.
After you create the loopback, you will need to write software that exercises the pins. If TX and RX are connected, you can send a byte and verify that it was echoed back. For the control pins, toggle them and make sure the other side of the connection saw the transition. Make sure you exercise every pin of the serial port.
Note that you should run a TX->RX data loopback at multiple baud rates. It is possible for there to be a signal integrity issue in the design that only shows up at higher bauds rates. It is also possible for there to be a bad signal connection on the board that is masked by inductance and capacitance at higher baud rates. Therefore, it is a good idea to run a data loopback at the slowest baud rate, the fastest baud rate, and 1-2 in the middle.
Another thing you should do is a baud rate accuracy test. This will prove the clock driving the UART is running at the right frequency, and being divided properly. Transmit X amount of bytes at a certain baud rate and verify they arrived in the expected amount of time. To get an accurate number, you will need to bypass any buffering in the OS serial port driver (e.g. use direct register I/O), and make sure to accommodate for any start/stop bit overhead (see comments below).
However, a loopback test is not exhaustive. It only proves the device can talk to itself. The device may still have some flaw (e.g. voltage levels) that cannot be detected locally. So, you should also run some tests with external hardware. Cable your board to another system and run a test (e.g. with minicom). Make sure they can talk to each other.
Even an external communication test can miss something. You can still have poor signal quality from your board, but it happens to be good enough for the other device. To accurately verify the signal quality/timing, you need an oscilloscope.
While you are running communication tests, connect the scope probes to the various signals and verify the signal integrity. Make sure that voltage levels are valid, that you see clean low/high bit transitions, and that the timing of the data pin appears correct for the specified baud rate. (A scope can be a more accurate way to measure baud rate than the software-based method described earlier.)