I have a XV-11 Lidar sensor from an old vacuum cleaner and I want to use it for a robot project.
During my research, I saw a very interesting and simple approach using Matplotlib and display all the distances using scatter points. eg (https://udayankumar.com/2018/08/01/working-with-lidar/) but when I run this python code to RP3 indeed a Matplotlib window is popping up with all the distances but the refresh rate for data it's too slow and impossible to view in real time. I mean the matplotlib display is falling behind a few dozens of seconds with all the sensor readings.
My next idea was to do something by myself with the following display lines but I have same result: Good readings but delayed a lot.
points =[]
plt.ion()
x = dist_mm*np.cos(angle_rad)
y = dist_mm*np.sin(angle_rad)
points.append([x,y])
points = np.array(points)
plt.scatter(points[:,0], points[:,1])
if angle == 356:
plt.plot()
plt.draw()
plt.pause(0.0001)
plt.clf()
print ("-----------")
What I'm doing wrong or what I can improve in this case? My expectations are something like this
Lidar animation, source: https://github.com/Hyun-je/pyrplidar but in this example it's used a different Lidar sensor
You are clearing and re-creating the axes, background etc. every time. At the very least you can limit this drawing/re-drawing to only the relevant plot points for a degree of improvement.
If you're not familiar with this I'd start with the animation guidance- https://matplotlib.org/stable/api/animation_api.html which introduces some basics like updating only parts of the figure.
If you're still churning out too much data to update then limiting the frequency with which you read your data or more specifically the rate at which you redraw might result in more stability too.
Probably worth hunting down more general guidance on realtime plotting e.g. update frame in matplotlib with live camera preview
Related
I am using plotting tool named 'Trend' with linux OS (Raspbian GNU/Linux 11 (bullseye)).
I am plotting live data from ADC to see (almost) real-time plot.
The problem I see is that after initial screen-wide plot, there is historic plot visible as well, which is disturbing.
How to get rid of the historic plot?
Picture of what I have, with real-time line and historic data line (the one which is faint on the left, coming stronger with progressing to the right side of the screen).
The manual of 'Trend' is somehow unfinished, and can be found under this link: https://www.thregr.org/~wavexx/software/trend/files/trend.1.html
It would be even better if there is more elaborated manual for 'Trend' tool. As you may guess google search for 'trend' brings all kind of trading, plotting and statistical results...
Question is, I want to calculate the speed of my arm for Slap detection. So I am using openpose to get the body points (here total points: 25) using body_25 model and using this along with the time I want to deduce the speed of my arm, i googled through openpose, stackoverflow, github.But could not succeed?
Velocity = Distance / Time = dx/dt
dx = frame3_bodypoints - frame_1_bodypoints;
dt = ?
I don't know how to find this from the openpose, is there a way I can find this? Any thoughts, would be great help!
I've never used OpenPose. But Newtonian physics would indicate that a slap corresponds to a sudden change in velocity of the hand.
I think it's a reasonable first approximation to assume that the Δt between frames is constant. Instantaneous variation in frame rate is called jitter. I would expect jitter to be small for modern recording devices. In any case, I don't know how to get instantaneous frame rate with the tools (OpenCV, PIL) that I am familiar with. I couldn't find any references to frame rate or time in the OpenPose docs.
For calculating velocity and delta-velocity, you have choices. Straight up linear velocity of the hand might be the easiest. For position changes use the geometric mean of positions (Δs = sqrt((x2-x1)^2 + (y2-y1)^2).
You could also calculate an angular velocity between the hand and the elbow, but that would be a little more involved and prone to noise.
I have a MacOS 10.13.4 and I want to write a GUI in python3 able to measure measure intervals on a scanned Electrocardiogram or ECG (hear rhythm of a patient). I have some programming knowledge in python but I'm not a professional computer scientist
For this I need to do the following steps:
Rotation: Rotate the image so that the ECG baseline is pefectly horizontal (because the ECG on the scan can be a little rotated by a few degrees of angle). Here is an image of an ECG which needs to be rotated:
Set the scale: As an ECG paper has always a system of grid squares which defines the time on the horizontal axis (40ms for a small square and 200ms for a big square) and the the amplitude of the signal in mV in the vertical axis (0.1mv for a big square). I want to measure unit time an unit amplitude and set it as reference. Here is an image of the scale
Measure intervals: Measure the different in terms of time interval and amplitude difference of the difference ECG points (P Wave, QRS, T Wave). Here is an image of the different intervals:
For the GUI, I'm thinking using Tkinter (any other thoughts)
What would be a simple Solution for this in terms of image processing:
- I tried to install openCV but got stuck in the install openCV
- Is Python Imaging Library - PIL a good alternative
- Any other thoughts?
Many thanks in advance!
I've read this is no big deal but it's really annoying. I'm plotting a 40Mhz BW at 20MSPS. This is a N210 and I'm connected through a switch.
It seems to plot fine but the scale on the Y-axis is constantly changing. Can I fix this?
Finally, the X-axis is from 0 to 500e-3. This makes no sense to me given my settings. Can someone please help me understand this?
In response to the question, "It seems to plot fine but the scale on the Y-axis is constantly changing. Can I fix this?" you can bring up the plot menu using the small down arrow on the plot view. From there select Settings... and under the Plot section there are places for the plot min and max, which default to AUTO.
USRP Overflow detected; I've read this is no big deal ...
It really is a big deal. It means your PC was not fast enough at processing the samples that came from the USRP, so some samples had to be dropped. This is the worst that can happen to your signal.
You will need to make your signal processing faster (for example, instead of processing everything live first storing things to an SSD and then later process stuff offline, or buy a significantly faster PC, if you think that would help with your specific application), or reduce the sampling rate.
I'm plotting a 40Mhz BW at 20MSPS
Nyquist says you're not. You can't observe 40 MHz bandwidth with 20 MS/s, it's mathematically impossible.
It seems to plot fine but the scale on the Y-axis is constantly changing. Can I fix this?
I don't know the graphical sinks of redhawk, but this sounds like autoscaling, so yes, probably you can disable that feature.
Finally, the X-axis is from 0 to 500e-3. This makes no sense to me given my settings. Can someone please help me understand this?
You don't tell us what you're plotting. Time values, given some trigger, converting complex samples to their magnitude? Or is it some kind of power spectrum?
In the later case, this is most probably normalized frequency for a real signal; you have to read it as "frequency in units of sampling rate".
I wish to calculate position of a small remote controlled car (relative to starting position). The car moves on a flat surface for example: a room floor.
Now, I am using an accelerometer and a gyroscope. To be precise this board --> http://www.sparkfun.com/products/9623
As a first step I just took the accelerometer data in x and y axes (since car moves on a surface) and double integrated the data to get position. The formulae I used were:
vel_new = vel_old + ( acc_old + ( (acc_new - acc_old ) / 2.0 ) ) * SAMPLING_TIME;
pos_new = pos_old + ( vel_old + ( (vel_new - vel_old ) / 2.0 ) ) * SAMPLING_TIME;
vel_old = vel_new;
pos_old = pos_new;
acc_new = measured value from accelerometer
The above formulae are based on this document: http://perso-etis.ensea.fr/~pierandr/cours/M1_SIC/AN3397.pdf
However this is giving horrible error.
After reading other similar questions on this forum, I found out that I need to subtract the component of Gravity from above acceleration values (everytime from acc_new) by using gyroscope somehow. This idea is very well explained in Google Tech Talks video Sensor Fusion on Android Devices: A Revolution in Motion Processing at time 23:49.
Now my problem is how to subtract that gravity component?
I get angular velocity from gyroscope. How do I convert it into acceleration so that I can subtract it from the output of accelerometer?
It won't work, these sensors are not accurate enough to calculate the position.
The reason is also explained in the video you are referring to.
The best you could do is to get the velocity based on the rpm of the wheels. If you also know the heading that belongs to the velocity, you can integrate the velocity to get position. Far from perfect but it should give you a reasonable estimate of the position.
I do not know how you could get the heading of the car, it depends on the hardware you have.
I'm afraid Ali's answer is quite true when it comes to those devices. However why don't you try searching arduino dead reckoning which will cover stories of people trying similar boards?
Here's a link that appeared after a search that I think may help you:
Converting IMU to INS
Even it seems like all of them failed you may come across workarounds which will decrease errors to acceptable amounts or calibrate your algorithm with some other sensor to put it back in track as the squared error of acceleration along with gyros white noise destroying accuracy.
One reason you have a huge error is that the equation appears to be wrong.
Example: To get updated vel,use.. Velnew=velold+((accelold+accelnew)/2)*sampletime.
Looks like you had an extra accel term in the equation. Using this alone will not correct all the error....need to as you say correct for influence of gravity and other things.