I am using a MPU6050 IMU to map the path of a device (with starting point as origin). For this I need to convert the accelerometer and gyroscope readings into (Cartesian)co-ordinates. I think I need to continuously sample the accelerometer readings and go on adding (integrating) the sample to the previous point for each axes respectively. At startup the previous point will be (0,0,0).
I know this on paper. But I dont think it will be that simple. How will I know when the device is moving backwards, ie towards the origin?
The MPU6050 provides accleration and gyro reading in all axes. I used this to fetch the values. But I dont know how to continue. So what I need is an "Inertial Navigation system" which takes acceleration and angular velocity vectors as well as the current position as input and returns the new position. I know this will have errors, but I am not concerned about that for now.
If someone can guide me in this that would be great. Any hints or pointers will be appreciated.
Kiran G
Kiran,
To answer that question it would be good to know what kind of Gyro are you using or willing to use. It is very different depending if the output is an analog signal (voltage or current loop) or if that is any kind of (normally serial) bus.
Please note that most likely you will have also to filter the signal based on the expected dynamics of the environment.
Related
I am very new to python and programming in general so I am hoping someone might be able to point me in the right direction in my search for a solution;
In controlling a linear actuator, I need to be sure the speed is constant over varying loads.
The motor controller receives byte values of 0(stopped)-127(full speed) over serial to control speed
The position is a voltage from 0-5v with 5v being fully retracted and 0v being fully extended.
I have looked up simple - PID but Im not sure I understand how I would apply it in this case.
Essentially I would specify every 0.x seconds, check the position relative to the previous position, calculate the speed, and then update the speed value being sent over the serial port so I reach my speed setpoint.
If someone could detail what should go where, specifically what the tunings are for?
https://pypi.org/project/simple-pid/
Thanks!
I am trying to use a K-Team Khepera III robot for a basic navigation and obstacle avoidance. I can successfully read the IR proximity sensors using MATLAB over Bluetooth COM port (command 'N'). However, I am unable to make sense of those values. The values range from 0 (no obstacle nearby) to somewhat around 3916 (obstacle touching the sensor). This does not give me distance in any sense. I am looking for any guidance on how to 'decode' these sensor readings.
I also tried using the Ultrasonic sensors, but the distance readings don't seem to be very accurate to me. (I am unsure how the number of echoes affects the accuracy, I just set it to 1).
I am sure that the readings are correct and not garbage values (confirmed using the K-Team's Interface Khepera III software on Windows).
I would really appreciate any help or resource that explains on using these sensors in detail.
Thanks in advance.
I know, this question has been asked a lot of times. Until yesterday i thought that the answer was "yes, it is possible but you can not obtain an accurate result of your position". My idea is to take a BLE badge in my hand and with other 4 devices, positioned on the ceiling, obtain my current position using the trilateration. After weeks of resarch, i concluded that this method could not be as accurate as i'd like it to be, so i went over.
Now, what about this video? Youtube by Loopd.
They use bluetooth badges, but how they obtain these results?
Thanks to everyone
The results of Bluetooth LE indoor location can be quite accurate, but it requires some processing of the raw signals rather than simple triangulation. Essentially you weight different beacons differently in your position calculation based on how far away they are and filter to smooth the result.
There is a working example as open source at http://vor.space/
I was wondering if there was an existing API for tracking the top of people heads with the Kinect. e.g., the Kinect is facing downwards from a ceiling.
If not, how might I implement such a thing with its depth data.
No. The Kinect expects to be facing a standing (or seated, given the appropriate flag) human. All APIs (official or 3rd party) that have a notion of skeleton tracking expect this.
If you wish you track someone from above, you will need to use a library such as OpenCV (or EmguCV, for C# development). Well, you don't have to, but they offer utilities to help with computer vision and image processing. These libraries don't care if you are using a Kinect or just a regular RGB camera.
Using the Kinect from above, you could use the depth data to help locate and track blobs. With the Kinect at a known distance from the floor, have a few people walk under it and see what z-coordinates you get out of it -- you can then assume that anything within a certain z-coordinate range is a person walking across the screen (vs. a cat, or something else).
You will need to use standard image processing techniques (see OpenCV reference above) to initially find the blobs within the image. Once found, the depth data from the Kinect might be useful but I think you'll find it isn't ultimately necessary if you're just watching people walk across the floor.
We built a Kinect-driven experience where the sensors had to point downward to detect users walking along a wall. We used openTSPS to do all the work of taking the camera input and doing blob detection and handing off tracked "persons" to (in our case) a Processing app. It works really well for us.
http://opentsps.com/
provided GPS will not work precisely in closed environment like rooms etc,I m interested to know whether Accelerometer can be used to find the position of object relative to certain point? If not then what other technology iphone4 provides to cater it?
Thanks
The accelerometer cannot reliably provide location information, even with the aid of the gyroscope. GPS is the best you are likely to get.
OTOH, work-arounds abound in the augmented-reality space. Consider the ARDefender game.