I have a csv file with traffic density data per road segment of a certain high way, measured in Annual average daily traffic (AADT). Now I want to visualize this data.
Since I have the locations (lat and lon) of the road segments, my idea is to create lines between these points and give it a color which relates to the AADT value. So suppose, road segments / lines with high AADT are marked red and low AADT are marked green.
Which package should I use for this visualization?
It is difficult to say without any information about the structure of the data.
Is it just points? Is it a shapefile? Probably you should start with geopandas....
You can use Plotly just as displayed in the link.
Related
I have a signal (points in frequency domain (nanometers, then converted to tera hertz; along with magnitude level in mW). My signal looks like the attached pic. I would like to know a way to calculate the center frequency.
One theory suggests finding the -3dB cutoff frequencies on both ends. However, I could not find how to do that. So, please tell me how to calculate the -3dB cutoff frequencies so that I can apply te following formula- (f1+f2)/2
or suggest me a better way of finding the center frequency.
You could perform this measurement as an OBW measurement. - 3dB is when you half the total signal power by 50%. (in Watts)
The way to do it manually is to get the whole signal spectrum in an excel table, for example 1000 points, measure the total power, Ptot, and start adding the power by the lowest frequency until you reach 25% of Ptot. The frequency at that point will be Flow. Do the same, but starting from the top frequency until you reach 25% of Ptot. it will be Fhigh. The center will be (Flow + Fhigh)/2.
Sorry if it's not very clear but if you look for OBW measurements You should find better explanations on the net. Most of modern spectrum analyzers have this function built in.
I am looking for a real time image processing to measure the velocity and output it to another control system.
I have attached an image of a yellow stripe. This has markings on the surface that I would like to automatically detect and use for calculation. In the first step the material moves only in one direction. Here for example to the right. For me only the horizontal part of the movement is of interest, so quasi only the velocity along an x-axis. But the material moves relatively fast. At maximum speed in 28 ms the current mark (spike) is at the position of the one in front of it.
The idea is to use a Raspberry Pi 4 with a camera at the maximum of 120 fps. So every 8.3 ms a picture is generated and should make it possible to clearly detect the movement of the marks.
My questions are:
Is it possible to process the images and detaction that fast to get the velocity in nearly real time? And which algorithem should I use for this configuration? Because it would be best if I could use two or three markers per image and average the velocity of them.
And I would like to use the velocity as an input signal for another system. What is the easiest and fastest way to send the information directly to another control system?
To get familiar with front-end web development, I'm creating a weather app. Most of the tutorials I found display the temperature, humidity, chance of rain, etc.
Looking at the Dark Sky API, I see the "Time Machine Request" returns observed weather conditions, and the response contains a 'precipIntensity' field: The intensity (in inches of liquid water per hour) of precipitation occurring at the given time. This value is conditional on probability (that is, assuming any precipitation occurs at all).
So, it made me wonder about creating a 'radar image' of precipitation intensity?
Assuming other weather apis are similar, is generating a radar image of precipitation as straightforward as:
Create a grid of latitude/longitude coordinates.
Submit a request for weather data for each coordinate.
Build a color-coded grid of received precipitation intensity values and smooth between them.
Or would that be considered a misuse of the data?
Thanks,
Mike
This would most likely end up in a very low resolution product. I will explain.
Weather observations come in from input sources ranging from mesonet stations, airports, and other programs like the "citizen weather observer" program. All of these thousands of inputs are input into the NOAA MADIS system, a centralized server that stores all observations. The companies that generate the API's pull the data from MADIS.
The problem with the observed conditions is twofold : one is that the stations are highly clustered in urban areas. In Texas, for example - there are 100's of stations in Central TX near the cities of San Antonio and Austin, but 100 miles west there is essentially nothing. To generate a radar image using this method would involve extreme interpolation- and...
The second problem is observation time. The input from rain gauges are many times delayed several minutes to an hour or more. This would give inaccurate data.
If you wanted a gridded system, the best answer would be to use MRMS (multi-radar-multi-sensor) data from the NWS. It is not an API. These are .grib files that must be downloaded and processed. This is the live viewer and if you want to work on the data itself you can use the NOAA Weather Climate Toolkit to view and/or process by GUI or batch process (You can export to geoTIF and colorize it with GDAL tools). The actual MRMS data is located here and for the basic usage you are looking for, you could use the latest data in the "MergedReflectivityComposite" folder. (That would be how other radar apps show rain.) If you want actual precip intensity, check the "PrecipRate" folder.
For anything else except radar (warning polygons, etc) the NWS has an API that is located here.
If you have other questions, I will be happy to help.
I ran an impulse response analysis on a value weighted stock index and a few variables in python and got the following results:
I am not sure how to interpret these results.
Can anyone please help me out?
You might want to check the book "New introduction to Multiple Time Series Analysis" by Helmut Lutkepohl, 2005, for a slightly dense theory about the method.
In the meantime, a simple way you can interpret your plots is, let's say your variables are VW, SP500, oil, uts, prod, cpi, n3 and usd. They all are parts of the same system; what the impulse response analysis does is, try to assess how much one variable impacts another one independently of the other variables. Therefore, it is a pairwise shock from one variable to another. Your first plot is VW -> VW, this is pretty much an autocorrelation plot. Now, look at the other plots: apparently, SP500 exerts a maximum impact on VW (you can see a peak in the blue line reaching 0.25. The y-axis is given in standard deviations and x-axis in lag-periods. So in your example, SP500 cause a 0.25 change in VW at the lag of whatever is in your x-axis (I can't see from your figure). Similarly, you can see n3 negatively impacting VW at a given period.
There is an interesting link that you probably know and shows an example of the application of Python statsmodels VAR for Impulse Response analysis
I used this method to assess how one variable impact another in a plant-water-atmosphere system, there are some explanations there and also the interpretation of similar plots, take a look:
Use of remote sensing indicators to assess effects of drought and human-induced land degradation on ecosystem health in Northeastern Brazil
Good luck!
When lowering borehole seismometer down a borehole there is little control over the orientation of the horizontal components. Once we have estimated or calculated the angle we need to rotate the reference system to produce traces of seismic data in the North and East directions.
http://docs.obspy.org/packages/autogen/obspy.core.stream.Stream.rotate.html shows ways of changing between ZNE, ZRT, LQT, and LQR. I don't need to do this, and can't do this with data that is not yet in any of those reference systems. I just want to input two traces that represent seismic data in some perpendicular horizontal directions to produce two traces that represent the seismic data in the North and East directions.
sorry for the really later answer. We don't usually monitor StackOverflow. In any case, the current ObsPy master (you will have to update to the dev version if you are still on 0.9.2) has a new function to rotate arbitrarily oriented components to ZNE:
https://github.com/obspy/obspy/blob/a8cf88bfc28b7d06b88427ca626f67418922aa52/obspy/signal/rotate.py#L188-251
Hope it helps!