python3: Split time series by diurnal periods - python-3.x
I have the following dataset:
01/05/2020,00,26.3,27.5,26.3,80,81,73,22.5,22.7,22.0,993.7,993.7,993.0,0.0,178,1.2,-3.53,0.0
01/05/2020,01,26.1,26.8,26.1,79,80,75,22.2,22.4,21.9,994.4,994.4,993.7,1.1,22,2.0,-3.54,0.0
01/05/2020,02,25.4,26.1,25.4,80,81,79,21.6,22.3,21.6,994.7,994.7,994.4,0.1,335,2.3,-3.54,0.0
01/05/2020,03,23.3,25.4,23.3,90,90,80,21.6,21.8,21.5,994.7,994.8,994.6,0.9,263,1.5,-3.54,0.0
01/05/2020,04,22.9,24.2,22.9,89,90,86,21.0,22.1,21.0,994.2,994.7,994.2,0.3,268,2.0,-3.54,0.0
01/05/2020,05,22.8,23.1,22.8,90,91,89,21.0,21.4,20.9,993.6,994.2,993.6,0.7,264,1.5,-3.54,0.0
01/05/2020,06,22.2,22.8,22.2,92,92,90,20.9,21.2,20.8,993.6,993.6,993.4,0.8,272,1.6,-3.54,0.0
01/05/2020,07,22.6,22.6,22.0,91,93,91,21.0,21.2,20.7,993.4,993.6,993.4,0.4,284,2.3,-3.49,0.0
01/05/2020,08,21.6,22.6,21.5,92,92,90,20.2,20.9,20.1,993.8,993.8,993.4,0.4,197,2.1,-3.54,0.0
01/05/2020,09,22.0,22.1,21.5,92,93,92,20.7,20.8,20.2,994.3,994.3,993.7,0.0,125,2.1,-3.53,0.0
01/05/2020,10,22.7,22.7,21.9,91,92,91,21.2,21.2,20.5,995.0,995.0,994.3,0.0,354,0.0,70.99,0.0
01/05/2020,11,25.0,25.0,22.7,83,91,82,21.8,22.1,21.1,995.5,995.5,995.0,0.8,262,1.5,744.8,0.0
01/05/2020,12,27.9,28.1,24.9,72,83,70,22.3,22.8,21.6,996.1,996.1,995.5,0.7,228,1.9,1392.,0.0
01/05/2020,13,30.4,30.4,27.7,58,72,55,21.1,22.6,20.4,995.9,996.2,995.9,1.6,134,3.7,1910.,0.0
01/05/2020,14,31.7,32.3,30.1,50,58,48,20.2,21.3,19.7,995.8,996.1,995.8,3.0,114,5.4,2577.,0.0
01/05/2020,15,32.9,33.2,31.8,44,50,43,19.1,20.5,18.6,994.9,995.8,994.9,0.0,128,5.6,2853.,0.0
01/05/2020,16,33.2,34.4,32.0,46,48,41,20.0,20.0,18.2,994.0,994.9,994.0,0.0,125,4.3,2700.,0.0
01/05/2020,17,33.1,34.5,32.7,44,46,39,19.2,19.9,18.5,993.4,994.1,993.4,0.0,170,1.6,2806.,0.0
01/05/2020,18,33.6,34.2,32.6,41,47,40,18.5,20.0,18.3,992.6,993.4,992.6,0.0,149,0.0,2319.,0.0
01/05/2020,19,33.5,34.7,32.1,43,49,39,19.2,20.4,18.3,992.3,992.6,992.3,0.3,168,4.1,1907.,0.0
01/05/2020,20,32.1,33.9,32.1,49,51,41,20.2,20.7,18.5,992.4,992.4,992.3,0.1,192,3.7,1203.,0.0
01/05/2020,21,29.9,32.2,29.9,62,62,49,21.8,21.9,20.2,992.3,992.4,992.2,0.0,188,2.9,408.0,0.0
01/05/2020,22,28.5,29.9,28.4,67,67,62,21.8,22.0,21.7,992.5,992.5,992.3,0.4,181,2.3,6.817,0.0
01/05/2020,23,27.8,28.5,27.8,71,71,66,22.1,22.1,21.5,993.1,993.1,992.5,0.0,225,1.6,-3.39,0.0
02/05/2020,00,27.4,28.2,27.3,75,75,68,22.5,22.5,21.7,993.7,993.7,993.1,0.5,139,1.5,-3.54,0.0
02/05/2020,01,27.3,27.7,27.3,72,75,72,21.9,22.6,21.9,994.3,994.3,993.7,0.0,126,1.1,-3.54,0.0
02/05/2020,02,25.4,27.3,25.2,85,85,72,22.6,22.8,21.9,994.4,994.5,994.3,0.1,256,2.6,-3.54,0.0
02/05/2020,03,25.5,25.6,25.3,84,85,82,22.5,22.7,22.1,994.3,994.4,994.2,0.0,329,0.7,-3.54,0.0
02/05/2020,04,24.5,25.5,24.5,86,86,82,22.0,22.5,21.9,993.9,994.3,993.9,0.0,290,1.2,-3.54,0.0
02/05/2020,05,24.0,24.5,23.5,87,88,86,21.6,22.1,21.3,993.6,993.9,993.6,0.7,285,1.3,-3.54,0.0
02/05/2020,06,23.7,24.1,23.7,87,87,85,21.3,21.6,21.3,993.1,993.6,993.1,0.1,305,1.1,-3.51,0.0
02/05/2020,07,22.7,24.1,22.5,91,91,86,21.0,21.7,20.7,993.1,993.3,993.1,0.6,220,1.1,-3.54,0.0
02/05/2020,08,22.9,22.9,22.6,92,92,91,21.5,21.5,21.0,993.2,993.2,987.6,0.0,239,1.5,-3.53,0.0
02/05/2020,09,22.9,23.0,22.8,93,93,92,21.7,21.7,21.4,993.6,993.6,993.2,0.0,289,0.4,-3.53,0.0
02/05/2020,10,23.5,23.5,22.8,92,93,92,22.1,22.1,21.6,994.3,994.3,993.6,0.0,256,0.0,91.75,0.0
02/05/2020,11,26.1,26.2,23.5,80,92,80,22.4,23.1,22.2,995.0,995.0,994.3,1.1,141,1.9,789.0,0.0
02/05/2020,12,28.7,28.7,26.1,69,80,68,22.4,22.7,22.1,995.5,995.5,995.0,0.0,116,2.2,1468.,0.0
02/05/2020,13,31.4,31.4,28.6,56,69,56,21.6,22.9,21.0,995.5,995.7,995.4,0.0,65,0.0,1762.,0.0
02/05/2020,14,32.1,32.4,30.6,48,58,47,19.8,22.0,19.3,995.0,995.6,990.6,0.0,105,0.0,2657.,0.0
02/05/2020,15,34.0,34.2,31.7,43,48,42,19.6,20.1,18.6,993.9,995.0,993.9,3.0,71,6.0,2846.,0.0
02/05/2020,16,34.7,34.7,32.3,38,48,38,18.4,20.3,18.3,992.7,993.9,992.7,1.4,63,6.3,2959.,0.0
02/05/2020,17,34.0,34.7,32.7,42,46,38,19.2,20.0,18.4,991.7,992.7,991.7,2.2,103,4.8,2493.,0.0
02/05/2020,18,34.3,34.7,33.6,41,42,38,19.1,19.4,18.0,991.2,991.7,991.2,2.0,141,4.8,2593.,0.0
02/05/2020,19,33.5,34.5,32.5,42,47,39,18.7,20.0,18.4,990.7,991.4,989.9,1.8,132,4.2,1317.,0.0
02/05/2020,20,32.5,34.2,32.5,47,48,40,19.7,20.3,18.7,990.5,990.7,989.8,1.3,191,4.2,1250.,0.0
02/05/2020,21,30.5,32.5,30.5,59,59,47,21.5,21.6,20.0,979.8,990.5,979.5,0.1,157,2.9,345.5,0.0
02/05/2020,22,28.6,30.5,28.6,67,67,59,21.9,21.9,21.5,978.9,980.1,978.7,0.6,166,2.2,1.122,0.0
02/05/2020,23,27.2,28.7,27.2,74,74,66,22.1,22.2,21.6,978.9,979.3,978.6,0.0,246,1.7,-3.54,0.0
03/05/2020,00,26.5,27.2,26.0,77,80,74,22.2,22.5,22.0,979.0,979.1,978.7,0.0,179,1.4,-3.54,0.0
03/05/2020,01,26.0,26.6,26.0,80,80,77,22.4,22.5,22.1,979.1,992.4,978.7,0.0,276,0.6,-3.54,0.0
03/05/2020,02,26.0,26.5,26.0,79,81,75,22.1,22.5,21.7,978.8,979.1,978.5,0.0,290,0.6,-3.53,0.0
03/05/2020,03,25.3,26.0,25.3,83,83,79,22.2,22.4,21.8,978.6,989.4,978.5,0.5,303,1.0,-3.54,0.0
03/05/2020,04,25.3,25.6,24.6,81,85,81,21.9,22.5,21.7,978.1,992.7,977.9,0.7,288,1.5,-3.00,0.0
03/05/2020,05,23.7,25.3,23.7,88,88,81,21.5,21.9,21.5,977.6,991.8,977.3,1.2,256,1.8,-3.54,0.0
03/05/2020,06,23.3,23.7,23.3,91,91,88,21.7,21.7,21.5,976.9,977.6,976.7,0.4,245,1.8,-3.54,0.0
03/05/2020,07,23.0,23.6,23.0,91,91,89,21.4,21.9,21.3,976.7,977.0,976.4,0.9,257,1.9,-3.54,0.0
03/05/2020,08,23.4,23.4,22.9,90,92,90,21.7,21.7,21.3,976.8,976.9,976.5,0.4,294,1.6,-3.52,0.0
03/05/2020,09,23.0,23.5,23.0,88,90,87,21.0,21.6,20.9,992.1,992.1,976.7,0.8,263,1.6,-3.54,0.0
03/05/2020,10,23.2,23.2,22.5,91,92,88,21.6,21.6,20.8,993.0,993.0,992.2,0.1,226,1.5,29.03,0.0
03/05/2020,11,26.0,26.1,23.2,77,91,76,21.6,22.1,21.5,993.8,993.8,982.1,0.0,120,0.9,458.1,0.0
03/05/2020,12,26.6,27.0,25.5,76,80,76,22.1,22.5,21.4,982.7,994.3,982.6,0.3,121,2.3,765.3,0.0
03/05/2020,13,28.5,28.7,26.6,66,77,65,21.5,23.1,21.2,982.5,994.2,982.4,1.4,130,3.2,1219.,0.0
03/05/2020,14,31.1,31.1,28.5,55,66,53,21.0,21.8,19.9,982.3,982.7,982.1,1.2,129,3.7,1743.,0.0
03/05/2020,15,31.6,31.8,30.7,50,55,49,19.8,20.8,19.2,992.9,993.5,982.2,1.1,119,5.1,1958.,0.0
03/05/2020,16,32.7,32.8,31.1,46,52,46,19.6,20.7,19.2,991.9,992.9,991.9,0.8,122,4.4,1953.,0.0
03/05/2020,17,32.3,33.3,32.0,44,49,42,18.6,20.2,18.2,990.7,991.9,979.0,2.6,133,5.9,2463.,0.0
03/05/2020,18,33.1,33.3,31.9,44,50,44,19.3,20.8,18.9,989.9,990.7,989.9,1.1,170,5.4,2033.,0.0
03/05/2020,19,32.4,33.2,32.2,47,47,44,19.7,20.0,18.7,989.5,989.9,989.5,2.4,152,5.2,1581.,0.0
03/05/2020,20,31.2,32.5,31.2,53,53,46,20.6,20.7,19.4,989.5,989.7,989.5,1.7,159,4.6,968.6,0.0
03/05/2020,21,29.7,32.0,29.7,62,62,51,21.8,21.8,20.5,989.7,989.7,989.4,0.8,154,4.0,414.2,0.0
03/05/2020,22,28.3,29.7,28.3,69,69,62,22.1,22.1,21.7,989.9,989.9,989.7,0.3,174,2.0,6.459,0.0
03/05/2020,23,26.9,28.5,26.9,75,75,67,22.1,22.5,21.7,990.5,990.5,989.8,0.2,183,1.0,-3.54,0.0
The second column is time (hour). I want to separate the dataset by morning (06-11), afternoon (12-17), evening (18-23) and night (00-05). How I can do it?
You can use pd.cut:
bins = [-1,5,11,17,24]
labels = ['morning', 'afternoon', 'evening', 'night']
df['day_part'] = pd.cut(df['hour'], bins=bins, labels=labels)
I added column names, including Hour for the second column.
Then I used read_csv which reads the source text, "dropping" leading
zeroes, so that Hour column is just int.
To split rows (add a column marking the diurnal period), use:
df['period'] = pd.cut(df.Hour, bins=[0, 6, 12, 18, 24], right=False,
labels=['night', 'morning', 'afternoon', 'evening'])
Then you can e.g. use groupby to process your groups.
Because I used right=False parameter, the bins are closed on the left
side, thus bin limits are more natural (no need for -1 as an hour).
And bin limits (except for the last) are just starting hours of
each period - quite natural notation.
Related
Pine script - Security function not show correct on different timeframe
I'm newbie and try to get ichimoku data on 4 hour timeframe but it not showing the correct value when I shift. //#version=4 study(title="test1", overlay=true) conversionPeriods = input(9, minval=1, title="Conversion Line Length") basePeriods = input(26, minval=1, title="Base Line Length") laggingSpan2Periods = input(52, minval=1, title="Leading Span B Length") displacement = input(26, minval=1, title="Displacement") donchian_M240(len) => avg(security(syminfo.tickerid, 'D' , lowest(len)), security(syminfo.tickerid, 'D', highest(len))) tenkanSen_M240 = donchian_M240(conversionPeriods) kijunSen_M240 = donchian_M240(basePeriods) senkoSpanA_M240 = avg(tenkanSen_M240, kijunSen_M240) plot(senkoSpanA_M240[25], title="senkoSpanA_M240[25]") The value senkoSpanA_M240[25] keep changing when I'm in M5, M15, M30, H1, H4 or D1. Can you help please?
the reason it keeps changing when you change time frames is because you are using a historical bar reference [25] on your senkoSpanA_M240. This means it will look for the senkoSpanA_M240 condition that occurred 25 bars ago. Depending on which time frame you are selecting, it will look back 25 bars of that time frame and perform the calculation. What exactly are you trying to achieve by using the [25]?
How to do a vector of dates in python? [duplicate]
I'm trying to generate a date range of monthly data where the day is always at the beginning of the month: pd.date_range(start='1/1/1980', end='11/1/1991', freq='M') This generates 1/31/1980, 2/29/1980, and so on. Instead, I just want 1/1/1980, 2/1/1980,... I've seen other question ask about generating data that is always on a specific day of the month, with answers saying it wasn't possible, but beginning of month surely must be possible!
You can do this by changing the freq argument from 'M' to 'MS': d = pandas.date_range(start='1/1/1980', end='11/1/1990', freq='MS') print(d) This should now print: DatetimeIndex(['1980-01-01', '1980-02-01', '1980-03-01', '1980-04-01', '1980-05-01', '1980-06-01', '1980-07-01', '1980-08-01', '1980-09-01', '1980-10-01', ... '1990-02-01', '1990-03-01', '1990-04-01', '1990-05-01', '1990-06-01', '1990-07-01', '1990-08-01', '1990-09-01', '1990-10-01', '1990-11-01'], dtype='datetime64[ns]', length=131, freq='MS', tz=None) Look into the offset aliases part of the documentation. There it states that 'M' is for the end of the month (month end frequency) while 'MS' for the beginning (month start frequency).
It is worth noting that pandas.date_range() only includes dates within the defined interval, which may not be expected : start = "2020-03-08" end = "2021-03-08" pd.date_range(start, end, freq='MS') results in DatetimeIndex(['2020-04-01', '2020-05-01', '2020-06-01', '2020-07-01', '2020-08-01', '2020-09-01', '2020-10-01', '2020-11-01', '2020-12-01', '2021-01-01', '2021-02-01', '2021-03-01'], dtype='datetime64[ns]', freq='MS') For MS, a workaround to include the first day of the opening month is to work only with the year and month of the start date : pd.date_range(start[:7], end, freq='MS') will then give DatetimeIndex(['2020-03-01', '2020-04-01', '2020-05-01', '2020-06-01', '2020-07-01', '2020-08-01', '2020-09-01', '2020-10-01', '2020-11-01', '2020-12-01', '2021-01-01', '2021-02-01', '2021-03-01'], dtype='datetime64[ns]', freq='MS') If you wish to keep the same starting day for each month, you can then add the offset with pd.DateOffset() : pd.date_range(start[:7], end, freq='MS') + pd.DateOffset(days=7) will give DatetimeIndex(['2020-03-08', '2020-04-08', '2020-05-08', '2020-06-08', '2020-07-08', '2020-08-08', '2020-09-08', '2020-10-08', '2020-11-08', '2020-12-08', '2021-01-08', '2021-02-08', '2021-03-08'], dtype='datetime64[ns]', freq=None) As mentioned in comments, note that trouble may come with this workaround for offsets higher or equals to 28.
how to choose certain elements of a matrix to create a new one with np.array?
I have a matrix called "times" of form (1,517) where are the times of a whole day 24 hours (in seconds Epoch time) and I want to create a new matrix with the times of each half hour, that is, starting from the first time then the one that corresponds to half hour later and so on until completing all the half hours that there are in a day, that is, 48 I created a delta of time with dt = timedelta (hours = 0.5) dts = timedelta.total_seconds (dt) but I do not know how to do to indicate that my new matrix takes those elements print(times.shape) Out[4]: (1, 517) print(times) array([[1.55079361e+09, 1.55079377e+09, 1.55079394e+09, 1.55079410e+09, 1.55079430e+09, 1.55079446e+09, 1.55079462e+09, 1.55079479e+09, 1.55079495e+09, 1.55079512e+09, 1.55079528e+09, 1.55079544e+09, 1.55079561e+09, 1.55079577e+09, 1.55079594e+09, 1.55079614e+09, 1.55079630e+09, 1.55079646e+09, 1.55079663e+09, 1.55079679e+09, 1.55079695e+09, 1.55079712e+09, 1.55079728e+09, 1.55079744e+09, 1.55079761e+09, 1.55079781e+09, 1.55079797e+09, 1.55079814e+09, 1.55079830e+09, 1.55079846e+09, 1.55079863e+09, 1.55079879e+09, 1.55079895e+09, 1.55079912e+09, 1.55079928e+09, 1.55079945e+09, 1.55079964e+09, 1.55079981e+09, 1.55079997e+09, 1.55080014e+09, 1.55080030e+09, 1.55080046e+09, 1.55080063e+09, 1.55080079e+09, 1.55080096e+09, 1.55080112e+09, 1.55080128e+09, 1.55080148e+09, 1.55080164e+09, 1.55080181e+09, 1.55080197e+09, 1.55080214e+09, 1.55080230e+09, 1.55080246e+09, 1.55080263e+09, 1.55080279e+09, 1.55080296e+09, 1.55080312e+09, 1.55080332e+09, 1.55080348e+09, 1.55080364e+09, 1.55080381e+09, 1.55080397e+09, 1.55080414e+09, 1.55080430e+09, 1.55080446e+09, 1.55080463e+09, 1.55080479e+09, 1.55080496e+09, 1.55080516e+09, 1.55080532e+09, 1.55080548e+09, 1.55080565e+09, 1.55080581e+09, 1.55080597e+09, 1.55080614e+09, 1.55080630e+09, 1.55080646e+09, 1.55080663e+09, 1.55080683e+09, 1.55080699e+09, 1.55080716e+09, 1.55080732e+09, 1.55080748e+09, 1.55080765e+09, 1.55080781e+09, 1.55080797e+09, 1.55080814e+09, 1.55080830e+09, 1.55080847e+09, 1.55080866e+09, 1.55080883e+09, 1.55080899e+09, 1.55080916e+09, 1.55080932e+09, 1.55080948e+09, 1.55080965e+09, 1.55080981e+09, 1.55080998e+09, 1.55081014e+09, 1.55081030e+09, 1.55081050e+09, 1.55081066e+09, 1.55081083e+09, 1.55081099e+09, 1.55081116e+09, 1.55081132e+09, 1.55081148e+09, 1.55081165e+09, 1.55081181e+09, 1.55081198e+09, 1.55081214e+09, 1.55081234e+09, 1.55081250e+09, 1.55081266e+09, 1.55081283e+09, 1.55081299e+09, 1.55081316e+09, 1.55081332e+09, 1.55081348e+09, 1.55081365e+09, 1.55081381e+09, 1.55081398e+09, 1.55081418e+09, 1.55081434e+09, 1.55081450e+09, 1.55081467e+09, 1.55081483e+09, 1.55081499e+09, 1.55081516e+09, 1.55081532e+09, 1.55081548e+09, 1.55081565e+09, 1.55081585e+09, 1.55081601e+09, 1.55081618e+09, 1.55081634e+09, 1.55081650e+09, 1.55081667e+09, 1.55081683e+09, 1.55081699e+09, 1.55081716e+09, 1.55081732e+09, 1.55081749e+09, 1.55081768e+09, 1.55081785e+09, 1.55081801e+09, 1.55081818e+09, 1.55081834e+09, 1.55081850e+09, 1.55081867e+09, 1.55081883e+09, 1.55081900e+09, 1.55081916e+09, 1.55081932e+09, 1.55081952e+09, 1.55081968e+09, 1.55081985e+09, 1.55082001e+09, 1.55082018e+09, 1.55082034e+09, 1.55082050e+09, 1.55082067e+09, 1.55082083e+09, 1.55082100e+09, 1.55082116e+09, 1.55082136e+09, 1.55082152e+09, 1.55082168e+09, 1.55082185e+09, 1.55082201e+09, 1.55082218e+09, 1.55082234e+09, 1.55082250e+09, 1.55082267e+09, 1.55082283e+09, 1.55082300e+09, 1.55082320e+09, 1.55082336e+09, 1.55082352e+09, 1.55082369e+09, 1.55082385e+09, 1.55082401e+09, 1.55082418e+09, 1.55082434e+09, 1.55082450e+09, 1.55082467e+09, 1.55082487e+09, 1.55082503e+09, 1.55082520e+09, 1.55082536e+09, 1.55082552e+09, 1.55082569e+09, 1.55082585e+09, 1.55082601e+09, 1.55082618e+09, 1.55082634e+09, 1.55082651e+09, 1.55082670e+09, 1.55082687e+09, 1.55082703e+09, 1.55082720e+09, 1.55082736e+09, 1.55082752e+09, 1.55082769e+09, 1.55082785e+09, 1.55082802e+09, 1.55082818e+09, 1.55082834e+09, 1.55082854e+09, 1.55082870e+09, 1.55082887e+09, 1.55082903e+09, 1.55082920e+09, 1.55082936e+09, 1.55082952e+09, 1.55082969e+09, 1.55082985e+09, 1.55083002e+09, 1.55083018e+09, 1.55083038e+09, 1.55083054e+09, 1.55083070e+09, 1.55083087e+09, 1.55083103e+09, 1.55083120e+09, 1.55083136e+09, 1.55083152e+09, 1.55083169e+09, 1.55083185e+09, 1.55083202e+09, 1.55083222e+09, 1.55083238e+09, 1.55083254e+09, 1.55083271e+09, 1.55083287e+09, 1.55083303e+09, 1.55083320e+09, 1.55083336e+09, 1.55083352e+09, 1.55083369e+09, 1.55083389e+09, 1.55083405e+09, 1.55083422e+09, 1.55083438e+09, 1.55083454e+09, 1.55083471e+09, 1.55083487e+09, 1.55083503e+09, 1.55083520e+09, 1.55083536e+09, 1.55083553e+09, 1.55083572e+09, 1.55083589e+09, 1.55083605e+09, 1.55083622e+09, 1.55083638e+09, 1.55083654e+09, 1.55083671e+09, 1.55083687e+09, 1.55083704e+09, 1.55083720e+09, 1.55083736e+09, 1.55083756e+09, 1.55083772e+09, 1.55083789e+09, 1.55083805e+09, 1.55083822e+09, 1.55083838e+09, 1.55083854e+09, 1.55083871e+09, 1.55083887e+09, 1.55083904e+09, 1.55083920e+09, 1.55083940e+09, 1.55083956e+09, 1.55083972e+09, 1.55083989e+09, 1.55084005e+09, 1.55084022e+09, 1.55084038e+09, 1.55084054e+09, 1.55084071e+09, 1.55084087e+09, 1.55084104e+09, 1.55084124e+09, 1.55084140e+09, 1.55084156e+09, 1.55084173e+09, 1.55084189e+09, 1.55084205e+09, 1.55084222e+09, 1.55084238e+09, 1.55084254e+09, 1.55084271e+09, 1.55084291e+09, 1.55084307e+09, 1.55084324e+09, 1.55084340e+09, 1.55084356e+09, 1.55084373e+09, 1.55084389e+09, 1.55084405e+09, 1.55084422e+09, 1.55084438e+09, 1.55084455e+09, 1.55084474e+09, 1.55084491e+09, 1.55084507e+09, 1.55084524e+09, 1.55084540e+09, 1.55084556e+09, 1.55084573e+09, 1.55084589e+09, 1.55084606e+09, 1.55084622e+09, 1.55084638e+09, 1.55084658e+09, 1.55084674e+09, 1.55084691e+09, 1.55084707e+09, 1.55084724e+09, 1.55084740e+09, 1.55084756e+09, 1.55084773e+09, 1.55084789e+09, 1.55084806e+09, 1.55084822e+09, 1.55084842e+09, 1.55084858e+09, 1.55084874e+09, 1.55084891e+09, 1.55084907e+09, 1.55084924e+09, 1.55084940e+09, 1.55084956e+09, 1.55084973e+09, 1.55084989e+09, 1.55085006e+09, 1.55085026e+09, 1.55085042e+09, 1.55085058e+09, 1.55085075e+09, 1.55085091e+09, 1.55085107e+09, 1.55085124e+09, 1.55085140e+09, 1.55085156e+09, 1.55085173e+09, 1.55085193e+09, 1.55085209e+09, 1.55085226e+09, 1.55085242e+09, 1.55085258e+09, 1.55085275e+09, 1.55085291e+09, 1.55085307e+09, 1.55085324e+09, 1.55085340e+09, 1.55085357e+09, 1.55085376e+09, 1.55085393e+09, 1.55085409e+09, 1.55085426e+09, 1.55085442e+09, 1.55085458e+09, 1.55085475e+09, 1.55085491e+09, 1.55085508e+09, 1.55085524e+09, 1.55085540e+09, 1.55085560e+09, 1.55085576e+09, 1.55085593e+09, 1.55085609e+09, 1.55085626e+09, 1.55085642e+09, 1.55085658e+09, 1.55085675e+09, 1.55085691e+09, 1.55085708e+09, 1.55085724e+09, 1.55085744e+09, 1.55085760e+09, 1.55085776e+09, 1.55085793e+09, 1.55085809e+09, 1.55085826e+09, 1.55085842e+09, 1.55085858e+09, 1.55085875e+09, 1.55085891e+09, 1.55085908e+09, 1.55085928e+09, 1.55085944e+09, 1.55085960e+09, 1.55085977e+09, 1.55085993e+09, 1.55086009e+09, 1.55086026e+09, 1.55086042e+09, 1.55086058e+09, 1.55086075e+09, 1.55086095e+09, 1.55086111e+09, 1.55086128e+09, 1.55086144e+09, 1.55086160e+09, 1.55086177e+09, 1.55086193e+09, 1.55086209e+09, 1.55086226e+09, 1.55086242e+09, 1.55086259e+09, 1.55086278e+09, 1.55086295e+09, 1.55086311e+09, 1.55086328e+09, 1.55086344e+09, 1.55086360e+09, 1.55086377e+09, 1.55086393e+09, 1.55086410e+09, 1.55086426e+09, 1.55086442e+09, 1.55086462e+09, 1.55086478e+09, 1.55086495e+09, 1.55086511e+09, 1.55086528e+09, 1.55086544e+09, 1.55086560e+09, 1.55086577e+09, 1.55086593e+09, 1.55086610e+09, 1.55086626e+09, 1.55086646e+09, 1.55086662e+09, 1.55086678e+09, 1.55086695e+09, 1.55086711e+09, 1.55086728e+09, 1.55086744e+09, 1.55086760e+09, 1.55086777e+09, 1.55086793e+09, 1.55086810e+09, 1.55086830e+09, 1.55086846e+09, 1.55086862e+09, 1.55086879e+09, 1.55086895e+09, 1.55086911e+09, 1.55086928e+09, 1.55086944e+09, 1.55086960e+09, 1.55086977e+09, 1.55086997e+09, 1.55087013e+09, 1.55087030e+09, 1.55087046e+09, 1.55087062e+09, 1.55087079e+09, 1.55087095e+09, 1.55087111e+09, 1.55087128e+09, 1.55087144e+09, 1.55087161e+09, 1.55087180e+09, 1.55087197e+09, 1.55087213e+09, 1.55087230e+09, 1.55087246e+09, 1.55087262e+09, 1.55087279e+09, 1.55087295e+09, 1.55087312e+09, 1.55087328e+09, 1.55087344e+09, 1.55087364e+09, 1.55087380e+09, 1.55087397e+09, 1.55087413e+09, 1.55087430e+09, 1.55087446e+09, 1.55087462e+09, 1.55087479e+09, 1.55087495e+09, 1.55087512e+09, 1.55087528e+09, 1.55087548e+09, 1.55087564e+09, 1.55087580e+09, 1.55087597e+09, 1.55087613e+09, 1.55087630e+09, 1.55087646e+09, 1.55087662e+09, 1.55087679e+09, 1.55087695e+09, 1.55087712e+09, 1.55087732e+09, 1.55087748e+09, 1.55087764e+09, 1.55087781e+09, 1.55087797e+09, 1.55087813e+09, 1.55087830e+09, 1.55087846e+09, 1.55087862e+09, 1.55087879e+09, 1.55087899e+09, 1.55087915e+09, 1.55087932e+09, 1.55087948e+09, 1.55087964e+09, 1.55087981e+09]])
First we create an array with a date range between the first and last entry of times t = np.arange(np.datetime64(datetime.datetime.fromtimestamp(times[0,0])), np.datetime64(datetime.datetime.fromtimestamp(times[0,-1])), np.timedelta64(30, 'm')) Output for t array(['2019-02-22T01:00:10.000000', '2019-02-22T01:30:10.000000', '2019-02-22T02:00:10.000000', '2019-02-22T02:30:10.000000', '2019-02-22T03:00:10.000000', '2019-02-22T03:30:10.000000', '2019-02-22T04:00:10.000000', '2019-02-22T04:30:10.000000', '2019-02-22T05:00:10.000000', '2019-02-22T05:30:10.000000', '2019-02-22T06:00:10.000000', '2019-02-22T06:30:10.000000', '2019-02-22T07:00:10.000000', '2019-02-22T07:30:10.000000', '2019-02-22T08:00:10.000000', '2019-02-22T08:30:10.000000', '2019-02-22T09:00:10.000000', '2019-02-22T09:30:10.000000', '2019-02-22T10:00:10.000000', '2019-02-22T10:30:10.000000', '2019-02-22T11:00:10.000000', '2019-02-22T11:30:10.000000', '2019-02-22T12:00:10.000000', '2019-02-22T12:30:10.000000', '2019-02-22T13:00:10.000000', '2019-02-22T13:30:10.000000', '2019-02-22T14:00:10.000000', '2019-02-22T14:30:10.000000', '2019-02-22T15:00:10.000000', '2019-02-22T15:30:10.000000', '2019-02-22T16:00:10.000000', '2019-02-22T16:30:10.000000', '2019-02-22T17:00:10.000000', '2019-02-22T17:30:10.000000', '2019-02-22T18:00:10.000000', '2019-02-22T18:30:10.000000', '2019-02-22T19:00:10.000000', '2019-02-22T19:30:10.000000', '2019-02-22T20:00:10.000000', '2019-02-22T20:30:10.000000', '2019-02-22T21:00:10.000000', '2019-02-22T21:30:10.000000', '2019-02-22T22:00:10.000000', '2019-02-22T22:30:10.000000', '2019-02-22T23:00:10.000000', '2019-02-22T23:30:10.000000', '2019-02-23T00:00:10.000000', '2019-02-23T00:30:10.000000'], dtype='datetime64[us]') Now, we want to calculate this back to seconds. To do this, we create a lambda function which does this for a single element of the array and use np.apply_along_axis to perform this operation element-wise on the array. f = lambda x: (x - np.datetime64('1970-01-01T00:00:00Z'))/np.timedelta64(1,'s') np.apply_along_axis(f, 0, t) output array([1.55079721e+09, 1.55079901e+09, 1.55080081e+09, 1.55080261e+09, 1.55080441e+09, 1.55080621e+09, 1.55080801e+09, 1.55080981e+09, 1.55081161e+09, 1.55081341e+09, 1.55081521e+09, 1.55081701e+09, 1.55081881e+09, 1.55082061e+09, 1.55082241e+09, 1.55082421e+09, 1.55082601e+09, 1.55082781e+09, 1.55082961e+09, 1.55083141e+09, 1.55083321e+09, 1.55083501e+09, 1.55083681e+09, 1.55083861e+09, 1.55084041e+09, 1.55084221e+09, 1.55084401e+09, 1.55084581e+09, 1.55084761e+09, 1.55084941e+09, 1.55085121e+09, 1.55085301e+09, 1.55085481e+09, 1.55085661e+09, 1.55085841e+09, 1.55086021e+09, 1.55086201e+09, 1.55086381e+09, 1.55086561e+09, 1.55086741e+09, 1.55086921e+09, 1.55087101e+09, 1.55087281e+09, 1.55087461e+09, 1.55087641e+09, 1.55087821e+09, 1.55088001e+09, 1.55088181e+09])
How to create a time array in python for seasonal data
I am working with paleoclimate data (536-550 CE) in NetCDF format, which I imported with xarray. The time format is a bit strange: import xarray as xr ds_tas_01 = xr.open_dataset('ue536a01_temp2_seasmean.nc') ds_tas_01['time'] <xarray.DataArray 'time' (time: 61)> array([15360215.25, 15360430.75, 15360731.75, 15361031.75, 15370131.75, 15370430.75, 15370731.75, 15371031.75, 15380131.75, 15380430.75, 15380731.75, 15381031.75, 15390131.75, 15390430.75, 15390731.75, 15391031.75, 15400131.75, 15400430.75, 15400731.75, 15401031.75, 15410131.75, 15410430.75, 15410731.75, 15411031.75, 15420131.75, 15420430.75, 15420731.75, 15421031.75, 15430131.75, 15430430.75, 15430731.75, 15431031.75, 15440131.75, 15440430.75, 15440731.75, 15441031.75, 15450131.75, 15450430.75, 15450731.75, 15451031.75, 15460131.75, 15460430.75, 15460731.75, 15461031.75, 15470131.75, 15470430.75, 15470731.75, 15471031.75, 15480131.75, 15480430.75, 15480731.75, 15481031.75, 15490131.75, 15490430.75, 15490731.75, 15491031.75, 15500131.75, 15500430.75, 15500731.75, 15501031.75, 15501231.75]) Coordinates: * time (time) float64 1.536e+07 1.536e+07 1.536e+07 ... 1.55e+07 1.55e+07 Attributes: standard_name: time bounds: time_bnds units: day as %Y%m%d.%f calendar: proleptic_gregorian axis: T So I want to make my own time array that I can use to plot the climate data. For monthly data I used: import numpy as np time = np.arange('0536-01-31', '0551-01-31', dtype='datetime64[M]') which gives me an array with the years and months between those two dates. now I grouped my data by season using cdo seasmean ('djf', 'mam', jja, 'son') and got 61 values instead of 180. Is there a way to regroup the 'time' array to seasonal values, or create a new time array that corresponds to the seasonal data?
I made it work by setting the number of steps in np.arange: time = np.arange('0536-01-31', '0551-01-31', steps=3, dtype='datetime64[M]') This gives a time step every three months, so essentially every 'season'.
Resampling Time Series Data (Pandas Python 3)
Trying to convert data at daily frequency to weekly frequency. In: weeklyaaapl = pd.DataFrame() weeklyaapl['Open'] = aapl.Open.resample('W').iloc[0] #here I am trying to take the first value of the aapl.Open, #that falls within the week. Out: ValueError: .resample() is now a deferred operation use .resample(...).mean() instead of .resample(...) I want the true open (the first open that prints for the week) (the open of the first day in that week). It instead wants me to take the mean of the daily open values for a given week using .mean(), which is not the information I need. Can't seem to interpret the error, documentation isn't helping either.
I think you need. aapl.resample('W').first() Output: Open High Low Close Volume Date 2010-01-10 30.49 30.64 30.34 30.57 123432050 2010-01-17 30.40 30.43 29.78 30.02 115557365 2010-01-24 29.76 30.74 29.61 30.72 182501620 2010-01-31 28.93 29.24 28.60 29.01 266424802 2010-02-07 27.48 28.00 27.33 27.82 187468421