why Linux system time adds and substracts one hour every 6 months? - linux

When calculating the number of seconds between 2 consecutive days there should be 86400 seconds (24*60*60).
But twice a year its not the case...
One time there are 23 hours on the day and 6 months later there are 25 hours in the day.
Why does this happen?
I ran a code to check the number of seconds between 2 days from 2005 till 2019
and all year there are exactly 24 hours except 2 days were there are 23 and 25.
Why does this happen?
Here is a summery of my results -
The difference column is the number of seconds between this day and the previous one to 86400 seconds
+------------+------------+-------------------+
| dates | difference | number_of_seconds |
+------------+------------+-------------------+
| 2005-04-02 | 3600 | 82800 |
| 2005-10-10 | -3600 | 90000 |
| 2006-04-01 | 3600 | 82800 |
| 2006-10-02 | -3600 | 90000 |
| 2007-03-31 | 3600 | 82800 |
| 2007-09-17 | -3600 | 90000 |
| 2008-03-29 | 3600 | 82800 |
| 2008-10-06 | -3600 | 90000 |
| 2009-03-28 | 3600 | 82800 |
| 2009-09-28 | -3600 | 90000 |
| 2010-03-27 | 3600 | 82800 |
| 2010-09-13 | -3600 | 90000 |
| 2011-04-02 | 3600 | 82800 |
| 2011-10-03 | -3600 | 90000 |
| 2012-03-31 | 3600 | 82800 |
| 2012-09-24 | -3600 | 90000 |
| 2013-03-30 | 3600 | 82800 |
| 2013-10-28 | -3600 | 90000 |
| 2014-03-29 | 3600 | 82800 |
| 2014-10-27 | -3600 | 90000 |
| 2015-03-28 | 3600 | 82800 |
| 2015-10-26 | -3600 | 90000 |
| 2016-03-26 | 3600 | 82800 |
| 2016-10-31 | -3600 | 90000 |
| 2017-03-25 | 3600 | 82800 |
| 2017-10-30 | -3600 | 90000 |
| 2018-03-24 | 3600 | 82800 |
| 2018-10-29 | -3600 | 90000 |
+------------+------------+-------------------+
Here is an example of the code that I ran inside my full code -
echo $((($(date +%s --date 2006-03-31)-$(date +%s --date 2006-03-30))))
echo $((($(date +%s --date 2006-04-01)-$(date +%s --date 2006-03-31))))
echo $((($(date +%s --date 2006-04-02)-$(date +%s --date 2006-04-01))))

The date command with the %s format gives you the wall clock time in seconds from the epoch and your location has daylight savings time. So when you change to and from summer time you either gain or loose an hour.

Related

How do I append the result of a PowerQuery to itself?

Let's say I have a table as follows
| make | model | license | mileage | book value |
|-----------|-----------|---------|---------|------------|
| ford | F150 | 123456 | 34000 | 35000 |
| chevrolet | Silverado | 555778 | 32000 | 29000 |
| | | | | |
Let's pretend I had to unpivot and all that, which I've done. I just used simplified data for this question. Now let's assume I run the query today (July 30th) I want my result to be:
| Date | make | model | license | mileage | book value |
|------------|-----------|-----------|---------|---------|------------|
| 2020-07-30 | ford | F150 | 123456 | 34000 | 35000 |
| 2020-07-30 | chevrolet | Silverado | 555778 | 32000 | 29000 |
| | | | | | |
I want to add the day the query is run. However, here's where I am stuck. Let's say I ran the query tomorrow, I want it to add the new values to the bottom of the existing result:
| Date | make | model | license | mileage | book value |
|------------|-----------|-----------|---------|---------|------------|
| 2020-07-30 | ford | F150 | 123456 | 34000 | 35000 |
| 2020-07-30 | chevrolet | Silverado | 555778 | 32000 | 29000 |
| 2020-07-31 | ford | F150 | 123456 | 34200 | 35000 |
| 2020-07-31 | chevrolet | Silverado | 555778 | 32156 | 29000 |
This would allow me to track the fleet over time.
Any help would be greatly appreciated

Creating a table of schedules in excel from 3 references

I'm trying to create a table that's easy to view the schedules from a raw data.
The raw data looks like the below
Jobname: deletearchive
Start_time: 00:00,01:00,23:45
Days_of_week: su,mo,sa
I would like to put them into columns with header su and time from 00:00 to 23:00 for each day. One challenge i see is I need consider 3 criteria - day(su to mo), time(some are not at 00 minutes but need to be round down) and the jobname. I have 500 jobnames with different schedules.
+---------------+-------+-------+-------+-------+-------+-------+--+--+--+--+--+--+--+
| JOBNAME | su | su | su | mo | mo | mo | | | | | | | |
+---------------+-------+-------+-------+-------+-------+-------+--+--+--+--+--+--+--+
| | 00:00 | 01:00 | 23:00 | 00:00 | 01:00 | 23:00 | | | | | | | |
| deletearchive | yes | yes | 23:45 | yes | yes | 23:45 | | | | | | | |
| JOB2 | | | | | | | | | | | | | |
| JOB3 | | | | | | | | | | | | | |
+---------------+-------+-------+-------+-------+-------+-------+--+--+--+--+--+--+--+

Maximum number of concurrent LOAD FROM S3 requests in Aurora Mysql

I've been using the LOAD FROM S3 feature of Aurora Mysql. To max out the box and the bandwidth, I submitted 30 concurrent requests for S3 load. On checking the processlist in Mysql, I see only 4 S3 LOAD queries running at any point in time. Is that a hard limit? Or am I doing something incorrect?
Here are my scripts for submitting the loads:
Single LOAD from S3:
cat load-one-table.sh
export MYSQL_PWD=[REDACTED]
echo "Shard #$1"
mysql -u [REDACTED] -B -e "load data from s3 's3://A/B/C/D' into table DB.table_"$1;
Concurrent LOADS:
cat parallel-load.sh
for i in $(seq 1 $1); do
echo "Starting Task #$i"
nohup ./load-one-table.sh $i &
echo "Submitted Task #$i"
done
Trigger:
./parallel-load 30
I do see all 30 requests submitted in nohup logs.
Explciitly checking if all loads are running on the client:
ps auxww | grep -c "load-one-table"
31
# The result shows 31, since there is one extra process match for the grep
Checking accepted requests on the server:
show full processlist;
+-----+----------+---------------------+--------------------+---------+------+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------+
| Id | User | Host | db | Command | Time | State | Info |
+-----+----------+---------------------+--------------------+---------+------+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------+
| 606 | REDACTED | localhost | NULL | Query | 60 | executing | load data from s3 's3://REDACTED' into table REDACTED_2 |
| 607 | REDACTED | localhost | NULL | Query | 60 | executing | load data from s3 's3://REDACTED' into table REDACTED_4 |
| 608 | REDACTED | localhost | NULL | Query | 60 | executing | load data from s3 's3://REDACTED' into table REDACTED_1 |
| 609 | REDACTED | localhost | NULL | Query | 60 | executing | load data from s3 's3://REDACTED' into table REDACTED_3 |
| 610 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 611 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 612 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 613 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 614 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 615 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 616 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 617 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 618 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 619 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 620 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 621 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 622 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 623 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 624 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 625 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 626 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 627 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 628 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 629 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 630 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 631 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 632 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 633 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 634 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
| 635 | REDACTED | localhost | NULL | Sleep | 60 | cleaning up | NULL |
+-----+----------+---------------------+--------------------+---------+------+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------+

How do I create condense list in Excel using formula (not VBA)

I am new to this site and haven't done much in Excel of decades (yes, decades), so I forgotten more than I know now.
Background: I am working on a simple pay sheet checking spreadsheet. One Worksheet is input timesheet for data entry, the complex one does all the calculations (Hourly rate; shift loading; tax formula, etc.) and the final worksheet presents the results in the same format as pay slip. Having finished the complex formulas in the calculation sheet, I am now stuck on condensing the results for the final results on the last sheet. I have try numerous functions including: vlookup, index, match, rank.eq, small and others, as per examples of other question on this site. Sample data is:
+----+-----------------------------------------------------+----------------+------------+--------------+--------+--------+-----+--------+
| | A | B | C | D | E | F | G | H |
+----+-----------------------------------------------------+----------------+------------+--------------+--------+--------+-----+--------+
| 1 | Sample data: | | | | | | | |
| 2 | Monday | Ordinary Hours | 30/04/2018 | Day Shift | 10.85 | 21.85 | 1 | 237.07 |
| 3 | Tuesday | Ordinary Hours | 1/05/2018 | | | 21.85 | 1 | |
| 4 | Wednesday | Ordinary Hours | 2/05/2018 | | | 21.85 | 1 | |
| 5 | Thursday | Ordinary Hours | 3/05/2018 | | | 21.85 | 1 | |
| 6 | Friday | Ordinary Hours | 4/05/2018 | | | 21.85 | 1 | |
| 7 | | | | | | | | |
| 8 | | | | | | | | |
| 9 | Monday | Ordinary Hours | 7/05/2018 | | | 21.85 | 1 | |
| 10 | Tuesday | Ordinary Hours | 8/05/2018 | | | 21.85 | 1 | |
| 11 | Wednesday | Ordinary Hours | 9/05/2018 | Day Shift | 10.85 | 21.85 | 1 | 237.07 |
| 12 | Thursday | Ordinary Hours | 10/05/2018 | Day Shift | 10.85 | 21.85 | 1 | 237.07 |
| 13 | Friday | Ordinary Hours | 11/05/2018 | | | 21.85 | 1 | |
| 14 | | | | | | | | |
| 15 | Monday | Overtime 1.5 | 30/04/2018 | | | 21.85 | 1.5 | |
| 16 | Tuesday | Overtime 1.5 | 1/05/2018 | Overtime 1.5 | 2 | 21.85 | 1.5 | 65.55 |
| 17 | Wednesday | Overtime 1.5 | 2/05/2018 | | | 21.85 | 1.5 | |
| 18 | Thursday | Overtime 1.5 | 3/05/2018 | | | 21.85 | 1.5 | |
| 19 | Friday | Overtime 1.5 | 4/05/2018 | | | 21.85 | 1.5 | |
| 20 | Saturday | Overtime 1.5 | 5/05/2018 | | | 21.85 | 1.5 | |
| 21 | | | | | | | | |
| 22 | Monday | Overtime 1.5 | 7/05/2018 | | | 21.85 | 1.5 | |
| 23 | Tuesday | Overtime 1.5 | 8/05/2018 | | | 21.85 | 1.5 | |
| 24 | Wednesday | Overtime 1.5 | 9/05/2018 | | | 21.85 | 1.5 | |
| 25 | Thursday | Overtime 1.5 | 10/05/2018 | | | 21.85 | 1.5 | |
| 26 | Friday | Overtime 1.5 | 11/05/2018 | | | 21.85 | 1.5 | |
| 27 | Saturday | Overtime 1.5 | 12/05/2018 | | | 21.85 | 1.5 | |
| 28 | | | | | | | | |
| 29 | | | | | | | | |
| 30 | Required result on separate sheet in same workbook: | | | | | | | |
| 31 | Taxable Allowances | Comments | Qty | Rate | Factor | Amount | | |
| 32 | Ordinary Hours | 30/04/2018 | 10.85 | 21.85 | 1 | 237.07 | | |
| 33 | Ordinary Hours | 9/05/2018 | 10.85 | 21.85 | 1 | 237.07 | | |
| 34 | Ordinary Hours | 10/05/2018 | 10.85 | 21.85 | 1 | 237.07 | | |
| 35 | Overtime 1.5 | 1/05/2018 | 2 | 21.85 | 1.5 | 65.55 | | |
| 36 | | | | | | | | |
| 37 | | | | | | | | |
| 38 | | | | | | | | |
| 39 | | | | | | | | |
| 40 | | | | | | | | |
+----+-----------------------------------------------------+----------------+------------+--------------+--------+--------+-----+--------+

Excel VBA extrapolate values

I have a file that has data stored in it the following way (weekly data example)
+----------+----------+----------+----------+----------+----------+
| | WK1 | WK2 | WK3 | WK4 | WK5 |
+----------+----------+----------+----------+----------+----------+
| DT Begin | 29.12.14 | 05.01.15 | 12.01.15 | 19.01.15 | 26.01.15 |
| DT End | 04.01.15 | 11.01.15 | 18.01.15 | 25.01.15 | 01.02.15 |
| XData | 50 | 10 | 10 | 10 | 50 |
+----------+----------+----------+----------+----------+----------+
My problem ist to aggregate the XData on a monthly basis. For that I want to break the data down for days and then calculate the average.
Edit: I changed the table as it was not clear what I meant. This averages to ((50*4)+(10*21)+(5*50))/31 = 22.90
+------------+-------+
| Date | Value |
+------------+-------+
| 01.01.2015 | 50 |
| 02.01.2015 | 50 |
| 03.01.2015 | 50 |
| 04.01.2015 | 50 |
| 05.01.2015 | 10 |
| 06.01.2015 | 10 |
| 07.01.2015 | 10 |
| 08.01.2015 | 10 |
| 09.01.2015 | 10 |
| 10.01.2015 | 10 |
| 11.01.2015 | 10 |
| 12.01.2015 | 10 |
| 13.01.2015 | 10 |
| 14.01.2015 | 10 |
| 15.01.2015 | 10 |
| 16.01.2015 | 10 |
| 17.01.2015 | 10 |
| 18.01.2015 | 10 |
| 19.01.2015 | 10 |
| 20.01.2015 | 10 |
| 21.01.2015 | 10 |
| 22.01.2015 | 10 |
| 23.01.2015 | 10 |
| 24.01.2015 | 10 |
| 25.01.2015 | 10 |
| 26.01.2015 | 50 |
| 27.01.2015 | 50 |
| 28.01.2015 | 50 |
| 29.01.2015 | 50 |
| 30.01.2015 | 50 |
| 31.01.2015 | 50 |
+------------+-------+
| Average | 22.90 |
+------------+-------+
After having done this calculation I want to summarize the data as follows for the entire year:
+-------+-------+-------+------+------+
| | Jan | Feb | Mar | ... |
+-------+-------+-------+------+------+
| XData | 22.90 | 22.00 | 23.1 | ... |
+-------+-------+-------+------+------+
Being a newbie in Excel VBA, I have extreme trouble doing this.
I know how to get to the value of a cell (Range.Value) but not how to find data in a particular week (as WK1 is there for 2014 as well) Range.Find with a date other than the one in the cell itself does not seem to work.
Whar I am asking for is a way to approach this problem. My particular difficulties are to:
Find the data in the worksheet
split the week values into day values (see table above)
Copy the data or hold it in some sort of data structure
calculate the average (this should be ease then)
fill in the data on a monthly basis
As you can see, I have trouble even getting started - any hints would be greatly appreciated. Maybe I'm thinking of this entirely too complicated? Thank you!

Resources