ubuntu linux removing date from timestamp from linux R [duplicate] - linux
How would I extract the time from a series of POSIXct objects discarding the date part?
For instance, I have:
times <- structure(c(1331086009.50098, 1331091427.42461, 1331252565.99979,
1331252675.81601, 1331262597.72474, 1331262641.11786, 1331269557.4059,
1331278779.26727, 1331448476.96126, 1331452596.13806), class = c("POSIXct",
"POSIXt"))
which corresponds to these dates:
"2012-03-07 03:06:49 CET" "2012-03-07 04:37:07 CET"
"2012-03-09 01:22:45 CET" "2012-03-09 01:24:35 CET"
"2012-03-09 04:09:57 CET" "2012-03-09 04:10:41 CET"
"2012-03-09 06:05:57 CET" "2012-03-09 08:39:39 CET"
"2012-03-11 07:47:56 CET" "2012-03-11 08:56:36 CET"
Now, I have some values for a parameter measured at those times:
val <- c(1.25343125e-05, 0.00022890575,
3.9269125e-05, 0.0002285681875,
4.26353125e-05, 5.982625e-05,
2.09575e-05, 0.0001516951251,
2.653125e-05, 0.0001021391875)
I would like to plot val vs time of the day, irrespectively of the specific day when val was measured.
Is there a specific function that would allow me to do that?
You can use strftime to convert datetimes to any character format:
> t <- strftime(times, format="%H:%M:%S")
> t
[1] "02:06:49" "03:37:07" "00:22:45" "00:24:35" "03:09:57" "03:10:41"
[7] "05:05:57" "07:39:39" "06:47:56" "07:56:36"
But that doesn't help very much, since you want to plot your data. One workaround is to strip the date element from your times, and then to add an identical date to all of your times:
> xx <- as.POSIXct(t, format="%H:%M:%S")
> xx
[1] "2012-03-23 02:06:49 GMT" "2012-03-23 03:37:07 GMT"
[3] "2012-03-23 00:22:45 GMT" "2012-03-23 00:24:35 GMT"
[5] "2012-03-23 03:09:57 GMT" "2012-03-23 03:10:41 GMT"
[7] "2012-03-23 05:05:57 GMT" "2012-03-23 07:39:39 GMT"
[9] "2012-03-23 06:47:56 GMT" "2012-03-23 07:56:36 GMT"
Now you can use these datetime objects in your plot:
plot(xx, rnorm(length(xx)), xlab="Time", ylab="Random value")
For more help, see ?DateTimeClasses
The data.table package has a function 'as.ITime', which can do this efficiently use below:
library(data.table)
x <- "2012-03-07 03:06:49 CET"
as.IDate(x) # Output is "2012-03-07"
as.ITime(x) # Output is "03:06:49"
There have been previous answers that showed the trick. In essence:
you must retain POSIXct types to take advantage of all the existing plotting functions
if you want to 'overlay' several days worth on a single plot, highlighting the intra-daily variation, the best trick is too ...
impose the same day (and month and even year if need be, which is not the case here)
which you can do by overriding the day-of-month and month components when in POSIXlt representation, or just by offsetting the 'delta' relative to 0:00:00 between the different days.
So with times and val as helpfully provided by you:
## impose month and day based on first obs
ntimes <- as.POSIXlt(times) # convert to 'POSIX list type'
ntimes$mday <- ntimes[1]$mday # and $mon if it differs too
ntimes <- as.POSIXct(ntimes) # convert back
par(mfrow=c(2,1))
plot(times,val) # old times
plot(ntimes,val) # new times
yields this contrasting the original and modified time scales:
Here's an update for those looking for a tidyverse method to extract hh:mm::ss.sssss from a POSIXct object. Note that time zone is not included in the output.
library(hms)
as_hms(times)
Many solutions have been provided, but I have not seen this one, which uses package chron:
hours = times(strftime(times, format="%T"))
plot(val~hours)
(sorry, I am not entitled to post an image, you'll have to plot it yourself)
I can't find anything that deals with clock times exactly, so I'd just use some functions from package:lubridate and work with seconds-since-midnight:
require(lubridate)
clockS = function(t){hour(t)*3600+minute(t)*60+second(t)}
plot(clockS(times),val)
You might then want to look at some of the axis code to figure out how to label axes nicely.
The time_t value for midnight GMT is always divisible by 86400 (24 * 3600). The value for seconds-since-midnight GMT is thus time %% 86400.
The hour in GMT is (time %% 86400) / 3600 and this can be used as the x-axis of the plot:
plot((as.numeric(times) %% 86400)/3600, val)
To adjust for a time zone, adjust the time before taking the modulus, by adding the number of seconds that your time zone is ahead of GMT. For example, US central daylight saving time (CDT) is 5 hours behind GMT. To plot against the time in CDT, the following expression is used:
plot(((as.numeric(times) - 5*3600) %% 86400)/3600, val)
Related
How to determine the appropriate the timezone to apply for historical dates in a give region in python3
I'm using python3 on Ubuntu 20.04. I have a trove of files with naive datetime strings in them, dating back more than 20 years. I know that all of these datetimes are in the Pacific Timezone. I would like to convert them all to UTC datetimes. However, whether they are relative to PDT or PST is a bigger question. Since when PDT/PST changes has changed over the last 20 years, it's not just a matter of doing a simple date/month threshold to figure out whether to apply the pdt or pst timezone. Is there an elegant way to make this determination and apply it?
Note upfront, for Python 3.9+: use zoneinfo from the standard library, no need anymore for a third party library. Example. Here's what you can to do set the timezone and convert to UTC. dateutil will take DST changes from the IANA database. from datetime import datetime import dateutil datestrings = ['1991-04-06T00:00:00', # PST '1991-04-07T04:00:00', # PDT '1999-10-30T00:00:00', # PDT '1999-10-31T02:01:00', # PST '2012-03-11T00:00:00', # PST '2012-03-11T02:00:00'] # PDT # to naive datetime objects dateobj = [datetime.fromisoformat(s) for s in datestrings] # set timezone: tz_pacific = dateutil.tz.gettz('US/Pacific') dtaware = [d.replace(tzinfo=tz_pacific) for d in dateobj] # with pytz use localize() instead of replace # check if has DST: # for d in dtaware: print(d.dst()) # 0:00:00 # 1:00:00 # 1:00:00 # 0:00:00 # 0:00:00 # 1:00:00 # convert to UTC: dtutc = [d.astimezone(dateutil.tz.UTC) for d in dtaware] # check output # for d in dtutc: print(d.isoformat()) # 1991-04-06T08:00:00+00:00 # 1991-04-07T11:00:00+00:00 # 1999-10-30T07:00:00+00:00 # 1999-10-31T10:01:00+00:00 # 2012-03-11T08:00:00+00:00 # 2012-03-11T09:00:00+00:00 Now if you'd like to be absolutely sure that DST (PDT vs. PST) is set correctly, you'd have to setup test cases and verify against IANA I guess...
python3: Split time series by diurnal periods
I have the following dataset: 01/05/2020,00,26.3,27.5,26.3,80,81,73,22.5,22.7,22.0,993.7,993.7,993.0,0.0,178,1.2,-3.53,0.0 01/05/2020,01,26.1,26.8,26.1,79,80,75,22.2,22.4,21.9,994.4,994.4,993.7,1.1,22,2.0,-3.54,0.0 01/05/2020,02,25.4,26.1,25.4,80,81,79,21.6,22.3,21.6,994.7,994.7,994.4,0.1,335,2.3,-3.54,0.0 01/05/2020,03,23.3,25.4,23.3,90,90,80,21.6,21.8,21.5,994.7,994.8,994.6,0.9,263,1.5,-3.54,0.0 01/05/2020,04,22.9,24.2,22.9,89,90,86,21.0,22.1,21.0,994.2,994.7,994.2,0.3,268,2.0,-3.54,0.0 01/05/2020,05,22.8,23.1,22.8,90,91,89,21.0,21.4,20.9,993.6,994.2,993.6,0.7,264,1.5,-3.54,0.0 01/05/2020,06,22.2,22.8,22.2,92,92,90,20.9,21.2,20.8,993.6,993.6,993.4,0.8,272,1.6,-3.54,0.0 01/05/2020,07,22.6,22.6,22.0,91,93,91,21.0,21.2,20.7,993.4,993.6,993.4,0.4,284,2.3,-3.49,0.0 01/05/2020,08,21.6,22.6,21.5,92,92,90,20.2,20.9,20.1,993.8,993.8,993.4,0.4,197,2.1,-3.54,0.0 01/05/2020,09,22.0,22.1,21.5,92,93,92,20.7,20.8,20.2,994.3,994.3,993.7,0.0,125,2.1,-3.53,0.0 01/05/2020,10,22.7,22.7,21.9,91,92,91,21.2,21.2,20.5,995.0,995.0,994.3,0.0,354,0.0,70.99,0.0 01/05/2020,11,25.0,25.0,22.7,83,91,82,21.8,22.1,21.1,995.5,995.5,995.0,0.8,262,1.5,744.8,0.0 01/05/2020,12,27.9,28.1,24.9,72,83,70,22.3,22.8,21.6,996.1,996.1,995.5,0.7,228,1.9,1392.,0.0 01/05/2020,13,30.4,30.4,27.7,58,72,55,21.1,22.6,20.4,995.9,996.2,995.9,1.6,134,3.7,1910.,0.0 01/05/2020,14,31.7,32.3,30.1,50,58,48,20.2,21.3,19.7,995.8,996.1,995.8,3.0,114,5.4,2577.,0.0 01/05/2020,15,32.9,33.2,31.8,44,50,43,19.1,20.5,18.6,994.9,995.8,994.9,0.0,128,5.6,2853.,0.0 01/05/2020,16,33.2,34.4,32.0,46,48,41,20.0,20.0,18.2,994.0,994.9,994.0,0.0,125,4.3,2700.,0.0 01/05/2020,17,33.1,34.5,32.7,44,46,39,19.2,19.9,18.5,993.4,994.1,993.4,0.0,170,1.6,2806.,0.0 01/05/2020,18,33.6,34.2,32.6,41,47,40,18.5,20.0,18.3,992.6,993.4,992.6,0.0,149,0.0,2319.,0.0 01/05/2020,19,33.5,34.7,32.1,43,49,39,19.2,20.4,18.3,992.3,992.6,992.3,0.3,168,4.1,1907.,0.0 01/05/2020,20,32.1,33.9,32.1,49,51,41,20.2,20.7,18.5,992.4,992.4,992.3,0.1,192,3.7,1203.,0.0 01/05/2020,21,29.9,32.2,29.9,62,62,49,21.8,21.9,20.2,992.3,992.4,992.2,0.0,188,2.9,408.0,0.0 01/05/2020,22,28.5,29.9,28.4,67,67,62,21.8,22.0,21.7,992.5,992.5,992.3,0.4,181,2.3,6.817,0.0 01/05/2020,23,27.8,28.5,27.8,71,71,66,22.1,22.1,21.5,993.1,993.1,992.5,0.0,225,1.6,-3.39,0.0 02/05/2020,00,27.4,28.2,27.3,75,75,68,22.5,22.5,21.7,993.7,993.7,993.1,0.5,139,1.5,-3.54,0.0 02/05/2020,01,27.3,27.7,27.3,72,75,72,21.9,22.6,21.9,994.3,994.3,993.7,0.0,126,1.1,-3.54,0.0 02/05/2020,02,25.4,27.3,25.2,85,85,72,22.6,22.8,21.9,994.4,994.5,994.3,0.1,256,2.6,-3.54,0.0 02/05/2020,03,25.5,25.6,25.3,84,85,82,22.5,22.7,22.1,994.3,994.4,994.2,0.0,329,0.7,-3.54,0.0 02/05/2020,04,24.5,25.5,24.5,86,86,82,22.0,22.5,21.9,993.9,994.3,993.9,0.0,290,1.2,-3.54,0.0 02/05/2020,05,24.0,24.5,23.5,87,88,86,21.6,22.1,21.3,993.6,993.9,993.6,0.7,285,1.3,-3.54,0.0 02/05/2020,06,23.7,24.1,23.7,87,87,85,21.3,21.6,21.3,993.1,993.6,993.1,0.1,305,1.1,-3.51,0.0 02/05/2020,07,22.7,24.1,22.5,91,91,86,21.0,21.7,20.7,993.1,993.3,993.1,0.6,220,1.1,-3.54,0.0 02/05/2020,08,22.9,22.9,22.6,92,92,91,21.5,21.5,21.0,993.2,993.2,987.6,0.0,239,1.5,-3.53,0.0 02/05/2020,09,22.9,23.0,22.8,93,93,92,21.7,21.7,21.4,993.6,993.6,993.2,0.0,289,0.4,-3.53,0.0 02/05/2020,10,23.5,23.5,22.8,92,93,92,22.1,22.1,21.6,994.3,994.3,993.6,0.0,256,0.0,91.75,0.0 02/05/2020,11,26.1,26.2,23.5,80,92,80,22.4,23.1,22.2,995.0,995.0,994.3,1.1,141,1.9,789.0,0.0 02/05/2020,12,28.7,28.7,26.1,69,80,68,22.4,22.7,22.1,995.5,995.5,995.0,0.0,116,2.2,1468.,0.0 02/05/2020,13,31.4,31.4,28.6,56,69,56,21.6,22.9,21.0,995.5,995.7,995.4,0.0,65,0.0,1762.,0.0 02/05/2020,14,32.1,32.4,30.6,48,58,47,19.8,22.0,19.3,995.0,995.6,990.6,0.0,105,0.0,2657.,0.0 02/05/2020,15,34.0,34.2,31.7,43,48,42,19.6,20.1,18.6,993.9,995.0,993.9,3.0,71,6.0,2846.,0.0 02/05/2020,16,34.7,34.7,32.3,38,48,38,18.4,20.3,18.3,992.7,993.9,992.7,1.4,63,6.3,2959.,0.0 02/05/2020,17,34.0,34.7,32.7,42,46,38,19.2,20.0,18.4,991.7,992.7,991.7,2.2,103,4.8,2493.,0.0 02/05/2020,18,34.3,34.7,33.6,41,42,38,19.1,19.4,18.0,991.2,991.7,991.2,2.0,141,4.8,2593.,0.0 02/05/2020,19,33.5,34.5,32.5,42,47,39,18.7,20.0,18.4,990.7,991.4,989.9,1.8,132,4.2,1317.,0.0 02/05/2020,20,32.5,34.2,32.5,47,48,40,19.7,20.3,18.7,990.5,990.7,989.8,1.3,191,4.2,1250.,0.0 02/05/2020,21,30.5,32.5,30.5,59,59,47,21.5,21.6,20.0,979.8,990.5,979.5,0.1,157,2.9,345.5,0.0 02/05/2020,22,28.6,30.5,28.6,67,67,59,21.9,21.9,21.5,978.9,980.1,978.7,0.6,166,2.2,1.122,0.0 02/05/2020,23,27.2,28.7,27.2,74,74,66,22.1,22.2,21.6,978.9,979.3,978.6,0.0,246,1.7,-3.54,0.0 03/05/2020,00,26.5,27.2,26.0,77,80,74,22.2,22.5,22.0,979.0,979.1,978.7,0.0,179,1.4,-3.54,0.0 03/05/2020,01,26.0,26.6,26.0,80,80,77,22.4,22.5,22.1,979.1,992.4,978.7,0.0,276,0.6,-3.54,0.0 03/05/2020,02,26.0,26.5,26.0,79,81,75,22.1,22.5,21.7,978.8,979.1,978.5,0.0,290,0.6,-3.53,0.0 03/05/2020,03,25.3,26.0,25.3,83,83,79,22.2,22.4,21.8,978.6,989.4,978.5,0.5,303,1.0,-3.54,0.0 03/05/2020,04,25.3,25.6,24.6,81,85,81,21.9,22.5,21.7,978.1,992.7,977.9,0.7,288,1.5,-3.00,0.0 03/05/2020,05,23.7,25.3,23.7,88,88,81,21.5,21.9,21.5,977.6,991.8,977.3,1.2,256,1.8,-3.54,0.0 03/05/2020,06,23.3,23.7,23.3,91,91,88,21.7,21.7,21.5,976.9,977.6,976.7,0.4,245,1.8,-3.54,0.0 03/05/2020,07,23.0,23.6,23.0,91,91,89,21.4,21.9,21.3,976.7,977.0,976.4,0.9,257,1.9,-3.54,0.0 03/05/2020,08,23.4,23.4,22.9,90,92,90,21.7,21.7,21.3,976.8,976.9,976.5,0.4,294,1.6,-3.52,0.0 03/05/2020,09,23.0,23.5,23.0,88,90,87,21.0,21.6,20.9,992.1,992.1,976.7,0.8,263,1.6,-3.54,0.0 03/05/2020,10,23.2,23.2,22.5,91,92,88,21.6,21.6,20.8,993.0,993.0,992.2,0.1,226,1.5,29.03,0.0 03/05/2020,11,26.0,26.1,23.2,77,91,76,21.6,22.1,21.5,993.8,993.8,982.1,0.0,120,0.9,458.1,0.0 03/05/2020,12,26.6,27.0,25.5,76,80,76,22.1,22.5,21.4,982.7,994.3,982.6,0.3,121,2.3,765.3,0.0 03/05/2020,13,28.5,28.7,26.6,66,77,65,21.5,23.1,21.2,982.5,994.2,982.4,1.4,130,3.2,1219.,0.0 03/05/2020,14,31.1,31.1,28.5,55,66,53,21.0,21.8,19.9,982.3,982.7,982.1,1.2,129,3.7,1743.,0.0 03/05/2020,15,31.6,31.8,30.7,50,55,49,19.8,20.8,19.2,992.9,993.5,982.2,1.1,119,5.1,1958.,0.0 03/05/2020,16,32.7,32.8,31.1,46,52,46,19.6,20.7,19.2,991.9,992.9,991.9,0.8,122,4.4,1953.,0.0 03/05/2020,17,32.3,33.3,32.0,44,49,42,18.6,20.2,18.2,990.7,991.9,979.0,2.6,133,5.9,2463.,0.0 03/05/2020,18,33.1,33.3,31.9,44,50,44,19.3,20.8,18.9,989.9,990.7,989.9,1.1,170,5.4,2033.,0.0 03/05/2020,19,32.4,33.2,32.2,47,47,44,19.7,20.0,18.7,989.5,989.9,989.5,2.4,152,5.2,1581.,0.0 03/05/2020,20,31.2,32.5,31.2,53,53,46,20.6,20.7,19.4,989.5,989.7,989.5,1.7,159,4.6,968.6,0.0 03/05/2020,21,29.7,32.0,29.7,62,62,51,21.8,21.8,20.5,989.7,989.7,989.4,0.8,154,4.0,414.2,0.0 03/05/2020,22,28.3,29.7,28.3,69,69,62,22.1,22.1,21.7,989.9,989.9,989.7,0.3,174,2.0,6.459,0.0 03/05/2020,23,26.9,28.5,26.9,75,75,67,22.1,22.5,21.7,990.5,990.5,989.8,0.2,183,1.0,-3.54,0.0 The second column is time (hour). I want to separate the dataset by morning (06-11), afternoon (12-17), evening (18-23) and night (00-05). How I can do it?
You can use pd.cut: bins = [-1,5,11,17,24] labels = ['morning', 'afternoon', 'evening', 'night'] df['day_part'] = pd.cut(df['hour'], bins=bins, labels=labels)
I added column names, including Hour for the second column. Then I used read_csv which reads the source text, "dropping" leading zeroes, so that Hour column is just int. To split rows (add a column marking the diurnal period), use: df['period'] = pd.cut(df.Hour, bins=[0, 6, 12, 18, 24], right=False, labels=['night', 'morning', 'afternoon', 'evening']) Then you can e.g. use groupby to process your groups. Because I used right=False parameter, the bins are closed on the left side, thus bin limits are more natural (no need for -1 as an hour). And bin limits (except for the last) are just starting hours of each period - quite natural notation.
Problem with transforming a string date to a datetime variable in Matlab
I have a variable called FOUNDATION_DATE which includes the following date observations in string format: '01/Jan/12' '' '' '' '01/Jan/08' '' '01/Jan/44' '' '' '14/Oct/08' '' '' '12/Jul/04' '03/Aug/05' '20/Apr/10' '30/Dec/98' '09/Apr/16' '01/Jan/10' '01/Dec/01' '01/Jan/93' I am using the Matlab function datetime to transform the above observations in datetime data type. The code is datetime(FOUNDATION_DATE,'InputFormat','dd/MMM/yy') which provides the following results: 01-Jan-2012 NaT NaT NaT 01-Jan-2008 NaT 01-Jan-2044 NaT NaT 14-Oct-2008 NaT NaT 12-Jul-2004 03-Aug-2005 20-Apr-2010 30-Dec-1998 09-Apr-2016 01-Jan-2010 01-Dec-2001 01-Jan-1993 While for the majority of the cases the transformation is conducted properly, for the observation '01/Jan/44' this is not the case as the year becomes 2044. This issue appears in many other date observations of my variable (only a small sample is presented here) and it is quite strange that this issue appears for date observations for years before 1969. Does anyone have a solution for accurately transforming these strings to datetime variables? Any explanation also why this happens?
You want the 'PivotYear' option, which defines which 100-year date range the 2 digit date refers to: datetime( '01/Jan/44', 'inputformat', 'dd/MMM/yy', 'pivotyear', 1930 ) So here the 100-year range is 1930 - 2029 The default as documented (therefore not very "strange"), is year(datetime('now'))-50 % = 1969 at time of writing (2019)
When only 2 years are represented matlab makes an assumption on what the first two digits are, you can override this by: startYear = year(datetime('now')) - 99; datetime('01/Jan/69', 'InputFormat', 'dd/MMM/yy', 'PivotYear', startYear) That will make any dates in 2 digits up until today be historic.
Simple datetime conversion from integer or string
Is there a simple way to convert a start and end time input into a list of evenly separated times? the input can be string or integer with format 1000,"1000",or "10:00" in 2400hr format. I've managed to accomplish this in a messy looking way, is there a tighter more efficient way to create this list? As you'll notice I created an array first and then called .tolist() to make the time transformation iteration easier. The problem is that an input of 1030 or 1015 would need to be translated into 1050 or 1025 to create the right spacing but if there were a way I could call a datetime.timedelta or something and cleanly make the array? start="1000" end="1600" total_minutes=(int(end[:2])*60)+int(end[2:])-(int(start[:2])*60)- int(start[2:]) dog=list(range(0,int(total_minutes),25)) walk=dog_df["Walk Length"][dog_df.index[dog_df["Name"]==self.name][0]] if walk=='half': self.dogarr=np.array([(x-25,x,x+25,x+50) for x in dog]) elif walk=='full': self.dogarr=np.array([(x-25,x,x+25,x+50,x+75,x+100) for x in dog]) else: self.dogarr=np.array([(x,x+25,x+50) for x in dog]) if int(start[2])!=0: start=start[:2]+str(int(int(start[2:])*1.667)) self.dogarr+=(int(start)) self.dogarr=self.dogarr.tolist() z=0 while z<len(self.dogarr): for timespot in self.dogarr[z].copy(): self.dogarr[z][self.dogarr[z].index(timespot)]=time.strftime('%H%M', time.gmtime(self.dogarr[z][self.dogarr[z].index(timespot)]*36)) z+=1 self.dogarr=np.array(self.dogarr)``` array([['1115', '1130', '1145', '1200'], ['1130', '1145', '1200', '1215'], ['1145', '1200', '1215', '1230'], ['1200', '1215', '1230', '1245'], ['1215', '1230', '1245', '1300']], dtype='<U4')
I'm sure you can figure out to parse times from any number of existing questions. The crux of your question seems to be how to create evenly separated times within a range. Here's a simple way: start = datetime.datetime(2018,12,20,10) # or use strptime etc. end = datetime.datetime(2018,12,24,18) count = 10 interval = (end - start) / count dt = start while dt <= end: print(dt) dt += interval The output is: 2018-12-20 10:00:00 2018-12-20 20:24:00 2018-12-21 06:48:00 2018-12-21 17:12:00 2018-12-22 03:36:00 2018-12-22 14:00:00 2018-12-23 00:24:00 2018-12-23 10:48:00 2018-12-23 21:12:00 2018-12-24 07:36:00 2018-12-24 18:00:00
Momentjs get time in current location
I'm trying to generate a momentjs object of a certain timestamp in the current day of a specified location. For example: const timeNow = moment().tz('Africa/Cairo') const startTime = moment('10:00 am', 'HH:mm a') const endTime = moment('2:30 pm', 'HH:mm a') Printing the above 3 variables outputs this: Fri, 12:31 am Thu, 10:00 am Thu, 02:30 pm Where the first result is in fact the current time in Cairo, However the other two results are the day before. How can I change it so that they return the current day?
You can simply do: moment.tz('Africa/Cairo') // <= Moment Object One small info: whenever you'll get to see some javascript date in a browser that will be shown in your system's time-zone. As javascript Date is UTC, browsers will show accordingly. Use moment.format() to get string values.