Kotlin string date formatter - string

I'm trying to parse string date
"AuthDate": "2021-08-19T23:40:52+04:00",
here is code for parsing and displaying
var date = item?.authDate.toString()
val formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ssz")
val parsedDate = formatter.parse(date)
val displayFormatter = DateTimeFormatter.ofPattern("yyyy-MM-dd, HH:MM:SS")
text = displayFormatter.format(parsedDate).toString()
This works fine, except one thing, Seconds is always displayed in "00".
For example, if authDate is 2021-08-19T23:40:52+04:00,
displayed authDate is 2021-08-19, 23:40:00
not 23:40:52 as I want.

val formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ssz")
...
val displayFormatter = DateTimeFormatter.ofPattern("yyyy-MM-dd, HH:MM:SS")
Notice how the first of these uses mm and ss, while the second uses MM and SS. The former says to parse hours, minutes, then seconds. The latter says to display hours, the month, and then the fraction of a second. See the documentation for a full list of the specifiers, but you're probably looking for
val displayFormatter = DateTimeFormatter.ofPattern("yyyy-MM-dd, HH:mm:ss")

Related

Countdown animation for LocalTime

val dateNow = LocalDateTime.now()
val formatter: DateTimeFormatter = DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSSSS")
val dateTime = LocalDateTime.parse(it?.LastUpdated, formatter)
println("The date is :" + dateTime)
val duration = Duration.between(dateTime, dateNow).toMinutes()
val countDown = LocalTime.MIN.plus(
Duration.ofMinutes( duration )
).toString()
So I'm pulling LastUpdated using Retrofit then, I'm getting the difference between the two dates and then displaying it in minutes (HH:SS) format.
Let's say, for example, the output of countDown is 24:45.
How can one make that text count down with animation?

Kotlin - Get difference between datetimes in seconds

Is there any way, to get the difference between two datetimes in seconds?
For example
First datetime: 2022-04-25 12:09:10
Second datetime: 2022-05-24 02:46:21
There is a dedicated class for that - Duration (the same in Android doc).
A time-based amount of time, such as '34.5 seconds'.
This class models a quantity or amount of time in terms of seconds and nanoseconds. It can be accessed using other duration-based units, such as minutes and hours. In addition, the DAYS unit can be used and is treated as exactly equal to 24 hours, thus ignoring daylight savings effects. See Period for the date-based equivalent to this class.
Here is example usage:
val date1 = LocalDateTime.now()
val date2 = LocalDateTime.now()
val duration = Duration.between(date1, date2)
val asSeconds: Long = duration.toSeconds()
val asMinutes: Long = duration.toMinutes()
If your date types are in the java.time package, in other words, are inheritors of Temporal: Use the ChronoUnit class.
val diffSeconds = ChronoUnit.SECONDS.between(date1, date2)
Note that this can result in a negative value, so do take its absolute value (abs) if necessary.

How to convert timestamp with 6digit milliseconds using to_timestamp function in pyspark

I have a timestamp column in my dataframe with timestamps in a format like: 2022-07-28T10:38:50.926866Z that are currently strings.
I want to convert this column into actual timestamps and I've searched around but every time I try to_timestamp with this type of data I get nulls.
Things I've tried:
df = spark.createDataFrame([("2022-07-28T10:38:50.926866Z",)],['date_str'])
df.withColumn("ts1", F.to_timestamp(col('date_str'), "yyyy-MM-dd'T'HH:mm:ss.SSSSSS'Z'")).show(truncate=False)
This always gets me null but when I run something similar on an example with just 3 ms digits, it seems to work:
df = spark.createDataFrame([("2022-07-28T10:38:50.926Z",)],['date_str'])
df.withColumn("ts1", F.to_timestamp(col('date_str'), "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'")).show(truncate=False)
I'm completely lost on how to handle this string conversion.
I actually ended up solving this by removing the last 4 characters of each timestamp string first and then running the to_timestamp. I don't mind losing the ms so this worked for me.
df = df.withColumn("date_str", F.substring("date_str", 1, 23))
df.withColumn("date_str", F.to_timestamp(df_final.date_str, "yyyy-MM-dd'T'HH:mm:ss.SSS")).show()

Pyspark - Applying custom function on structured streaming

I have 4 columns ['clienttimestamp",'sensor_id','actvivity',"incidents"]. From kafka stream, i consume data,preprocess and aggregate in window.
If i do with groupby with ".count()", The stream works very well writing each window with their count in the console.
This works,
df = df.withWatermark("clientTimestamp", "1 minutes")\
.groupby(window(df.clientTimestamp, "1 minutes", "1 minutes"), col('sensor_type')).count()
query = df.writeStream.outputMode("append").format('console').start()
query.awaitTermination()
But the real motive is to find the total time for which critical activity was live.
i.e. For each sensor_type, i group the data by window and i get the list of critical activity and find the total time for which the all critical activity lasted" (The code is below). But am not sure if i am using the udf in right way! Because below method does not work! Can anyone provide an example of applying a custom function for each group of window and to write the output to console.
This does not work
#f.pandas_udf(schemahh, f.PandasUDFType.GROUPED_MAP)
def calculate_time(pdf):
pdf = pdf.reset_index(drop=True)
total_time = 0
index_list = pdf.index[pdf['activity'] == 'critical'].to_list()
for ind in index_list:
start = pdf.loc[ind]['clientTimestamp']
end = pdf.loc[ind + 1]['clientTimestamp']
diff = start - end
time_n_mins = round(diff.seconds / 60, 2)
total_time = total_time + time_n_mins
largest_session_time = total_time
new_pdf = pd.DataFrame(columns=['sensor_type', 'largest_session_time'])
new_pdf.loc[0] = [pdf.loc[0]['sensor_type'], largest_session_time]
return new_pdf
df = df.withWatermark("clientTimestamp", "1 minutes")\
.groupby(window(df.clientTimestamp, "1 minutes", "1 minutes"), col('sensor_type'), col('activity')).apply(calculate_time)
query = df.writeStream.outputMode("append").format('console').start()
query.awaitTermination()

convert scientific notation to datetime

How can I convert date from seconds to date format.
I have a table containing information about lat, long and time.
table
f_table['dt'] = pd.to_datetime(f_table['dt'])
f_table["dt"]
it results like this:
output
but the output is wrong actually the date is 20160628 but it converted to 1970.
My desired output:
24-April-2014
The unit needs to be nanoseconds, so you need to multiply with 1e9
f_table['dt'] = pd.to_datetime(f_table['dt'] * 1e9)
This should work.
#Split your string to extract timestamp, I am assuming a single space between each float
op = "28.359062 69.693673 5.204486e+08"
ts = float(op.split()[2])
from datetime import datetime
#Timestamp to datetime object
dt = datetime.fromtimestamp(ts)
#Datetime object to string
dt_str = dt.strftime('%m-%B-%Y')
print(dt_str)
#06-June-1986

Resources