Understanding GTFS service times - search

I have a problem with the understanding of the gtfs file format. Or maybe there is an error in that data. There is a gtfs file from a public transportation agency called "Verkehrsverbund Mittelthüringen" (VMT). This data is accessible at https://transitfeeds.com/p/verkehrsverbund-mittelth-ringen/1080/latest/routes.
For example: I have taken the trip with the ID 9782458 (trips.txt).
2841_0,45,9782458,"Erfurt, Thüringenhalle","",0,,,,
This has the service ID 45 with the specification
45,0,0,0,0,0,0,0,20191101,20200229
Additionally here are the entries for the celendar_dates.txt
45,20191104,1
45,20191111,1
45,20191118,1
45,20191125,1
45,20191202,1
45,20191209,1
45,20191216,1
45,20191105,1
45,20191112,1
45,20191119,1
45,20191126,1
45,20191203,1
45,20191210,1
45,20191217,1
45,20191106,1
45,20191113,1
45,20191120,1
45,20191127,1
45,20191204,1
45,20191211,1
45,20191218,1
45,20191107,1
45,20191114,1
45,20191121,1
45,20191128,1
45,20191205,1
45,20191212,1
45,20191219,1
45,20191101,1
45,20191108,1
45,20191115,1
45,20191122,1
45,20191129,1
45,20191206,1
45,20191213,1
45,20191220,1
Does this mean, that the service is available all times, except from the 1st November 2019 to the 29th February 2020? My Problem now is the output of the search engine tansitfeeds.com. It says the trip with the ID 9782458 is available at the 14th November 2019. Which is contary to my understanding of the data: the trip won't be available in November. Where is the clue I missed? Or is there an error in the data?

The line you pasted indicates that service ID 45 runs on zero days of the week (that's what all those zeros mean), so the start and end dates in the same line don't really mean anything.
If this service actually does run on Nov. 14, this could be represented in the calendar_dates.txt file, which is usually used to represent service changes for special dates.
EDIT: the data you added from calendar_dates.txt does indeed show that service 45 has been added for date 20191114.

Related

How to push complex legacy logs into logstash?

I'd like to use ELK to analyze and visualize our GxP Logs, created by our stoneold LIMS system.
At least the system runs on SLES but the whole logging structure is some kind of a mess.
I try to give you an impression:
Main_Dir
| Log Dir
| Large number of sub dirs with a lot of files in them of which some may be of interest later
| Archive Dir
| [some dirs which I'm not interested in]
| gpYYMM <-- subdirs created automatically each month: YY = Year ; MM = Month
| gpDD.log <-- log file created automatically each day.
| [more dirs which I'm not interested in]
Important: Each medical examination, that I need to track, is completely logged in the gpDD.log file that represents the date of the order entry. The duration of the complete examination varies between minutes (if no material is available), several hours or days (e.g. 48h for a Covid-19 examination) or even several weeks for a microbiological sample. Example: All information about a Covid-19 sample, that reached us on December 30th is logged in ../gp2012/gp30.log even if the examination was made on January 4th and the validation / creation of report was finished on January 5th.
Could you please provide me some guidance of the right beat to use ( I guess either logbeat or filebeat) and how to implement the log transfer?
Logstash file input:
input {
file {
path => "/Main Dir/Archive Dir/gp*/gp*.log"
}
}
Filebeat input:
- type: log
paths:
- /Main Dir/Archive Dir/gp*/gp*.log
In both cases the path is possible, however if you need further processing of the lines, I would suggest using at least Logstash as a passthrough (using beats input if you do not want to install Logstash on the source itself, which can be understood)

Dialogflow console returning different result to when sent through the API (Where the wrong timezone is used)

I have a DialogFlow agent set up where one intent is used to schedule reminders. I can say something like:
"At 5pm remind me to go for a run"
And it returns the sentence to remind (in this case 'go for a run') as well as the time to set the reminder using #sys.date-time.
This works as intended, I am able to get the correct time because it just sends the time without a timezone attached.
When I use a command such as:
"In 15 minutes remind me to go for a run"
it sends the result as the time using the timezone, in this case, the incorrect one.
So right now, a result for the date-time using the API was:
2020-11-09T14:20:33+01:00
which is an hour more than it should be.
I have checked the DialogFlow agent's default time zone where it is set to:
(GMT0:00) Africa/Casablanca
Which I am fairly certain is the correct one for London time. However moving to a different timezone changes it and actually gives the correct one for the timezone (Just not my timezone)
Leaving me to wonder if this time zone is broken?
Regardless though, the Dialogflow console on the webpage returns the correct date-time but in a different format using 'startDateTime' and 'endDateTime', something that the agent does not do when sent using the API.
I have checked all configurations within the program and cannot find any evidence of any code giving a new timezone and in fact have tried to add the London timezone when a query is sent but this does not resolve the issue.
Does anyone have any advice on how to solve this?
EDIT:
After receiving a good suggestion from a user I am reminded of the most puzzling part of this issue. Chaning the timezone to GMT -1:00 vs 0:00 actually having a difference of two hours.
Around 1pm I queried it to get the time in 15 minutes.
When it was set to GMT-1:00 Atlantiv/Cape_Verde the time returned is:
2020-11-10T12:21:15-01:00
When it was set to GMT0:00 Africa/Casablanca the time returned is:
2020-11-10T14:22:07+01:00
Neither is the correct time and despite the timezone suggesting a 1 hour difference, it is actually 2 hours apart.
For London the correct timezone should be GMT -1, Casablanca Africa is GMT+1, I used this web page to determine this.
If you are in London is recommended to configure your agent to sue the correct time zone (GMT-1).

How to set date time format in Logic App

I have created an API which creates an excel having Date columns too. When I execute the code from local it works well and displays the date as expected.
When I try executing API via Logic App, it changes the date format of that field. How can I set date time in Logic App?
When you have a fixed DateTime-Format you can use the Logic App function "formatDateTime" as follow:
formatDateTime(triggerBody()?['myDate'], 'yyyy-MM-dd')
You can find it under Expressions - Date and Time - "See more" - formatDateTime
I found some useful documentation on Microsoft's site here:
https://learn.microsoft.com/en-us/azure/kusto/query/format-datetimefunction
Here it is .. in action:
Try this to get the current date in GMT format:
The problem is your local machine is running a different Locale than the machine running your code when it is deployed.
You could either set the CultureInfo.CurrentCulture to make sure the right CultureInfo is used.
CultureInfo...
Provides information about a specific culture (called a locale for unmanaged code development). The information includes the names for the culture, the writing system, the calendar used, the sort order of strings, and formatting for dates and numbers.
You could also use the appropriate DateTimeFormatInfo, when writing the date to Excel, it
Provides culture-specific information about the format of date and time values.
Logic apps time zone base on UTC. So, it needs conversion between UTC to your local time.
In my case, 'SE Asia Standard Time', (UTC+07:00); Bangkok, Hanoi, Jakarta. So, you need to convert it as your local time. Here's in my case:
Date convertTimeZone(utcNow(), 'UTC', 'SE Asia Standard Time', 'MMMM dd, yyyy');
Time convertTimeZone(utcNow(), 'UTC', 'SE Asia Standard Time','hh:mm:ss.ff tt')

ElementTree: selecting right tree?

For the data contained here: http://www.playonline.com/ff11us/polnews/news.xml
I have this Python code:
import xml.etree.ElementTree as ET
import urllib.request
rss1 = "http://www.playonline.com/ff11us/polnews/news.xml"
f = urllib.request.urlopen(rss1)
objects = ET.fromstring(f.read().decode())
print([el.attrib.get('title') for el in objects.findall('*/item')])
But it's returning a blank array. Am I selecting the wrong tree? Do I need to select a child tree or something similar?
Trying to get a grasp on the layout:
>>> for o in objects:
... print(o.tag)
...
{http://purl.org/rss/1.0/}channel
{http://purl.org/rss/1.0/}item
{http://purl.org/rss/1.0/}item
{http://purl.org/rss/1.0/}item
{http://purl.org/rss/1.0/}item
{http://purl.org/rss/1.0/}item
So selecting item I thought would work, or do I need to select that entire line?
I can select on ./{http://purl.org/rss/1.0/}item to get parent items, but how do I get the data (link) from the child items?
I think there is nothing wrong in your original code, except that it needs to use namespaces (to fully qualify an element). I added the hash of namespaces to your code. See if it works for you.
import xml.etree.ElementTree as etree
import urllib.request
pon_url = "http://www.playonline.com/ff11us/polnews/news.xml"
response = urllib.request.urlopen(pon_url)
xmlstr = response.read().decode()
root = etree.fromstring(xmlstr)
/* This is the namespaces. We use it later during 'findall' */
ns = {'ns1':'http://purl.org/rss/1.0/','ns2':'http://purl.org/dc/elements/1.1/'}
#etree.dump(root)
print("\nItems:\n")
print([i.text for i in root.findall('./ns1:item/ns1:title',ns)])
print("\nlinks:\n")
print([i.text for i in root.findall('./ns1:item/ns1:link',ns)])
print("\nDescriptions:\n")
print([i.text for i in root.findall('./ns1:item/ns1:description',ns)])
print("\nDates\n");
print([i.text for i in root.findall('./ns1:item/ns2:date',ns)])
Result:
Items:
['FINAL FANTASY XI Updated (Nov. 9)', 'All Worlds Maintenance (Nov. 9)', 'Temporary Suspension of NA GM Petition Service (Nov. 5)', 'Recovery from Fenrir World Technical Difficulties (Oct. 29)', 'Fenrir World Several Areas Technical Difficulties (Oct. 29)']
links:
['http://www.playonline.com/ff11us/polnews/news24770.shtml', 'http://www.playonline.com/ff11us/polnews/news24767.shtml', 'http://www.playonline.com/ff11us/polnews/news24765.shtml', 'http://www.playonline.com/ff11us/polnews/news24762.shtml', 'http://www.playonline.com/ff11us/polnews/news24759.shtml']
Descriptions:
['A version update was performed on FINAL FANTASY XI at the following time.<br><br>* Clients will update automatically upon launch after the date and time below. After the following time, the update will automatically begin after you press the "Play" button. After that, please follow the instructions on the screen.<br><br>[Date & Time]<br>Nov. 9, 2016 21:00 (PST)<br><br>[Affected Service]<br>FINAL FANTASY XI<br><br>[Update Details]<br>http://sqex.to/DYa', 'At the following time, we will be performing server maintenance. During this period, FINAL FANTASY XI will be unavailable.<br><br>We apologize for any inconvenience this may cause and thank you for your patience.<br><br>* This maintenance will be accompanied by a client program version update. After maintenance is complete, the update will automatically begin after you press the "Play" button. After that, please follow the instructions on the screen.<br><br>* The World Transfer Service will be unavailable starting 30 minutes before the maintenance.<br><br>*After the maintenance ends, a spike in access is expected.<br>If you encounter congestion errors such as "POL-1160" and "POL-0010" during the confirmation screen or while downloading, we apologize for the inconvenience, and we ask that you try again after waiting for some time.<br><br>To ensure a smooth version update, we ask for your understanding and cooperation.<br><br>*Update details will be announced on Nov. 9, 2016 (PST)<br><br>[Date & Time]<br>Nov. 9, 2016 21:00 to 23:00 (PST)<br>* Completion time is subject to change.<br><br>[Affected Service]<br>- FINAL FANTASY XI', 'At the following time, the North American GM petition service will be temporarily unavailable.<br><br>The European GM petition service will be operating normally and users may still place GM calls for urgent issues. However, please be aware that there may be significant delays until these GM calls can be answered.<br><br>We apologize for the inconvenience and thank you for your patience in this matter.<br><br>[Date & Time]<br>Nov. 5, 2016 1:00 to 4:00 (PDT)<br><br>[Affected Services]<br>- The North American GM Petition service for FINAL FANTASY XI<br><br>[Cause]<br>Building Maintenance', 'At the time below, players were unable to access several areas on the Fenrir World.<br><br>We are pleased to announce that the issue has been addressed. We apologize for any inconvenience this may have caused.<br><br>[Date & Time]<br>Oct. 29, 2016 from 2:58 to 4:32 (PDT)<br><br>[Details]<br>- Unable to access certain areas on the Fenrir World<br><br>[Cause]<br>- Server equipment issue<br><br>[Affected Areas]<br>Unable to access the following areas:<br>- Newton Movalpolos<br>- Abyssea - Konschtat<br>- Lufaise Meadows<br>- Monarch Linn<br>- The Garden of Ru'Hmet<br>- Dynamis - Tavnazia<br>- Aydeewa Subterrane<br>- La Vaule [S]<br>- West Ronfaure<br>- North Gustaberg<br>- South Gustaberg<br>- Cape Teriggan<br>- East Sarutabaruta<br>- Ru'Aun Gardens<br>- Fort Ghelsba<br>- Qulun Dome<br>- Castle Oztroja<br>- Castle Zvahl Keep [S]<br>- Sacrificial Chamber<br>- Throne Room<br>- Ranguemont Pass<br>- Ve'Lugannon Palace<br>- Dynamis - Windurst<br>- Dangruf Wadi<br>- Outer Horutoto Ruins<br>- Ifrit's Cauldron<br>- Qu'Bia Arena<br>- Cloister of Tremors<br>- Abyssea - Attohwa<br>- Ship bound for Selbina<br>- Jeuno-Windurst Airship<br>- Northern San d'Oria<br>- Windurst Waters<br>- Lower Jeuno', 'We are currently experiencing technical difficulties with certain areas on the Fenrir World. The issue is currently under investigation, and new updates will follow as additional information becomes available.<br><br>We apologize for any inconvenience this may be causing, and we thank you for your understanding.<br><br>[Date & Time]<br>From Oct. 29, 2016 2:58 (PDT)<br><br>[Affected]<br>- Certain areas on the Fenrir World<br><br>[Cause]<br>Under investigation<br><br>[Details]<br>Under investigation']
Dates
['Thu, 10 Nov 2016 15:06:13 JST', 'Mon, 07 Nov 2016 14:14:32 JST', 'Fri, 04 Nov 2016 17:18:30 JST', 'Sat, 29 Oct 2016 20:05:44 JST', 'Sat, 29 Oct 2016 20:05:44 JST']

Strange sybase behavior around daylight savings time (DST)

I've got a strange sybase behavior which I do not understand.
Situation
I have a table (MY_TABLE) with several columns of type smalldatetime. For illustration purposes let's assume the following table and data:
MY_TABLE||ID |TS_INIT |TS_LASTCHANGE |MY_TEXT |
||4711|3/31/2013 12:00:00 AM|3/31/2013 3:00:00 AM|someText|
TS_INIT and TS_LASTCHANGE are of type smalldatetime.
When executing the following statement I get the above result:
SELECT ID, TS_INIT, TS_LASTCHANGE MY_TEXT
FROM MY_TABLE
WHERE ID = 4711
go
My client is running in UTC+1 (Berlin) and has daylight savings time (DST) enabled.
I am not sure in what time zone the server is running and whether or not DST is enabled.
Problem
When I execute this (note that it is 03:00h):
SELECT ID, TS_INIT, TS_LASTCHANGE MY_TEXT
FROM MY_TABLE
WHERE ID = 4711 AND TS_LASTCHANGE = "2013-03-31 03:00:00:000"
go
I get NO results but when I execute this (note that it is 02:00h this time):
SELECT ID, TS_INIT, TS_LASTCHANGE MY_TEXT
FROM MY_TABLE
WHERE ID = 4711 AND TS_LASTCHANGE = "2013-03-31 02:00:00:000"
go
I do again get the above result which is saying TS_LASTCHANGE is
3/31/2013 3:00:00 AM
Note that the result prints 03:00h, even though I queried for 02:00h.
Why Is the first query returning no results even though there should be a match and why is the second query returning a result even though there should be no match?!
Note also that 3/31/2013 3:00:00 AM is the first moment in DST (at least in the year 2013) and 3/31/2013 2:00:00 AM should never ever exist at all because when transitioning from winter to summer time, the clock switches from 01:59:59 to 03:00:00 (as per this site).
Database: Adaptive Server Enterprise V15.0.3
Client: Aqua Data Studio V16.0.5
EDIT:
When querying whit the TS_INIT everything works as one would expect (only a result for 3/31/2013 12:00:00 AM)
Aqua Data Studio is written in Java.
The problem you are having has to do with the fact that Java is aware of timezones and databases don't have a concept of timezone when they store date and times. When the time comes back from the database, the database's JDBC driver puts it in a Java date and just assumes the timezone is irrelevant. The problem happens when you try to display a time which the JVM thinks is invalid, so a valid date is presented, which basically pushes the time by an hour. Daylight savings for 2015 started on March 08 2.00 AM and one of your rows contains a date which is invalid according to JVM.
This has been a known design issue with Java, and they are trying to fix this with JSR-310 for inclusion in Java SE 8. With this, they will have LocalDate, OffsetDate and ZonedDate. You can read more about it here ...
https://today.java.net/pub/a/today/2008/09/18/jsr-310-new-java-date-time-api.html#jsr-310-datetime-concepts
https://jcp.org/en/jsr/detail?id=310
http://docs.google.com/View?id=dfn5297z_8d27fnf
Workaround
The only workaround, is to probably trick the JVM by setting the timezone in the JVM to GMT. If you are running ADS 16 on Windows, and you launch ADS with the shortcut icon on the desktop (which runs datastudio.exe), then you need to modify the datastudio.ini file in your folder. Add a new entry for vmarg.5=-Duser.timezone=gmt
This link explains the location of where to find the data studio.ini
https://www.aquaclusters.com/app/home/project/public/aquadatastudio/wikibook/Documentation14/page/50/Launcher-Memory-Configuration#windows
Once you have made the change, Restart ADS. Then go to Help->About->System: and double check your user.timezone setting and make sure it is GMT. Then give it a try.
With the above change there might be side effects in the application where timezone are involved, For e.g. in the Table Data Editor->Insert Current Date&Time, which would display a GMT time ... so there would be an offset.

Resources