Outlook REST API future recurring events are getting created with wrong start time - outlook-restapi

I am using REST API endpoint https://outlook.office.com/api/v1.0/me/events/ to create meeting in Outlook live. The payload of meeting looks like-
{
"Subject":"Test Meeting",
"Location":{
"DisplayName":""
},
"Start":"2017-03-02T18:00:00Z",
"End":"2017-03-02T19:00:00Z",
"Body":{
"ContentType":"HTML",
"Content":"<html><body>Test Meeting Content<\/body><\/html>"
},
"Recurrence":{
"Pattern":{
"Type":"Weekly",
"Interval":1,
"Month":0,
"Index":"First",
"FirstDayOfWeek":"Sunday",
"DayOfMonth":0,
"DaysOfWeek":["Thursday"]
},
"Range":{
"Type":"EndDate",
"StartDate":"2017-03-02",
"EndDate":"2017-03-31"
}},
"Attendees":[
{
"EmailAddress":{
"Address":"starstart#example.com"
},
"Type":"Required"
}
]
}
For this weekly recurring event for a month, first two occurrence are getting created at right time but the rest of three meeting events are getting created with an hour delay (instead of 10:00AM UTC, it is 11:00AM UTC).
I even tried with v2.0 endpoint with no luck. I also tried passing timezone for meeting start date and end date, but it is showing same behavior.
Did anyone hit this or similar issue? Any pointers would be of great help, thank you!
Reference to API- https://msdn.microsoft.com/office/office365/APi/calendar-rest-operations#CreateEvents

The API is technically behaving correctly here. UTC doesn't change, but the timezone you have configured on your client likely does. In the US Daylight Savings start on March 12, so you see that the appointment "shifts" in the local view so that the appointment always starts at 18:00 UTC, as you specified :)
So my guess is that you want the start time to stay constant across the DST change, so what you really want to do here is specify the timezone for the user in the request. I'd recommend using the v2 API, where the Start and End change types to a DateTimeTimeZone, allowing you to specify the TZ by name:
"Start": {
"DateTime": "2017-03-02T10:00:00",
"TimeZone": "Pacific Standard Time"
},
"End": {
"DateTime": "2017-03-02T11:00:00",
"TimeZone": "Pacific Standard Time"
},
However, if you need to stay with the v1 API, then you can still specify the TZ in the request, using the StartTimeZone and EndTimeZone properties. The additional work you have to do here is calculate the offsets in the Start and End values. So for example, for Pacific Standard Time, the offset is -08:00 from UTC, so the relevant bit would look like:
"Start": "2017-03-02T10:00:00-08:00",
"StartTimeZone": "Pacific Standard Time",
"End": "2017-03-02T11:00:00-08:00",
"EndTimeZone": "Pacific Standard Time",

Related

Selecting appropriate Noda time structures

I have the following data which i was querying with .net time and ran into issues with timezones and spans. I was recommended to use Noda Time.
"MarketStates": {
"dataTimeZone": "America/New_York",
"monday": [
{
"start": "04:00:00",
"end": "09:30:00",
"state": "premarket"
},
{
"start": "09:30:00",
"end": "16:00:00",
"state": "market"
}
],
"holidays": [
"1/1/1998",
"1/1/1999",
"1/1/2001"
],
"earlyCloses": {
"7/3/2000": "13:00:00",
"7/3/2001": "13:00:00"
}
}
I am writing a function IsMarketOpen providing a time to test against, and the above MarketStates json database - it returns true if the current time is during market open and false if a holiday or earlyClose.
For the market states (monday above) I will use a LocalTime.
For the earlyCloses I plan to use ZonedDateTime.
For the passed time into this method, I will use a ZonedDateTime.
For holidays would I need to keep the timezone? I cannot find a ZonedDate, only OffsetDate or LocalDate?
In summary, should I keep everything ZonedDateTime (since i have the time zone specified in the json database snippet above), or use a LocalDateTime and then perform the conversion/testing at that point?
Please bear with me for the above question I didn't realize that time is actually so hard and need guidance for structure selection, I will adapt as per comments if extra context is needed. Thank you.
Your earlyCloses looks like it's really a Dictionary<LocalDate, LocalTime>, and holidays is a List<LocalDate>. (As a side note, it's pretty awful that it's not using ISO-8601 for the date format... I can't tell whether those early closes are July 3rd or March 7th.)
The time zone is specified by dataTimeZone, but I'd suggest keeping it as a string in the model, and converting it to a DateTimeZone when you need to.
The thrust of what I'm saying is that I'd encourage you to make the values in your direct model (loaded from JSON and saved to JSON) match what's actually stored in the JSON. You could have a wrapper around that model which (for example) converted the early closes into ZonedDateTime values... but I've generally found it really useful to keep the "plain model" simple, so you can immediately guess the representation in the JSON just from looking at it.

Nodejs Gmail OAuth API to get Users.threads: list

I want to get list of threads after a specific datetime, but couldn't find any way to do it.
I have stored the updated datetime in the database according to the last time the Gmail API was used to fetch the list of threads.
Then using that updated time I want to retrieve the list further.
The Gmail API allows you to list threads/messages after a specific timestamp with second accuracy.
Let's say you want to list threads with messages received after Friday, 22-Jul-16 00:00:00 UTC, you would write after:1469145600 in the q-parameter.
Request
GET https://www.googleapis.com/gmail/v1/users/me/threads?q=after%3A1469145600&access_token={YOUR_ACCESS_TOKEN}
Response
{
"threads": [
{
"id": "1561153ce695b9ab",
"snippet": "Infinite Elgintensity has uploaded Gym Idiots - Spread Eagle Rows & Mike O'Hearn 585-Lb. Squat A montage of gym fails with... Infinite Elgintensity has uploaded Gym Idiots - Spread Eagle Rows",
"historyId": "895071"
}
],
"resultSizeEstimate": 1
}

Azure Data Factory 'Pending Validation'

I created some pipelines in my Azure Data Factory service to move data from SQL Tables to Azure Tables. But they never start running. Instead, the source data sets remain pending validation even after I click the run button in Azure Portal. I have already checked the external properties, which are all set as true. I wonder if there are any other possible reasons.
And here is my table source
{
"name": "TableSrc",
"properties": {
"published": false,
"type": "AzureSqlTable",
"linkedServiceName": "LinkedService-AzureSql",
"typeProperties": {
"tableName": "myTable"
},
"availability": {
"frequency": "Month",
"interval": 1
},
"external": true,
"policy": {}
}
}
I ran into this trying to set up a pipeline to run daily, and thought that I could use the "anchorDateTime" availability property and I was able to do this but you have to set the "frequency" property of the "availability" section in the dataset to the lowest level of granularity that you want to specify. That is, if you want something to run at 6:30pm UTC every day, your dataset needs to look like this (because you are specifying a time at the minute-level):
"availability": {
"frequency": "Minute",
"interval": 1440,
"anchorDateTime": "2016-01-27T18:30:00Z"
}
and the "scheduler" portion of the pipeline needs to be something like:
"scheduler": {
"frequency": "Minute",
"interval": 1440,
"anchorDateTime": "2016-01-27T18:30:00Z"
}
This will run every 1440 minutes (i.e. every 24 hours). I hope that it helps somebody else out since the Microsoft documentation contradicts itself on this topic (or at least is misleading):
For a daily schedule, if you set anchorDateTime = 10/20/2014 6 AM means that the scheduling will happen every day at 6 AM.
This is actually not true, and two lines later it says:
If the AnchorDateTime has date parts that are more granular than the interval, then the more granular parts will be ignored. For example, if the interval is hourly (frequency: hour and interval: 1) and the AnchorDateTime contains minutes and seconds, then the minutes and seconds parts of the AnchorDateTime will be ignored.
This second part is what I think we're running into and why I suggested the strategy above.
reference: https://msdn.microsoft.com/en-us/library/azure/dn894092.aspx
I got the reason... It will wait for the next rounded month to start. Which means it will start at the first day of next month, and no way to manually trigger it.
I was getting the same problem. Turns out that I had not specified the start time of the pipeline according to UTC.
Well, if you want your pipeline to be run, update active periods to dates in the past. You can do it using below powershell command
set-AzureDataFactoryPipelineActivePeriod -DataFactoryName $DataFactoryName -PipelineName $PipelineName -StartDateTime $DateInPast -EndDateTime $DateOneDayLessInPast -ResourceGroupName $ResourceGroupName -Force

How do I get the number of Followers gained and lost during a selected time range from Instagram API?

I need to get number of Followers gained and lost or just the total followers during a selected time range.
For example:
if I send a request to: https://api.instagram.com/v1/users/3/
I will get this Json:
{
"data":
{
"username": "kevin",
"bio": "CEO & Co-founder of Instagram",
"website": "",
"profile_picture": "https://instagramimages-a.akamaihd.net/profiles/profile_3_75sq_1325536697.jpg",
"full_name": "Kevin Systrom",
"counts": {
"media": 1419,
"followed_by": 1138347,
"follows": 643
},
"id": "3"
}
But I can't get the "followed_by" for a select time range
I read the Instagram api documentation and I can't find anywhere a Endpoint to do a request that I want.
in MEDIA I can pass MIN_TIMESTAMP and MAX_TIMESTAMP as parameters, but I am not looking for medias, I am looking for the number of followers.
I know It's possible because there is a website https://minter.io that gets every information since the beginning of the account.
PS: I already have the Authentication with OAuth 2.0
There is no API to do this, I think you have to keep a track of your follower count and update it every day. Setup a cron job to do this via API.
I just tried minter.io, I dont know how they show followers from day 1, I can definitely tell u it is fake and not accurate, I had about 1400 followers at some point and I removed them few months ago, so its back to 200 something, and minter.io does not show this at all. I think they just show a fake linear graph for historical data and going forward they keep a track of followers every day.
It is possible to use the field 'followers_count' on the User node to get a starting point. This gives the total number of followers at the time of the request. From there you could calculate a running total by subtracting the daily follower count from the Insights api that you mention.
Get current follower count:
curl -i -X GET "https://graph.facebook.com/v8/<user id>?fields=followers_count&access_token=EAACwX..."
https://developers.facebook.com/docs/instagram-api/reference/user
Get new followers for single day:
curl -i -X GET "https://graph.facebook.com/v8/<user id>/insights?metric=follower_count&period=day&since=2020-10-5T07:00:01&until=2020-10-06T07:00:00
https://developers.facebook.com/docs/instagram-api/reference/user/insights

Timestamp in sequelize 1.7.3

hi i want to save date and time according to eastern time, my sequelize model is:
var objName= sequelize.define('tbl_name',{
orders_id: { type: Sequelize.INTEGER, autoIncrement: true, primaryKey: true },
customers_id:Sequelize.INTEGER,
date_purchased: { type: Sequelize.DATE, defaultValue: Sequelize.NOW },
order_total:Sequelize.INTEGER,
},
{
tableName: 'tbl_name',
timestamps: false
});
now i want to use timestamps somehow in it and want to save (Eastern Standard Time), can someone guide me how can i do this.
Note: i am using mysql and sailsjs .
Thanks.
EDIT
also used following two snippets but then it saves 0000-00-00 00:00:00
date_purchased: { type: Sequelize.DATE, defaultValue: sequelize.literal("FROM_UNIXTIME(UNIX_TIMESTAMP() + (3600 * 2))") },
date_purchased: { type: Sequelize.DATE, defaultValue: sequelize.literal("(now() at time zone 'EST')") }
Are you going to want to manipulate your dates (add, subtract, compare)? If NOT then just save them as pre-formatted strings.
Otherwise you are going to run into problems with this approach of saving "Eastern time" that are beyond sails. Remember eastertime is the same as UTC, just with an offset and this offset changes depending on if your in Daylight savings or no.
MySQL by default will use the system time zone internally but it is possible to define a different time zone for the MySQL server globally or even per transaction. However, MYSQL is still going to save the time in a UTC, all your telling MYSQL is to transform during read/writes as if we were in eastern timezone.
This can be tricky when dealing with many variables such as your application server, your db adapter and finally your DB.
The safest thing to do is to save all dates as UTC (have your server setup in a UTC system time) and then display them in the correct timezone via the application layer.
You have to ask yourself. "Where are my dates coming from". Are they all being generated within the database itself (like your code above)? Are they being generated by the application server (such as sails creating the date and then inserting it) or will dates come form the client (browsers have their own timezone issues).
If you do want to manipulate/compare them, then your best making sure your dates save UTC and then manipulating the timezone when its displayed to the user. Consider Daylight Savings Time (it changes this weekend!).
How to accomplish this will vary depending on your setup and answering many of the questions I have issued above.
I know that I stopped using Date/Time methods a while ago and instead use integer to store all my dates with a UnixTimestamp. Then I format that into the correct timezone before showing to the client. This ensures that my dates will not be unduly transformed no matter what DB i'm using or where the server is. This is just a personal preference, but I find it has elevated a big headache.

Resources