Unable to edit due dates - todoist

I am using Todoist's Sync API to manage my tasks. I've successfully created tasks and add Due dates. But not able to update or remove due dates from tasks.
I'm using the following:
import todoist
api = todoist.TodoistAPI('xxxx')
api.reset_state
api.sync()
TaskList = api.state['items']
item = api.items.get_by_id('4061696598')
item.update(due=None)
api.commit
api.sync()
print('done')
I've tried:
due=None
due=null
due=""
due={}
What am i missing?!

You are not actually calling the commit method: instead of api.commit, you should use api.commit().
(Likewise for api.reset_state() by the way.)

Related

Azure DevOps build pipeline logid for a specific task - dynamically [duplicate]

This question already has an answer here:
Azure DevOps pipeline logs for a specific task
(1 answer)
Closed last month.
In Azure DevOps, I have a multistage build pipeline. In that pipeline, I have a task named - Terraform_init. I need the log ID of this task, dynamically. How do I find out the log ID dynamically, if I know the displayname of the task?
Current situation:
Right now I have figured out the Log ID for that task. But later, in my build pipeline, I will add more tasks before the Terraform_init task. So, the log Id will be changed for that task.
Why I need:
After Terraform init task, I have another task called Get_logs. This task gets the logs of the Terraform_init task and saves it in a blob. For that, I have to use the following line -
$logs_url = ('https://dev.azure.com/bmw-ai-big-data-platform/{0}/_apis/build/builds/{1}/logs/27?api-version=6.0' -f $($env:SYSTEM_TEAMPROJECTID), $($env:BUILD_BUILDID) )
I will have more tasks before the Terrafrom_init task, so ../logs/27.. - this part will need to update every time. I want to avoid this.
Thanks in advanced.
Use this code to get the logids:
import requests
import json
url = "https://dev.azure.com/<orgname>/<project name>/_apis/build/builds/<build id>/timeline?api-version=6.0"
payload={}
headers = {
'Authorization': 'Basic <base64encoded PAT>'
}
response = requests.request("GET", url, headers=headers, data=payload)
reponse_json = response.json()
records = reponse_json['records']
for record in records:
if record['name'] == 'Initialize job':
print(record['log']['id'])
After that, using logging command to output the data as variable, and then you can be able to use it in other following tasks(Only for runtime, compile time is unable to achieve.).

VdmComplex Changes Not Working with PATCH

Using the SAP B1 .edmx with 3.39.0 and trying to update DeliveryNotes with new DocumentPackages. However, the list of DocumentPackage that eventually gets passed by the execution of the update operation is empty.
Code:
var packagesUpdateDocument = new Document();
packagesUpdateDocument.setDocEntry(1);
var documentPackages = new ArrayList<DocumentPackage>();
var documentPackage = new DocumentPackage();
documentPackage.setNumber(10);
documentPackages.add(documentPackage);
packagesUpdateDocument.setDocumentPackages(documentPackages);
var updateDeliveryPackagesRequest = service.withServicePath("etc")
.updateDeliveryNotes(packagesUpdateDocument);
var updateDeliveryPackagesResponse = updateDeliveryPackagesRequest.tryExecute(serviceLayerDestination);
Looking at the logs of the service layer I can see this is the request which was eventually sent by the client:
PATCH /b1s/v2/DeliveryNotes(1)
{"DocEntry":1,"DocumentPackages":[{}],"#odata.type":"SAPB1.Document"}
From my understanding, PATCH requests will automatically disregard anything the generated client deems as 'unchanged.'
Printing the changed fields:
System.out.println(packagesUpdateDocument.getChangedFields());
Yields:
{
DocEntry=175017,
DocumentPackages=
[DocumentPackage
(
super=VdmObject(customFields={},
changedOriginalFields={}),
odataType=SAPB1.DocumentPackage,
number=10,
)
]
.....
}
I believe the Package is not recording the fields which have changed. Although I am not certain.
Is there a step I am missing or is this a feature gap?
As of SAP Cloud SDK 3.42.0 we support updating complex properties with PATCH out-of-the-box. See the release notes for more details.
Yes this is a feature gap currently. PATCH will only consider properties of the root entity and navigation properties while disregarding changes in complex properties.
Until that is supported updating with PUT instead via the .replacingEntity() option should work.

How to access Clockify API through Excel Power Query

I am new to Clockify and to API usage (so I apologize if I express myself not properly). My objective is to automatically load in Excel (via Power Query) the detailed report with all time entries. I have tried the below code and it works ... but for the fact that I got just the latest 50 time entries.
This because, as I understand properly, I should use this base Endpoint: "https://api.clockify.me/api/v1" (the one I use is deprecated). Is there a way to make the below work on the correct Endpoint?
let Source = Web.Contents("https://api.clockify.me/api/reports/{your report ID}", [https://api.clockify.me/api", #"X-Api-Key"="{your API Key}"]]), jsonResponse = Json.Document(Source), workspace = jsonResponse[workspace] in workspace
Thanks in advance,

Right way to delete and then reindex ES documents

I have a python3 script that attempts to reindex certain documents in an existing ElasticSearch index. I can't update the documents because I'm changing from an autogenerated id to an explicitly assigned id.
I'm currently attempting to do this by deleting existing documents using delete_by_query and then indexing once the delete is complete:
self.elasticsearch.delete_by_query(
index='%s_*' % base_index_name,
doc_type='type_a',
conflicts='proceed',
wait_for_completion=True,
refresh=True,
body={}
)
However, the index is massive, and so the delete can take several hours to finish. I'm currently getting a ReadTimeoutError, which is causing the script to crash:
WARNING:elasticsearch:Connection <Urllib3HttpConnection: X> has failed for 2 times in a row, putting on 120 second timeout.
WARNING:elasticsearch:POST X:9200/base_index_name_*/type_a/_delete_by_query?conflicts=proceed&wait_for_completion=true&refresh=true [status:N/A request:140.117s]
urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='X', port=9200): Read timed out. (read timeout=140)
Is my approach correct? If so, how can I make my script wait long enough for the delete_by_query to complete? There are 2 timeout parameters that can be passed to delete_by_query - search_timeout and timeout, but search_timeout defaults to no timeout (which is I think what I want), and timeout doesn't seem to do what I want. Is there some other parameter I can pass to delete_by_query to make it wait as long as it takes for the delete to finish? Or do I need to make my script wait some other way?
Or is there some better way to do this using the ElasticSearch API?
You should set wait_for_completion to False. In this case you'll get task details and will be able to track task progress using corresponding API: https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-delete-by-query.html#docs-delete-by-query-task-api
Just to explain more in the form of codebase explained by Random for the newbee in ES/python like me:
ES = Elasticsearch(['http://localhost:9200'])
query = {'query': {'match_all': dict()}}
task_id = ES.delete_by_query(index='index_name', doc_type='sample_doc', wait_for_completion=False, body=query, ignore=[400, 404])
response_task = ES.tasks.get(task_id) # check if the task is completed
isCompleted = response_task["completed"] # if complete key is true it means task is completed
One can write custom definition to check if the task is completed in some interval using while loop.
I have used python 3.x and ElasticSearch 6.x
You can use the 'request_timeout' global param. This will reset the Connections timeout settings, as mentioned here
For example -
es.delete_by_query(index=<index_name>, body=<query>,request_timeout=300)
Or set it at connection level, for example
es = Elasticsearch(**(get_es_connection_parms()),timeout=60)

ScriptError using Google Apps Script Execution API

Following these guides https://developers.google.com/apps-script/guides/rest/quickstart/target-script and https://developers.google.com/apps-script/guides/rest/quickstart/nodejs, I am trying to use the Execution API in node to return some data that are in a Google Spreadsheet.
I have set the script ID to be the Project Key of the Apps Script file. I have also verified that running the function in the Script Editor works successfully.
However, when running the script locally with node, I get this error:
The API returned an error: Error: ScriptError
I have also made sure the script is associated with the project that I use to auth with Google APIs as well.
Does anyone have any suggestion on what I can do to debug/ fix this issue? The error is so generic that I am not sure where to look.
UPDATE: I've included a copy of the code in this JSBin (the year function is the entry point)
https://jsbin.com/zanefitasi/edit?js
UPDATE 2: The error seems to be caused by the inclusion of this line
var spreadsheet = SpreadsheetApp.open(DriveApp.getFileById(docID));
It seems that I didn't request the right scopes. The nodejs example include 'https://www.googleapis.com/auth/drive', but I also needed to include 'https://www.googleapis.com/auth/spreadsheets' in the SCOPES array. It seems like the error message ScriptError is not very informative here.
In order to find what scopes you'd need, to go the Script Editor > File > Project Properties > Scopes. Remember to delete the old credentials ~/.credentials/old-credential.json so that the script will request a new one.
EDIT: With the update in information I took a closer look and saw you are returning a non-basic type. Specifically you are returning a Sheet Object.
The basic types in Apps Script are similar to the basic types in
JavaScript: strings, arrays, objects, numbers and booleans. The
Execution API can only take and return values corresponding to these
basic types -- more complex Apps Script objects (like a Document or
Sheet) cannot be passed by the API.
https://developers.google.com/apps-script/guides/rest/api
In your Account "Class"
this.report = spreadsheet.getSheetByName(data.reportSheet);
old answer:
'data.business_exp' will be null in this context. You need to load the data from somewhere. Every time a script is called a new instance of the script is created. At the end of execution chain it will be destroyed. Any data stored as global objects will be lost. You need to save that data to a permanent location such as the script/user properties, and reloaded on each script execution.
https://developers.google.com/apps-script/reference/properties/

Resources