Is there a way to schedule a python script (the script uses Google API to read google sheets) using crontab? - python-3.x

I am trying to schedule a python script using crontab on a web server, but the file doesn't run at all. The script uses Google API to read google sheets. Other files that don't use Google API run as expected. Why is it so?

Related

Need a way of running excel VBA either through azure cloud or through any other way?

I have a pipeline in azure that runs a script once per month. The script invokes a VBA. The problem is I can't run this VBA in azure, since in order for the script to run it requires a copy of excel. Is there any way to automate the process of executing a VBA either within azure or somewhere else and then grab the resulting excel files? I'm open to any ideas. Where else can I run VBAs external to azure and then draw those files into azure blob storage.
Thanks
First, use Windows Task Scheduler to run the excel every month and use macro to save the result in some network place, onedrive or sharepoint; or sent it by email; or push into some database, etc...
Second, use power automate and office script every month to handle the result file if necessary.

Drag'n'Drop local file to browser window via Command Line for uploading

Not quite sure if it possible at all, so decided to ask here.
I need to automate some things and search the way to drag'n'drop local file to browser window (with specific URL) via Terminal command for uploading.
I use Mac, but I think Linux will fit here as well.
If there is any solution or module on Bash / Python / Node.js I will gladly give it a try.
take a look at requests package in python language.
you can make a POST request and send information you want to the web server.

Get local Files from Python lambda Function

i want get a file (List of files) located in my Local machine using a python lambda Function.
Im using the So library and try run local and works, but when y try run in aws my code does not detect the file.
I need verify a folder and it if has a file, upload this file to s3. This process (Verification and upload) will run according to a schedule.
Batch file it´s not a option.
Thanks for the help
It appears you are referring to: Access data sources on premises - Azure Logic Apps | Microsoft Docs
No, there is no equivalent for AWS Lambda functions.
An AWS Lambda function can access services on the Internet (eg make API calls, access websites), but you would need to code that yourself.

Azure - Process Message Files in real time

I am working on Azure platform and use Python 3.x for data integration (ETL) activities using Azure Data Factory v2. I got a requirement to parse the message files in .txt format real time as and when they are downloaded from blob storage to Windows Virtual Machine under the path D:/MessageFiles/.
I wrote a Python script to parse the message files because it's a fixed width file and it parses all the files in the directory and generates the output. Once the files are successfully parsed, it will be moved to archive directory. This script runs well in local disk on ad-hoc mode whenever i need it.
Now, i would like to make this script run continuously in Azure so that it looks for the incoming message files in the directory D:/MessageFiles/ all the time and perform the processing as and when it sees the new files in the path.
Can someone please let me know how to do this? Should i use any stream analytics application to achieve this?
Note: I don't want to use Timer option in Python script. Instead, i am looking for an option in Azure to use Python logic only for File Parsing.

Google Cloud Shell: How to interface via command-line terminal?

I created a very simple python hello_world script on the google cloud shell. I implemented it and ran it within the in-browser google cloud shell and it compiled and outputted correctly.
My question is, what if I want to implement this same hello_world script via a linux server I am on? I currently just downloaded the google SDK and did a 'gcloud init' and set up this hello_world python project onto my linux server by following the google cloud documentation:
https://cloud.google.com/sdk/docs/quickstart-linux
https://cloud.google.com/sdk/downloads
I am just a little confused on how to access these files on my linux server terminal and run them via linux server terminal. I simply want to run them on here and have the same output as I did on the google cloud shell via the in-browser console.
If I'm reading this right, you can simply download the files over to you system. When you have the shell open, the upper right corner there is the 'more' option and from there you will see the ability to download files directly. Sorry, if this isn't what you were looking for.
Rick

Resources