I am trying to run a python3 program continuously on GCP. What is the best way to do this?
So far I have tried using a google compute engine virtual machine running Debian linux. I used nohup but it still hangs up when the ssh connection is broken.
What other ways could I try to run the program through the vm? Are there better alternatives using GCP to run the program continuously?
Python's installation depend on different operating systems, documentation [1] could help you to run python program in Linux or in Windows without any trouble.
On the other hand, Google App Engine applications [2] are easy to create, easy to maintain, and easy to scale as your traffic and data storage needs change. With App Engine, there are no servers to maintain. You simply upload your application and very easy to operate. The documentation [3] also could be very helpful for you.
To know more about python on Google Cloud Platform, please have the documentation [4].
[1] https://cloud.google.com/python/setup#installing_python
[2] https://codelabs.developers.google.com/codelabs/cloud-app-engine-python3/#0
[3] https://cloud.google.com/python/getting-started
[4] https://cloud.google.com/python
Related
I want my C++ application to launch an arbitrary app (let's say a python script through a python interpreter) inside a secure enclave (Intel SGX). Is that even possible?
The steps are the following.
My app initializes an enclave and performs its attestation.
Next, it somehow uploads a python interpreter and a python script to the enclave.
It also uplaods to the enclave some piece of data to be processed by the script.
Then, the script is being launched inside the enclave and the data is being processed.
Finally, the processing result is uploaded back to the host.
Is this scenario possible? If yes, are there any examples on how to do so?
Microsoft OpenEnclave is also a choice.
There are many examples of adding unmodified libraries to SGX and then run the toy applications: oeapkman, Package Manager and Toolbox for enclave development: apkman.
If AWS Nitro Enclaves would satisfy you then the Oblivious framework let’s you do what you are describing.
There is a full tutorial and YouTube walkthrough of deploying FastAPI servers, as an example, here.
Disclosure: I work with Oblivious but this post is in no way an ad or plug, I think it just does what #pgr is asking for.
Feels like I've searched the entire web for an answer...to no avail. I have a puppeteer script that works perfectly locally. My local machine is a little unreliable, so I've been trying to push this script to the cloud so that it can run there. But I have no idea where to start. I'm sitting here with an IBM cloud account with no idea what to do. Can anyone help me out?
Running Puppeteer scripts can be done on any cloud platform that
exposes a Node.js environment
enables running a browser (Puppeteer will need to start Chromium)
This could be achieved, for example, using AWS EC2.
AWS Lambda, Google Cloud Functions and IBM Cloud Functions (and similar services) might also work but they might need additional work on your side to get the browser running.
For a step-by-step guide, I would suggest checking out this article and this follow-up.
Also, it might just be easier to look into services like Checkly (disclaimer: I work for Checkly), Browserless and similar (a quick search for something along the lines of "run puppeteer online" will return several of those), which allow you to run Puppeteer checks online without requiring any additional setup. Useful if you are serious about using Puppeteer for testing or synthetic monitoring in the long run.
Is there a Python3 library which will determine if we are running on GCP or another cloud architecture as opposed to a native architecture?
Something like platform.platform() or jaraco.docker.is_docker(), but for the cloud?
Yes, you can. Each cloud vendor provides unique interfaces and usually environment variables, system services, etc that can be used to detect the cloud vendor. For Google, I use a simple method of connecting to the Metadata server for compute services.
For Python look at libraries such as cloud-detect. This is not a recommendation, just a link.
cloud-detect
No, simply because the Python codes actually running in a virtual machine in the cloud.
I have a docker image which I am running on Google's Cloud Run.
When I want to run the image locally, I have to give my container additional capabilities like the following:
docker run -p 8080:8080 --cap-add=SYS_ADMIN gcr.io/my-project/my-docker-image
Is there a way of configuring Docker's capabilities in Cloud Run?
I stumbled upon this piece of API documentation from Google, but I don't know how to configure my container. I am not even sure that it is relevant to my situation.
Any help would be really appreciated.
Expanding the POSIX capabilities is not an option on Cloud Run or Cloud Run on GKE as they represent expanding the security vulnerabilities of the underlying host.
Adding capabilities is often the easiest way to make something with special system demands work. More complex but frequently doable are modifications within the container environment or to the package configuration to get things working.
If what you're trying to do absolutely requires cap-add, this might be addressed in a feature request to the software package... or it may be a novel use case that Cloud Run cannot support but may in the future with your feedback.
My developer has written a web scraping app on Linux on his private machine, and asked me to provide him with a Linux server. I setup an account on Google Compute Engine, created a Linux image with enough resources and a sufficiently large SSD drive. Three weeks later he is claiming that working on Google is too complex quote - "google is complex because their deployment process is separate for all modules. especially i will have to learn about how to set a scheduler and call remote scripts (it looks they handle these their own way)."
He suggests I create an account on Hostgator.com.
I appreciate that I am non-technical, but I cannot be that difficult to use Linux on Google?! Am I missing something? Is there any advice you could give me?
Regarding the suggestion to create an account on Hostgator to utilize what I presume would be a VPS in lieu of a Virtual Machine on GCE , I would suggest seeking a more concrete example from the developer.
For instance, the comment about the "scheduler", let's refer to it as some process that needs to execute on a regular basis:
How is this 'process' currently accomplished on the private machine ?
How would it be done on the VPS ?
What is preventing this 'process' from being done on the GCE VM ?