How to download Node.js project deployed on Google Cloud - node.js

I have deployed the Node.js code on Google Cloud using following command:-
gcloud app deploy
So, How to download Node.js project deployed on Google Cloud.

I'm not sure but if you need Tomcat server then set and deploy node.js application inside by creating a folder and add the dist folder files in it.
npm install --save #google-cloud/storage

At this point in time it is only possible to download your app's source code for Java, Python, PHP and Golang. The instructions are similar for Python, Golang and PHP:
appcfg.py -A [YOUR_PROJECT_ID] -V [YOUR_VERSION_ID] download_app [OUTPUT_DIR]
where:
[YOUR_PROJECT_ID] is your GCP project ID.
[YOUR_VERSION_ID] is the version ID of your application that you want to download.
[OUTPUT_DIR] is the full directory path to where you want your files downloaded.
In Java you use appcfg.sh:
appcfg.sh -A [YOUR_PROJECT_ID] -V [YOUR_VERSION_ID] download_app [OUTPUT_DIR]
See the link above and also the reference for Java and Python, PHP, Golang

Related

Creating Reactjs app production build without using node

We have just a single webpage with some links on clicking them it will redirect to different sources. As of now we are using "npm run build" to create the production package.
But because of the build files having dependencies with node, i cannot host it in a particular server.
Is there a way to create the Reactjs production build without using node ?
I suggest using Netlify to host your react app easily .
Below are some resources that can help you along the way.
https://www.netlify.com/with/react/
https://www.youtube.com/watch?v=sGBdp9r2GSg
You can have a build and upload it manually to your Netlify account,
You can use the CLI (netlify-cli) or you can your account to git .
Similar approach can be followed with git pages for example.
What packages do you have in your package.json file? Did you use a React project template that uses Node server-side features? It seems like you want to host your React project statically, not necessarily get rid of Node and npm.
For example, I've worked on lots of React projects using npm and create-react-app that we were able to host with a .NET backend and Microsoft IIS (instead of Node). The output is .html, .js, and other static files that you can host anywhere.
When you build a react app, the files at folder build contains everything it needs to run
If your hosting server hasn't integration with CI/CD, then you must deploy manually only the build folder, not the root folder (the folder that contains package.json).
I believe your issue is just a confusion/misunderstanding on how react works, how to deploy it, and how to run it.
React needs to be built on an environment where node, npm, and other tools are available. It can be on a build server or in your local machine.
After built, react app is just a folder with a bunch of html, css, js files which will run on the client browser, so, there's no dependency on NODE anymore.
These static files must be served with a simple static file server (apache, nginx, iis, etc),
I recommend you build the app locally on your machine and then deploy manually to your host through ftp, ssh or web interface.
If react is overkill to your needs, then don't use it.
The best approach is to host it in a cloud service that can do the full CI/CD integrated with git, all automated (Google GCP, AWS, Azure, Netlify, etc)

How to obfuscate Python code with buildpacks?

I am using pack cli to build docker image for my python flask app running with gunicorn.
Inside docker image, my whole code is exposed in workspace folder.
What shall i do to restrict user to access folder or obfuscate my code?
I am using Google Buildpack
pack set-default-builder gcr.io/buildpacks/builder:v1
I'm assuming you're using the Heroku Python Buildpack (and the heroku/buildpacks:18 builder).
You can create a bin/post_compile and use it to run compileall, and then run rm **/*.py

Google app engine (path file)

I am trying to get google app engine to work using node.js and puppeteer
It runs fine on local dev and node.path there is just dir / index.html and pupeteer can run it.
However when I deploy it as app engine flexible, it suddenly makes the path /app/index.html ans obviously puppeteer can't read it like it could in node.
Will update the question with code once I get home and not using phone.
Thanks!
It looks like you are running your application in Google App Engine with Flexible environment. What Flex does is to dockerize your application. At least for python the application is then placed in the /app directory, which is not unheard of for when executing python in Docker.
You can read more it in the official Google's GitHub repository for 'gcr.io/google_appengine/python' source for more details.
Note: you can debug the running Flex instance or even download the container built for your application.

npm module 'openurl' not working after deploying chatbot to Azure

I am working on chatbot development using node.js in MS bot framework.
I need to open a webpage during the conversation. I have used openurl npm module which is working successfully in local environment. After deploying to Azure from GitHub repository, the functionality is not working.
Could you please let me know any solution or fix for this?
I am also using other modules like system-sleep but I am facing the same problem. In short, all custom modules installed are not working after deployment to Azure.
var openurl = require('openurl');
var sleep = require('system-sleep');
openurl.open("https:google.com")
sleep(10);
While most modules are simply plain-text JavaScript files, some modules are platform-specific binary images. These modules are compiled at install time, usually by using Python and node-gyp Azure App Service does not support all native modules and might fail at compiling those with very specific prerequisites.
the description is from Using Node.js Modules with Azure applications.
Per my experience, the system-sleep module requires Python and node-gyp while installing.
You can try to install the modules in windows 32 platform on your local environment, and deploy your application to Azure with the node_modules folder which contains the compiled module.
On the other hand, you can leverage Azure App Service Editor to install those libs which are simply plain-text JavaScript files online.

deploy to google cloud platform flex env does not include directories?

AFAIK when using the google cloud sdk shell to deploy to the google cloud platform flex environment the deploy does not include the directories. For example, I'm following this nodejs express tutorial - https://cloud.google.com/nodejs/resources/frameworks/express. I can run the app locally. I deploy using gcloud app deploy running it in the same directory as app.yaml. However, after the deployment I get an application startup error:
node ./bin/www:
module.js:471`
throw err;
Error: Cannot find module '/app/bin/www'
I am able to deploy and run the hello world nodejs tutorial app, but that app has no sub-directories. However, if I modify that app to use EJS and put .ejs files in a 'views' folder and then deploy, the deploy works but the views folder is missing! I've verified that it is missing by using 'fs' in app.js to print out the files and directories of the current folder and guess what - NO directories except the node_modules folder which gets created during the deployment.
I've also tried deploying a python flask app that seems to have the same error (basically it cannot find a template because the templates folder does not exist...).
Has anyone else experienced this? Do I need to do something special in the deploy? I'm very surprised that not even Google's own sample tutorial app does not deploy.
Thanks!
I spent my whole day on the same issue. It looks like it was a problem with the new gcloud sdk shell. You can downgrade your version by 'gcloud components update --version=137.0.1'
It workes for me

Resources