Cloud Functions: how to upload additional file for use in code? - node.js

I need to get access to protoc file in my code. Locally I just put it in the folder but how to get this file from deployed Firebase functions?
const grpc = require('grpc');
const PROTO_PATH = __dirname + '\\protos\\prediction_service.proto';
exports.helloWorld = functions.https.onRequest((request, response){
var tensorflow_serving = grpc.load(PROTO_PATH).tensorflow.serving;
...
}

You'd like to upload 3 files to deploy your Cloud Function:
index.js
package.json
prediction_service.proto
In order to do so via the Developer Console, you'll need to:
Go to the Google Cloud Developer Console > Cloud Functions > Create Function
In the "Source Code" field, choose either:
"ZIP upload" and select a zip file including your 3 files,
"ZIP from Cloud Storage" and select file location on GCS,
"Cloud Source repository" and provide your repo details
Fill in the remaining fields and click "Create"
Once deployed, in the source tab for your function you'll see the three files displayed.
Alternatively, you can use gcloud to deploy your files via the following command:
gcloud beta functions deploy <functionName> --source=SOURCE
where source can be a ZIP file on Google Cloud Storage, a reference to source repository or a local filesystem path. I'd recommend to have a look at the doc for this command for full details.

When you are using Firebase Cloud Functions with TypeScript (your code is in functions/src/index.ts), you need to put the additional files in functions/lib

I find this way the easiest when it comes to Firebase Functions:
Put your .proto file into functions folder of your firebase project (where index.js and package.json is located).
Deploy your functions as normal with the CLI command firebase deploy --only functions
As you can see here the file is automatically added to the project in the GCP:
And you can access it in your node.js project:
protobuf.load(__dirname + '/schema.proto')

While it is possible to use GCS, it's simple to include files in your source.
Put your package.json, index.js (or whatever file is specified in package.json's 'main' field) and other dependent files in a directory.
When you create your function, provide that directory including your other files via the ZIP upload or Cloud Source repository.
Your other files are available at path.join(__dirname, 'your/path/file.ext')

Related

Location for SSH private key and temporary SFTP download data in Azure functions

I am writing an Azure function that uses WinSCP library to download files using SFTP and upload the files on blob storage. This library doesn't allow to get files as a Stream. Only option is to download them locally. My code also uses a private key file. So i have 2 questions.
sessionOptions.SshPrivateKeyPath = Path.GetFullPath("privateKey2.ppk");
is working locally. I have added this file in solution with option "copy to output" and it works. But will it work when Azure function is deployed?
While getting the files I need to specify local path where the files will be downloaded.
var transferResult = session.GetFiles(
file.FullName, Path.GetTempPath() + #"SomeFolder\" + file.Name, false,
transferOptions);
The second parameter is the local path.
What should I use in place of Path.GetTempPath() that will work when Azure function is deployed?
For the private key, just deploy it along with your function project. You can simply add it to your VS project.
See also Including a file when I publish my Azure function in Visual Studio.
For the download: The latest version of WinSCP already supports streaming the files. Use the Session.GetFile method.
To answer your question about the temporary location, see:
Azure Functions Temp storage.
Where to store files for Azure function?

Possible to create file in sources directory on Azure DevOps during build

I have a node script which needs to create a file in the root directory of my application before it builds the file.
The data this file will contain is specific to each build that gets triggered, however, I'm having no luck on Azure DevOps in this regards.
For the writing of the file I'm using fs.writeFile(...), something similar to this:
fs.writeFile(targetPath, file, function(err) { // removed for brevity });
however, this throws an expection:
[Error: ENOENT: no such file or directory, open '/home/vsts/work/1/s/data-file.json']
Locally this works, I'm assuming this has got to do with permissions, however, I tried adding a blank version of this file to my project, however, it still throws this exception.
Possible to create file in sources directory on Azure DevOps during
build
The answer is Yes. This is fully supported scenario in Azure Devops Service if you're using Microsoft ubuntu-hosted agent.
If you met this issue when using microsoft-hosted agent, I think this issue is more related to one path issue. Please check:
The function where the error no such file or directory comes. Apart from the fs.writeFile function, do you also use fs.readFile in the xx.js file? If so, you should make sure the two paths are same.
The structure of your source files and your real requirements. According to your question you want to create it in Source directory /home/vsts/work/1/s, but the first line indicates that you actually want to create file in root directory of my application.
1).If you want to create file in source directory /home/vsts/work/1/s:
In your js file: Use something targetpath like './data-file.json'. And make sure you're running command node xx.js from source directory. (Leaving CMD task/PS task/Bash task's working directory blank!!!)
2).If you want to do that in root of application folder like /home/vsts/work/1/s/MyApp:
In your js file: Use __dirname like fs.writeFile(__dirname + '/data-file.json', file, function(err) { // removed for brevity }); and fs.readFile(__dirname + '/data-file.json',...).

Node.js: multi-part file upload via REST API

I would like to upload invoking a REST endpoint in multi-part.
In particular, I am looking at this API: Google Cloud Storage: Objects: insert
I did read about using multer, however I did not find any complete example showing me how to perform this operation.
Could someone help me with that?
https://cloud.google.com/nodejs/getting-started/using-cloud-storage#uploading_to_cloud_storage
^^ this is a a good example of how to use multer to upload a single image to Google Cloud Storage. Use multer to create filestream for each file ( storage: multer.memoryStorage() ), and handle the file stream by sending it to your GCS bucket in your callback.
However link only shows an example for one image. If you want to do an array of images, create a for-loop, where you create a stream for each file in your request, but only put the next() function after the for loop ends. If you keep the next(); in each loop cycle you will get the error: Error: Can't set headers after they are sent.
There is an example for uploading files with the nodejs client library and multer. You can modify this example and set the multipart option:
Download the sample code and cd into the folder:
git clone https://github.com/GoogleCloudPlatform/nodejs-docs-samples/
cd nodejs-docs-samples/appengine/storage
Edit the app.yaml file and include your bucket name:
GCLOUD_STORAGE_BUCKET: YOUR_BUCKET_NAME
Then in the source code, you can modify the publicUrl variable according to Objects: insert example:
const publicUrl = format(`https://www.googleapis.com/upload/storage/v1/b/${bucket.name}/o?uploadType=multipart`);
Download a key file for your service account and set the environment variable:
Go to the Create service account key page in the GCP Console.
From the Service account drop-down list, select New service account.
Input a name into the Service account name field.
From the Role drop-down list, select Project > Owner.
Click Create. A JSON file that contains your key downloads to your computer. And finally export the environment variable:
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/key/file
After that, yo're ready to run npm start and go to the app's frontend and upload your file:

AWS Lambda function to connect to a Postgresql database

Does anyone know how I can connect to a PostgreSQL database through an AWS Lambda function. I searched it up online but I couldn't find anything about it. If you could tell me how to go about it that would be great.
If you can find something wrong with my code (node.js) that would be great otherwise can you tell me how to go about it?
exports.handler = (event, context, callback) => {
"use strict"
const pg = require('pg');
const connectionStr =
"postgres://username:password#host:port/db_name";
var client = new pg.Client(connectionStr);
client.connect(function(err){
if(err) {
callback(err)
}
callback(null, 'Connection established');
});
context.callbackWaitsForEmptyEventLoop = false;
};
The code throws an error:
cannot find module 'pg'
I wrote it directly on AWS Lambda and didn't upload anything if that makes a difference.
I wrote it directly on AWS Lambda and didn't upload anything if that makes a difference.
Yes this makes the difference! Lambda doesnt provide 3rd party libraries out of the box. As soon as you have a dependency on a 3rd party library you need to zip and upload your Lambda code manually or with the use of the API.
Fore more informations: Lambda Execution Environment and Available Libraries
You need to refer Creating a Deployment Package (Node.js)
Simple scenario – If your custom code requires only the AWS SDK library, then you can use the inline editor in the AWS Lambda console. Using the console, you can edit and upload your code to AWS Lambda. The console will zip up your code with the relevant configuration information into a deployment package that the Lambda service can run.
and
Advanced scenario – If you are writing code that uses other resources, such as a graphics library for image processing, or you want to use the AWS CLI instead of the console, you need to first create the Lambda function deployment package, and then use the console or the CLI to upload the package.
Your case like mine falls under Advanced scenario. So we need to create a deployment package and then upload it. Here what I did -
mkdir deployment
cd deployment
vi index.js
write your lambda code in this file. Make sure your handler name is index.handler when you create it.
npm install pg
You should see node_modules directory created in deployment directory which has multiple modules in it
Package the deployment directory into a zip file and upload to Lambda.
You should be good then
NOTE : npm install will install node modules in same directory under node_modules directory unless it sees a node_module directory in parent directory. To be same first do npm init followed by npm install to ensure modules are installed in same directory for deployment.

Delete folder in Google Cloud Storage using nodejs gcloud api

I am using gcloud nodejs api to access Google Cloud Storage. I can save/delete/exists files individually, but I didn't find a way to delete a folder or even to list the files in a folder using gcloud nodejs api.
I have seen people say that the folder hierachy in GCS is not a real tree structure, but just names. So I tried to use wildcard to match the file name string, which did not succeed.
I wonder if there is any way to do it. If not, what tool should I use?
The code to list files in a directory should look something like this:
bucket.getFiles({ prefix: 'directoryName/' }, function(err, files) {})
And to delete:
bucket.deleteFiles({ prefix: 'directoryName/' }, function(err) {})
getFiles API documentation
deleteFiles API documentation
Instead of using gcloud nodejs api, there are two other ways to do this.
Use the googleapis package to access the standard JSON API and XML API of gcs. googleapis is a lower level API tool which includes interacting with google cloud services. We can create/list/delete files on gcs. Documentation and examples:
https://cloud.google.com/storage/docs/json_api/v1/objects/delete
https://cloud.google.com/storage/docs/json_api/v1/objects/list
Use childe_process to execute the gsutil commmand line tool. This is not a standard way of programatically accessing the google api, but still a viable solution.Wildcard is allowed when issuing the command. Note that is may not work on the google app engine. Here is an example.
Nodejs
var exec = require('child_process').exec;
exec("gsutil rm gs://[bucketname]/[directory ]/*" , function(error,stdout,stderr){});
As Stephen suggested, using standard gcloud method bucket.getFiles and bucket.deleteFiles is the most desirable approach. Since gcs don't have the concept of directories, the manipulation of multiple files obviously should be considered as a bucket level operation.

Resources