nodejs .node-xmlhttprequest-sync-1 file created - node.js

I am using google cloud function with nodejs12 runtime and I am getting the following error.
EROFS: read-only file system, open '.node-xmlhttprequest-sync-1'"
nodejs (express.js) is creating a file in same level as index.js which is not permitted (files should be created in /tmp/ in cloud functions)
why is this file created
if this is necessary, how to ensure its created in /tmp

This file is created by the node-XHMLHttpRequest library when the settings.async flag is set to false: https://github.com/driverdan/node-XMLHttpRequest/blob/master/lib/XMLHttpRequest.js#L480
Are you or any functions you're importing using this library? If so, the quickest fix would be to use async requests.

Related

how do you create a folder and write to heroku's ephemeral storage?

i'm using nodejs, and when my script uses fs.mkdir nothing seems to happen... it works well locally. is there an alternate command/function i can use to create and write to folders in heroku's file system?
(yes i'm aware the ephemeral system is temporary, with my use case, all files will be deleted after 5 minutes)
You can create a tmp folder in the root of your project, which is where you will write to and read files from in Heroku. For instance, first line of code below allows you to save data to a specified file path inside the tmp folder. The second line creates the stream for that file
fs.writeFileSync(`/tmp/${filename}.json`, dataToSave)
const fileStream = fs.createReadStream(`/tmp/${filename}.json`)

pd.read_parquet produces: OSError: Passed non-file path

I'm trying to retrieve a DataFrame with pd.read_parquet and I get the following error:
OSError: Passed non-file path: my/path
I access the .parquet file in GCS with a path with prefix gs://
For some unknown reason, the OSError shows the path without the prefix gs://
I did suspect it was related to credentials, however from my Mac I can use gsutil with no problem. I can access and download the parquet file via the Google Cloud Platform. I just can't read the .parquet file directly in the gs:// path. Any ideas why not?
Arrow does not currently natively support loading data from GCS. There is some progress in implementing this the tasks are tracked in JIRA.
As a workaround you should be able to use the fsspec GCS filesystem implementation to access objects (I would try installing it and using 'gcs://' instead 'gs://'), or opening the file directly with the gcsfs api and passing it through.

Possible to create file in sources directory on Azure DevOps during build

I have a node script which needs to create a file in the root directory of my application before it builds the file.
The data this file will contain is specific to each build that gets triggered, however, I'm having no luck on Azure DevOps in this regards.
For the writing of the file I'm using fs.writeFile(...), something similar to this:
fs.writeFile(targetPath, file, function(err) { // removed for brevity });
however, this throws an expection:
[Error: ENOENT: no such file or directory, open '/home/vsts/work/1/s/data-file.json']
Locally this works, I'm assuming this has got to do with permissions, however, I tried adding a blank version of this file to my project, however, it still throws this exception.
Possible to create file in sources directory on Azure DevOps during
build
The answer is Yes. This is fully supported scenario in Azure Devops Service if you're using Microsoft ubuntu-hosted agent.
If you met this issue when using microsoft-hosted agent, I think this issue is more related to one path issue. Please check:
The function where the error no such file or directory comes. Apart from the fs.writeFile function, do you also use fs.readFile in the xx.js file? If so, you should make sure the two paths are same.
The structure of your source files and your real requirements. According to your question you want to create it in Source directory /home/vsts/work/1/s, but the first line indicates that you actually want to create file in root directory of my application.
1).If you want to create file in source directory /home/vsts/work/1/s:
In your js file: Use something targetpath like './data-file.json'. And make sure you're running command node xx.js from source directory. (Leaving CMD task/PS task/Bash task's working directory blank!!!)
2).If you want to do that in root of application folder like /home/vsts/work/1/s/MyApp:
In your js file: Use __dirname like fs.writeFile(__dirname + '/data-file.json', file, function(err) { // removed for brevity }); and fs.readFile(__dirname + '/data-file.json',...).

Can we read files from server path using any fs method in NodeJs

In my case I need to read file/icon.png from cloud storage/bucket which is a token base URL/path. Token resides in header of request.
I tried to use fs.readFile('serverpath') but it gave back error as 'ENOENT' i.e. 'No such file or directory' is existed, but file is existed on that path. So are these methods are eligible to make calls and read files from server or they work only with static path, if that is so then in my case how to read file from cloud bucket/server.
Here i need to pass that file-path to UI, to show this icon.
Use this lib to handle GCS operations.
https://www.npmjs.com/package/#google-cloud/storage
If you do need use fs, install https://cloud.google.com/storage/docs/gcs-fuse, mount bucket to your local filesystem, then use fs as you normally would.
I would like to complement Cloud Ace's answer by saying that if you have Storage Object Admin permission you can make the URL of the image public and use it like any other public URL.
If you don't want to make the URL public you can get temporary access to the file by creating a signed URL.
Otherwise, you'll have to download the file using the GCS Node.js Client.
I posted this as an answer as it is quite long to be a comment.

Cloud Functions: how to upload additional file for use in code?

I need to get access to protoc file in my code. Locally I just put it in the folder but how to get this file from deployed Firebase functions?
const grpc = require('grpc');
const PROTO_PATH = __dirname + '\\protos\\prediction_service.proto';
exports.helloWorld = functions.https.onRequest((request, response){
var tensorflow_serving = grpc.load(PROTO_PATH).tensorflow.serving;
...
}
You'd like to upload 3 files to deploy your Cloud Function:
index.js
package.json
prediction_service.proto
In order to do so via the Developer Console, you'll need to:
Go to the Google Cloud Developer Console > Cloud Functions > Create Function
In the "Source Code" field, choose either:
"ZIP upload" and select a zip file including your 3 files,
"ZIP from Cloud Storage" and select file location on GCS,
"Cloud Source repository" and provide your repo details
Fill in the remaining fields and click "Create"
Once deployed, in the source tab for your function you'll see the three files displayed.
Alternatively, you can use gcloud to deploy your files via the following command:
gcloud beta functions deploy <functionName> --source=SOURCE
where source can be a ZIP file on Google Cloud Storage, a reference to source repository or a local filesystem path. I'd recommend to have a look at the doc for this command for full details.
When you are using Firebase Cloud Functions with TypeScript (your code is in functions/src/index.ts), you need to put the additional files in functions/lib
I find this way the easiest when it comes to Firebase Functions:
Put your .proto file into functions folder of your firebase project (where index.js and package.json is located).
Deploy your functions as normal with the CLI command firebase deploy --only functions
As you can see here the file is automatically added to the project in the GCP:
And you can access it in your node.js project:
protobuf.load(__dirname + '/schema.proto')
While it is possible to use GCS, it's simple to include files in your source.
Put your package.json, index.js (or whatever file is specified in package.json's 'main' field) and other dependent files in a directory.
When you create your function, provide that directory including your other files via the ZIP upload or Cloud Source repository.
Your other files are available at path.join(__dirname, 'your/path/file.ext')

Resources