How to pass File to Firebase Functions - node.js

I'm trying to pass a file (image) from my Reactjs app to Firebase Functions to then upload it to Pinata using their Node.js sdk but my cloud function keeps returning: ERR_INVALID_ARG_TYPE
Is it possible to pass a file to Firebase Functions?
I need to eventually convert it to a readable stream to use Pinata.
Image file:

Callable Cloud Functions only accept JSON types, so you can't pass a File object to them.
What you can do is:
either read the bytes from the file in your client, and then pass the byte[] (which is a JSON type) or pass it as a base64 encoded String.
or you can write the file to Cloud Storage through Firebase, and then pass the path to that file to your Cloud Function

Used Frank van Puffelen's advice and first uploaded the file to Firebase Storage, then downloaded it in my Firebase Function, then converted the file to an array buffer.
Download from Google Cloud Storage
const [bufferFile] = await admin
.storage()
.bucket('my-bucket-name')
.file('file-path)
.download()
Convert to Array Buffer
const toArrayBuffer = (buf: Buffer) => {
const ab = new ArrayBuffer(buf.length)
const view = new Uint8Array(ab)
for (let i = 0; i < buf.length; ++i) {
view[i] = buf[i]
}
return ab
}
Then you can pass the Array Buffer to IPFS or arweave

Related

How to upload a base64 image URL to Firebase and retrieve its access token?

In a MERN + Firebase project, I have an image data string that I want to upload and then get the access token of that file.
The image data string is of the following form:
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAOoAAAClCAYAAABSmmH3AAAAAXNSR0IArs4c6QAABFJJREFUeF7t1oENg0AQA8F8/6XSAx8pVWTloQIzZwvOvfd+PAQI/LXAMdS/vo9wBH4ChqoIBAIChho4kogEDFUHCAQEDDVwJBEJGKoOEAgIGGrgSCISMFQdIBAQMNTAkUQkYKg6QCAgYKiBI4lIwFB1gEBAwFADRxKRgKHqAIGAgKEGjiQiAUPVAQIBAUMNHElEAoaqAwQCAoYaOJKIBAxVBwgEBAw1cCQRCRiqDhAICBhq4EgiEjBUHSAQEDDUwJFEJGCoOkAgIGCogSOJSMBQdYBAQMBQA0cSkYCh6gCBgIChBo4kIgFD1QECAQFDDRxJRAKGqgMEAgKGGjiSiAQMVQcIBAQMNXAkEQkYqg4QCAgYauBIIhIwVB0gEBAw1MCRRCRgqDpAICBgqIEjiUjAUHWAQEDAUANHEpGAoeoAgYCAoQaOJCIBQ9UBAgEBQw0cSUQChqoDBAIChho4kogEDFUHCAQEDDVwJBEJGKoOEAgIGGrgSCISMFQdIBAQMNTAkUQkYKg6QCAgYKiBI4lIwFB1gEBAwFADRxKRgKHqAIGAgKEGjiQiAUPVAQIBAUMNHElEAoaqAwQCAoYaOJKIBAxVBwgEBAw1cCQRCRiqDhAICBhq4EgiEjBUHSAQEDDUwJFEJGCoOkAgIGCogSOJSMBQdYBAQMBQA0cSkYCh6gCBgIChBo4kIgFD1QECAQFDDRxJRAKGqgMEAgKGGjiSiAQMVQcIBAQMNXAkEQkYqg4QCAgYauBIIhIwVB0gEBAw1MCRRCRgqDpAICBgqIEjiUjAUHWAQEDAUANHEpGAoeoAgYCAoQaOJCIBQ9UBAgGB8zzPDeQUkcC0wHnf11CnK+DlCwJ+fQtXknFewBd1vgIACgK+qIUryTgvYKjzFQBQEDDUwpVknBcw1PkKACgIGGrhSjLOCxjqfAUAFAQMtXAlGecFDHW+AgAKAoZauJKM8wKGOl8BAAUBQy1cScZ5AUOdrwCAgoChFq4k47yAoc5XAEBBwFALV5JxXsBQ5ysAoCBgqIUryTgvYKjzFQBQEDDUwpVknBcw1PkKACgIGGrhSjLOCxjqfAUAFAQMtXAlGecFDHW+AgAKAoZauJKM8wKGOl8BAAUBQy1cScZ5AUOdrwCAgoChFq4k47yAoc5XAEBBwFALV5JxXsBQ5ysAoCBgqIUryTgvYKjzFQBQEDDUwpVknBcw1PkKACgIGGrhSjLOCxjqfAUAFAQMtXAlGecFDHW+AgAKAoZauJKM8wKGOl8BAAUBQy1cScZ5AUOdrwCAgoChFq4k47yAoc5XAEBBwFALV5JxXsBQ5ysAoCBgqIUryTgvYKjzFQBQEDDUwpVknBcw1PkKACgIGGrhSjLOCxjqfAUAFAQMtXAlGecFDHW+AgAKAoZauJKM8wKGOl8BAAUBQy1cScZ5AUOdrwCAgoChFq4k47yAoc5XAEBBwFALV5JxXsBQ5ysAoCDwBaT9ke70WG4vAAAAAElFTkSuQmCC
This is the reference to where the file should be uploaded:
const imageRef: StorageReference = ref(
storage,
`/issueImages/${firebaseImageId}`
);
So far, I have attempted to use the 'put' function with the imageRef, and when I try using uploadBytes() of firebase, I have to upload it as a Buffer, and even then I cannot seem to find the access token in the metadata.
To upload a data URL to Firebase, you would use storageRef.putString(url, 'DATA_URL') (legacy) or uploadString(storageRef, url, 'DATA_URL') (modern) depending on the SDK you are using.
When you upload a file to a Cloud Storage bucket, it will not be issued an access token until a client calls its version of getDownloadURL(). So to fix your issue, you would call getDownloadURL() immediately after upload.
If Node is running on a client's machine, you would use:
// legacy syntax
import * as firebase from "firebase";
// reference to file
const imageStorageRef = firebase.storage()
.ref(`/issueImages/${firebaseImageId}`);
// perform the upload
await imageStorageRef.putString(dataUrl, 'DATA_URL');
// get the download URL
const imageStorageDownloadURL = await imageStorageRef.getDownloadURL();
// modern syntax
import { getStorage, getDownloadURL, ref, uploadString } from "firebase/storage";
// reference to file
const imageStorageRef = ref(
getStorage(),
`/issueImages/${firebaseImageId}`
);
// perform the upload
await uploadString(imageStorageRef, dataUrl, 'DATA_URL');
// get the download URL
const imageStorageDownloadURL = await getDownloadURL(imageStorageRef);
If Node is running on a private server you control, you should opt to use the Firebase Admin SDK instead as it bypasses the rate limits and restrictions applied to the client SDKs.
As mentioned before, the download URLs aren't created automatically. Unfortunately for us, getDownloadURL is a feature of the client SDKs and the Admin SDK doesn't have it. So we can either let a client call getDownloadURL when it is needed or we can manually create the download URL if we want to insert it into a database.
Nico has an excellent write up on how Firebase Storage URLs work, where they collated information from the Firebase Extensions GitHub and this StackOverflow thread. In summary, to create (or recreate) a download URL once it has been uploaded, you can use the following function:
import { uuid } from "uuidv4";
// Original Credit: Nico (#nicomqh)
// https://www.sentinelstand.com/article/guide-to-firebase-storage-download-urls-tokens
// "file" is an instance of the File class from the Cloud Storage SDK
// executing this function more than once will revoke all previous tokens
function createDownloadURL(file) {
const downloadToken = uuid();
await file.setMetadata({
metadata: {
firebaseStorageDownloadTokens: downloadToken
}
});
return `https://firebasestorage.googleapis.com/v0/b/${file.bucket.name}/o/${encodeURIComponent(file.name)}?alt=media&token=${downloadToken}`;
}
The allows us to change the client-side code above into the following so it can run using the Admin SDK:
// assuming firebase-admin is initialized already
import { getStorage } from "firebase-admin/storage";
// reference to file
const imageStorageFile = getStorage()
.bucket()
.file(`/issueImages/${firebaseImageId}`);
// perform the upload
await imageStorageFile.save(dataUrl);
// get the download URL
const imageStorageDownloadURL = await createDownloadURL(imageStorageFile);
In all of the above examples, a download URL is retrieved and saved to the imageStorageDownloadURL variable. You should store this value as-is in your database. However, if you instead want to store only the access token and reassemble the URL on an as-needed basis, you can extract the token from its ?token= parameter using:
const downloadToken = new URL(imageStorageDownloadURL).searchParams.get('token');

Convert google cloud storage file to base64 string

i am retrieving a pdf file from google cloud storage. I need to convert this file to base64 string so that i can pass to api as request.This is is nodejs
const { Storage } = require("#google-cloud/storage");
const storage = new Storage(options);
const bucket = storage.bucket(bucketName);
let remoteFile = bucket.file(fileName);
Need to convert this remoteFile object to base64 string.
Actually i need to pass this remoteFile as attachment to sendgrid mail api.
As you can find here in the library sample, you need to download the file content first, and then you can do what you want with, encoded it in base64 if you want
....
remoteFile.download().then(function(data) {
const file = data[0];
... convert base64 and continue here....
});

How to Upload PDF File on Azure Blob Storage via Node.js?

const blobServiceClient = await BlobServiceClient.fromConnectionString(connectionString);
const containerClient = await blobServiceClient.getContainerClient(container);
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
const uploadBlobResponse = await blockBlobClient.upload(content, content.length);
console.log(uploadBlobResponse);
console.log(`FIle upload successfully on cloud ${uploadBlobResponse.requestId}`);
i am trying like this, but blockBlobClient.upload() needs content, i converted file in base64 and sent it into content, but i am having issue, file is uplaoded but corrupted. any help please.
Check the SDK, the upload method construct is upload(HttpRequestBody, number, BlockBlobUploadOptions), the content is HttpRequestBody, check the parameter it requires
Blob, string, ArrayBuffer, ArrayBufferView or a function which returns
a new Readable stream whose offset is from data source beginning.
So maybe you could try uploadFile, just use the file path to upload, I have tried this way it works.
Also, you could use uploadStream to upload the file readable stream.

How to access a specific file and its contents from within Google Cloud Storage

I need to access a file stored on Google Cloud Storage, this is not a file that is uploaded, instead it is created by a cloud function. I cannot figure out how to access this file using Node.
I have tried many of the things recommended on Stackoverflow.
google-cloud TypeError: gcs.bucket is not a function
Python GAE - How to check if file exists in Google cloud storage
How to read content of JSON file uploaded to google cloud storage using node js
Similarly, I have checked out the sample Google project but that only uses read streams and because I am not uploading the file, I couldn't figure out how to use that.
The closest I have gotten is modifing the first link to get this
var {Storage} = require('#google-cloud/storage');
var gcs = new Storage({
keyFilename: path.join("_file.json"),
projectId: 'ProjectId'
});
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket('file.json');
This causes a TypeError:
TypeError: bucket is not a function
How can I access and read the json located within the file?
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket('file.json');
chosenBucket is not a function. It's a Bucket object. Do something like this:
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket.file('file.json');
const download_options = {
// The path to which the file should be downloaded, e.g. "./file.txt"
destination: destFilename,
};
await file.download(download_options)
See an example: https://github.com/googleapis/nodejs-storage/blob/master/samples/files.js

Meteor and Azure Blob storage

I am looking for a way to uploading files to Azure Blob Storage.
I found azure-storage npm package.
But I'm having a problem with 'createBlockBlobFromStream' method.
I dont know how create stream from Uint8Array.
xhr.onload = function (e) {
if (this.status == 200) {
// Note: .response instead of .responseText
const blob = new Blob([this.response]);
console.log(audios[i].file);
const reader = new FileReader();
reader.onload = function () {
const data = new Uint8Array(reader.result);
Meteor.call('uploadFilesViaSDK', data);
};
reader.readAsArrayBuffer(blob);
}
};
I`m trying to migarate files from S3 to Azure blob. Thats why I dowload files from S3, and than read it as ArrayBuffer and convert it to Uint8Array.
And now I am looking way how to upload this data to azure via azure.createBlockBlobFromStream meyhod.
Specifically, I need an example of creating a stream from Uint8Array.
I'll be grateful for any answer
In addition to an approach provided by Gaurav, once you created a stream from Uint8Array by using streamifier, you can use createWriteStreamToBlockBlob function to write to a block blob from a stream. With that you are able to transmit stream by calling .pipe():
streamifier.createReadStream(new Buffer(uint8)).pipe(blobSvc.createWriteStreamToBlockBlob('mycontainer', 'test.txt'));

Resources