Firebase Storage access token creation in node.js [duplicate] - node.js

This question already has answers here:
Get Download URL from file uploaded with Cloud Functions for Firebase
(25 answers)
Closed 2 years ago.
Basically I have two ways of uploading files to the firebase storage:
Frontend calls backend with image url, the backend downloads the image and uploads via bucket.upload (which I like, because I can set the storageAccessToken)
The User uploads directly via frontend with a signed url that got generated in the backend. (which I dislike because I can't set a storageAccessToken)
I have a onFinalize trigger that gets called everytime a new object is stored in the firebase storage.
In this trigger I want to create a firestore doc for this particular object that got added.
In order to get the file information I use probe-image-size.
The problem is that I need a download link for the file and without a storageAccessToken I can not download the file as far as I know.
Basically the question is: can I add a storageAccessToken in the onFinalize trigger or could I add the storageAccessToken when I upload a file via frontend with the signedUrl?

If I correctly understand your question, you would like, in a Cloud Function ("onFinalize trigger") to create a Firestore document which contains, among other info (from probe-image-size), a signed URL.
You can get a signed URL through a Cloud Function, as follows, without the need for a storageAccessToken:
exports.generateSignedURL = functions.storage.object().onFinalize(async object => {
try {
const bucket = admin.storage().bucket(object.bucket);
const file = bucket.file(object.name);
const signedURLconfig = { action: 'read', expires: '08-12-2025' }; // For example...
const signedURLArray = await file.getSignedUrl(signedURLconfig);
const url = signedURLArray[0];
await admin.firestore().collection('signedURLs').add({ fileName: object.name, signedURL: url })
return null;
} catch (error) {
console.log(error);
return null;
}
});

Related

Why is the S3 Url not able to be added to the object and submitted to MongoDB?

I am working on this nextjs jobboard website app and I have a job object which has several key value pairs that gets filled up over several sibling components and is to be submitted by the current component. The current component's job is to upload the job file attachment to s3, retrieve the url, and store it in the job object and upload it to mongodb. The file was successfully added to s3 and able to grab the url. However after retrieving the file url from s3, it was not able to add the string url to the job object which has a specific string field for the url. When i click submit job from the UI, the joblisting was able to be submitted to mogodb with all the other fields filled with values from other sibling components but failed to add the string URL. I have tried to find a way to solve it but was unable to even though the logic seems so clear.
I need help. Thank you in advance
const submit = async (e) =>{
setsubmitjob(true);
let {url} = await uploadToS3(file);
setFileUrl(url);
setjob({
...jobarray, joblistingattachment: url
})
createjoblist();
}
const createjoblist = async () =>{
try {
const res = await axios.post("http://localhost:3000/api/job", jobarray);
setuploadstatus(true);
sleep(3500).then(() => {
router.push("/");
})
} catch (e) {
console.log("Connection to server failed. Please try again in a minute")
}
}

How to display images of products stored on aws s3 bucket

I was practicing on this tutorial
https://www.youtube.com/watch?v=NZElg91l_ms&t=1234s
It is working absolutely like a charm for me but the thing is I am storing images of products I am storing them in bucket and lets say I upload 4 images they all are uploaded.
but when I am displaying them i got access denied error as I am displaying the list and repeated request are maybe detecting it as a spam
This is how i am trying to fetch them on my react app
//rest of data is from mysql datbase (product name,price)
//100+ products
{ products.map((row)=>{
<div className="product-hero"><img src=`http://localhost:3909/images/${row.imgurl}`</div>
<div className="text-center">{row.productName}</div>
})
}
as it fetch 100+ products from db and 100 images from aws it fails
Sorry for such detailed question but in short how can i fetch all product images from my bucket
Note I am aware that i can get only one image per call so how can I get all images one by one in my scenario
//download code in my app.js
const { uploadFile, getFileStream } = require('./s3')
const app = express()
app.get('/images/:key', (req, res) => {
console.log(req.params)
const key = req.params.key
const readStream = getFileStream(key)
readStream.pipe(res)
})
//s3 file
// uploads a file to s3
function uploadFile(file) {
const fileStream = fs.createReadStream(file.path)
const uploadParams = {
Bucket: bucketName,
Body: fileStream,
Key: file.filename
}
return s3.upload(uploadParams).promise()
}
exports.uploadFile = uploadFile
// downloads a file from s3
function getFileStream(fileKey) {
const downloadParams = {
Key: fileKey,
Bucket: bucketName
}
return s3.getObject(downloadParams).createReadStream()
}
exports.getFileStream = getFileStream
It appears that your code is sending image requests to your back-end, which retrieves the objects from Amazon S3 and then serves the images in response to the request.
A much better method would be to have the URLs in the HTML page point directly to the images stored in Amazon S3. This would be highly scalable and will reduce the load on your web server.
This would require the images to be public so that the user's web browser can retrieve the images. The easiest way to do this would be to add a Bucket Policy that grants GetObject access to all users.
Alternatively, if you do not wish to make the bucket public, you can instead generate Amazon S3 pre-signed URLs, which are time-limited URLs that provides temporary access to a private object. Your back-end can calculate the pre-signed URL with a couple of lines of code, and the user's web browser will then be able to retrieve private objects from S3 for display on the page.
I did sililar S3 image handling while I handle my blog's image upload functionality, but I did not use getFileStream() to upload my image.
Because nothing should be done until the image file is fully processed, I used fs.readFile(path, callback) instead to read the data.
My way will generate Buffer Data, but AWS S3 is smart enough to know to intercept this as image. (I have only added suffix in my filename, I don't know how to apply image headers...)
This is my part of code for reference:
fs.readFile(imgPath, (err, data) => {
if (err) { throw err }
// Once file is read, upload to AWS S3
const objectParams = {
Bucket: 'yuyuichiu-personal',
Key: req.file.filename,
Body: data
}
S3.putObject(objectParams, (err, data) => {
// store image link and read image with link
}
}

Add attachments to Invoices in Xero using node js SDK or API

I am trying to add attachments to existing invoices in xero.
I am using xero-node sdk (https://github.com/XeroAPI/xero-node#readme) for this integration and they provide a method for adding attachment as follows:
this.xero.accountingApi.createInvoiceAttachmentByFileName(tenantId, invoiceid, filenameInvoice,includeOnline,readStream )
The issue here is it requires an fs.ReadStream object for readStream.
The file I am trying to upload is present in cloud and I cannot download it and store it in file system before sending to Xero. I want to send the file present in azure cloud directly to xero. I have the url of file so I can get the content as a variable by making http request but there is no option to send this content to Xero.
There is an API available for this as well (here https://developer.xero.com/documentation/api/attachments) apart from the sdk. But I am not sure how I can send the file that I have to this API in body as it expects RAW data. Are there any specific headers or encodings required to call this API with file content in body? Because this is also not working for me if I just pass the body of the response I got from azure file url, as body to this Xero Attachment API. It tries for a long time and gives timeout error.
yes you are correct. There are additional headers/manipulation you need to do to upload files.
Please checkout the sample app - we've got it queued up to show exactly how to upload files: https://github.com/XeroAPI/xero-node-oauth2-app/blob/master/src/app.ts#L1188
Something like the following should get you sorted:
import * as fs from "fs";
const path = require("path");
const mime = require("mime-types");
const totalInvoices = await xero.accountingApi.getInvoices('your-tenantId-uuid', undefined, undefined, undefined, undefined, undefined, undefined, ['PAID']);
// Attachments need to be uploaded to associated objects https://developer.xero.com/documentation/api/attachments
// CREATE ATTACHMENT
const filename = "xero-dev.png";
const pathToUpload = path.resolve(__dirname, "../path-to-your.png");
const readStream = fs.createReadStream(pathToUpload);
const contentType = mime.lookup(filename);
const fileAttached = await xero.accountingApi.createInvoiceAttachmentByFileName(req.session.activeTenant.tenantId, totalInvoices.body.invoices[0].invoiceID, filename, true, readStream, {
headers: {
"Content-Type": contentType,
},
});
I ended up adding the link to file in History and Notes section of the invoice. Even though this is not the best solution, It serves the purpose of showing invoices to the customer.
Thanks to #SerKnight for your answer.

Uploading data from firebase functions to firebase storage?

I have a website running with node.js, with the backend running on Firebase Functions. I want to store a bunch of JSON to Firebase Storage. The below snippet works just fine when I'm running on localhost, but when I upload it to Firebase functions, it says Error: EROFS: read-only file system, open 'export-stock-trades.json. Anyone know how to get around this?
fs.writeFile(fileNameToReadWrite, JSON.stringify(jsonObjToUploadAsFile), function(err){
bucket.upload(fileNameToReadWrite, {
destination: destinationPath,
});
res.send({success: true});
});
I can't tell for sure, since much of the context of your function is missing, but it looks like you function is attempting to write a file to local disk first (fs.writeFile), then upload it (bucket.upload).
On Cloud Functions, code you write only has write access to /tmp,
which is os.tmpdir() in node. Read more about that in the
documentation:
The only writeable part of the filesystem is the /tmp directory, which
you can use to store temporary files in a function instance. This is a
local disk mount point known as a "tmpfs" volume in which data written
to the volume is stored in memory. Note that it will consume memory
resources provisioned for the function.
This is probably what's causing your code to fail.
Incidentally, if the data you want to upload is in memory, you don't have to write it to a file first as you're doing now. You can instead use file.save() for that.
Another way I feel this could work is to convert the JSON file into a buffer and then perform an action like this (the code snippet below). I wrote an article on how you can do this using Google Cloud Storage but it works fine with Firebase storage. The only thing you need to change is the "service-account-key.json" file.
The link to the article can be found here: Link to article on medium
const util = require('util')
const gc = require('./config/')
const bucket = gc.bucket('all-mighti') // should be your bucket name
/**
*
* #param { File } object file object that will be uploaded
* #description - This function does the following
* - It uploads a file to the image bucket on Google Cloud
* - It accepts an object as an argument with the
* "originalname" and "buffer" as keys
*/
export const uploadImage = (file) => new Promise((resolve, reject) => {
const { originalname, buffer } = file
const blob = bucket.file(originalname.replace(/ /g, "_"))
const blobStream = blob.createWriteStream({
resumable: false
})
blobStream.on('finish', () => {
const publicUrl = format(
`https://storage.googleapis.com/${bucket.name}/${blob.name}`
)
resolve(publicUrl)
})
.on('error', () => {
reject(`Unable to upload image, something went wrong`)
})
.end(buffer)
})

Unable to upload file using googleapis on Google Drive

I have a problem with the file upload on google drive using the package googleapis (v.40):
In my web application (written in Vue js) i need to upload an image file of user 'A' into the google drive space of user 'B' using googleapis.
For this, i have created a "service-account" from the google console platform (of user B) and generated the credentials.json for the API access. (JWToken,service-to-service scenario)
In my web application, after getting the AccessToken by means the json credentials of the service account, i'm ready to upload the file. But, when i call the drive.files.create(...) api i get the following error:
Invalid multipart request with 0 mime parts.
Here some code:
...
// get google api
const {google} = require("googleapis");
const drive = google.drive("v3");
// get authorization token
let authToken = await getAuthToken() // it perfectly works
console.log(authToken)
let metadata = {
name: name,
parents: [idOfTheParentFolder]
}
let media = {
mimeType: 'image/jpeg',
body: file
}
let objImage = {
auth: authToken,
resource:metadata,
media: media,
fields: 'id',
}
drive.files.create(objImage, function (error, success) {
if (!error) {
...
}
else{
// here i got the error in question
}
})
I've tried exactly this code in a single node js file and it works fine, but it doesn't works in the web browser Vue application.
Let me say first that the API for showing the files (drive.files.list()) and the API for create a folder (drive.files.create()) they work just fine in my web application.
Any suggestion?

Resources