unable to make folders in firebase storage bucket - firebase admin - node.js

well in my case I have a list of Url and I want to download each and every file from those urls and organise it in firebase storage bucket, my problem is I am unable to make folders in firebase storage bucket through nodejs javascript/typescript.
well firebase storage offers ref() and child method to upload files inside child folder (see this) but firebase only offers those method for firebase client libraries, it is not that we can not use client library in nodejs but they have made some namespaces hidden when you connect firebase client library in nodejs and storage is one of them (see this).
I am happy they have considered frontend and backend separately because of this very reason that front and backend have whole different scenario for security and use cases, so what they have really written to use in nodejs is firebase admin and I cannot see ref and child method in official documentation which they have said is this not any other way to name the file I am uploading nor any method for making folders to go child directories, when I upload a file from my computer it get saved in the bucket root with the same name as the filename it was in my computer, even though I can make folders from firebase console manually but it will not fulfill my requirement for sure there should must be any way to make folders in programmatically.
I also tried using google cloud storage library const {Storage} = require('#google-cloud/storage');
but it turned out firebase admin and gogole cloud library shares the same document and have same interface at least in upload file part.
well I have spent my day (well night too since it is 4:46am) trying different libraries and digging into their documents which I also found little unorganised and lack of code examples.
any help would be appreciated, my code snippet so far is following which is from their doc and uploading file correctly:
import "firebase/firestore"
admin.initializeApp({
credential: admin.credential.cert("./../path-to-service account-cert.json"),
databaseURL: 'gs://bilal-assistant-xxxxx.appspot.com'
});
const quran_bucket = admin.storage().bucket("quran-bucket");
quran_bucket.upload("./my_computer_path/fatiha.mp3", {
gzip: true,
metadata: {
cacheControl: 'public, max-age=31536000',
}
}).then(uploadResponse => {
console.log(` uploaded complete.`);
}).catch((reason: any) => {
console.log("reason: ", reason);
})
All I wanted is to save the audio file in folder bucket, not in bucket root

According to the API documentation, upload() takes an UploadOptions object as the second parameter. You will want to used the documented destination property of that object to specify the name of the file in Storage:
quran_bucket.upload("./my_computer_path/fatiha.mp3", {
destination: 'audio/juz30/fatiha.mp3',
gzip: true,
metadata: {
cacheControl: 'public, max-age=31536000',
}
})
You probably don't want to bother the gzip an mp3, as it's already compressed and won't compress much further.

Related

Disable Caching on Google Cloud Storage

I have been using GCS to storage my images and also use the NodeJS package to upload these images to my bucket. I have noticed that if I frequently change an image, it either does one of the following:
It changes
It serves an old image
It doesn't change
This seems to happen pretty randomly despite setting all of the options properly and even cross-referencing that with GCS.
I upload my images like this:
const options = {
destination,
public: true,
resumable: false,
metadata: {
cacheControl: 'no-cache, max-age=0',
},
};
const file = await this.bucket.upload(tempImageLocation, options);
const { bucket, name, generation } = file[0].metadata;
const imageUrl = `https://storage.googleapis.com/${bucket}/${name}`;
I have debated whether to use the base URL you see there or use this one: https://storage.cloud.google.com.
I can't seem to figure out what I am doing wrong and how to always serve a fresh image. I have also tried ?ignoreCache=1 and other query parameters.
As per the official API documentation - accessible here - shows, you should not need the await. This might be affecting your upload sometime. If you want to use the await, you need to have your function to be async in the declaration, as showed in the second example from the documentation. Your code should look like this.
const bucketName = 'Name of a bucket, e.g. my-bucket';
const filename = 'Local file to upload, e.g. ./local/path/to/file.txt';
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
async function uploadFile() {
// Uploads a local file to the bucket
await storage.bucket(bucketName).upload(filename, {
// Support for HTTP requests made with `Accept-Encoding: gzip`
gzip: true,
// By setting the option `destination`, you can change the name of the
// object you are uploading to a bucket.
metadata: {
// Enable long-lived HTTP caching headers
// Use only if the contents of the file will never change
// (If the contents will change, use cacheControl: 'no-cache')
cacheControl: 'public, max-age=31536000',
},
});
console.log(`${filename} uploaded to ${bucketName}.`);
}
uploadFile().catch(console.error);
While this is untested, it should help you avoiding the issue with not uploading always the images.
Besides that, as explained in the official documentation of Editing Metada, you can change the way that metadata - which includes the cache control - is used and managed by your project. This way, you can change your cache configuration as well.
I also, would like to include the below link for a complete tutorial on how to send images to Cloud Storage with Node.js, in case you want to check a different approach.
Image Upload With Google Cloud Storage and Node.js
Let me know if the information helped you!
u can try change ?ignoreCache=1 to ?ignoreCache=0.

Upload large files to Google Cloud Storage using Google App Engine

I would like to upload files up to 1GB to Google Cloud Storage. I'm using Google App Engine Flexible. From what I understand, GAE has a 32MB limit on file uploads, which means I have to either upload directly to GCS or break the file into chunks.
This answer from several years ago suggests using the Blobstore API, however there doesn't seem to be an option for Node.js and the documentation also recommends using GCS instead of Blobstore to store files.
After doing some searching, it seems like using signed urls to upload directly to GCS may be the best option, but I'm having trouble finding any example code on how to do this. Is this the best way and are there any examples of how to do this using App Engine with Node.js?
Your best bet would be to use the Cloud Storage client library for Node.js to create a resumable upload.
Here's the official code example on how to create the session URI:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const myBucket = storage.bucket('my-bucket');
const file = myBucket.file('my-file');
file.createResumableUpload(function(err, uri) {
if (!err) {
// `uri` can be used to PUT data to.
}
});
//-
// If the callback is omitted, we'll return a Promise.
//-
file.createResumableUpload().then(function(data) {
const uri = data[0];
});
Edit: It seems you can nowadays use the createWriteStream method to perform an uplaod without having to worry about creation of a URL.

Upload File to Firebase Storage using Admin SDK

According to the Docs, I have to pass the filename to the function in order to upload a file.
// Uploads a local file to the bucket
await storage.bucket(bucketName).upload(filename, {
// Support for HTTP requests made with `Accept-Encoding: gzip`
gzip: true,
metadata: {
// Enable long-lived HTTP caching headers
// Use only if the contents of the file will never change
// (If the contents will change, use cacheControl: 'no-cache')
cacheControl: 'public, max-age=31536000',
},
});
I am using Firebase Admin SDK (Nodejs) in my server side code and clients send file in form-data which i get as File Objects. How then do i upload this when the function accepts only filename leading to filepath.
I want to be able to do something like this
app.use(req: Request, res: Response) {
const file = req.file;
// upload file to firebase storage using admin sdk
}
Since the Firebase Admin SDK just wraps the Cloud SDK, you can use the Cloud Storage node.js API documentation as a reference to see what it can do.
You don't have to provide a local file. You can also upload using node streams. There is a method File.createWriteStream() which gets you a WritableStream to work with. There is also File.save() which accepts multiple kinds of things, including a Buffer. There are examples of using each method here.
what you should do is use the built-in function
let's say you receive the file as imageDoc in the client side and
const imageDoc = e.target.files[0]
in node, you can now get a URL path to the object as
const imageDocUrl = URL.createObjectURL(imageDoc)
so your final code will be
// Uploads a local file to the bucket
await storage.bucket(bucketName).upload(imageDocUrl, {
// Support for HTTP requests made with "Accept-Encoding: gzip"
gzip: true,
metadata: {
// Enable long-lived HTTP caching headers
// Use only if the contents of the file will never change
// (If the contents will change, use cacheControl: 'no-cache')
cacheControl: 'public, max-age=31536000',
},
});

uploading raw data to a firebase storage bucket in a Google Cloud function?

I have a Google Cloud Function and I'm trying to save some data into Firebase storage. I'm using the firebase-admin package to interact with Firebase.
I'm reading through the documentation (https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-nodejs) and it seems to have clear instructions on how to upload files if the file is on your local computer.
// Uploads a local file to the bucket
await storage.bucket(bucketName).upload(filename, {
// Support for HTTP requests made with `Accept-Encoding: gzip`
gzip: true,
metadata: {
// Enable long-lived HTTP caching headers
// Use only if the contents of the file will never change
// (If the contents will change, use cacheControl: 'no-cache')
cacheControl: 'public, max-age=31536000',
},
});
In my case through, I have a Google Cloud Function which will be fed some data in the postbody and I want to save that data over to the Firebase bucket.
How do I do this? The upload method only seems to specify a filepath and doesn't have a data parameter.
Use the save method:
storage
.bucket('gs://myapp.appspot.com')
.file('dest/path/in/bucket')
.save('This will get stored in my storage bucket.', {
gzip: true,
contentType: 'text/plain'
})
The API docs for Bucket.upload() state that it's just a wrapper around File.createWriteStream(). This method will create a WritableStream that you can use upload data that's already in memory. You will deal with this stream just like you would any other stream in Node. There is sample code in the API docs.

Google Cloud Storage creating content links with inconsistent behavior

I'm working on a project using Google Cloud Storage to allow users to upload media files into a predefined bucket using Node.js. I've been testing with small .jpg files. I also used gsutil to set bucket permissions to public.
At first, all files generated links that downloaded the file. Upon investigation of the docs, I learned that I could explicitly set the Content-Type of each file after upload using the gsutil CLI. When I used this procedure to set the filetype to 'image/jpeg', the link behavior changed to display the image in the browser. But this only worked if the link had not been previously clicked prior to updating the metadata with gsutil. I thought that this might be due to browser caching, but the behavior was duplicated in an incognito browser.
Using gsutil to set the mime type would be impractical at any rate, so I modified the code in my node server POST function to set the metadata at upload time using an npm module called mime. Here is the code:
app.post('/api/assets', multer.single('qqfile'), function (req, res, next) {
console.log(req.file);
if (!req.file) {
return ('400 - No file uploaded.');
}
// Create a new blob in the bucket and upload the file data.
var blob = bucket.file(req.file.originalname);
var blobStream = blob.createWriteStream();
var metadata = {
contentType: mime.lookup(req.file.originalname)
};
blobStream.on('error', function (err) {
return next(err);
});
blobStream.on('finish', function () {
blob.setMetadata(metadata, function(err, response){
console.log(response);
// The public URL can be used to directly access the file via HTTP.
var publicUrl = format(
'https://storage.googleapis.com/%s/%s',
bucket.name, blob.name);
res.status(200).send(
{
'success': true,
'publicUrl': publicUrl,
'mediaLink': response.mediaLink
});
});
});
blobStream.end(req.file.buffer);
});
This seems to work, from the standpoint that it does actually set the Content-Type on upload, and that is correctly reflected in the response object as well as the Cloud Storage console. The issue is that some of the links returned as publicUrl cause a file download, and others cause a browser load of the image. Ideally I would like to have both options available, but I am unable to see any difference in the stored files or their metadata.
What am I missing here?
Google Cloud Storage makes no assumptions about the content-type of uploaded objects. If you don't specify, GCS will simply assign a type of "application/octet-stream".
The command-line tool gsutil, however, is smarter, and will attach the right Content-Type to files being uploaded in most cases, JPEGs included.
Now, there are two reasons why your browser is likely to download images rather than display them. First, if the Content-Type is set to "application/octet-stream", most browsers will download the results as a file rather than display them. This was likely happening in your case.
The second reason is if the server responds with a 'Content-Disposition: attachment' header. This doesn't generally happen when you fetch GCS objects from the host "storage.googleapis.com" as you are doing above, but it can if you, for instance, explicitly specified a contentDisposition for the object that you've uploaded.
For this reason I suspect that some of your objects don't have an "image/jpeg" content type. You could go through and set them all with gsutil like so: gsutil -m setmeta 'Content-Type:image/jpeg' gs://myBucketName/**

Resources