Related
After uploading a file in Firebase Storage with Functions for Firebase, I'd like to get the download url of the file.
I have this :
...
return bucket
.upload(fromFilePath, {destination: toFilePath})
.then((err, file) => {
// Get the download url of file
});
The object file has a lot of parameters. Even one named mediaLink. However, if I try to access this link, I get this error :
Anonymous users does not have storage.objects.get access to object ...
Can somebody tell me how to get the public download Url?
Thank you
You'll need to generate a signed URL using getSignedURL via the #google-cloud/storage NPM module.
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
// ...
const bucket = gcs.bucket(bucket);
const file = bucket.file(fileName);
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
// signedUrls[0] contains the file's public URL
});
You'll need to initialize #google-cloud/storage with your service account credentials as the application default credentials will not be sufficient.
UPDATE: The Cloud Storage SDK can now be accessed via the Firebase Admin SDK, which acts as a wrapper around #google-cloud/storage. The only way it will is if you either:
Init the SDK with a special service account, typically through a second, non-default instance.
Or, without a service account, by giving the default App Engine service account the "signBlob" permission.
This answer will summarize the options for getting a download URL when uploading a file to Google/Firebase Cloud Storage. There are three types of download URLS:
Token download URLs, which are persistent and have security features
Signed download URLs, which are temporary and have security features
Public download URLs, which are persistent and lack security
There are two ways to get a token download URL. Signed and public download URLs each have only one way to get them.
Token URL method #1: From the Firebase Storage Console
You can get the download URL from Firebase Storage console:
The download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/languagetwo-cd94d.appspot.com/o/Audio%2FEnglish%2FUnited_States-OED-0%2Fabout.mp3?alt=media&token=489c48b3-23fb-4270-bd85-0a328d2808e5
The first part is a standard path to your file. At the end is the token. This download URL is permanent, i.e., it won't expire, although you can revoke it.
Token URL method #2: From the Front End
The documentation tells us to use getDownloadURL():
let url = await firebase.storage().ref('Audio/English/United_States-OED-' + i +'/' + $scope.word.word + ".mp3").getDownloadURL();
This gets the same download URL that you can get from your Firebase Storage console. This method is easy but requires that you know the path to your file, which in my app is difficult. You could upload files from the front end, but this would expose your credentials to anyone who downloads your app. So for most projects you'll want to upload your files from your Cloud Functions, then get the download URL and save it to your database along with other data about your file.
I can't find a way to get the token download URL when I write a file to Storage from a Cloud Function (because I can't find a way to tell the front end that a file has written to Storage), but what works for me is to write a file to a publicly available URL, write the publicly available URL to Firebase, then when my Angular front end gets the download URL from Firebase it also runs getDownloadURL(), which has the token, then compares the download URL in Firestore to the token download URL, and if they don't match then it updates the token download URL in place of the publicly available URL in Firestore. This exposes your file to the public only once.
This is easier than it sounds. The following code iterates through an array of Storage download URLs and replaces publicly available download URLs with token download URLs.
const storage = getStorage();
var audioFiles: string[] = [];
if (this.pronunciationArray[0].pronunciation != undefined) {
for (const audioFile of this.pronunciationArray[0].audioFiles) { // for each audio file in array
let url = await getDownloadURL(ref(storage, audioFile)); // get the download url with token
if (audioFile !== url) { // download URLs don't match
audioFiles.push(url);
} // end inner if
}; // end for of loop
if (audioFiles.length > 0) { // update Firestore only if we have new download URLs
await updateDoc(doc(this.firestore, 'Dictionaries/' + this.l2Language.long + '/Words/' + word + '/Pronunciations/', this.pronunciationArray[0].pronunciation), {
audioFiles: audioFiles
});
}
} // end outer if
You're thinking, "I'll return the Storage location from my Cloud Function to my front end and then use the location with getDownloadURL() to write the token download URL to Firestore." That won't work because Cloud Functions can only return synchronous results. Async operations return null.
"No problem," you say. "I'll set up an Observer on Storage, get the location from the Observer, and then use the location with getDownloadURL() to write the token download URL to Firestore." No dice. Firestore has Observers. Storage doesn't have Observers.
"How about," you say, "calling listAll() from my front end, getting a list of all my Storage files, then calling the metadata for each file, and extracting the download URL and token for each file, and then writing these to Firestore?" Good try, but Storage metadata doesn't include the download URL or token.
Signed URL method #1: getSignedUrl() for Temporary Download URLs
getSignedUrl() is easy to use from a Cloud Function:
function oedPromise() {
return new Promise(function(resolve, reject) {
http.get(oedAudioURL, function(response) {
response.pipe(file.createWriteStream(options))
.on('error', function(error) {
console.error(error);
reject(error);
})
.on('finish', function() {
file.getSignedUrl(config, function(err, url) {
if (err) {
console.error(err);
return;
} else {
resolve(url);
}
});
});
});
});
}
A signed download URL looks like this:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio%2FSpanish%2FLatin_America-Sofia-Female-IBM%2Faqu%C3%AD.mp3?GoogleAccessId=languagetwo-cd94d%40appspot.gserviceaccount.com&Expires=4711305600&Signature=WUmABCZIlUp6eg7dKaBFycuO%2Baz5vOGTl29Je%2BNpselq8JSl7%2BIGG1LnCl0AlrHpxVZLxhk0iiqIejj4Qa6pSMx%2FhuBfZLT2Z%2FQhIzEAoyiZFn8xy%2FrhtymjDcpbDKGZYjmWNONFezMgYekNYHi05EPMoHtiUDsP47xHm3XwW9BcbuW6DaWh2UKrCxERy6cJTJ01H9NK1wCUZSMT0%2BUeNpwTvbRwc4aIqSD3UbXSMQlFMxxWbPvf%2B8Q0nEcaAB1qMKwNhw1ofAxSSaJvUdXeLFNVxsjm2V9HX4Y7OIuWwAxtGedLhgSleOP4ErByvGQCZsoO4nljjF97veil62ilaQ%3D%3D
The signed URL has an expiration date and long signature. The documentation for the command line gsutil signurl -d says that signed URLs are temporary: the default expiration is one hour and the maximum expiration is seven days.
I'm going to rant here that the getSignedUrl documentation never says that your signed URL will expire in a week. The documentation code has 3-17-2025 as the expiration date, suggesting that you can set the expiration years in the future. My app worked perfectly, and then crashed a week later. The error message said that the signatures didn't match, not that the download URL had expired. I made various changes to my code, and everything worked...until it all crashed a week later. This went on for more than a month of frustration. Is the 3-17-2025 date an inside joke? Like the gold coins of a leprechaun that vanish when the leprechaun is out of sight, a St. Patrick's Day expiry date years in the future vanishes in two weeks, just when you thought your code was bug-free.
Public URL #1: Make Your File Publicly Available
You can set the permissions on your file to public read, as explained in the documentation. This can be done from the Cloud Storage Browser or from your Node server. You can make one file public or a directory or your entire Storage database. Here's the Node code:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
predefinedAcl: 'publicRead',
contentType: 'audio/' + audioType,
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
console.log("webm audio file written.");
resolve();
})
.catch(error => console.error(error));
});
The result will look like this in your Cloud Storage Browser:
Anyone can then use the standard path to download your file:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio/English/United_States-OED-0/system.mp3
Another way to make a file public is to use the method makePublic(). I haven't been able to get this to work, it's tricky to get the bucket and file paths right.
An interesting alternative is to use Access Control Lists. You can make a file available only to users whom you put on a list, or use authenticatedRead to make the file available to anyone who is logged in from a Google account. If there were an option "anyone who logged into my app using Firebase Auth" I would use this, as it would limit access to only my users.
Deprecated: Build Your Own Download URL with firebaseStorageDownloadTokens
Several answers describe an undocumented Google Storage object property firebaseStorageDownloadTokens. This was never an official Google Cloud Storage feature and it no longer works. Here's how it used to work.
You told Storage the token you wanted to use. You then generated a token with the uuid Node module. Four lines of code and you could build your own download URL, the same download URL you get from the console or getDownloadURL(). The four lines of code are:
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
Here's the code in context:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
contentType: 'audio/' + audioType,
metadata: {
metadata: {
firebaseStorageDownloadTokens: uuid,
}
}
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
})
.catch(error => console.error(error));
});
That's not a typo--you have to nest firebaseStorageDownloadTokens in double layers of metadata:!
Here's an example on how to specify the download token on upload:
const UUID = require("uuid-v4");
const fbId = "<YOUR APP ID>";
const fbKeyFile = "./YOUR_AUTH_FIlE.json";
const gcs = require('#google-cloud/storage')({keyFilename: fbKeyFile});
const bucket = gcs.bucket(`${fbId}.appspot.com`);
var upload = (localFile, remoteFile) => {
let uuid = UUID();
return bucket.upload(localFile, {
destination: remoteFile,
uploadType: "media",
metadata: {
contentType: 'image/png',
metadata: {
firebaseStorageDownloadTokens: uuid
}
}
})
.then((data) => {
let file = data[0];
return Promise.resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent(file.name) + "?alt=media&token=" + uuid);
});
}
then call with
upload(localPath, remotePath).then( downloadURL => {
console.log(downloadURL);
});
The key thing here is that there is a metadata object nested within the metadata option property. Setting firebaseStorageDownloadTokens to a uuid-v4 value will tell Cloud Storage to use that as its public auth token.
Many thanks to #martemorfosis
If you're working on a Firebase project, you can create signed URLs in a Cloud Function without including other libraries or downloading a credentials file. You just need to enable the IAM API and add a role to your existing service account (see below).
Initialize the admin library and get a file reference as your normally would:
import * as functions from 'firebase-functions'
import * as admin from 'firebase-admin'
admin.initializeApp(functions.config().firebase)
const myFile = admin.storage().bucket().file('path/to/my/file')
You then generate a signed URL with
myFile.getSignedUrl({action: 'read', expires: someDateObj}).then(urls => {
const signedUrl = urls[0]
})
Make sure your Firebase service account has sufficient permissions to run this
Go to the Google API console and enable the IAM API (https://console.developers.google.com/apis/api/iam.googleapis.com/overview)
Still in the API console, go to the main menu, "IAM & admin" -> "IAM"
Click edit for the "App Engine default service account" role
Click "Add another role", and add the one called "Service Account Token Creator"
Save and wait a minute for the changes to propagate
With a vanilla Firebase config, the first time you run the above code you'll get an error Identity and Access Management (IAM) API has not been used in project XXXXXX before or it is disabled.. If you follow the link in the error message and enable the IAM API, you'll get another error: Permission iam.serviceAccounts.signBlob is required to perform this operation on service account my-service-account. Adding the Token Creator role fixes this second permission issue.
You should avoid harcoding URL prefix in your code, especially when there are alternatives. I suggest using the option predefinedAcl: 'publicRead' when uploading a file with Cloud Storage NodeJS 1.6.x or +:
const options = {
destination: yourFileDestination,
predefinedAcl: 'publicRead'
};
bucket.upload(attachment, options);
Then, getting the public URL is as simple as:
bucket.upload(attachment, options).then(result => {
const file = result[0];
return file.getMetadata();
}).then(results => {
const metadata = results[0];
console.log('metadata=', metadata.mediaLink);
}).catch(error => {
console.error(error);
});
With the recent changes in the functions object response you can get everything you need to "stitch" together the download URL like so:
const img_url = 'https://firebasestorage.googleapis.com/v0/b/[YOUR BUCKET]/o/'
+ encodeURIComponent(object.name)
+ '?alt=media&token='
+ object.metadata.firebaseStorageDownloadTokens;
console.log('URL',img_url);
This is what I currently use, it's simple and it works flawlessly.
You don't need to do anything with Google Cloud. It works out of the box with Firebase..
// Save the base64 to storage.
const file = admin.storage().bucket('url found on the storage part of firebase').file(`profile_photos/${uid}`);
await file.save(base64Image, {
metadata: {
contentType: 'image/jpeg',
},
predefinedAcl: 'publicRead'
});
const metaData = await file.getMetadata()
const url = metaData[0].mediaLink
EDIT:
Same example, but with upload:
await bucket.upload(fromFilePath, {destination: toFilePath});
file = bucket.file(toFilePath);
metaData = await file.getMetadata()
const trimUrl = metaData[0].mediaLink
#update:
no need to make two different call in upload method to get the metadata:
let file = await bucket.upload(fromFilePath, {destination: toFilePath});
const trimUrl = file[0].metaData.mediaLink
For those wondering where the Firebase Admin SDK serviceAccountKey.json file should go. Just place it in the functions folder and deploy as usual.
It still baffles me why we can't just get the download url from the metadata like we do in the Javascript SDK. Generating a url that will eventually expire and saving it in the database is not desirable.
One method I'm using with success is to set a UUID v4 value to a key named firebaseStorageDownloadTokens in the metadata of the file after it finishes uploading and then assemble the download URL myself following the structure Firebase uses to generate these URLs, eg:
https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[FILE_PATH]?alt=media&token=[THE_TOKEN_YOU_CREATED]
I don't know how much "safe" is to use this method (given that Firebase could change how it generates the download URLs in the future ) but it is easy to implement.
Sorry but i can't post a comment to your question above because of missing reputation, so I will include it in this answer.
Do as stated above by generating a signed Url, but instead of using the service-account.json I think you have to use the serviceAccountKey.json which you can generate at (replace YOURPROJECTID accordingly)
https://console.firebase.google.com/project/YOURPROJECTID/settings/serviceaccounts/adminsdk
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'serviceAccountKey.json'});
// ...
const bucket = gcs.bucket(bucket);
// ...
return bucket.upload(tempLocalFile, {
destination: filePath,
metadata: {
contentType: 'image/jpeg'
}
})
.then((data) => {
let file = data[0]
file.getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// handle url
})
I can't comment on the answer James Daniels gave, but I think this is very Important to read.
Giving out a signed URL Like he did seems for many cases pretty bad and possible Dangerous.
According to the documentation of Firebase the signed url expires after some time, so adding that to your databse will lead to a empty url after a certain timeframe
It may be that misunderstood the Documentation there and the signed url doesn't expire, which would have some security issues as a result.
The Key seems to be the same for every uploaded file. This means once you got the url of one file, someone could easily access files that he is not suposed to access, just by knowing their names.
If i missunderstood that then i would lvoe to be corrected.
Else someone should probably Update the above named solution.
If i may be wrong there
If you use the predefined access control lists value of 'publicRead', you can upload the file and access it with a very simple url structure:
// Upload to GCS
const opts: UploadOptions = {
gzip: true,
destination: dest, // 'someFolder/image.jpg'
predefinedAcl: 'publicRead',
public: true
};
return bucket.upload(imagePath, opts);
You can then construct the url like so:
const storageRoot = 'https://storage.googleapis.com/';
const bucketName = 'myapp.appspot.com/'; // CHANGE TO YOUR BUCKET NAME
const downloadUrl = storageRoot + bucketName + encodeURIComponent(dest);
Use file.publicUrl()
Async/Await
const bucket = storage.bucket('bucket-name');
const uploadResponse = await bucket.upload('image-name.jpg');
const downloadUrl = uploadResponse[0].publicUrl();
Callback
const bucket = storage.bucket('bucket-name');
bucket.upload('image-name.jpg', (err, file) => {
if(!file) {
throw err;
}
const downloadUrl = file.publicUrl();
})
The downloadUrl will be "https://storage.googleapis.com/bucket-name/image-name.jpg".
Please note that in order for the above code to work, you have to make the bucket or file public. To do so, follow the instructions here https://cloud.google.com/storage/docs/access-control/making-data-public. Also, I imported the #google-cloud/storage package directly not through the Firebase SDK.
I had the same issue, however, I was looking at the code of the firebase function example instead of the README. And Answers on this thread didn't help either...
You can avoid passing the config file by doing the following:
Go to your project's Cloud Console > IAM & admin > IAM, Find the App
Engine default service account and add the Service Account Token
Creator role to that member. This will allow your app to create signed
public URLs to the images.
source: Automatically Generate Thumbnails function README
Your role for app engine should look like this:
This works if you just need a public file with a simple URL. Note that this may overrule your Firebase storage rules.
bucket.upload(file, function(err, file) {
if (!err) {
//Make the file public
file.acl.add({
entity: 'allUsers',
role: gcs.acl.READER_ROLE
}, function(err, aclObject) {
if (!err) {
var URL = "https://storage.googleapis.com/[your bucket name]/" + file.id;
console.log(URL);
} else {
console.log("Failed to set permissions: " + err);
}
});
} else {
console.log("Upload failed: " + err);
}
});
Without signedURL() using makePublic()
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp()
var bucket = admin.storage().bucket();
// --- [Above] for admin related operations, [Below] for making a public url from a GCS uploaded object
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
exports.testDlUrl = functions.storage.object().onFinalize(async (objMetadata) => {
console.log('bucket, file', objMetadata.bucket + ' ' + objMetadata.name.split('/').pop()); // assuming file is in folder
return storage.bucket(objMetadata.bucket).file(objMetadata.name).makePublic().then(function (data) {
return admin.firestore().collection('publicUrl').doc().set({ publicUrl: 'https://storage.googleapis.com/' + objMetadata.bucket + '/' + objMetadata.name }).then(writeResult => {
return console.log('publicUrl', writeResult);
});
});
});
answer by https://stackoverflow.com/users/269447/laurent works best
const uploadOptions: UploadOptions = {
public: true
};
const bucket = admin.storage().bucket();
[ffile] = await bucket.upload(oPath, uploadOptions);
ffile.metadata.mediaLink // this is what you need
I saw this on the admin storage doc
const options = {
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
};
// Get a v4 signed URL for reading the file
const [url] = await storage
.bucket(bucketName)
.file(filename)
.getSignedUrl(options);
console.log('Generated GET signed URL:');
console.log(url);
console.log('You can use this URL with any user agent, for example:');
console.log(`curl '${url}'`);
For those who are using Firebase SDK andadmin.initializeApp:
1 - Generate a Private Key and place in /functions folder.
2 - Configure your code as follows:
const serviceAccount = require('../../serviceAccountKey.json');
try { admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) })); } catch (e) {}
Documentation
The try/catch is because I'm using a index.js that imports other files and creates one function to each file. If you're using a single index.js file with all functions, you should be ok with admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) }));.
As of firebase 6.0.0 I was able to access the storage directly with the admin like this:
const bucket = admin.storage().bucket();
So I didn't need to add a service account. Then setting the UUID as referenced above worked for getting the firebase url.
This is the best I came up. It is redundant, but the only reasonable solution that worked for me.
await bucket.upload(localFilePath, {destination: uploadPath, public: true});
const f = await bucket.file(uploadPath)
const meta = await f.getMetadata()
console.log(meta[0].mediaLink)
I already post my ans... in below URL Where you can get full code with solution
How do I upload a base64 encoded image (string) directly to a Google Cloud Storage bucket using Node.js?
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
const os = require('os')
const path = require('path')
const cors = require('cors')({ origin: true })
const Busboy = require('busboy')
const fs = require('fs')
var admin = require("firebase-admin");
var serviceAccount = {
"type": "service_account",
"project_id": "xxxxxx",
"private_key_id": "xxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\jr5x+4AvctKLonBafg\nElTg3Cj7pAEbUfIO9I44zZ8=\n-----END PRIVATE KEY-----\n",
"client_email": "xxxx#xxxx.iam.gserviceaccount.com",
"client_id": "xxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-5rmdm%40xxxxx.iam.gserviceaccount.com"
}
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "xxxxx-xxxx" // use your storage bucket name
});
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('/uploadFile', (req, response) => {
response.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers })
let uploadData = null
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(os.tmpdir(), filename)
uploadData = { file: filepath, type: mimetype }
console.log("-------------->>",filepath)
file.pipe(fs.createWriteStream(filepath))
})
busboy.on('finish', () => {
const bucket = admin.storage().bucket();
bucket.upload(uploadData.file, {
uploadType: 'media',
metadata: {
metadata: { firebaseStorageDownloadTokens: uuid,
contentType: uploadData.type,
},
},
})
.catch(err => {
res.status(500).json({
error: err,
})
})
})
busboy.end(req.rawBody)
});
exports.widgets = functions.https.onRequest(app);
For those trying to use the token parameter to share the file and would like to use gsutil command, here is how I did it:
First you need to authenticate by running: gcloud auth
Then run:
gsutil setmeta -h "x-goog-meta-firebaseStorageDownloadTokens:$FILE_TOKEN" gs://$FIREBASE_REPO/$FILE_NAME
Then you can download the file with the following link:
https://firebasestorage.googleapis.com/v0/b/$FIREBASE_REPO/o/$FILE_NAME?alt=media&token=$FILE_TOKEN
From the Admin SDKs, you cannot retrieve the download token generated by Firebase of an uploaded file, but you can set that token when uploading by adding it in the metadata.
For those who are working on Python SDK. This is the way to do it:
from firebase_admin import storage
from uuid import uuid4
bucket = storage.bucket()
blob = bucket.blob(path_to_file)
token = str(uuid4()) # Random ID
blob.metadata = {
"firebaseStorageDownloadTokens": token
}
blob.upload_from_file(file)
You have now uploaded the file and got the URL token. You could now save the token (or even the full download URL) into your database (e.g. Firestore) and send it to the client when the file is requested and then making the client itself retrieve the file.
The full download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/{bucket_name}/o/{file_name}?alt=media&token={token}
If you are getting error:
Google Cloud Functions: require(…) is not a function
try this:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({keyFilename: 'service-account-key.json'});
const bucket = storage.bucket(object.bucket);
const file = bucket.file(filePath);
.....
This question already has answers here:
Get Download URL from file uploaded with Cloud Functions for Firebase
(25 answers)
Closed 1 year ago.
I uploaded image using Firebase sdk on Flutter, on Flutter side i can usually just call getDownloadUrl() on the reference, then i want to get that url on my Cloud Function trigger so if image is uploaded on Firebase Storage i would get the download URL and post it on firestore.
I have tried getting metadata mediaLink but it is not the same url which i can found manually from browsing firebase storage console.
exports.generateThumbnail = functions.storage.object().onFinalize(async (object) => {
const db = admin.firestore();
const filePath = object.name;
const contentType = object.contentType;
const fileDir = path.dirname(filePath);
const fileName = path.basename(filePath);
const jobId = fileName.replace("poster_","").replace(".png","");
const bucket = admin.storage().bucket(object.bucket);
const file = bucket.file(filePath);
const posterMetadata = await file.getMetadata();
const posterFileUrl = posterMetadata[0].mediaLink;
return functions.logger.log('url: '+posterFileUrl );
}
Using that url i got "Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object." error,
How to get this generated link, which we can found on firebase storage console
I understand that you want, in a Cloud Function triggered when a new file is uploaded to Cloud Storage, to get the signed URL of this file.
The following will do the trick:
exports.generateFileURL = functions.storage.object().onFinalize(async object => {
try {
const bucket = admin.storage().bucket(object.bucket);
const file = bucket.file(object.name);
const signedURLconfig = { action: 'read', expires: '01-01-2030' };
const signedURLArray = await file.getSignedUrl(signedURLconfig);
const url = signedURLArray[0];
// Do whatever you want with the signed URL
// e.g. save it to Firestore
await admin.firestore().collection('signedURLs').add({ fileName: object.name, signedURL: url })
return null;
} catch (error) {
console.log(error);
return null;
}
});
We use the getSignedUrl() method from the Cloud Storage Node.js Client API. You'll find in the documentation more details on the possible properties and values of the configuration object passed to the method (i.e. signedURLconfig in the above example).
After uploading a file in Firebase Storage with Functions for Firebase, I'd like to get the download url of the file.
I have this :
...
return bucket
.upload(fromFilePath, {destination: toFilePath})
.then((err, file) => {
// Get the download url of file
});
The object file has a lot of parameters. Even one named mediaLink. However, if I try to access this link, I get this error :
Anonymous users does not have storage.objects.get access to object ...
Can somebody tell me how to get the public download Url?
Thank you
You'll need to generate a signed URL using getSignedURL via the #google-cloud/storage NPM module.
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
// ...
const bucket = gcs.bucket(bucket);
const file = bucket.file(fileName);
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
// signedUrls[0] contains the file's public URL
});
You'll need to initialize #google-cloud/storage with your service account credentials as the application default credentials will not be sufficient.
UPDATE: The Cloud Storage SDK can now be accessed via the Firebase Admin SDK, which acts as a wrapper around #google-cloud/storage. The only way it will is if you either:
Init the SDK with a special service account, typically through a second, non-default instance.
Or, without a service account, by giving the default App Engine service account the "signBlob" permission.
This answer will summarize the options for getting a download URL when uploading a file to Google/Firebase Cloud Storage. There are three types of download URLS:
Token download URLs, which are persistent and have security features
Signed download URLs, which are temporary and have security features
Public download URLs, which are persistent and lack security
There are two ways to get a token download URL. Signed and public download URLs each have only one way to get them.
Token URL method #1: From the Firebase Storage Console
You can get the download URL from Firebase Storage console:
The download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/languagetwo-cd94d.appspot.com/o/Audio%2FEnglish%2FUnited_States-OED-0%2Fabout.mp3?alt=media&token=489c48b3-23fb-4270-bd85-0a328d2808e5
The first part is a standard path to your file. At the end is the token. This download URL is permanent, i.e., it won't expire, although you can revoke it.
Token URL method #2: From the Front End
The documentation tells us to use getDownloadURL():
let url = await firebase.storage().ref('Audio/English/United_States-OED-' + i +'/' + $scope.word.word + ".mp3").getDownloadURL();
This gets the same download URL that you can get from your Firebase Storage console. This method is easy but requires that you know the path to your file, which in my app is difficult. You could upload files from the front end, but this would expose your credentials to anyone who downloads your app. So for most projects you'll want to upload your files from your Cloud Functions, then get the download URL and save it to your database along with other data about your file.
I can't find a way to get the token download URL when I write a file to Storage from a Cloud Function (because I can't find a way to tell the front end that a file has written to Storage), but what works for me is to write a file to a publicly available URL, write the publicly available URL to Firebase, then when my Angular front end gets the download URL from Firebase it also runs getDownloadURL(), which has the token, then compares the download URL in Firestore to the token download URL, and if they don't match then it updates the token download URL in place of the publicly available URL in Firestore. This exposes your file to the public only once.
This is easier than it sounds. The following code iterates through an array of Storage download URLs and replaces publicly available download URLs with token download URLs.
const storage = getStorage();
var audioFiles: string[] = [];
if (this.pronunciationArray[0].pronunciation != undefined) {
for (const audioFile of this.pronunciationArray[0].audioFiles) { // for each audio file in array
let url = await getDownloadURL(ref(storage, audioFile)); // get the download url with token
if (audioFile !== url) { // download URLs don't match
audioFiles.push(url);
} // end inner if
}; // end for of loop
if (audioFiles.length > 0) { // update Firestore only if we have new download URLs
await updateDoc(doc(this.firestore, 'Dictionaries/' + this.l2Language.long + '/Words/' + word + '/Pronunciations/', this.pronunciationArray[0].pronunciation), {
audioFiles: audioFiles
});
}
} // end outer if
You're thinking, "I'll return the Storage location from my Cloud Function to my front end and then use the location with getDownloadURL() to write the token download URL to Firestore." That won't work because Cloud Functions can only return synchronous results. Async operations return null.
"No problem," you say. "I'll set up an Observer on Storage, get the location from the Observer, and then use the location with getDownloadURL() to write the token download URL to Firestore." No dice. Firestore has Observers. Storage doesn't have Observers.
"How about," you say, "calling listAll() from my front end, getting a list of all my Storage files, then calling the metadata for each file, and extracting the download URL and token for each file, and then writing these to Firestore?" Good try, but Storage metadata doesn't include the download URL or token.
Signed URL method #1: getSignedUrl() for Temporary Download URLs
getSignedUrl() is easy to use from a Cloud Function:
function oedPromise() {
return new Promise(function(resolve, reject) {
http.get(oedAudioURL, function(response) {
response.pipe(file.createWriteStream(options))
.on('error', function(error) {
console.error(error);
reject(error);
})
.on('finish', function() {
file.getSignedUrl(config, function(err, url) {
if (err) {
console.error(err);
return;
} else {
resolve(url);
}
});
});
});
});
}
A signed download URL looks like this:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio%2FSpanish%2FLatin_America-Sofia-Female-IBM%2Faqu%C3%AD.mp3?GoogleAccessId=languagetwo-cd94d%40appspot.gserviceaccount.com&Expires=4711305600&Signature=WUmABCZIlUp6eg7dKaBFycuO%2Baz5vOGTl29Je%2BNpselq8JSl7%2BIGG1LnCl0AlrHpxVZLxhk0iiqIejj4Qa6pSMx%2FhuBfZLT2Z%2FQhIzEAoyiZFn8xy%2FrhtymjDcpbDKGZYjmWNONFezMgYekNYHi05EPMoHtiUDsP47xHm3XwW9BcbuW6DaWh2UKrCxERy6cJTJ01H9NK1wCUZSMT0%2BUeNpwTvbRwc4aIqSD3UbXSMQlFMxxWbPvf%2B8Q0nEcaAB1qMKwNhw1ofAxSSaJvUdXeLFNVxsjm2V9HX4Y7OIuWwAxtGedLhgSleOP4ErByvGQCZsoO4nljjF97veil62ilaQ%3D%3D
The signed URL has an expiration date and long signature. The documentation for the command line gsutil signurl -d says that signed URLs are temporary: the default expiration is one hour and the maximum expiration is seven days.
I'm going to rant here that the getSignedUrl documentation never says that your signed URL will expire in a week. The documentation code has 3-17-2025 as the expiration date, suggesting that you can set the expiration years in the future. My app worked perfectly, and then crashed a week later. The error message said that the signatures didn't match, not that the download URL had expired. I made various changes to my code, and everything worked...until it all crashed a week later. This went on for more than a month of frustration. Is the 3-17-2025 date an inside joke? Like the gold coins of a leprechaun that vanish when the leprechaun is out of sight, a St. Patrick's Day expiry date years in the future vanishes in two weeks, just when you thought your code was bug-free.
Public URL #1: Make Your File Publicly Available
You can set the permissions on your file to public read, as explained in the documentation. This can be done from the Cloud Storage Browser or from your Node server. You can make one file public or a directory or your entire Storage database. Here's the Node code:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
predefinedAcl: 'publicRead',
contentType: 'audio/' + audioType,
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
console.log("webm audio file written.");
resolve();
})
.catch(error => console.error(error));
});
The result will look like this in your Cloud Storage Browser:
Anyone can then use the standard path to download your file:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio/English/United_States-OED-0/system.mp3
Another way to make a file public is to use the method makePublic(). I haven't been able to get this to work, it's tricky to get the bucket and file paths right.
An interesting alternative is to use Access Control Lists. You can make a file available only to users whom you put on a list, or use authenticatedRead to make the file available to anyone who is logged in from a Google account. If there were an option "anyone who logged into my app using Firebase Auth" I would use this, as it would limit access to only my users.
Deprecated: Build Your Own Download URL with firebaseStorageDownloadTokens
Several answers describe an undocumented Google Storage object property firebaseStorageDownloadTokens. This was never an official Google Cloud Storage feature and it no longer works. Here's how it used to work.
You told Storage the token you wanted to use. You then generated a token with the uuid Node module. Four lines of code and you could build your own download URL, the same download URL you get from the console or getDownloadURL(). The four lines of code are:
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
Here's the code in context:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
contentType: 'audio/' + audioType,
metadata: {
metadata: {
firebaseStorageDownloadTokens: uuid,
}
}
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
})
.catch(error => console.error(error));
});
That's not a typo--you have to nest firebaseStorageDownloadTokens in double layers of metadata:!
Here's an example on how to specify the download token on upload:
const UUID = require("uuid-v4");
const fbId = "<YOUR APP ID>";
const fbKeyFile = "./YOUR_AUTH_FIlE.json";
const gcs = require('#google-cloud/storage')({keyFilename: fbKeyFile});
const bucket = gcs.bucket(`${fbId}.appspot.com`);
var upload = (localFile, remoteFile) => {
let uuid = UUID();
return bucket.upload(localFile, {
destination: remoteFile,
uploadType: "media",
metadata: {
contentType: 'image/png',
metadata: {
firebaseStorageDownloadTokens: uuid
}
}
})
.then((data) => {
let file = data[0];
return Promise.resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent(file.name) + "?alt=media&token=" + uuid);
});
}
then call with
upload(localPath, remotePath).then( downloadURL => {
console.log(downloadURL);
});
The key thing here is that there is a metadata object nested within the metadata option property. Setting firebaseStorageDownloadTokens to a uuid-v4 value will tell Cloud Storage to use that as its public auth token.
Many thanks to #martemorfosis
If you're working on a Firebase project, you can create signed URLs in a Cloud Function without including other libraries or downloading a credentials file. You just need to enable the IAM API and add a role to your existing service account (see below).
Initialize the admin library and get a file reference as your normally would:
import * as functions from 'firebase-functions'
import * as admin from 'firebase-admin'
admin.initializeApp(functions.config().firebase)
const myFile = admin.storage().bucket().file('path/to/my/file')
You then generate a signed URL with
myFile.getSignedUrl({action: 'read', expires: someDateObj}).then(urls => {
const signedUrl = urls[0]
})
Make sure your Firebase service account has sufficient permissions to run this
Go to the Google API console and enable the IAM API (https://console.developers.google.com/apis/api/iam.googleapis.com/overview)
Still in the API console, go to the main menu, "IAM & admin" -> "IAM"
Click edit for the "App Engine default service account" role
Click "Add another role", and add the one called "Service Account Token Creator"
Save and wait a minute for the changes to propagate
With a vanilla Firebase config, the first time you run the above code you'll get an error Identity and Access Management (IAM) API has not been used in project XXXXXX before or it is disabled.. If you follow the link in the error message and enable the IAM API, you'll get another error: Permission iam.serviceAccounts.signBlob is required to perform this operation on service account my-service-account. Adding the Token Creator role fixes this second permission issue.
You should avoid harcoding URL prefix in your code, especially when there are alternatives. I suggest using the option predefinedAcl: 'publicRead' when uploading a file with Cloud Storage NodeJS 1.6.x or +:
const options = {
destination: yourFileDestination,
predefinedAcl: 'publicRead'
};
bucket.upload(attachment, options);
Then, getting the public URL is as simple as:
bucket.upload(attachment, options).then(result => {
const file = result[0];
return file.getMetadata();
}).then(results => {
const metadata = results[0];
console.log('metadata=', metadata.mediaLink);
}).catch(error => {
console.error(error);
});
With the recent changes in the functions object response you can get everything you need to "stitch" together the download URL like so:
const img_url = 'https://firebasestorage.googleapis.com/v0/b/[YOUR BUCKET]/o/'
+ encodeURIComponent(object.name)
+ '?alt=media&token='
+ object.metadata.firebaseStorageDownloadTokens;
console.log('URL',img_url);
This is what I currently use, it's simple and it works flawlessly.
You don't need to do anything with Google Cloud. It works out of the box with Firebase..
// Save the base64 to storage.
const file = admin.storage().bucket('url found on the storage part of firebase').file(`profile_photos/${uid}`);
await file.save(base64Image, {
metadata: {
contentType: 'image/jpeg',
},
predefinedAcl: 'publicRead'
});
const metaData = await file.getMetadata()
const url = metaData[0].mediaLink
EDIT:
Same example, but with upload:
await bucket.upload(fromFilePath, {destination: toFilePath});
file = bucket.file(toFilePath);
metaData = await file.getMetadata()
const trimUrl = metaData[0].mediaLink
#update:
no need to make two different call in upload method to get the metadata:
let file = await bucket.upload(fromFilePath, {destination: toFilePath});
const trimUrl = file[0].metaData.mediaLink
For those wondering where the Firebase Admin SDK serviceAccountKey.json file should go. Just place it in the functions folder and deploy as usual.
It still baffles me why we can't just get the download url from the metadata like we do in the Javascript SDK. Generating a url that will eventually expire and saving it in the database is not desirable.
One method I'm using with success is to set a UUID v4 value to a key named firebaseStorageDownloadTokens in the metadata of the file after it finishes uploading and then assemble the download URL myself following the structure Firebase uses to generate these URLs, eg:
https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[FILE_PATH]?alt=media&token=[THE_TOKEN_YOU_CREATED]
I don't know how much "safe" is to use this method (given that Firebase could change how it generates the download URLs in the future ) but it is easy to implement.
Sorry but i can't post a comment to your question above because of missing reputation, so I will include it in this answer.
Do as stated above by generating a signed Url, but instead of using the service-account.json I think you have to use the serviceAccountKey.json which you can generate at (replace YOURPROJECTID accordingly)
https://console.firebase.google.com/project/YOURPROJECTID/settings/serviceaccounts/adminsdk
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'serviceAccountKey.json'});
// ...
const bucket = gcs.bucket(bucket);
// ...
return bucket.upload(tempLocalFile, {
destination: filePath,
metadata: {
contentType: 'image/jpeg'
}
})
.then((data) => {
let file = data[0]
file.getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// handle url
})
I can't comment on the answer James Daniels gave, but I think this is very Important to read.
Giving out a signed URL Like he did seems for many cases pretty bad and possible Dangerous.
According to the documentation of Firebase the signed url expires after some time, so adding that to your databse will lead to a empty url after a certain timeframe
It may be that misunderstood the Documentation there and the signed url doesn't expire, which would have some security issues as a result.
The Key seems to be the same for every uploaded file. This means once you got the url of one file, someone could easily access files that he is not suposed to access, just by knowing their names.
If i missunderstood that then i would lvoe to be corrected.
Else someone should probably Update the above named solution.
If i may be wrong there
If you use the predefined access control lists value of 'publicRead', you can upload the file and access it with a very simple url structure:
// Upload to GCS
const opts: UploadOptions = {
gzip: true,
destination: dest, // 'someFolder/image.jpg'
predefinedAcl: 'publicRead',
public: true
};
return bucket.upload(imagePath, opts);
You can then construct the url like so:
const storageRoot = 'https://storage.googleapis.com/';
const bucketName = 'myapp.appspot.com/'; // CHANGE TO YOUR BUCKET NAME
const downloadUrl = storageRoot + bucketName + encodeURIComponent(dest);
Use file.publicUrl()
Async/Await
const bucket = storage.bucket('bucket-name');
const uploadResponse = await bucket.upload('image-name.jpg');
const downloadUrl = uploadResponse[0].publicUrl();
Callback
const bucket = storage.bucket('bucket-name');
bucket.upload('image-name.jpg', (err, file) => {
if(!file) {
throw err;
}
const downloadUrl = file.publicUrl();
})
The downloadUrl will be "https://storage.googleapis.com/bucket-name/image-name.jpg".
Please note that in order for the above code to work, you have to make the bucket or file public. To do so, follow the instructions here https://cloud.google.com/storage/docs/access-control/making-data-public. Also, I imported the #google-cloud/storage package directly not through the Firebase SDK.
I had the same issue, however, I was looking at the code of the firebase function example instead of the README. And Answers on this thread didn't help either...
You can avoid passing the config file by doing the following:
Go to your project's Cloud Console > IAM & admin > IAM, Find the App
Engine default service account and add the Service Account Token
Creator role to that member. This will allow your app to create signed
public URLs to the images.
source: Automatically Generate Thumbnails function README
Your role for app engine should look like this:
This works if you just need a public file with a simple URL. Note that this may overrule your Firebase storage rules.
bucket.upload(file, function(err, file) {
if (!err) {
//Make the file public
file.acl.add({
entity: 'allUsers',
role: gcs.acl.READER_ROLE
}, function(err, aclObject) {
if (!err) {
var URL = "https://storage.googleapis.com/[your bucket name]/" + file.id;
console.log(URL);
} else {
console.log("Failed to set permissions: " + err);
}
});
} else {
console.log("Upload failed: " + err);
}
});
Without signedURL() using makePublic()
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp()
var bucket = admin.storage().bucket();
// --- [Above] for admin related operations, [Below] for making a public url from a GCS uploaded object
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
exports.testDlUrl = functions.storage.object().onFinalize(async (objMetadata) => {
console.log('bucket, file', objMetadata.bucket + ' ' + objMetadata.name.split('/').pop()); // assuming file is in folder
return storage.bucket(objMetadata.bucket).file(objMetadata.name).makePublic().then(function (data) {
return admin.firestore().collection('publicUrl').doc().set({ publicUrl: 'https://storage.googleapis.com/' + objMetadata.bucket + '/' + objMetadata.name }).then(writeResult => {
return console.log('publicUrl', writeResult);
});
});
});
answer by https://stackoverflow.com/users/269447/laurent works best
const uploadOptions: UploadOptions = {
public: true
};
const bucket = admin.storage().bucket();
[ffile] = await bucket.upload(oPath, uploadOptions);
ffile.metadata.mediaLink // this is what you need
I saw this on the admin storage doc
const options = {
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
};
// Get a v4 signed URL for reading the file
const [url] = await storage
.bucket(bucketName)
.file(filename)
.getSignedUrl(options);
console.log('Generated GET signed URL:');
console.log(url);
console.log('You can use this URL with any user agent, for example:');
console.log(`curl '${url}'`);
For those who are using Firebase SDK andadmin.initializeApp:
1 - Generate a Private Key and place in /functions folder.
2 - Configure your code as follows:
const serviceAccount = require('../../serviceAccountKey.json');
try { admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) })); } catch (e) {}
Documentation
The try/catch is because I'm using a index.js that imports other files and creates one function to each file. If you're using a single index.js file with all functions, you should be ok with admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) }));.
As of firebase 6.0.0 I was able to access the storage directly with the admin like this:
const bucket = admin.storage().bucket();
So I didn't need to add a service account. Then setting the UUID as referenced above worked for getting the firebase url.
This is the best I came up. It is redundant, but the only reasonable solution that worked for me.
await bucket.upload(localFilePath, {destination: uploadPath, public: true});
const f = await bucket.file(uploadPath)
const meta = await f.getMetadata()
console.log(meta[0].mediaLink)
I already post my ans... in below URL Where you can get full code with solution
How do I upload a base64 encoded image (string) directly to a Google Cloud Storage bucket using Node.js?
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
const os = require('os')
const path = require('path')
const cors = require('cors')({ origin: true })
const Busboy = require('busboy')
const fs = require('fs')
var admin = require("firebase-admin");
var serviceAccount = {
"type": "service_account",
"project_id": "xxxxxx",
"private_key_id": "xxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\jr5x+4AvctKLonBafg\nElTg3Cj7pAEbUfIO9I44zZ8=\n-----END PRIVATE KEY-----\n",
"client_email": "xxxx#xxxx.iam.gserviceaccount.com",
"client_id": "xxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-5rmdm%40xxxxx.iam.gserviceaccount.com"
}
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "xxxxx-xxxx" // use your storage bucket name
});
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('/uploadFile', (req, response) => {
response.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers })
let uploadData = null
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(os.tmpdir(), filename)
uploadData = { file: filepath, type: mimetype }
console.log("-------------->>",filepath)
file.pipe(fs.createWriteStream(filepath))
})
busboy.on('finish', () => {
const bucket = admin.storage().bucket();
bucket.upload(uploadData.file, {
uploadType: 'media',
metadata: {
metadata: { firebaseStorageDownloadTokens: uuid,
contentType: uploadData.type,
},
},
})
.catch(err => {
res.status(500).json({
error: err,
})
})
})
busboy.end(req.rawBody)
});
exports.widgets = functions.https.onRequest(app);
For those trying to use the token parameter to share the file and would like to use gsutil command, here is how I did it:
First you need to authenticate by running: gcloud auth
Then run:
gsutil setmeta -h "x-goog-meta-firebaseStorageDownloadTokens:$FILE_TOKEN" gs://$FIREBASE_REPO/$FILE_NAME
Then you can download the file with the following link:
https://firebasestorage.googleapis.com/v0/b/$FIREBASE_REPO/o/$FILE_NAME?alt=media&token=$FILE_TOKEN
From the Admin SDKs, you cannot retrieve the download token generated by Firebase of an uploaded file, but you can set that token when uploading by adding it in the metadata.
For those who are working on Python SDK. This is the way to do it:
from firebase_admin import storage
from uuid import uuid4
bucket = storage.bucket()
blob = bucket.blob(path_to_file)
token = str(uuid4()) # Random ID
blob.metadata = {
"firebaseStorageDownloadTokens": token
}
blob.upload_from_file(file)
You have now uploaded the file and got the URL token. You could now save the token (or even the full download URL) into your database (e.g. Firestore) and send it to the client when the file is requested and then making the client itself retrieve the file.
The full download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/{bucket_name}/o/{file_name}?alt=media&token={token}
If you are getting error:
Google Cloud Functions: require(…) is not a function
try this:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({keyFilename: 'service-account-key.json'});
const bucket = storage.bucket(object.bucket);
const file = bucket.file(filePath);
.....
const { Storage } = require("#google-cloud/storage");
const storage = new Storage({
keyFilename: "./xxx-path-to-key(locally).json",
projectId: "my project ID",
});
exports.get_data = functions.https.onRequest(async (req, res) => {
const pdf = async () => {
const doc = new PDFDocument();
doc.text("Hello, World!");
doc.end();
return await getStream.buffer(doc);
};
const pdfBuffer = await pdf();
const pdfBase64string = pdfBuffer.toString("base64"); //getting string which I want to send to storage like pdf file
const bucketName = "gs://path-to-storage.com/"
let filename = "test_file.pdf" //some locally pdf file (but can't save locally while using cloud functions)
const uploadFile = async () => {
// Uploads a local file to the bucket
await storage.bucket(bucketName).upload(filename, { //this one fine works for locally uploading
metadata: {
cacheControl: "public, max-age=31536000",
contentType: "application/pdf",
},
});
console.log(`${filename} uploaded to ${bucketName}.`);
};
uploadFile();
})
I need to create PDF file using cloud functions and upload it to firebase storage.
I've create pdfBase64string - and just need to save this string-pdf to storage, but can't find information to do it.
I tried many different ways but got stuck, because Google answers come to the end.
Notice that you are using the Cloud Storage and not the Firebase Storage library. You can upload with:
// Create a root reference
var storageRef = firebase.storage().ref();
// Create a reference to 'mountains.pdf'
var ref = storageRef.child('mountains.pdf');
// Base64 formatted string
var message = '5b6p5Y+344GX44G+44GX44Gf77yB44GK44KB44Gn44Go44GG77yB';
ref.putString(message, 'base64').then(function(snapshot) {
console.log('Uploaded a base64 string!');
});
from the documentation
Now,
To be able to access Firebase Storage from Cloud Functions you need to add Firebase Editor role to the Cloud Function service account.
Alternatively you can use a Firebase Function to upload to Firebase Storage.
The Cloud Storage library is not the same as the Firebase Storage library.
After uploading a file in Firebase Storage with Functions for Firebase, I'd like to get the download url of the file.
I have this :
...
return bucket
.upload(fromFilePath, {destination: toFilePath})
.then((err, file) => {
// Get the download url of file
});
The object file has a lot of parameters. Even one named mediaLink. However, if I try to access this link, I get this error :
Anonymous users does not have storage.objects.get access to object ...
Can somebody tell me how to get the public download Url?
Thank you
You'll need to generate a signed URL using getSignedURL via the #google-cloud/storage NPM module.
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
// ...
const bucket = gcs.bucket(bucket);
const file = bucket.file(fileName);
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
// signedUrls[0] contains the file's public URL
});
You'll need to initialize #google-cloud/storage with your service account credentials as the application default credentials will not be sufficient.
UPDATE: The Cloud Storage SDK can now be accessed via the Firebase Admin SDK, which acts as a wrapper around #google-cloud/storage. The only way it will is if you either:
Init the SDK with a special service account, typically through a second, non-default instance.
Or, without a service account, by giving the default App Engine service account the "signBlob" permission.
This answer will summarize the options for getting a download URL when uploading a file to Google/Firebase Cloud Storage. There are three types of download URLS:
Token download URLs, which are persistent and have security features
Signed download URLs, which are temporary and have security features
Public download URLs, which are persistent and lack security
There are two ways to get a token download URL. Signed and public download URLs each have only one way to get them.
Token URL method #1: From the Firebase Storage Console
You can get the download URL from Firebase Storage console:
The download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/languagetwo-cd94d.appspot.com/o/Audio%2FEnglish%2FUnited_States-OED-0%2Fabout.mp3?alt=media&token=489c48b3-23fb-4270-bd85-0a328d2808e5
The first part is a standard path to your file. At the end is the token. This download URL is permanent, i.e., it won't expire, although you can revoke it.
Token URL method #2: From the Front End
The documentation tells us to use getDownloadURL():
let url = await firebase.storage().ref('Audio/English/United_States-OED-' + i +'/' + $scope.word.word + ".mp3").getDownloadURL();
This gets the same download URL that you can get from your Firebase Storage console. This method is easy but requires that you know the path to your file, which in my app is difficult. You could upload files from the front end, but this would expose your credentials to anyone who downloads your app. So for most projects you'll want to upload your files from your Cloud Functions, then get the download URL and save it to your database along with other data about your file.
I can't find a way to get the token download URL when I write a file to Storage from a Cloud Function (because I can't find a way to tell the front end that a file has written to Storage), but what works for me is to write a file to a publicly available URL, write the publicly available URL to Firebase, then when my Angular front end gets the download URL from Firebase it also runs getDownloadURL(), which has the token, then compares the download URL in Firestore to the token download URL, and if they don't match then it updates the token download URL in place of the publicly available URL in Firestore. This exposes your file to the public only once.
This is easier than it sounds. The following code iterates through an array of Storage download URLs and replaces publicly available download URLs with token download URLs.
const storage = getStorage();
var audioFiles: string[] = [];
if (this.pronunciationArray[0].pronunciation != undefined) {
for (const audioFile of this.pronunciationArray[0].audioFiles) { // for each audio file in array
let url = await getDownloadURL(ref(storage, audioFile)); // get the download url with token
if (audioFile !== url) { // download URLs don't match
audioFiles.push(url);
} // end inner if
}; // end for of loop
if (audioFiles.length > 0) { // update Firestore only if we have new download URLs
await updateDoc(doc(this.firestore, 'Dictionaries/' + this.l2Language.long + '/Words/' + word + '/Pronunciations/', this.pronunciationArray[0].pronunciation), {
audioFiles: audioFiles
});
}
} // end outer if
You're thinking, "I'll return the Storage location from my Cloud Function to my front end and then use the location with getDownloadURL() to write the token download URL to Firestore." That won't work because Cloud Functions can only return synchronous results. Async operations return null.
"No problem," you say. "I'll set up an Observer on Storage, get the location from the Observer, and then use the location with getDownloadURL() to write the token download URL to Firestore." No dice. Firestore has Observers. Storage doesn't have Observers.
"How about," you say, "calling listAll() from my front end, getting a list of all my Storage files, then calling the metadata for each file, and extracting the download URL and token for each file, and then writing these to Firestore?" Good try, but Storage metadata doesn't include the download URL or token.
Signed URL method #1: getSignedUrl() for Temporary Download URLs
getSignedUrl() is easy to use from a Cloud Function:
function oedPromise() {
return new Promise(function(resolve, reject) {
http.get(oedAudioURL, function(response) {
response.pipe(file.createWriteStream(options))
.on('error', function(error) {
console.error(error);
reject(error);
})
.on('finish', function() {
file.getSignedUrl(config, function(err, url) {
if (err) {
console.error(err);
return;
} else {
resolve(url);
}
});
});
});
});
}
A signed download URL looks like this:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio%2FSpanish%2FLatin_America-Sofia-Female-IBM%2Faqu%C3%AD.mp3?GoogleAccessId=languagetwo-cd94d%40appspot.gserviceaccount.com&Expires=4711305600&Signature=WUmABCZIlUp6eg7dKaBFycuO%2Baz5vOGTl29Je%2BNpselq8JSl7%2BIGG1LnCl0AlrHpxVZLxhk0iiqIejj4Qa6pSMx%2FhuBfZLT2Z%2FQhIzEAoyiZFn8xy%2FrhtymjDcpbDKGZYjmWNONFezMgYekNYHi05EPMoHtiUDsP47xHm3XwW9BcbuW6DaWh2UKrCxERy6cJTJ01H9NK1wCUZSMT0%2BUeNpwTvbRwc4aIqSD3UbXSMQlFMxxWbPvf%2B8Q0nEcaAB1qMKwNhw1ofAxSSaJvUdXeLFNVxsjm2V9HX4Y7OIuWwAxtGedLhgSleOP4ErByvGQCZsoO4nljjF97veil62ilaQ%3D%3D
The signed URL has an expiration date and long signature. The documentation for the command line gsutil signurl -d says that signed URLs are temporary: the default expiration is one hour and the maximum expiration is seven days.
I'm going to rant here that the getSignedUrl documentation never says that your signed URL will expire in a week. The documentation code has 3-17-2025 as the expiration date, suggesting that you can set the expiration years in the future. My app worked perfectly, and then crashed a week later. The error message said that the signatures didn't match, not that the download URL had expired. I made various changes to my code, and everything worked...until it all crashed a week later. This went on for more than a month of frustration. Is the 3-17-2025 date an inside joke? Like the gold coins of a leprechaun that vanish when the leprechaun is out of sight, a St. Patrick's Day expiry date years in the future vanishes in two weeks, just when you thought your code was bug-free.
Public URL #1: Make Your File Publicly Available
You can set the permissions on your file to public read, as explained in the documentation. This can be done from the Cloud Storage Browser or from your Node server. You can make one file public or a directory or your entire Storage database. Here's the Node code:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
predefinedAcl: 'publicRead',
contentType: 'audio/' + audioType,
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
console.log("webm audio file written.");
resolve();
})
.catch(error => console.error(error));
});
The result will look like this in your Cloud Storage Browser:
Anyone can then use the standard path to download your file:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio/English/United_States-OED-0/system.mp3
Another way to make a file public is to use the method makePublic(). I haven't been able to get this to work, it's tricky to get the bucket and file paths right.
An interesting alternative is to use Access Control Lists. You can make a file available only to users whom you put on a list, or use authenticatedRead to make the file available to anyone who is logged in from a Google account. If there were an option "anyone who logged into my app using Firebase Auth" I would use this, as it would limit access to only my users.
Deprecated: Build Your Own Download URL with firebaseStorageDownloadTokens
Several answers describe an undocumented Google Storage object property firebaseStorageDownloadTokens. This was never an official Google Cloud Storage feature and it no longer works. Here's how it used to work.
You told Storage the token you wanted to use. You then generated a token with the uuid Node module. Four lines of code and you could build your own download URL, the same download URL you get from the console or getDownloadURL(). The four lines of code are:
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
Here's the code in context:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
contentType: 'audio/' + audioType,
metadata: {
metadata: {
firebaseStorageDownloadTokens: uuid,
}
}
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
})
.catch(error => console.error(error));
});
That's not a typo--you have to nest firebaseStorageDownloadTokens in double layers of metadata:!
Here's an example on how to specify the download token on upload:
const UUID = require("uuid-v4");
const fbId = "<YOUR APP ID>";
const fbKeyFile = "./YOUR_AUTH_FIlE.json";
const gcs = require('#google-cloud/storage')({keyFilename: fbKeyFile});
const bucket = gcs.bucket(`${fbId}.appspot.com`);
var upload = (localFile, remoteFile) => {
let uuid = UUID();
return bucket.upload(localFile, {
destination: remoteFile,
uploadType: "media",
metadata: {
contentType: 'image/png',
metadata: {
firebaseStorageDownloadTokens: uuid
}
}
})
.then((data) => {
let file = data[0];
return Promise.resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent(file.name) + "?alt=media&token=" + uuid);
});
}
then call with
upload(localPath, remotePath).then( downloadURL => {
console.log(downloadURL);
});
The key thing here is that there is a metadata object nested within the metadata option property. Setting firebaseStorageDownloadTokens to a uuid-v4 value will tell Cloud Storage to use that as its public auth token.
Many thanks to #martemorfosis
If you're working on a Firebase project, you can create signed URLs in a Cloud Function without including other libraries or downloading a credentials file. You just need to enable the IAM API and add a role to your existing service account (see below).
Initialize the admin library and get a file reference as your normally would:
import * as functions from 'firebase-functions'
import * as admin from 'firebase-admin'
admin.initializeApp(functions.config().firebase)
const myFile = admin.storage().bucket().file('path/to/my/file')
You then generate a signed URL with
myFile.getSignedUrl({action: 'read', expires: someDateObj}).then(urls => {
const signedUrl = urls[0]
})
Make sure your Firebase service account has sufficient permissions to run this
Go to the Google API console and enable the IAM API (https://console.developers.google.com/apis/api/iam.googleapis.com/overview)
Still in the API console, go to the main menu, "IAM & admin" -> "IAM"
Click edit for the "App Engine default service account" role
Click "Add another role", and add the one called "Service Account Token Creator"
Save and wait a minute for the changes to propagate
With a vanilla Firebase config, the first time you run the above code you'll get an error Identity and Access Management (IAM) API has not been used in project XXXXXX before or it is disabled.. If you follow the link in the error message and enable the IAM API, you'll get another error: Permission iam.serviceAccounts.signBlob is required to perform this operation on service account my-service-account. Adding the Token Creator role fixes this second permission issue.
You should avoid harcoding URL prefix in your code, especially when there are alternatives. I suggest using the option predefinedAcl: 'publicRead' when uploading a file with Cloud Storage NodeJS 1.6.x or +:
const options = {
destination: yourFileDestination,
predefinedAcl: 'publicRead'
};
bucket.upload(attachment, options);
Then, getting the public URL is as simple as:
bucket.upload(attachment, options).then(result => {
const file = result[0];
return file.getMetadata();
}).then(results => {
const metadata = results[0];
console.log('metadata=', metadata.mediaLink);
}).catch(error => {
console.error(error);
});
With the recent changes in the functions object response you can get everything you need to "stitch" together the download URL like so:
const img_url = 'https://firebasestorage.googleapis.com/v0/b/[YOUR BUCKET]/o/'
+ encodeURIComponent(object.name)
+ '?alt=media&token='
+ object.metadata.firebaseStorageDownloadTokens;
console.log('URL',img_url);
This is what I currently use, it's simple and it works flawlessly.
You don't need to do anything with Google Cloud. It works out of the box with Firebase..
// Save the base64 to storage.
const file = admin.storage().bucket('url found on the storage part of firebase').file(`profile_photos/${uid}`);
await file.save(base64Image, {
metadata: {
contentType: 'image/jpeg',
},
predefinedAcl: 'publicRead'
});
const metaData = await file.getMetadata()
const url = metaData[0].mediaLink
EDIT:
Same example, but with upload:
await bucket.upload(fromFilePath, {destination: toFilePath});
file = bucket.file(toFilePath);
metaData = await file.getMetadata()
const trimUrl = metaData[0].mediaLink
#update:
no need to make two different call in upload method to get the metadata:
let file = await bucket.upload(fromFilePath, {destination: toFilePath});
const trimUrl = file[0].metaData.mediaLink
For those wondering where the Firebase Admin SDK serviceAccountKey.json file should go. Just place it in the functions folder and deploy as usual.
It still baffles me why we can't just get the download url from the metadata like we do in the Javascript SDK. Generating a url that will eventually expire and saving it in the database is not desirable.
One method I'm using with success is to set a UUID v4 value to a key named firebaseStorageDownloadTokens in the metadata of the file after it finishes uploading and then assemble the download URL myself following the structure Firebase uses to generate these URLs, eg:
https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[FILE_PATH]?alt=media&token=[THE_TOKEN_YOU_CREATED]
I don't know how much "safe" is to use this method (given that Firebase could change how it generates the download URLs in the future ) but it is easy to implement.
Sorry but i can't post a comment to your question above because of missing reputation, so I will include it in this answer.
Do as stated above by generating a signed Url, but instead of using the service-account.json I think you have to use the serviceAccountKey.json which you can generate at (replace YOURPROJECTID accordingly)
https://console.firebase.google.com/project/YOURPROJECTID/settings/serviceaccounts/adminsdk
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'serviceAccountKey.json'});
// ...
const bucket = gcs.bucket(bucket);
// ...
return bucket.upload(tempLocalFile, {
destination: filePath,
metadata: {
contentType: 'image/jpeg'
}
})
.then((data) => {
let file = data[0]
file.getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// handle url
})
I can't comment on the answer James Daniels gave, but I think this is very Important to read.
Giving out a signed URL Like he did seems for many cases pretty bad and possible Dangerous.
According to the documentation of Firebase the signed url expires after some time, so adding that to your databse will lead to a empty url after a certain timeframe
It may be that misunderstood the Documentation there and the signed url doesn't expire, which would have some security issues as a result.
The Key seems to be the same for every uploaded file. This means once you got the url of one file, someone could easily access files that he is not suposed to access, just by knowing their names.
If i missunderstood that then i would lvoe to be corrected.
Else someone should probably Update the above named solution.
If i may be wrong there
If you use the predefined access control lists value of 'publicRead', you can upload the file and access it with a very simple url structure:
// Upload to GCS
const opts: UploadOptions = {
gzip: true,
destination: dest, // 'someFolder/image.jpg'
predefinedAcl: 'publicRead',
public: true
};
return bucket.upload(imagePath, opts);
You can then construct the url like so:
const storageRoot = 'https://storage.googleapis.com/';
const bucketName = 'myapp.appspot.com/'; // CHANGE TO YOUR BUCKET NAME
const downloadUrl = storageRoot + bucketName + encodeURIComponent(dest);
Use file.publicUrl()
Async/Await
const bucket = storage.bucket('bucket-name');
const uploadResponse = await bucket.upload('image-name.jpg');
const downloadUrl = uploadResponse[0].publicUrl();
Callback
const bucket = storage.bucket('bucket-name');
bucket.upload('image-name.jpg', (err, file) => {
if(!file) {
throw err;
}
const downloadUrl = file.publicUrl();
})
The downloadUrl will be "https://storage.googleapis.com/bucket-name/image-name.jpg".
Please note that in order for the above code to work, you have to make the bucket or file public. To do so, follow the instructions here https://cloud.google.com/storage/docs/access-control/making-data-public. Also, I imported the #google-cloud/storage package directly not through the Firebase SDK.
I had the same issue, however, I was looking at the code of the firebase function example instead of the README. And Answers on this thread didn't help either...
You can avoid passing the config file by doing the following:
Go to your project's Cloud Console > IAM & admin > IAM, Find the App
Engine default service account and add the Service Account Token
Creator role to that member. This will allow your app to create signed
public URLs to the images.
source: Automatically Generate Thumbnails function README
Your role for app engine should look like this:
This works if you just need a public file with a simple URL. Note that this may overrule your Firebase storage rules.
bucket.upload(file, function(err, file) {
if (!err) {
//Make the file public
file.acl.add({
entity: 'allUsers',
role: gcs.acl.READER_ROLE
}, function(err, aclObject) {
if (!err) {
var URL = "https://storage.googleapis.com/[your bucket name]/" + file.id;
console.log(URL);
} else {
console.log("Failed to set permissions: " + err);
}
});
} else {
console.log("Upload failed: " + err);
}
});
Without signedURL() using makePublic()
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp()
var bucket = admin.storage().bucket();
// --- [Above] for admin related operations, [Below] for making a public url from a GCS uploaded object
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
exports.testDlUrl = functions.storage.object().onFinalize(async (objMetadata) => {
console.log('bucket, file', objMetadata.bucket + ' ' + objMetadata.name.split('/').pop()); // assuming file is in folder
return storage.bucket(objMetadata.bucket).file(objMetadata.name).makePublic().then(function (data) {
return admin.firestore().collection('publicUrl').doc().set({ publicUrl: 'https://storage.googleapis.com/' + objMetadata.bucket + '/' + objMetadata.name }).then(writeResult => {
return console.log('publicUrl', writeResult);
});
});
});
answer by https://stackoverflow.com/users/269447/laurent works best
const uploadOptions: UploadOptions = {
public: true
};
const bucket = admin.storage().bucket();
[ffile] = await bucket.upload(oPath, uploadOptions);
ffile.metadata.mediaLink // this is what you need
I saw this on the admin storage doc
const options = {
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
};
// Get a v4 signed URL for reading the file
const [url] = await storage
.bucket(bucketName)
.file(filename)
.getSignedUrl(options);
console.log('Generated GET signed URL:');
console.log(url);
console.log('You can use this URL with any user agent, for example:');
console.log(`curl '${url}'`);
For those who are using Firebase SDK andadmin.initializeApp:
1 - Generate a Private Key and place in /functions folder.
2 - Configure your code as follows:
const serviceAccount = require('../../serviceAccountKey.json');
try { admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) })); } catch (e) {}
Documentation
The try/catch is because I'm using a index.js that imports other files and creates one function to each file. If you're using a single index.js file with all functions, you should be ok with admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) }));.
As of firebase 6.0.0 I was able to access the storage directly with the admin like this:
const bucket = admin.storage().bucket();
So I didn't need to add a service account. Then setting the UUID as referenced above worked for getting the firebase url.
This is the best I came up. It is redundant, but the only reasonable solution that worked for me.
await bucket.upload(localFilePath, {destination: uploadPath, public: true});
const f = await bucket.file(uploadPath)
const meta = await f.getMetadata()
console.log(meta[0].mediaLink)
I already post my ans... in below URL Where you can get full code with solution
How do I upload a base64 encoded image (string) directly to a Google Cloud Storage bucket using Node.js?
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
const os = require('os')
const path = require('path')
const cors = require('cors')({ origin: true })
const Busboy = require('busboy')
const fs = require('fs')
var admin = require("firebase-admin");
var serviceAccount = {
"type": "service_account",
"project_id": "xxxxxx",
"private_key_id": "xxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\jr5x+4AvctKLonBafg\nElTg3Cj7pAEbUfIO9I44zZ8=\n-----END PRIVATE KEY-----\n",
"client_email": "xxxx#xxxx.iam.gserviceaccount.com",
"client_id": "xxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-5rmdm%40xxxxx.iam.gserviceaccount.com"
}
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "xxxxx-xxxx" // use your storage bucket name
});
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('/uploadFile', (req, response) => {
response.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers })
let uploadData = null
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(os.tmpdir(), filename)
uploadData = { file: filepath, type: mimetype }
console.log("-------------->>",filepath)
file.pipe(fs.createWriteStream(filepath))
})
busboy.on('finish', () => {
const bucket = admin.storage().bucket();
bucket.upload(uploadData.file, {
uploadType: 'media',
metadata: {
metadata: { firebaseStorageDownloadTokens: uuid,
contentType: uploadData.type,
},
},
})
.catch(err => {
res.status(500).json({
error: err,
})
})
})
busboy.end(req.rawBody)
});
exports.widgets = functions.https.onRequest(app);
For those trying to use the token parameter to share the file and would like to use gsutil command, here is how I did it:
First you need to authenticate by running: gcloud auth
Then run:
gsutil setmeta -h "x-goog-meta-firebaseStorageDownloadTokens:$FILE_TOKEN" gs://$FIREBASE_REPO/$FILE_NAME
Then you can download the file with the following link:
https://firebasestorage.googleapis.com/v0/b/$FIREBASE_REPO/o/$FILE_NAME?alt=media&token=$FILE_TOKEN
From the Admin SDKs, you cannot retrieve the download token generated by Firebase of an uploaded file, but you can set that token when uploading by adding it in the metadata.
For those who are working on Python SDK. This is the way to do it:
from firebase_admin import storage
from uuid import uuid4
bucket = storage.bucket()
blob = bucket.blob(path_to_file)
token = str(uuid4()) # Random ID
blob.metadata = {
"firebaseStorageDownloadTokens": token
}
blob.upload_from_file(file)
You have now uploaded the file and got the URL token. You could now save the token (or even the full download URL) into your database (e.g. Firestore) and send it to the client when the file is requested and then making the client itself retrieve the file.
The full download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/{bucket_name}/o/{file_name}?alt=media&token={token}
If you are getting error:
Google Cloud Functions: require(…) is not a function
try this:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({keyFilename: 'service-account-key.json'});
const bucket = storage.bucket(object.bucket);
const file = bucket.file(filePath);
.....