After uploading a file in Firebase Storage with Functions for Firebase, I'd like to get the download url of the file.
I have this :
...
return bucket
.upload(fromFilePath, {destination: toFilePath})
.then((err, file) => {
// Get the download url of file
});
The object file has a lot of parameters. Even one named mediaLink. However, if I try to access this link, I get this error :
Anonymous users does not have storage.objects.get access to object ...
Can somebody tell me how to get the public download Url?
Thank you
You'll need to generate a signed URL using getSignedURL via the #google-cloud/storage NPM module.
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
// ...
const bucket = gcs.bucket(bucket);
const file = bucket.file(fileName);
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
// signedUrls[0] contains the file's public URL
});
You'll need to initialize #google-cloud/storage with your service account credentials as the application default credentials will not be sufficient.
UPDATE: The Cloud Storage SDK can now be accessed via the Firebase Admin SDK, which acts as a wrapper around #google-cloud/storage. The only way it will is if you either:
Init the SDK with a special service account, typically through a second, non-default instance.
Or, without a service account, by giving the default App Engine service account the "signBlob" permission.
This answer will summarize the options for getting a download URL when uploading a file to Google/Firebase Cloud Storage. There are three types of download URLS:
Token download URLs, which are persistent and have security features
Signed download URLs, which are temporary and have security features
Public download URLs, which are persistent and lack security
There are two ways to get a token download URL. Signed and public download URLs each have only one way to get them.
Token URL method #1: From the Firebase Storage Console
You can get the download URL from Firebase Storage console:
The download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/languagetwo-cd94d.appspot.com/o/Audio%2FEnglish%2FUnited_States-OED-0%2Fabout.mp3?alt=media&token=489c48b3-23fb-4270-bd85-0a328d2808e5
The first part is a standard path to your file. At the end is the token. This download URL is permanent, i.e., it won't expire, although you can revoke it.
Token URL method #2: From the Front End
The documentation tells us to use getDownloadURL():
let url = await firebase.storage().ref('Audio/English/United_States-OED-' + i +'/' + $scope.word.word + ".mp3").getDownloadURL();
This gets the same download URL that you can get from your Firebase Storage console. This method is easy but requires that you know the path to your file, which in my app is difficult. You could upload files from the front end, but this would expose your credentials to anyone who downloads your app. So for most projects you'll want to upload your files from your Cloud Functions, then get the download URL and save it to your database along with other data about your file.
I can't find a way to get the token download URL when I write a file to Storage from a Cloud Function (because I can't find a way to tell the front end that a file has written to Storage), but what works for me is to write a file to a publicly available URL, write the publicly available URL to Firebase, then when my Angular front end gets the download URL from Firebase it also runs getDownloadURL(), which has the token, then compares the download URL in Firestore to the token download URL, and if they don't match then it updates the token download URL in place of the publicly available URL in Firestore. This exposes your file to the public only once.
This is easier than it sounds. The following code iterates through an array of Storage download URLs and replaces publicly available download URLs with token download URLs.
const storage = getStorage();
var audioFiles: string[] = [];
if (this.pronunciationArray[0].pronunciation != undefined) {
for (const audioFile of this.pronunciationArray[0].audioFiles) { // for each audio file in array
let url = await getDownloadURL(ref(storage, audioFile)); // get the download url with token
if (audioFile !== url) { // download URLs don't match
audioFiles.push(url);
} // end inner if
}; // end for of loop
if (audioFiles.length > 0) { // update Firestore only if we have new download URLs
await updateDoc(doc(this.firestore, 'Dictionaries/' + this.l2Language.long + '/Words/' + word + '/Pronunciations/', this.pronunciationArray[0].pronunciation), {
audioFiles: audioFiles
});
}
} // end outer if
You're thinking, "I'll return the Storage location from my Cloud Function to my front end and then use the location with getDownloadURL() to write the token download URL to Firestore." That won't work because Cloud Functions can only return synchronous results. Async operations return null.
"No problem," you say. "I'll set up an Observer on Storage, get the location from the Observer, and then use the location with getDownloadURL() to write the token download URL to Firestore." No dice. Firestore has Observers. Storage doesn't have Observers.
"How about," you say, "calling listAll() from my front end, getting a list of all my Storage files, then calling the metadata for each file, and extracting the download URL and token for each file, and then writing these to Firestore?" Good try, but Storage metadata doesn't include the download URL or token.
Signed URL method #1: getSignedUrl() for Temporary Download URLs
getSignedUrl() is easy to use from a Cloud Function:
function oedPromise() {
return new Promise(function(resolve, reject) {
http.get(oedAudioURL, function(response) {
response.pipe(file.createWriteStream(options))
.on('error', function(error) {
console.error(error);
reject(error);
})
.on('finish', function() {
file.getSignedUrl(config, function(err, url) {
if (err) {
console.error(err);
return;
} else {
resolve(url);
}
});
});
});
});
}
A signed download URL looks like this:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio%2FSpanish%2FLatin_America-Sofia-Female-IBM%2Faqu%C3%AD.mp3?GoogleAccessId=languagetwo-cd94d%40appspot.gserviceaccount.com&Expires=4711305600&Signature=WUmABCZIlUp6eg7dKaBFycuO%2Baz5vOGTl29Je%2BNpselq8JSl7%2BIGG1LnCl0AlrHpxVZLxhk0iiqIejj4Qa6pSMx%2FhuBfZLT2Z%2FQhIzEAoyiZFn8xy%2FrhtymjDcpbDKGZYjmWNONFezMgYekNYHi05EPMoHtiUDsP47xHm3XwW9BcbuW6DaWh2UKrCxERy6cJTJ01H9NK1wCUZSMT0%2BUeNpwTvbRwc4aIqSD3UbXSMQlFMxxWbPvf%2B8Q0nEcaAB1qMKwNhw1ofAxSSaJvUdXeLFNVxsjm2V9HX4Y7OIuWwAxtGedLhgSleOP4ErByvGQCZsoO4nljjF97veil62ilaQ%3D%3D
The signed URL has an expiration date and long signature. The documentation for the command line gsutil signurl -d says that signed URLs are temporary: the default expiration is one hour and the maximum expiration is seven days.
I'm going to rant here that the getSignedUrl documentation never says that your signed URL will expire in a week. The documentation code has 3-17-2025 as the expiration date, suggesting that you can set the expiration years in the future. My app worked perfectly, and then crashed a week later. The error message said that the signatures didn't match, not that the download URL had expired. I made various changes to my code, and everything worked...until it all crashed a week later. This went on for more than a month of frustration. Is the 3-17-2025 date an inside joke? Like the gold coins of a leprechaun that vanish when the leprechaun is out of sight, a St. Patrick's Day expiry date years in the future vanishes in two weeks, just when you thought your code was bug-free.
Public URL #1: Make Your File Publicly Available
You can set the permissions on your file to public read, as explained in the documentation. This can be done from the Cloud Storage Browser or from your Node server. You can make one file public or a directory or your entire Storage database. Here's the Node code:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
predefinedAcl: 'publicRead',
contentType: 'audio/' + audioType,
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
console.log("webm audio file written.");
resolve();
})
.catch(error => console.error(error));
});
The result will look like this in your Cloud Storage Browser:
Anyone can then use the standard path to download your file:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio/English/United_States-OED-0/system.mp3
Another way to make a file public is to use the method makePublic(). I haven't been able to get this to work, it's tricky to get the bucket and file paths right.
An interesting alternative is to use Access Control Lists. You can make a file available only to users whom you put on a list, or use authenticatedRead to make the file available to anyone who is logged in from a Google account. If there were an option "anyone who logged into my app using Firebase Auth" I would use this, as it would limit access to only my users.
Deprecated: Build Your Own Download URL with firebaseStorageDownloadTokens
Several answers describe an undocumented Google Storage object property firebaseStorageDownloadTokens. This was never an official Google Cloud Storage feature and it no longer works. Here's how it used to work.
You told Storage the token you wanted to use. You then generated a token with the uuid Node module. Four lines of code and you could build your own download URL, the same download URL you get from the console or getDownloadURL(). The four lines of code are:
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
Here's the code in context:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
contentType: 'audio/' + audioType,
metadata: {
metadata: {
firebaseStorageDownloadTokens: uuid,
}
}
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
})
.catch(error => console.error(error));
});
That's not a typo--you have to nest firebaseStorageDownloadTokens in double layers of metadata:!
Here's an example on how to specify the download token on upload:
const UUID = require("uuid-v4");
const fbId = "<YOUR APP ID>";
const fbKeyFile = "./YOUR_AUTH_FIlE.json";
const gcs = require('#google-cloud/storage')({keyFilename: fbKeyFile});
const bucket = gcs.bucket(`${fbId}.appspot.com`);
var upload = (localFile, remoteFile) => {
let uuid = UUID();
return bucket.upload(localFile, {
destination: remoteFile,
uploadType: "media",
metadata: {
contentType: 'image/png',
metadata: {
firebaseStorageDownloadTokens: uuid
}
}
})
.then((data) => {
let file = data[0];
return Promise.resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent(file.name) + "?alt=media&token=" + uuid);
});
}
then call with
upload(localPath, remotePath).then( downloadURL => {
console.log(downloadURL);
});
The key thing here is that there is a metadata object nested within the metadata option property. Setting firebaseStorageDownloadTokens to a uuid-v4 value will tell Cloud Storage to use that as its public auth token.
Many thanks to #martemorfosis
If you're working on a Firebase project, you can create signed URLs in a Cloud Function without including other libraries or downloading a credentials file. You just need to enable the IAM API and add a role to your existing service account (see below).
Initialize the admin library and get a file reference as your normally would:
import * as functions from 'firebase-functions'
import * as admin from 'firebase-admin'
admin.initializeApp(functions.config().firebase)
const myFile = admin.storage().bucket().file('path/to/my/file')
You then generate a signed URL with
myFile.getSignedUrl({action: 'read', expires: someDateObj}).then(urls => {
const signedUrl = urls[0]
})
Make sure your Firebase service account has sufficient permissions to run this
Go to the Google API console and enable the IAM API (https://console.developers.google.com/apis/api/iam.googleapis.com/overview)
Still in the API console, go to the main menu, "IAM & admin" -> "IAM"
Click edit for the "App Engine default service account" role
Click "Add another role", and add the one called "Service Account Token Creator"
Save and wait a minute for the changes to propagate
With a vanilla Firebase config, the first time you run the above code you'll get an error Identity and Access Management (IAM) API has not been used in project XXXXXX before or it is disabled.. If you follow the link in the error message and enable the IAM API, you'll get another error: Permission iam.serviceAccounts.signBlob is required to perform this operation on service account my-service-account. Adding the Token Creator role fixes this second permission issue.
You should avoid harcoding URL prefix in your code, especially when there are alternatives. I suggest using the option predefinedAcl: 'publicRead' when uploading a file with Cloud Storage NodeJS 1.6.x or +:
const options = {
destination: yourFileDestination,
predefinedAcl: 'publicRead'
};
bucket.upload(attachment, options);
Then, getting the public URL is as simple as:
bucket.upload(attachment, options).then(result => {
const file = result[0];
return file.getMetadata();
}).then(results => {
const metadata = results[0];
console.log('metadata=', metadata.mediaLink);
}).catch(error => {
console.error(error);
});
With the recent changes in the functions object response you can get everything you need to "stitch" together the download URL like so:
const img_url = 'https://firebasestorage.googleapis.com/v0/b/[YOUR BUCKET]/o/'
+ encodeURIComponent(object.name)
+ '?alt=media&token='
+ object.metadata.firebaseStorageDownloadTokens;
console.log('URL',img_url);
This is what I currently use, it's simple and it works flawlessly.
You don't need to do anything with Google Cloud. It works out of the box with Firebase..
// Save the base64 to storage.
const file = admin.storage().bucket('url found on the storage part of firebase').file(`profile_photos/${uid}`);
await file.save(base64Image, {
metadata: {
contentType: 'image/jpeg',
},
predefinedAcl: 'publicRead'
});
const metaData = await file.getMetadata()
const url = metaData[0].mediaLink
EDIT:
Same example, but with upload:
await bucket.upload(fromFilePath, {destination: toFilePath});
file = bucket.file(toFilePath);
metaData = await file.getMetadata()
const trimUrl = metaData[0].mediaLink
#update:
no need to make two different call in upload method to get the metadata:
let file = await bucket.upload(fromFilePath, {destination: toFilePath});
const trimUrl = file[0].metaData.mediaLink
For those wondering where the Firebase Admin SDK serviceAccountKey.json file should go. Just place it in the functions folder and deploy as usual.
It still baffles me why we can't just get the download url from the metadata like we do in the Javascript SDK. Generating a url that will eventually expire and saving it in the database is not desirable.
One method I'm using with success is to set a UUID v4 value to a key named firebaseStorageDownloadTokens in the metadata of the file after it finishes uploading and then assemble the download URL myself following the structure Firebase uses to generate these URLs, eg:
https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[FILE_PATH]?alt=media&token=[THE_TOKEN_YOU_CREATED]
I don't know how much "safe" is to use this method (given that Firebase could change how it generates the download URLs in the future ) but it is easy to implement.
Sorry but i can't post a comment to your question above because of missing reputation, so I will include it in this answer.
Do as stated above by generating a signed Url, but instead of using the service-account.json I think you have to use the serviceAccountKey.json which you can generate at (replace YOURPROJECTID accordingly)
https://console.firebase.google.com/project/YOURPROJECTID/settings/serviceaccounts/adminsdk
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'serviceAccountKey.json'});
// ...
const bucket = gcs.bucket(bucket);
// ...
return bucket.upload(tempLocalFile, {
destination: filePath,
metadata: {
contentType: 'image/jpeg'
}
})
.then((data) => {
let file = data[0]
file.getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// handle url
})
I can't comment on the answer James Daniels gave, but I think this is very Important to read.
Giving out a signed URL Like he did seems for many cases pretty bad and possible Dangerous.
According to the documentation of Firebase the signed url expires after some time, so adding that to your databse will lead to a empty url after a certain timeframe
It may be that misunderstood the Documentation there and the signed url doesn't expire, which would have some security issues as a result.
The Key seems to be the same for every uploaded file. This means once you got the url of one file, someone could easily access files that he is not suposed to access, just by knowing their names.
If i missunderstood that then i would lvoe to be corrected.
Else someone should probably Update the above named solution.
If i may be wrong there
If you use the predefined access control lists value of 'publicRead', you can upload the file and access it with a very simple url structure:
// Upload to GCS
const opts: UploadOptions = {
gzip: true,
destination: dest, // 'someFolder/image.jpg'
predefinedAcl: 'publicRead',
public: true
};
return bucket.upload(imagePath, opts);
You can then construct the url like so:
const storageRoot = 'https://storage.googleapis.com/';
const bucketName = 'myapp.appspot.com/'; // CHANGE TO YOUR BUCKET NAME
const downloadUrl = storageRoot + bucketName + encodeURIComponent(dest);
Use file.publicUrl()
Async/Await
const bucket = storage.bucket('bucket-name');
const uploadResponse = await bucket.upload('image-name.jpg');
const downloadUrl = uploadResponse[0].publicUrl();
Callback
const bucket = storage.bucket('bucket-name');
bucket.upload('image-name.jpg', (err, file) => {
if(!file) {
throw err;
}
const downloadUrl = file.publicUrl();
})
The downloadUrl will be "https://storage.googleapis.com/bucket-name/image-name.jpg".
Please note that in order for the above code to work, you have to make the bucket or file public. To do so, follow the instructions here https://cloud.google.com/storage/docs/access-control/making-data-public. Also, I imported the #google-cloud/storage package directly not through the Firebase SDK.
I had the same issue, however, I was looking at the code of the firebase function example instead of the README. And Answers on this thread didn't help either...
You can avoid passing the config file by doing the following:
Go to your project's Cloud Console > IAM & admin > IAM, Find the App
Engine default service account and add the Service Account Token
Creator role to that member. This will allow your app to create signed
public URLs to the images.
source: Automatically Generate Thumbnails function README
Your role for app engine should look like this:
This works if you just need a public file with a simple URL. Note that this may overrule your Firebase storage rules.
bucket.upload(file, function(err, file) {
if (!err) {
//Make the file public
file.acl.add({
entity: 'allUsers',
role: gcs.acl.READER_ROLE
}, function(err, aclObject) {
if (!err) {
var URL = "https://storage.googleapis.com/[your bucket name]/" + file.id;
console.log(URL);
} else {
console.log("Failed to set permissions: " + err);
}
});
} else {
console.log("Upload failed: " + err);
}
});
Without signedURL() using makePublic()
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp()
var bucket = admin.storage().bucket();
// --- [Above] for admin related operations, [Below] for making a public url from a GCS uploaded object
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
exports.testDlUrl = functions.storage.object().onFinalize(async (objMetadata) => {
console.log('bucket, file', objMetadata.bucket + ' ' + objMetadata.name.split('/').pop()); // assuming file is in folder
return storage.bucket(objMetadata.bucket).file(objMetadata.name).makePublic().then(function (data) {
return admin.firestore().collection('publicUrl').doc().set({ publicUrl: 'https://storage.googleapis.com/' + objMetadata.bucket + '/' + objMetadata.name }).then(writeResult => {
return console.log('publicUrl', writeResult);
});
});
});
answer by https://stackoverflow.com/users/269447/laurent works best
const uploadOptions: UploadOptions = {
public: true
};
const bucket = admin.storage().bucket();
[ffile] = await bucket.upload(oPath, uploadOptions);
ffile.metadata.mediaLink // this is what you need
I saw this on the admin storage doc
const options = {
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
};
// Get a v4 signed URL for reading the file
const [url] = await storage
.bucket(bucketName)
.file(filename)
.getSignedUrl(options);
console.log('Generated GET signed URL:');
console.log(url);
console.log('You can use this URL with any user agent, for example:');
console.log(`curl '${url}'`);
For those who are using Firebase SDK andadmin.initializeApp:
1 - Generate a Private Key and place in /functions folder.
2 - Configure your code as follows:
const serviceAccount = require('../../serviceAccountKey.json');
try { admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) })); } catch (e) {}
Documentation
The try/catch is because I'm using a index.js that imports other files and creates one function to each file. If you're using a single index.js file with all functions, you should be ok with admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) }));.
As of firebase 6.0.0 I was able to access the storage directly with the admin like this:
const bucket = admin.storage().bucket();
So I didn't need to add a service account. Then setting the UUID as referenced above worked for getting the firebase url.
This is the best I came up. It is redundant, but the only reasonable solution that worked for me.
await bucket.upload(localFilePath, {destination: uploadPath, public: true});
const f = await bucket.file(uploadPath)
const meta = await f.getMetadata()
console.log(meta[0].mediaLink)
I already post my ans... in below URL Where you can get full code with solution
How do I upload a base64 encoded image (string) directly to a Google Cloud Storage bucket using Node.js?
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
const os = require('os')
const path = require('path')
const cors = require('cors')({ origin: true })
const Busboy = require('busboy')
const fs = require('fs')
var admin = require("firebase-admin");
var serviceAccount = {
"type": "service_account",
"project_id": "xxxxxx",
"private_key_id": "xxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\jr5x+4AvctKLonBafg\nElTg3Cj7pAEbUfIO9I44zZ8=\n-----END PRIVATE KEY-----\n",
"client_email": "xxxx#xxxx.iam.gserviceaccount.com",
"client_id": "xxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-5rmdm%40xxxxx.iam.gserviceaccount.com"
}
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "xxxxx-xxxx" // use your storage bucket name
});
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('/uploadFile', (req, response) => {
response.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers })
let uploadData = null
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(os.tmpdir(), filename)
uploadData = { file: filepath, type: mimetype }
console.log("-------------->>",filepath)
file.pipe(fs.createWriteStream(filepath))
})
busboy.on('finish', () => {
const bucket = admin.storage().bucket();
bucket.upload(uploadData.file, {
uploadType: 'media',
metadata: {
metadata: { firebaseStorageDownloadTokens: uuid,
contentType: uploadData.type,
},
},
})
.catch(err => {
res.status(500).json({
error: err,
})
})
})
busboy.end(req.rawBody)
});
exports.widgets = functions.https.onRequest(app);
For those trying to use the token parameter to share the file and would like to use gsutil command, here is how I did it:
First you need to authenticate by running: gcloud auth
Then run:
gsutil setmeta -h "x-goog-meta-firebaseStorageDownloadTokens:$FILE_TOKEN" gs://$FIREBASE_REPO/$FILE_NAME
Then you can download the file with the following link:
https://firebasestorage.googleapis.com/v0/b/$FIREBASE_REPO/o/$FILE_NAME?alt=media&token=$FILE_TOKEN
From the Admin SDKs, you cannot retrieve the download token generated by Firebase of an uploaded file, but you can set that token when uploading by adding it in the metadata.
For those who are working on Python SDK. This is the way to do it:
from firebase_admin import storage
from uuid import uuid4
bucket = storage.bucket()
blob = bucket.blob(path_to_file)
token = str(uuid4()) # Random ID
blob.metadata = {
"firebaseStorageDownloadTokens": token
}
blob.upload_from_file(file)
You have now uploaded the file and got the URL token. You could now save the token (or even the full download URL) into your database (e.g. Firestore) and send it to the client when the file is requested and then making the client itself retrieve the file.
The full download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/{bucket_name}/o/{file_name}?alt=media&token={token}
If you are getting error:
Google Cloud Functions: require(…) is not a function
try this:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({keyFilename: 'service-account-key.json'});
const bucket = storage.bucket(object.bucket);
const file = bucket.file(filePath);
.....
Related
After uploading a file in Firebase Storage with Functions for Firebase, I'd like to get the download url of the file.
I have this :
...
return bucket
.upload(fromFilePath, {destination: toFilePath})
.then((err, file) => {
// Get the download url of file
});
The object file has a lot of parameters. Even one named mediaLink. However, if I try to access this link, I get this error :
Anonymous users does not have storage.objects.get access to object ...
Can somebody tell me how to get the public download Url?
Thank you
You'll need to generate a signed URL using getSignedURL via the #google-cloud/storage NPM module.
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
// ...
const bucket = gcs.bucket(bucket);
const file = bucket.file(fileName);
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
// signedUrls[0] contains the file's public URL
});
You'll need to initialize #google-cloud/storage with your service account credentials as the application default credentials will not be sufficient.
UPDATE: The Cloud Storage SDK can now be accessed via the Firebase Admin SDK, which acts as a wrapper around #google-cloud/storage. The only way it will is if you either:
Init the SDK with a special service account, typically through a second, non-default instance.
Or, without a service account, by giving the default App Engine service account the "signBlob" permission.
This answer will summarize the options for getting a download URL when uploading a file to Google/Firebase Cloud Storage. There are three types of download URLS:
Token download URLs, which are persistent and have security features
Signed download URLs, which are temporary and have security features
Public download URLs, which are persistent and lack security
There are two ways to get a token download URL. Signed and public download URLs each have only one way to get them.
Token URL method #1: From the Firebase Storage Console
You can get the download URL from Firebase Storage console:
The download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/languagetwo-cd94d.appspot.com/o/Audio%2FEnglish%2FUnited_States-OED-0%2Fabout.mp3?alt=media&token=489c48b3-23fb-4270-bd85-0a328d2808e5
The first part is a standard path to your file. At the end is the token. This download URL is permanent, i.e., it won't expire, although you can revoke it.
Token URL method #2: From the Front End
The documentation tells us to use getDownloadURL():
let url = await firebase.storage().ref('Audio/English/United_States-OED-' + i +'/' + $scope.word.word + ".mp3").getDownloadURL();
This gets the same download URL that you can get from your Firebase Storage console. This method is easy but requires that you know the path to your file, which in my app is difficult. You could upload files from the front end, but this would expose your credentials to anyone who downloads your app. So for most projects you'll want to upload your files from your Cloud Functions, then get the download URL and save it to your database along with other data about your file.
I can't find a way to get the token download URL when I write a file to Storage from a Cloud Function (because I can't find a way to tell the front end that a file has written to Storage), but what works for me is to write a file to a publicly available URL, write the publicly available URL to Firebase, then when my Angular front end gets the download URL from Firebase it also runs getDownloadURL(), which has the token, then compares the download URL in Firestore to the token download URL, and if they don't match then it updates the token download URL in place of the publicly available URL in Firestore. This exposes your file to the public only once.
This is easier than it sounds. The following code iterates through an array of Storage download URLs and replaces publicly available download URLs with token download URLs.
const storage = getStorage();
var audioFiles: string[] = [];
if (this.pronunciationArray[0].pronunciation != undefined) {
for (const audioFile of this.pronunciationArray[0].audioFiles) { // for each audio file in array
let url = await getDownloadURL(ref(storage, audioFile)); // get the download url with token
if (audioFile !== url) { // download URLs don't match
audioFiles.push(url);
} // end inner if
}; // end for of loop
if (audioFiles.length > 0) { // update Firestore only if we have new download URLs
await updateDoc(doc(this.firestore, 'Dictionaries/' + this.l2Language.long + '/Words/' + word + '/Pronunciations/', this.pronunciationArray[0].pronunciation), {
audioFiles: audioFiles
});
}
} // end outer if
You're thinking, "I'll return the Storage location from my Cloud Function to my front end and then use the location with getDownloadURL() to write the token download URL to Firestore." That won't work because Cloud Functions can only return synchronous results. Async operations return null.
"No problem," you say. "I'll set up an Observer on Storage, get the location from the Observer, and then use the location with getDownloadURL() to write the token download URL to Firestore." No dice. Firestore has Observers. Storage doesn't have Observers.
"How about," you say, "calling listAll() from my front end, getting a list of all my Storage files, then calling the metadata for each file, and extracting the download URL and token for each file, and then writing these to Firestore?" Good try, but Storage metadata doesn't include the download URL or token.
Signed URL method #1: getSignedUrl() for Temporary Download URLs
getSignedUrl() is easy to use from a Cloud Function:
function oedPromise() {
return new Promise(function(resolve, reject) {
http.get(oedAudioURL, function(response) {
response.pipe(file.createWriteStream(options))
.on('error', function(error) {
console.error(error);
reject(error);
})
.on('finish', function() {
file.getSignedUrl(config, function(err, url) {
if (err) {
console.error(err);
return;
} else {
resolve(url);
}
});
});
});
});
}
A signed download URL looks like this:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio%2FSpanish%2FLatin_America-Sofia-Female-IBM%2Faqu%C3%AD.mp3?GoogleAccessId=languagetwo-cd94d%40appspot.gserviceaccount.com&Expires=4711305600&Signature=WUmABCZIlUp6eg7dKaBFycuO%2Baz5vOGTl29Je%2BNpselq8JSl7%2BIGG1LnCl0AlrHpxVZLxhk0iiqIejj4Qa6pSMx%2FhuBfZLT2Z%2FQhIzEAoyiZFn8xy%2FrhtymjDcpbDKGZYjmWNONFezMgYekNYHi05EPMoHtiUDsP47xHm3XwW9BcbuW6DaWh2UKrCxERy6cJTJ01H9NK1wCUZSMT0%2BUeNpwTvbRwc4aIqSD3UbXSMQlFMxxWbPvf%2B8Q0nEcaAB1qMKwNhw1ofAxSSaJvUdXeLFNVxsjm2V9HX4Y7OIuWwAxtGedLhgSleOP4ErByvGQCZsoO4nljjF97veil62ilaQ%3D%3D
The signed URL has an expiration date and long signature. The documentation for the command line gsutil signurl -d says that signed URLs are temporary: the default expiration is one hour and the maximum expiration is seven days.
I'm going to rant here that the getSignedUrl documentation never says that your signed URL will expire in a week. The documentation code has 3-17-2025 as the expiration date, suggesting that you can set the expiration years in the future. My app worked perfectly, and then crashed a week later. The error message said that the signatures didn't match, not that the download URL had expired. I made various changes to my code, and everything worked...until it all crashed a week later. This went on for more than a month of frustration. Is the 3-17-2025 date an inside joke? Like the gold coins of a leprechaun that vanish when the leprechaun is out of sight, a St. Patrick's Day expiry date years in the future vanishes in two weeks, just when you thought your code was bug-free.
Public URL #1: Make Your File Publicly Available
You can set the permissions on your file to public read, as explained in the documentation. This can be done from the Cloud Storage Browser or from your Node server. You can make one file public or a directory or your entire Storage database. Here's the Node code:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
predefinedAcl: 'publicRead',
contentType: 'audio/' + audioType,
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
console.log("webm audio file written.");
resolve();
})
.catch(error => console.error(error));
});
The result will look like this in your Cloud Storage Browser:
Anyone can then use the standard path to download your file:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio/English/United_States-OED-0/system.mp3
Another way to make a file public is to use the method makePublic(). I haven't been able to get this to work, it's tricky to get the bucket and file paths right.
An interesting alternative is to use Access Control Lists. You can make a file available only to users whom you put on a list, or use authenticatedRead to make the file available to anyone who is logged in from a Google account. If there were an option "anyone who logged into my app using Firebase Auth" I would use this, as it would limit access to only my users.
Deprecated: Build Your Own Download URL with firebaseStorageDownloadTokens
Several answers describe an undocumented Google Storage object property firebaseStorageDownloadTokens. This was never an official Google Cloud Storage feature and it no longer works. Here's how it used to work.
You told Storage the token you wanted to use. You then generated a token with the uuid Node module. Four lines of code and you could build your own download URL, the same download URL you get from the console or getDownloadURL(). The four lines of code are:
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
Here's the code in context:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
contentType: 'audio/' + audioType,
metadata: {
metadata: {
firebaseStorageDownloadTokens: uuid,
}
}
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
})
.catch(error => console.error(error));
});
That's not a typo--you have to nest firebaseStorageDownloadTokens in double layers of metadata:!
Here's an example on how to specify the download token on upload:
const UUID = require("uuid-v4");
const fbId = "<YOUR APP ID>";
const fbKeyFile = "./YOUR_AUTH_FIlE.json";
const gcs = require('#google-cloud/storage')({keyFilename: fbKeyFile});
const bucket = gcs.bucket(`${fbId}.appspot.com`);
var upload = (localFile, remoteFile) => {
let uuid = UUID();
return bucket.upload(localFile, {
destination: remoteFile,
uploadType: "media",
metadata: {
contentType: 'image/png',
metadata: {
firebaseStorageDownloadTokens: uuid
}
}
})
.then((data) => {
let file = data[0];
return Promise.resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent(file.name) + "?alt=media&token=" + uuid);
});
}
then call with
upload(localPath, remotePath).then( downloadURL => {
console.log(downloadURL);
});
The key thing here is that there is a metadata object nested within the metadata option property. Setting firebaseStorageDownloadTokens to a uuid-v4 value will tell Cloud Storage to use that as its public auth token.
Many thanks to #martemorfosis
If you're working on a Firebase project, you can create signed URLs in a Cloud Function without including other libraries or downloading a credentials file. You just need to enable the IAM API and add a role to your existing service account (see below).
Initialize the admin library and get a file reference as your normally would:
import * as functions from 'firebase-functions'
import * as admin from 'firebase-admin'
admin.initializeApp(functions.config().firebase)
const myFile = admin.storage().bucket().file('path/to/my/file')
You then generate a signed URL with
myFile.getSignedUrl({action: 'read', expires: someDateObj}).then(urls => {
const signedUrl = urls[0]
})
Make sure your Firebase service account has sufficient permissions to run this
Go to the Google API console and enable the IAM API (https://console.developers.google.com/apis/api/iam.googleapis.com/overview)
Still in the API console, go to the main menu, "IAM & admin" -> "IAM"
Click edit for the "App Engine default service account" role
Click "Add another role", and add the one called "Service Account Token Creator"
Save and wait a minute for the changes to propagate
With a vanilla Firebase config, the first time you run the above code you'll get an error Identity and Access Management (IAM) API has not been used in project XXXXXX before or it is disabled.. If you follow the link in the error message and enable the IAM API, you'll get another error: Permission iam.serviceAccounts.signBlob is required to perform this operation on service account my-service-account. Adding the Token Creator role fixes this second permission issue.
You should avoid harcoding URL prefix in your code, especially when there are alternatives. I suggest using the option predefinedAcl: 'publicRead' when uploading a file with Cloud Storage NodeJS 1.6.x or +:
const options = {
destination: yourFileDestination,
predefinedAcl: 'publicRead'
};
bucket.upload(attachment, options);
Then, getting the public URL is as simple as:
bucket.upload(attachment, options).then(result => {
const file = result[0];
return file.getMetadata();
}).then(results => {
const metadata = results[0];
console.log('metadata=', metadata.mediaLink);
}).catch(error => {
console.error(error);
});
With the recent changes in the functions object response you can get everything you need to "stitch" together the download URL like so:
const img_url = 'https://firebasestorage.googleapis.com/v0/b/[YOUR BUCKET]/o/'
+ encodeURIComponent(object.name)
+ '?alt=media&token='
+ object.metadata.firebaseStorageDownloadTokens;
console.log('URL',img_url);
This is what I currently use, it's simple and it works flawlessly.
You don't need to do anything with Google Cloud. It works out of the box with Firebase..
// Save the base64 to storage.
const file = admin.storage().bucket('url found on the storage part of firebase').file(`profile_photos/${uid}`);
await file.save(base64Image, {
metadata: {
contentType: 'image/jpeg',
},
predefinedAcl: 'publicRead'
});
const metaData = await file.getMetadata()
const url = metaData[0].mediaLink
EDIT:
Same example, but with upload:
await bucket.upload(fromFilePath, {destination: toFilePath});
file = bucket.file(toFilePath);
metaData = await file.getMetadata()
const trimUrl = metaData[0].mediaLink
#update:
no need to make two different call in upload method to get the metadata:
let file = await bucket.upload(fromFilePath, {destination: toFilePath});
const trimUrl = file[0].metaData.mediaLink
For those wondering where the Firebase Admin SDK serviceAccountKey.json file should go. Just place it in the functions folder and deploy as usual.
It still baffles me why we can't just get the download url from the metadata like we do in the Javascript SDK. Generating a url that will eventually expire and saving it in the database is not desirable.
One method I'm using with success is to set a UUID v4 value to a key named firebaseStorageDownloadTokens in the metadata of the file after it finishes uploading and then assemble the download URL myself following the structure Firebase uses to generate these URLs, eg:
https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[FILE_PATH]?alt=media&token=[THE_TOKEN_YOU_CREATED]
I don't know how much "safe" is to use this method (given that Firebase could change how it generates the download URLs in the future ) but it is easy to implement.
Sorry but i can't post a comment to your question above because of missing reputation, so I will include it in this answer.
Do as stated above by generating a signed Url, but instead of using the service-account.json I think you have to use the serviceAccountKey.json which you can generate at (replace YOURPROJECTID accordingly)
https://console.firebase.google.com/project/YOURPROJECTID/settings/serviceaccounts/adminsdk
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'serviceAccountKey.json'});
// ...
const bucket = gcs.bucket(bucket);
// ...
return bucket.upload(tempLocalFile, {
destination: filePath,
metadata: {
contentType: 'image/jpeg'
}
})
.then((data) => {
let file = data[0]
file.getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// handle url
})
I can't comment on the answer James Daniels gave, but I think this is very Important to read.
Giving out a signed URL Like he did seems for many cases pretty bad and possible Dangerous.
According to the documentation of Firebase the signed url expires after some time, so adding that to your databse will lead to a empty url after a certain timeframe
It may be that misunderstood the Documentation there and the signed url doesn't expire, which would have some security issues as a result.
The Key seems to be the same for every uploaded file. This means once you got the url of one file, someone could easily access files that he is not suposed to access, just by knowing their names.
If i missunderstood that then i would lvoe to be corrected.
Else someone should probably Update the above named solution.
If i may be wrong there
If you use the predefined access control lists value of 'publicRead', you can upload the file and access it with a very simple url structure:
// Upload to GCS
const opts: UploadOptions = {
gzip: true,
destination: dest, // 'someFolder/image.jpg'
predefinedAcl: 'publicRead',
public: true
};
return bucket.upload(imagePath, opts);
You can then construct the url like so:
const storageRoot = 'https://storage.googleapis.com/';
const bucketName = 'myapp.appspot.com/'; // CHANGE TO YOUR BUCKET NAME
const downloadUrl = storageRoot + bucketName + encodeURIComponent(dest);
Use file.publicUrl()
Async/Await
const bucket = storage.bucket('bucket-name');
const uploadResponse = await bucket.upload('image-name.jpg');
const downloadUrl = uploadResponse[0].publicUrl();
Callback
const bucket = storage.bucket('bucket-name');
bucket.upload('image-name.jpg', (err, file) => {
if(!file) {
throw err;
}
const downloadUrl = file.publicUrl();
})
The downloadUrl will be "https://storage.googleapis.com/bucket-name/image-name.jpg".
Please note that in order for the above code to work, you have to make the bucket or file public. To do so, follow the instructions here https://cloud.google.com/storage/docs/access-control/making-data-public. Also, I imported the #google-cloud/storage package directly not through the Firebase SDK.
I had the same issue, however, I was looking at the code of the firebase function example instead of the README. And Answers on this thread didn't help either...
You can avoid passing the config file by doing the following:
Go to your project's Cloud Console > IAM & admin > IAM, Find the App
Engine default service account and add the Service Account Token
Creator role to that member. This will allow your app to create signed
public URLs to the images.
source: Automatically Generate Thumbnails function README
Your role for app engine should look like this:
This works if you just need a public file with a simple URL. Note that this may overrule your Firebase storage rules.
bucket.upload(file, function(err, file) {
if (!err) {
//Make the file public
file.acl.add({
entity: 'allUsers',
role: gcs.acl.READER_ROLE
}, function(err, aclObject) {
if (!err) {
var URL = "https://storage.googleapis.com/[your bucket name]/" + file.id;
console.log(URL);
} else {
console.log("Failed to set permissions: " + err);
}
});
} else {
console.log("Upload failed: " + err);
}
});
Without signedURL() using makePublic()
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp()
var bucket = admin.storage().bucket();
// --- [Above] for admin related operations, [Below] for making a public url from a GCS uploaded object
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
exports.testDlUrl = functions.storage.object().onFinalize(async (objMetadata) => {
console.log('bucket, file', objMetadata.bucket + ' ' + objMetadata.name.split('/').pop()); // assuming file is in folder
return storage.bucket(objMetadata.bucket).file(objMetadata.name).makePublic().then(function (data) {
return admin.firestore().collection('publicUrl').doc().set({ publicUrl: 'https://storage.googleapis.com/' + objMetadata.bucket + '/' + objMetadata.name }).then(writeResult => {
return console.log('publicUrl', writeResult);
});
});
});
answer by https://stackoverflow.com/users/269447/laurent works best
const uploadOptions: UploadOptions = {
public: true
};
const bucket = admin.storage().bucket();
[ffile] = await bucket.upload(oPath, uploadOptions);
ffile.metadata.mediaLink // this is what you need
I saw this on the admin storage doc
const options = {
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
};
// Get a v4 signed URL for reading the file
const [url] = await storage
.bucket(bucketName)
.file(filename)
.getSignedUrl(options);
console.log('Generated GET signed URL:');
console.log(url);
console.log('You can use this URL with any user agent, for example:');
console.log(`curl '${url}'`);
For those who are using Firebase SDK andadmin.initializeApp:
1 - Generate a Private Key and place in /functions folder.
2 - Configure your code as follows:
const serviceAccount = require('../../serviceAccountKey.json');
try { admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) })); } catch (e) {}
Documentation
The try/catch is because I'm using a index.js that imports other files and creates one function to each file. If you're using a single index.js file with all functions, you should be ok with admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) }));.
As of firebase 6.0.0 I was able to access the storage directly with the admin like this:
const bucket = admin.storage().bucket();
So I didn't need to add a service account. Then setting the UUID as referenced above worked for getting the firebase url.
This is the best I came up. It is redundant, but the only reasonable solution that worked for me.
await bucket.upload(localFilePath, {destination: uploadPath, public: true});
const f = await bucket.file(uploadPath)
const meta = await f.getMetadata()
console.log(meta[0].mediaLink)
I already post my ans... in below URL Where you can get full code with solution
How do I upload a base64 encoded image (string) directly to a Google Cloud Storage bucket using Node.js?
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
const os = require('os')
const path = require('path')
const cors = require('cors')({ origin: true })
const Busboy = require('busboy')
const fs = require('fs')
var admin = require("firebase-admin");
var serviceAccount = {
"type": "service_account",
"project_id": "xxxxxx",
"private_key_id": "xxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\jr5x+4AvctKLonBafg\nElTg3Cj7pAEbUfIO9I44zZ8=\n-----END PRIVATE KEY-----\n",
"client_email": "xxxx#xxxx.iam.gserviceaccount.com",
"client_id": "xxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-5rmdm%40xxxxx.iam.gserviceaccount.com"
}
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "xxxxx-xxxx" // use your storage bucket name
});
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('/uploadFile', (req, response) => {
response.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers })
let uploadData = null
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(os.tmpdir(), filename)
uploadData = { file: filepath, type: mimetype }
console.log("-------------->>",filepath)
file.pipe(fs.createWriteStream(filepath))
})
busboy.on('finish', () => {
const bucket = admin.storage().bucket();
bucket.upload(uploadData.file, {
uploadType: 'media',
metadata: {
metadata: { firebaseStorageDownloadTokens: uuid,
contentType: uploadData.type,
},
},
})
.catch(err => {
res.status(500).json({
error: err,
})
})
})
busboy.end(req.rawBody)
});
exports.widgets = functions.https.onRequest(app);
For those trying to use the token parameter to share the file and would like to use gsutil command, here is how I did it:
First you need to authenticate by running: gcloud auth
Then run:
gsutil setmeta -h "x-goog-meta-firebaseStorageDownloadTokens:$FILE_TOKEN" gs://$FIREBASE_REPO/$FILE_NAME
Then you can download the file with the following link:
https://firebasestorage.googleapis.com/v0/b/$FIREBASE_REPO/o/$FILE_NAME?alt=media&token=$FILE_TOKEN
From the Admin SDKs, you cannot retrieve the download token generated by Firebase of an uploaded file, but you can set that token when uploading by adding it in the metadata.
For those who are working on Python SDK. This is the way to do it:
from firebase_admin import storage
from uuid import uuid4
bucket = storage.bucket()
blob = bucket.blob(path_to_file)
token = str(uuid4()) # Random ID
blob.metadata = {
"firebaseStorageDownloadTokens": token
}
blob.upload_from_file(file)
You have now uploaded the file and got the URL token. You could now save the token (or even the full download URL) into your database (e.g. Firestore) and send it to the client when the file is requested and then making the client itself retrieve the file.
The full download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/{bucket_name}/o/{file_name}?alt=media&token={token}
If you are getting error:
Google Cloud Functions: require(…) is not a function
try this:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({keyFilename: 'service-account-key.json'});
const bucket = storage.bucket(object.bucket);
const file = bucket.file(filePath);
.....
After uploading a file in Firebase Storage with Functions for Firebase, I'd like to get the download url of the file.
I have this :
...
return bucket
.upload(fromFilePath, {destination: toFilePath})
.then((err, file) => {
// Get the download url of file
});
The object file has a lot of parameters. Even one named mediaLink. However, if I try to access this link, I get this error :
Anonymous users does not have storage.objects.get access to object ...
Can somebody tell me how to get the public download Url?
Thank you
You'll need to generate a signed URL using getSignedURL via the #google-cloud/storage NPM module.
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
// ...
const bucket = gcs.bucket(bucket);
const file = bucket.file(fileName);
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
// signedUrls[0] contains the file's public URL
});
You'll need to initialize #google-cloud/storage with your service account credentials as the application default credentials will not be sufficient.
UPDATE: The Cloud Storage SDK can now be accessed via the Firebase Admin SDK, which acts as a wrapper around #google-cloud/storage. The only way it will is if you either:
Init the SDK with a special service account, typically through a second, non-default instance.
Or, without a service account, by giving the default App Engine service account the "signBlob" permission.
This answer will summarize the options for getting a download URL when uploading a file to Google/Firebase Cloud Storage. There are three types of download URLS:
Token download URLs, which are persistent and have security features
Signed download URLs, which are temporary and have security features
Public download URLs, which are persistent and lack security
There are two ways to get a token download URL. Signed and public download URLs each have only one way to get them.
Token URL method #1: From the Firebase Storage Console
You can get the download URL from Firebase Storage console:
The download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/languagetwo-cd94d.appspot.com/o/Audio%2FEnglish%2FUnited_States-OED-0%2Fabout.mp3?alt=media&token=489c48b3-23fb-4270-bd85-0a328d2808e5
The first part is a standard path to your file. At the end is the token. This download URL is permanent, i.e., it won't expire, although you can revoke it.
Token URL method #2: From the Front End
The documentation tells us to use getDownloadURL():
let url = await firebase.storage().ref('Audio/English/United_States-OED-' + i +'/' + $scope.word.word + ".mp3").getDownloadURL();
This gets the same download URL that you can get from your Firebase Storage console. This method is easy but requires that you know the path to your file, which in my app is difficult. You could upload files from the front end, but this would expose your credentials to anyone who downloads your app. So for most projects you'll want to upload your files from your Cloud Functions, then get the download URL and save it to your database along with other data about your file.
I can't find a way to get the token download URL when I write a file to Storage from a Cloud Function (because I can't find a way to tell the front end that a file has written to Storage), but what works for me is to write a file to a publicly available URL, write the publicly available URL to Firebase, then when my Angular front end gets the download URL from Firebase it also runs getDownloadURL(), which has the token, then compares the download URL in Firestore to the token download URL, and if they don't match then it updates the token download URL in place of the publicly available URL in Firestore. This exposes your file to the public only once.
This is easier than it sounds. The following code iterates through an array of Storage download URLs and replaces publicly available download URLs with token download URLs.
const storage = getStorage();
var audioFiles: string[] = [];
if (this.pronunciationArray[0].pronunciation != undefined) {
for (const audioFile of this.pronunciationArray[0].audioFiles) { // for each audio file in array
let url = await getDownloadURL(ref(storage, audioFile)); // get the download url with token
if (audioFile !== url) { // download URLs don't match
audioFiles.push(url);
} // end inner if
}; // end for of loop
if (audioFiles.length > 0) { // update Firestore only if we have new download URLs
await updateDoc(doc(this.firestore, 'Dictionaries/' + this.l2Language.long + '/Words/' + word + '/Pronunciations/', this.pronunciationArray[0].pronunciation), {
audioFiles: audioFiles
});
}
} // end outer if
You're thinking, "I'll return the Storage location from my Cloud Function to my front end and then use the location with getDownloadURL() to write the token download URL to Firestore." That won't work because Cloud Functions can only return synchronous results. Async operations return null.
"No problem," you say. "I'll set up an Observer on Storage, get the location from the Observer, and then use the location with getDownloadURL() to write the token download URL to Firestore." No dice. Firestore has Observers. Storage doesn't have Observers.
"How about," you say, "calling listAll() from my front end, getting a list of all my Storage files, then calling the metadata for each file, and extracting the download URL and token for each file, and then writing these to Firestore?" Good try, but Storage metadata doesn't include the download URL or token.
Signed URL method #1: getSignedUrl() for Temporary Download URLs
getSignedUrl() is easy to use from a Cloud Function:
function oedPromise() {
return new Promise(function(resolve, reject) {
http.get(oedAudioURL, function(response) {
response.pipe(file.createWriteStream(options))
.on('error', function(error) {
console.error(error);
reject(error);
})
.on('finish', function() {
file.getSignedUrl(config, function(err, url) {
if (err) {
console.error(err);
return;
} else {
resolve(url);
}
});
});
});
});
}
A signed download URL looks like this:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio%2FSpanish%2FLatin_America-Sofia-Female-IBM%2Faqu%C3%AD.mp3?GoogleAccessId=languagetwo-cd94d%40appspot.gserviceaccount.com&Expires=4711305600&Signature=WUmABCZIlUp6eg7dKaBFycuO%2Baz5vOGTl29Je%2BNpselq8JSl7%2BIGG1LnCl0AlrHpxVZLxhk0iiqIejj4Qa6pSMx%2FhuBfZLT2Z%2FQhIzEAoyiZFn8xy%2FrhtymjDcpbDKGZYjmWNONFezMgYekNYHi05EPMoHtiUDsP47xHm3XwW9BcbuW6DaWh2UKrCxERy6cJTJ01H9NK1wCUZSMT0%2BUeNpwTvbRwc4aIqSD3UbXSMQlFMxxWbPvf%2B8Q0nEcaAB1qMKwNhw1ofAxSSaJvUdXeLFNVxsjm2V9HX4Y7OIuWwAxtGedLhgSleOP4ErByvGQCZsoO4nljjF97veil62ilaQ%3D%3D
The signed URL has an expiration date and long signature. The documentation for the command line gsutil signurl -d says that signed URLs are temporary: the default expiration is one hour and the maximum expiration is seven days.
I'm going to rant here that the getSignedUrl documentation never says that your signed URL will expire in a week. The documentation code has 3-17-2025 as the expiration date, suggesting that you can set the expiration years in the future. My app worked perfectly, and then crashed a week later. The error message said that the signatures didn't match, not that the download URL had expired. I made various changes to my code, and everything worked...until it all crashed a week later. This went on for more than a month of frustration. Is the 3-17-2025 date an inside joke? Like the gold coins of a leprechaun that vanish when the leprechaun is out of sight, a St. Patrick's Day expiry date years in the future vanishes in two weeks, just when you thought your code was bug-free.
Public URL #1: Make Your File Publicly Available
You can set the permissions on your file to public read, as explained in the documentation. This can be done from the Cloud Storage Browser or from your Node server. You can make one file public or a directory or your entire Storage database. Here's the Node code:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
predefinedAcl: 'publicRead',
contentType: 'audio/' + audioType,
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
console.log("webm audio file written.");
resolve();
})
.catch(error => console.error(error));
});
The result will look like this in your Cloud Storage Browser:
Anyone can then use the standard path to download your file:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio/English/United_States-OED-0/system.mp3
Another way to make a file public is to use the method makePublic(). I haven't been able to get this to work, it's tricky to get the bucket and file paths right.
An interesting alternative is to use Access Control Lists. You can make a file available only to users whom you put on a list, or use authenticatedRead to make the file available to anyone who is logged in from a Google account. If there were an option "anyone who logged into my app using Firebase Auth" I would use this, as it would limit access to only my users.
Deprecated: Build Your Own Download URL with firebaseStorageDownloadTokens
Several answers describe an undocumented Google Storage object property firebaseStorageDownloadTokens. This was never an official Google Cloud Storage feature and it no longer works. Here's how it used to work.
You told Storage the token you wanted to use. You then generated a token with the uuid Node module. Four lines of code and you could build your own download URL, the same download URL you get from the console or getDownloadURL(). The four lines of code are:
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
Here's the code in context:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
contentType: 'audio/' + audioType,
metadata: {
metadata: {
firebaseStorageDownloadTokens: uuid,
}
}
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
})
.catch(error => console.error(error));
});
That's not a typo--you have to nest firebaseStorageDownloadTokens in double layers of metadata:!
Here's an example on how to specify the download token on upload:
const UUID = require("uuid-v4");
const fbId = "<YOUR APP ID>";
const fbKeyFile = "./YOUR_AUTH_FIlE.json";
const gcs = require('#google-cloud/storage')({keyFilename: fbKeyFile});
const bucket = gcs.bucket(`${fbId}.appspot.com`);
var upload = (localFile, remoteFile) => {
let uuid = UUID();
return bucket.upload(localFile, {
destination: remoteFile,
uploadType: "media",
metadata: {
contentType: 'image/png',
metadata: {
firebaseStorageDownloadTokens: uuid
}
}
})
.then((data) => {
let file = data[0];
return Promise.resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent(file.name) + "?alt=media&token=" + uuid);
});
}
then call with
upload(localPath, remotePath).then( downloadURL => {
console.log(downloadURL);
});
The key thing here is that there is a metadata object nested within the metadata option property. Setting firebaseStorageDownloadTokens to a uuid-v4 value will tell Cloud Storage to use that as its public auth token.
Many thanks to #martemorfosis
If you're working on a Firebase project, you can create signed URLs in a Cloud Function without including other libraries or downloading a credentials file. You just need to enable the IAM API and add a role to your existing service account (see below).
Initialize the admin library and get a file reference as your normally would:
import * as functions from 'firebase-functions'
import * as admin from 'firebase-admin'
admin.initializeApp(functions.config().firebase)
const myFile = admin.storage().bucket().file('path/to/my/file')
You then generate a signed URL with
myFile.getSignedUrl({action: 'read', expires: someDateObj}).then(urls => {
const signedUrl = urls[0]
})
Make sure your Firebase service account has sufficient permissions to run this
Go to the Google API console and enable the IAM API (https://console.developers.google.com/apis/api/iam.googleapis.com/overview)
Still in the API console, go to the main menu, "IAM & admin" -> "IAM"
Click edit for the "App Engine default service account" role
Click "Add another role", and add the one called "Service Account Token Creator"
Save and wait a minute for the changes to propagate
With a vanilla Firebase config, the first time you run the above code you'll get an error Identity and Access Management (IAM) API has not been used in project XXXXXX before or it is disabled.. If you follow the link in the error message and enable the IAM API, you'll get another error: Permission iam.serviceAccounts.signBlob is required to perform this operation on service account my-service-account. Adding the Token Creator role fixes this second permission issue.
You should avoid harcoding URL prefix in your code, especially when there are alternatives. I suggest using the option predefinedAcl: 'publicRead' when uploading a file with Cloud Storage NodeJS 1.6.x or +:
const options = {
destination: yourFileDestination,
predefinedAcl: 'publicRead'
};
bucket.upload(attachment, options);
Then, getting the public URL is as simple as:
bucket.upload(attachment, options).then(result => {
const file = result[0];
return file.getMetadata();
}).then(results => {
const metadata = results[0];
console.log('metadata=', metadata.mediaLink);
}).catch(error => {
console.error(error);
});
With the recent changes in the functions object response you can get everything you need to "stitch" together the download URL like so:
const img_url = 'https://firebasestorage.googleapis.com/v0/b/[YOUR BUCKET]/o/'
+ encodeURIComponent(object.name)
+ '?alt=media&token='
+ object.metadata.firebaseStorageDownloadTokens;
console.log('URL',img_url);
This is what I currently use, it's simple and it works flawlessly.
You don't need to do anything with Google Cloud. It works out of the box with Firebase..
// Save the base64 to storage.
const file = admin.storage().bucket('url found on the storage part of firebase').file(`profile_photos/${uid}`);
await file.save(base64Image, {
metadata: {
contentType: 'image/jpeg',
},
predefinedAcl: 'publicRead'
});
const metaData = await file.getMetadata()
const url = metaData[0].mediaLink
EDIT:
Same example, but with upload:
await bucket.upload(fromFilePath, {destination: toFilePath});
file = bucket.file(toFilePath);
metaData = await file.getMetadata()
const trimUrl = metaData[0].mediaLink
#update:
no need to make two different call in upload method to get the metadata:
let file = await bucket.upload(fromFilePath, {destination: toFilePath});
const trimUrl = file[0].metaData.mediaLink
For those wondering where the Firebase Admin SDK serviceAccountKey.json file should go. Just place it in the functions folder and deploy as usual.
It still baffles me why we can't just get the download url from the metadata like we do in the Javascript SDK. Generating a url that will eventually expire and saving it in the database is not desirable.
One method I'm using with success is to set a UUID v4 value to a key named firebaseStorageDownloadTokens in the metadata of the file after it finishes uploading and then assemble the download URL myself following the structure Firebase uses to generate these URLs, eg:
https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[FILE_PATH]?alt=media&token=[THE_TOKEN_YOU_CREATED]
I don't know how much "safe" is to use this method (given that Firebase could change how it generates the download URLs in the future ) but it is easy to implement.
Sorry but i can't post a comment to your question above because of missing reputation, so I will include it in this answer.
Do as stated above by generating a signed Url, but instead of using the service-account.json I think you have to use the serviceAccountKey.json which you can generate at (replace YOURPROJECTID accordingly)
https://console.firebase.google.com/project/YOURPROJECTID/settings/serviceaccounts/adminsdk
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'serviceAccountKey.json'});
// ...
const bucket = gcs.bucket(bucket);
// ...
return bucket.upload(tempLocalFile, {
destination: filePath,
metadata: {
contentType: 'image/jpeg'
}
})
.then((data) => {
let file = data[0]
file.getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// handle url
})
I can't comment on the answer James Daniels gave, but I think this is very Important to read.
Giving out a signed URL Like he did seems for many cases pretty bad and possible Dangerous.
According to the documentation of Firebase the signed url expires after some time, so adding that to your databse will lead to a empty url after a certain timeframe
It may be that misunderstood the Documentation there and the signed url doesn't expire, which would have some security issues as a result.
The Key seems to be the same for every uploaded file. This means once you got the url of one file, someone could easily access files that he is not suposed to access, just by knowing their names.
If i missunderstood that then i would lvoe to be corrected.
Else someone should probably Update the above named solution.
If i may be wrong there
If you use the predefined access control lists value of 'publicRead', you can upload the file and access it with a very simple url structure:
// Upload to GCS
const opts: UploadOptions = {
gzip: true,
destination: dest, // 'someFolder/image.jpg'
predefinedAcl: 'publicRead',
public: true
};
return bucket.upload(imagePath, opts);
You can then construct the url like so:
const storageRoot = 'https://storage.googleapis.com/';
const bucketName = 'myapp.appspot.com/'; // CHANGE TO YOUR BUCKET NAME
const downloadUrl = storageRoot + bucketName + encodeURIComponent(dest);
Use file.publicUrl()
Async/Await
const bucket = storage.bucket('bucket-name');
const uploadResponse = await bucket.upload('image-name.jpg');
const downloadUrl = uploadResponse[0].publicUrl();
Callback
const bucket = storage.bucket('bucket-name');
bucket.upload('image-name.jpg', (err, file) => {
if(!file) {
throw err;
}
const downloadUrl = file.publicUrl();
})
The downloadUrl will be "https://storage.googleapis.com/bucket-name/image-name.jpg".
Please note that in order for the above code to work, you have to make the bucket or file public. To do so, follow the instructions here https://cloud.google.com/storage/docs/access-control/making-data-public. Also, I imported the #google-cloud/storage package directly not through the Firebase SDK.
I had the same issue, however, I was looking at the code of the firebase function example instead of the README. And Answers on this thread didn't help either...
You can avoid passing the config file by doing the following:
Go to your project's Cloud Console > IAM & admin > IAM, Find the App
Engine default service account and add the Service Account Token
Creator role to that member. This will allow your app to create signed
public URLs to the images.
source: Automatically Generate Thumbnails function README
Your role for app engine should look like this:
This works if you just need a public file with a simple URL. Note that this may overrule your Firebase storage rules.
bucket.upload(file, function(err, file) {
if (!err) {
//Make the file public
file.acl.add({
entity: 'allUsers',
role: gcs.acl.READER_ROLE
}, function(err, aclObject) {
if (!err) {
var URL = "https://storage.googleapis.com/[your bucket name]/" + file.id;
console.log(URL);
} else {
console.log("Failed to set permissions: " + err);
}
});
} else {
console.log("Upload failed: " + err);
}
});
Without signedURL() using makePublic()
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp()
var bucket = admin.storage().bucket();
// --- [Above] for admin related operations, [Below] for making a public url from a GCS uploaded object
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
exports.testDlUrl = functions.storage.object().onFinalize(async (objMetadata) => {
console.log('bucket, file', objMetadata.bucket + ' ' + objMetadata.name.split('/').pop()); // assuming file is in folder
return storage.bucket(objMetadata.bucket).file(objMetadata.name).makePublic().then(function (data) {
return admin.firestore().collection('publicUrl').doc().set({ publicUrl: 'https://storage.googleapis.com/' + objMetadata.bucket + '/' + objMetadata.name }).then(writeResult => {
return console.log('publicUrl', writeResult);
});
});
});
answer by https://stackoverflow.com/users/269447/laurent works best
const uploadOptions: UploadOptions = {
public: true
};
const bucket = admin.storage().bucket();
[ffile] = await bucket.upload(oPath, uploadOptions);
ffile.metadata.mediaLink // this is what you need
I saw this on the admin storage doc
const options = {
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
};
// Get a v4 signed URL for reading the file
const [url] = await storage
.bucket(bucketName)
.file(filename)
.getSignedUrl(options);
console.log('Generated GET signed URL:');
console.log(url);
console.log('You can use this URL with any user agent, for example:');
console.log(`curl '${url}'`);
For those who are using Firebase SDK andadmin.initializeApp:
1 - Generate a Private Key and place in /functions folder.
2 - Configure your code as follows:
const serviceAccount = require('../../serviceAccountKey.json');
try { admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) })); } catch (e) {}
Documentation
The try/catch is because I'm using a index.js that imports other files and creates one function to each file. If you're using a single index.js file with all functions, you should be ok with admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) }));.
As of firebase 6.0.0 I was able to access the storage directly with the admin like this:
const bucket = admin.storage().bucket();
So I didn't need to add a service account. Then setting the UUID as referenced above worked for getting the firebase url.
This is the best I came up. It is redundant, but the only reasonable solution that worked for me.
await bucket.upload(localFilePath, {destination: uploadPath, public: true});
const f = await bucket.file(uploadPath)
const meta = await f.getMetadata()
console.log(meta[0].mediaLink)
I already post my ans... in below URL Where you can get full code with solution
How do I upload a base64 encoded image (string) directly to a Google Cloud Storage bucket using Node.js?
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
const os = require('os')
const path = require('path')
const cors = require('cors')({ origin: true })
const Busboy = require('busboy')
const fs = require('fs')
var admin = require("firebase-admin");
var serviceAccount = {
"type": "service_account",
"project_id": "xxxxxx",
"private_key_id": "xxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\jr5x+4AvctKLonBafg\nElTg3Cj7pAEbUfIO9I44zZ8=\n-----END PRIVATE KEY-----\n",
"client_email": "xxxx#xxxx.iam.gserviceaccount.com",
"client_id": "xxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-5rmdm%40xxxxx.iam.gserviceaccount.com"
}
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "xxxxx-xxxx" // use your storage bucket name
});
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('/uploadFile', (req, response) => {
response.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers })
let uploadData = null
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(os.tmpdir(), filename)
uploadData = { file: filepath, type: mimetype }
console.log("-------------->>",filepath)
file.pipe(fs.createWriteStream(filepath))
})
busboy.on('finish', () => {
const bucket = admin.storage().bucket();
bucket.upload(uploadData.file, {
uploadType: 'media',
metadata: {
metadata: { firebaseStorageDownloadTokens: uuid,
contentType: uploadData.type,
},
},
})
.catch(err => {
res.status(500).json({
error: err,
})
})
})
busboy.end(req.rawBody)
});
exports.widgets = functions.https.onRequest(app);
For those trying to use the token parameter to share the file and would like to use gsutil command, here is how I did it:
First you need to authenticate by running: gcloud auth
Then run:
gsutil setmeta -h "x-goog-meta-firebaseStorageDownloadTokens:$FILE_TOKEN" gs://$FIREBASE_REPO/$FILE_NAME
Then you can download the file with the following link:
https://firebasestorage.googleapis.com/v0/b/$FIREBASE_REPO/o/$FILE_NAME?alt=media&token=$FILE_TOKEN
From the Admin SDKs, you cannot retrieve the download token generated by Firebase of an uploaded file, but you can set that token when uploading by adding it in the metadata.
For those who are working on Python SDK. This is the way to do it:
from firebase_admin import storage
from uuid import uuid4
bucket = storage.bucket()
blob = bucket.blob(path_to_file)
token = str(uuid4()) # Random ID
blob.metadata = {
"firebaseStorageDownloadTokens": token
}
blob.upload_from_file(file)
You have now uploaded the file and got the URL token. You could now save the token (or even the full download URL) into your database (e.g. Firestore) and send it to the client when the file is requested and then making the client itself retrieve the file.
The full download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/{bucket_name}/o/{file_name}?alt=media&token={token}
If you are getting error:
Google Cloud Functions: require(…) is not a function
try this:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({keyFilename: 'service-account-key.json'});
const bucket = storage.bucket(object.bucket);
const file = bucket.file(filePath);
.....
I'm building a web app in which I need to let users upload documents to their account and also read all the documents they have uploaded. In addition, I would like to allow users to be able to upload a profile photo as well. To handle file storage, I chose AWS S3.
However, I'm having a lot of trouble with the SDK (v3). Bear in mind I never used the previous version (v2). I installed 2 packages via npm, #aws-sdk/client-s3 and #aws-sdk/s3-request-presigner
. I'm having trouble finding proper documentation for all the functionality I need. The docs I have come across are not exactly beginner friendly and don't go into a lot of detail explaining all the functionality. For example, in the case of GetObjectCommand, I am able to get a response but I'm unsure about how to actually tap into the Body and use the contents.
I'm also unsure about whether I should be using GetObjectCommand or getSignedUrl for my use case. For context, I'm using Express to build my server.
My questions -
Is there any easier way to interact with S3 for my app rather than using the SDK? By easier I just mean properly documented.
Am I looking at the wrong documentation? Are there other resources that make this simpler?
What are the situations where one would use getSignedUrl over GetObjectCommand to read and then render stored files for a web app?
I will be extremely grateful for any and all help.
See response to question 2.
https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/s3-example-creating-buckets.html - the documentation is perhaps more beginner friendly
Depends on your use case. GetObjectCommand is the straightforward method, but you'll run into premission issues most likely. A presigned URL is a URL that you can provide to your users to grant temporary access to a specific S3 object.
Here's code for GetObjectCommand using getSignedUrl (I've also updated the doc.)
const {
S3,
CreateBucketCommand,
PutObjectCommand,
GetObjectCommand,
DeleteObjectCommand,
DeleteBucketCommand,
} = require("#aws-sdk/client-s3");
const { getSignedUrl } = require("#aws-sdk/s3-request-presigner");
const fetch = require("node-fetch");
// Set parameters
// Create random names for the Amazon Simple Storage Service (Amazon S3) bucket and key.
const params = {
Bucket: `test-bucket-${Math.ceil(Math.random() * 10 ** 10)}`,
Key: `test-object-${Math.ceil(Math.random() * 10 ** 10)}`,
Body: "BODY",
Region: "REGION"
};
// Create an Amazon S3 client object.
const s3Client = new S3({ region: params.Region });
const run = async () => {
// Create an Amazon S3 bucket.
try {
console.log(`Creating bucket ${params.Bucket}`);
const data = await s3Client.send(
new CreateBucketCommand({ Bucket: params.Bucket })
);
console.log(`Waiting for "${params.Bucket}" bucket creation...\n`);
} catch (err) {
console.log("Error creating bucket", err);
}
// Put the object in the Amazon S3 bucket.
try {
console.log(`Putting object "${params.Key}" in bucket`);
const data = await s3Client.send(
new PutObjectCommand({
Bucket: params.Bucket,
Key: params.Key,
Body: params.Body,
})
);
} catch (err) {
console.log("Error putting object", err);
}
// Create a presigned URL.
try {
// Create the command.
const command = new GetObjectCommand(params);
// Create the presigned URL.
const signedUrl = await getSignedUrl(s3Client, command, {
expiresIn: 3600,
});
console.log(
`\nGetting "${params.Key}" using signedUrl with body "${params.Body}" in v3`
);
console.log(signedUrl);
const response = await fetch(signedUrl);
console.log(
`\nResponse returned by signed URL: ${await response.text()}\n`
);
}
catch (err) {
console.log("Error creating presigned URL", err);
}
};
run();
Please help
I receive images from the client and save it on my server in the file system and process this image, after which I need to upload it to firebase storage
I try upload image file to firebase storage from Node.js in my async function
const path = process.cwd() + '/my_image.jpg';
const file = readFileSync(path);
await firebase.storage().ref().child('my_image.jpg').put(file);
...
But i have error
The first argument must be of type string or an instance of Buffer. Received an instance of Uint8Array
Okey, i try binary format
const path = process.cwd() + '/my_image.jpg';
const file = readFileSync(path, { encoding: 'base64' });
await firebase.storage().ref().child('my_image.jpg').putString(file, 'base64');
...
And i get error
Firebase Storage: String does not match format 'base64': Invalid character found"
I've already tried a lot of things, but nothing works!
What am I doing wrong?
You can use this code right here
var admin = require("firebase-admin");
const uuid = require('uuid-v4');
// CHANGE: The path to your service account
var serviceAccount = require("path/to/serviceAccountKey.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "<BUCKET_NAME>.appspot.com"
});
var bucket = admin.storage().bucket();
var filename = "path/to/image.png"
async function uploadFile() {
const metadata = {
metadata: {
// This line is very important. It's to create a download token.
firebaseStorageDownloadTokens: uuid()
},
contentType: 'image/png',
cacheControl: 'public, max-age=31536000',
};
// Uploads a local file to the bucket
await bucket.upload(filename, {
// Support for HTTP requests made with `Accept-Encoding: gzip`
gzip: true,
metadata: metadata,
});
console.log(`${filename} uploaded.`);
}
uploadFile().catch(console.error);
To successfully run this code, you will need to:
Add the Firebase Admin SDK to Your Server
Install uuid-v4
Replace "path/to/serviceAccountKey.json" with the path to your own service account. Here is a guide to get yours.
Replace <BUCKET_NAME> with the name of your default bucket. You can find this name in the Storage section of your Firebase Console. The bucket name must not contain gs:// or any other protocol prefixes. For example, if the bucket URL displayed in the Firebase Console is gs://bucket-name.appspot.com, pass the string bucket-name.appspot.com to the Admin SDK.
Replace "path/to/image.png" with the path to your own image.
If needed, adjust the contentType in the metadata accordingly.
Just to let you know, whenever you upload an image using Firebase Console, an access token will be automatically generated. However, if you upload an image using any Admin SDK or gsutil you will need to manually generate this access token yourself. That's why it is very important the uuid part
Firebase Support says that this is being fixed, but I think anyone having this problem should go this way instead of waiting for Firebase to fix this.
For Node js there is a library called '#google-cloud/storage
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
const bucket = storage.bucket("my-bucket.appspot.com");
await bucket.upload(
'/my_image_path.jpg',
{
destination: 'my_uploaded_image.jpg',
metadata: {
cacheControl: "public, max-age=315360000",
contentType: "image/jpeg"
}
});
https://www.npmjs.com/package/#google-cloud/storage
You probably need to authenticate your nodejs client with service account key.
See this https://cloud.google.com/docs/authentication/getting-started
Maybe uploading a Uint8Array to Storage was not available a few months ago, but now you can do it. The only thing is you have to add the content type in a metadata object, this way:
const file = new Uint8Array(...)
const metadata = { contentType: 'image/jpeg; charset=utf-8' }
const storageRef = firebase.storage().ref().child('path/to/image.jpg')
await storageRef.put(file, metadata).then((snapshot) => {
console.log('Uploaded an array!', snapshot)
})
Maybe your node version does not support readFileSync function with { encoding: 'base64' } option.
Try original way to convert a buffer to a string:
const file = readFileSync(path).toString('base64');
// now file is a base64 string
await firebase.storage().ref().child('my_image.jpg').putString(file, 'base64');
Using Firebase-admin with service account make you able to upload file in Firebase
const admin = require('firebase-admin')
var serviceAccount = require("/pathTOServiceAccount.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount)
});
var bucket = admin.storage().bucket('your/firebaseStorage/folderPath')
bucket.upload('pathOfThe/FileIn/Backend', {
destination: 'path/in/firebase/storage',
metadata: {
cacheControl: "public, max-age=315360000",
contentType: "image/jpeg"
}
})
After uploading a file in Firebase Storage with Functions for Firebase, I'd like to get the download url of the file.
I have this :
...
return bucket
.upload(fromFilePath, {destination: toFilePath})
.then((err, file) => {
// Get the download url of file
});
The object file has a lot of parameters. Even one named mediaLink. However, if I try to access this link, I get this error :
Anonymous users does not have storage.objects.get access to object ...
Can somebody tell me how to get the public download Url?
Thank you
You'll need to generate a signed URL using getSignedURL via the #google-cloud/storage NPM module.
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
// ...
const bucket = gcs.bucket(bucket);
const file = bucket.file(fileName);
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
// signedUrls[0] contains the file's public URL
});
You'll need to initialize #google-cloud/storage with your service account credentials as the application default credentials will not be sufficient.
UPDATE: The Cloud Storage SDK can now be accessed via the Firebase Admin SDK, which acts as a wrapper around #google-cloud/storage. The only way it will is if you either:
Init the SDK with a special service account, typically through a second, non-default instance.
Or, without a service account, by giving the default App Engine service account the "signBlob" permission.
This answer will summarize the options for getting a download URL when uploading a file to Google/Firebase Cloud Storage. There are three types of download URLS:
Token download URLs, which are persistent and have security features
Signed download URLs, which are temporary and have security features
Public download URLs, which are persistent and lack security
There are two ways to get a token download URL. Signed and public download URLs each have only one way to get them.
Token URL method #1: From the Firebase Storage Console
You can get the download URL from Firebase Storage console:
The download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/languagetwo-cd94d.appspot.com/o/Audio%2FEnglish%2FUnited_States-OED-0%2Fabout.mp3?alt=media&token=489c48b3-23fb-4270-bd85-0a328d2808e5
The first part is a standard path to your file. At the end is the token. This download URL is permanent, i.e., it won't expire, although you can revoke it.
Token URL method #2: From the Front End
The documentation tells us to use getDownloadURL():
let url = await firebase.storage().ref('Audio/English/United_States-OED-' + i +'/' + $scope.word.word + ".mp3").getDownloadURL();
This gets the same download URL that you can get from your Firebase Storage console. This method is easy but requires that you know the path to your file, which in my app is difficult. You could upload files from the front end, but this would expose your credentials to anyone who downloads your app. So for most projects you'll want to upload your files from your Cloud Functions, then get the download URL and save it to your database along with other data about your file.
I can't find a way to get the token download URL when I write a file to Storage from a Cloud Function (because I can't find a way to tell the front end that a file has written to Storage), but what works for me is to write a file to a publicly available URL, write the publicly available URL to Firebase, then when my Angular front end gets the download URL from Firebase it also runs getDownloadURL(), which has the token, then compares the download URL in Firestore to the token download URL, and if they don't match then it updates the token download URL in place of the publicly available URL in Firestore. This exposes your file to the public only once.
This is easier than it sounds. The following code iterates through an array of Storage download URLs and replaces publicly available download URLs with token download URLs.
const storage = getStorage();
var audioFiles: string[] = [];
if (this.pronunciationArray[0].pronunciation != undefined) {
for (const audioFile of this.pronunciationArray[0].audioFiles) { // for each audio file in array
let url = await getDownloadURL(ref(storage, audioFile)); // get the download url with token
if (audioFile !== url) { // download URLs don't match
audioFiles.push(url);
} // end inner if
}; // end for of loop
if (audioFiles.length > 0) { // update Firestore only if we have new download URLs
await updateDoc(doc(this.firestore, 'Dictionaries/' + this.l2Language.long + '/Words/' + word + '/Pronunciations/', this.pronunciationArray[0].pronunciation), {
audioFiles: audioFiles
});
}
} // end outer if
You're thinking, "I'll return the Storage location from my Cloud Function to my front end and then use the location with getDownloadURL() to write the token download URL to Firestore." That won't work because Cloud Functions can only return synchronous results. Async operations return null.
"No problem," you say. "I'll set up an Observer on Storage, get the location from the Observer, and then use the location with getDownloadURL() to write the token download URL to Firestore." No dice. Firestore has Observers. Storage doesn't have Observers.
"How about," you say, "calling listAll() from my front end, getting a list of all my Storage files, then calling the metadata for each file, and extracting the download URL and token for each file, and then writing these to Firestore?" Good try, but Storage metadata doesn't include the download URL or token.
Signed URL method #1: getSignedUrl() for Temporary Download URLs
getSignedUrl() is easy to use from a Cloud Function:
function oedPromise() {
return new Promise(function(resolve, reject) {
http.get(oedAudioURL, function(response) {
response.pipe(file.createWriteStream(options))
.on('error', function(error) {
console.error(error);
reject(error);
})
.on('finish', function() {
file.getSignedUrl(config, function(err, url) {
if (err) {
console.error(err);
return;
} else {
resolve(url);
}
});
});
});
});
}
A signed download URL looks like this:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio%2FSpanish%2FLatin_America-Sofia-Female-IBM%2Faqu%C3%AD.mp3?GoogleAccessId=languagetwo-cd94d%40appspot.gserviceaccount.com&Expires=4711305600&Signature=WUmABCZIlUp6eg7dKaBFycuO%2Baz5vOGTl29Je%2BNpselq8JSl7%2BIGG1LnCl0AlrHpxVZLxhk0iiqIejj4Qa6pSMx%2FhuBfZLT2Z%2FQhIzEAoyiZFn8xy%2FrhtymjDcpbDKGZYjmWNONFezMgYekNYHi05EPMoHtiUDsP47xHm3XwW9BcbuW6DaWh2UKrCxERy6cJTJ01H9NK1wCUZSMT0%2BUeNpwTvbRwc4aIqSD3UbXSMQlFMxxWbPvf%2B8Q0nEcaAB1qMKwNhw1ofAxSSaJvUdXeLFNVxsjm2V9HX4Y7OIuWwAxtGedLhgSleOP4ErByvGQCZsoO4nljjF97veil62ilaQ%3D%3D
The signed URL has an expiration date and long signature. The documentation for the command line gsutil signurl -d says that signed URLs are temporary: the default expiration is one hour and the maximum expiration is seven days.
I'm going to rant here that the getSignedUrl documentation never says that your signed URL will expire in a week. The documentation code has 3-17-2025 as the expiration date, suggesting that you can set the expiration years in the future. My app worked perfectly, and then crashed a week later. The error message said that the signatures didn't match, not that the download URL had expired. I made various changes to my code, and everything worked...until it all crashed a week later. This went on for more than a month of frustration. Is the 3-17-2025 date an inside joke? Like the gold coins of a leprechaun that vanish when the leprechaun is out of sight, a St. Patrick's Day expiry date years in the future vanishes in two weeks, just when you thought your code was bug-free.
Public URL #1: Make Your File Publicly Available
You can set the permissions on your file to public read, as explained in the documentation. This can be done from the Cloud Storage Browser or from your Node server. You can make one file public or a directory or your entire Storage database. Here's the Node code:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
predefinedAcl: 'publicRead',
contentType: 'audio/' + audioType,
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
console.log("webm audio file written.");
resolve();
})
.catch(error => console.error(error));
});
The result will look like this in your Cloud Storage Browser:
Anyone can then use the standard path to download your file:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio/English/United_States-OED-0/system.mp3
Another way to make a file public is to use the method makePublic(). I haven't been able to get this to work, it's tricky to get the bucket and file paths right.
An interesting alternative is to use Access Control Lists. You can make a file available only to users whom you put on a list, or use authenticatedRead to make the file available to anyone who is logged in from a Google account. If there were an option "anyone who logged into my app using Firebase Auth" I would use this, as it would limit access to only my users.
Deprecated: Build Your Own Download URL with firebaseStorageDownloadTokens
Several answers describe an undocumented Google Storage object property firebaseStorageDownloadTokens. This was never an official Google Cloud Storage feature and it no longer works. Here's how it used to work.
You told Storage the token you wanted to use. You then generated a token with the uuid Node module. Four lines of code and you could build your own download URL, the same download URL you get from the console or getDownloadURL(). The four lines of code are:
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
Here's the code in context:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
contentType: 'audio/' + audioType,
metadata: {
metadata: {
firebaseStorageDownloadTokens: uuid,
}
}
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
})
.catch(error => console.error(error));
});
That's not a typo--you have to nest firebaseStorageDownloadTokens in double layers of metadata:!
Here's an example on how to specify the download token on upload:
const UUID = require("uuid-v4");
const fbId = "<YOUR APP ID>";
const fbKeyFile = "./YOUR_AUTH_FIlE.json";
const gcs = require('#google-cloud/storage')({keyFilename: fbKeyFile});
const bucket = gcs.bucket(`${fbId}.appspot.com`);
var upload = (localFile, remoteFile) => {
let uuid = UUID();
return bucket.upload(localFile, {
destination: remoteFile,
uploadType: "media",
metadata: {
contentType: 'image/png',
metadata: {
firebaseStorageDownloadTokens: uuid
}
}
})
.then((data) => {
let file = data[0];
return Promise.resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent(file.name) + "?alt=media&token=" + uuid);
});
}
then call with
upload(localPath, remotePath).then( downloadURL => {
console.log(downloadURL);
});
The key thing here is that there is a metadata object nested within the metadata option property. Setting firebaseStorageDownloadTokens to a uuid-v4 value will tell Cloud Storage to use that as its public auth token.
Many thanks to #martemorfosis
If you're working on a Firebase project, you can create signed URLs in a Cloud Function without including other libraries or downloading a credentials file. You just need to enable the IAM API and add a role to your existing service account (see below).
Initialize the admin library and get a file reference as your normally would:
import * as functions from 'firebase-functions'
import * as admin from 'firebase-admin'
admin.initializeApp(functions.config().firebase)
const myFile = admin.storage().bucket().file('path/to/my/file')
You then generate a signed URL with
myFile.getSignedUrl({action: 'read', expires: someDateObj}).then(urls => {
const signedUrl = urls[0]
})
Make sure your Firebase service account has sufficient permissions to run this
Go to the Google API console and enable the IAM API (https://console.developers.google.com/apis/api/iam.googleapis.com/overview)
Still in the API console, go to the main menu, "IAM & admin" -> "IAM"
Click edit for the "App Engine default service account" role
Click "Add another role", and add the one called "Service Account Token Creator"
Save and wait a minute for the changes to propagate
With a vanilla Firebase config, the first time you run the above code you'll get an error Identity and Access Management (IAM) API has not been used in project XXXXXX before or it is disabled.. If you follow the link in the error message and enable the IAM API, you'll get another error: Permission iam.serviceAccounts.signBlob is required to perform this operation on service account my-service-account. Adding the Token Creator role fixes this second permission issue.
You should avoid harcoding URL prefix in your code, especially when there are alternatives. I suggest using the option predefinedAcl: 'publicRead' when uploading a file with Cloud Storage NodeJS 1.6.x or +:
const options = {
destination: yourFileDestination,
predefinedAcl: 'publicRead'
};
bucket.upload(attachment, options);
Then, getting the public URL is as simple as:
bucket.upload(attachment, options).then(result => {
const file = result[0];
return file.getMetadata();
}).then(results => {
const metadata = results[0];
console.log('metadata=', metadata.mediaLink);
}).catch(error => {
console.error(error);
});
With the recent changes in the functions object response you can get everything you need to "stitch" together the download URL like so:
const img_url = 'https://firebasestorage.googleapis.com/v0/b/[YOUR BUCKET]/o/'
+ encodeURIComponent(object.name)
+ '?alt=media&token='
+ object.metadata.firebaseStorageDownloadTokens;
console.log('URL',img_url);
This is what I currently use, it's simple and it works flawlessly.
You don't need to do anything with Google Cloud. It works out of the box with Firebase..
// Save the base64 to storage.
const file = admin.storage().bucket('url found on the storage part of firebase').file(`profile_photos/${uid}`);
await file.save(base64Image, {
metadata: {
contentType: 'image/jpeg',
},
predefinedAcl: 'publicRead'
});
const metaData = await file.getMetadata()
const url = metaData[0].mediaLink
EDIT:
Same example, but with upload:
await bucket.upload(fromFilePath, {destination: toFilePath});
file = bucket.file(toFilePath);
metaData = await file.getMetadata()
const trimUrl = metaData[0].mediaLink
#update:
no need to make two different call in upload method to get the metadata:
let file = await bucket.upload(fromFilePath, {destination: toFilePath});
const trimUrl = file[0].metaData.mediaLink
For those wondering where the Firebase Admin SDK serviceAccountKey.json file should go. Just place it in the functions folder and deploy as usual.
It still baffles me why we can't just get the download url from the metadata like we do in the Javascript SDK. Generating a url that will eventually expire and saving it in the database is not desirable.
One method I'm using with success is to set a UUID v4 value to a key named firebaseStorageDownloadTokens in the metadata of the file after it finishes uploading and then assemble the download URL myself following the structure Firebase uses to generate these URLs, eg:
https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[FILE_PATH]?alt=media&token=[THE_TOKEN_YOU_CREATED]
I don't know how much "safe" is to use this method (given that Firebase could change how it generates the download URLs in the future ) but it is easy to implement.
Sorry but i can't post a comment to your question above because of missing reputation, so I will include it in this answer.
Do as stated above by generating a signed Url, but instead of using the service-account.json I think you have to use the serviceAccountKey.json which you can generate at (replace YOURPROJECTID accordingly)
https://console.firebase.google.com/project/YOURPROJECTID/settings/serviceaccounts/adminsdk
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'serviceAccountKey.json'});
// ...
const bucket = gcs.bucket(bucket);
// ...
return bucket.upload(tempLocalFile, {
destination: filePath,
metadata: {
contentType: 'image/jpeg'
}
})
.then((data) => {
let file = data[0]
file.getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// handle url
})
I can't comment on the answer James Daniels gave, but I think this is very Important to read.
Giving out a signed URL Like he did seems for many cases pretty bad and possible Dangerous.
According to the documentation of Firebase the signed url expires after some time, so adding that to your databse will lead to a empty url after a certain timeframe
It may be that misunderstood the Documentation there and the signed url doesn't expire, which would have some security issues as a result.
The Key seems to be the same for every uploaded file. This means once you got the url of one file, someone could easily access files that he is not suposed to access, just by knowing their names.
If i missunderstood that then i would lvoe to be corrected.
Else someone should probably Update the above named solution.
If i may be wrong there
If you use the predefined access control lists value of 'publicRead', you can upload the file and access it with a very simple url structure:
// Upload to GCS
const opts: UploadOptions = {
gzip: true,
destination: dest, // 'someFolder/image.jpg'
predefinedAcl: 'publicRead',
public: true
};
return bucket.upload(imagePath, opts);
You can then construct the url like so:
const storageRoot = 'https://storage.googleapis.com/';
const bucketName = 'myapp.appspot.com/'; // CHANGE TO YOUR BUCKET NAME
const downloadUrl = storageRoot + bucketName + encodeURIComponent(dest);
Use file.publicUrl()
Async/Await
const bucket = storage.bucket('bucket-name');
const uploadResponse = await bucket.upload('image-name.jpg');
const downloadUrl = uploadResponse[0].publicUrl();
Callback
const bucket = storage.bucket('bucket-name');
bucket.upload('image-name.jpg', (err, file) => {
if(!file) {
throw err;
}
const downloadUrl = file.publicUrl();
})
The downloadUrl will be "https://storage.googleapis.com/bucket-name/image-name.jpg".
Please note that in order for the above code to work, you have to make the bucket or file public. To do so, follow the instructions here https://cloud.google.com/storage/docs/access-control/making-data-public. Also, I imported the #google-cloud/storage package directly not through the Firebase SDK.
I had the same issue, however, I was looking at the code of the firebase function example instead of the README. And Answers on this thread didn't help either...
You can avoid passing the config file by doing the following:
Go to your project's Cloud Console > IAM & admin > IAM, Find the App
Engine default service account and add the Service Account Token
Creator role to that member. This will allow your app to create signed
public URLs to the images.
source: Automatically Generate Thumbnails function README
Your role for app engine should look like this:
This works if you just need a public file with a simple URL. Note that this may overrule your Firebase storage rules.
bucket.upload(file, function(err, file) {
if (!err) {
//Make the file public
file.acl.add({
entity: 'allUsers',
role: gcs.acl.READER_ROLE
}, function(err, aclObject) {
if (!err) {
var URL = "https://storage.googleapis.com/[your bucket name]/" + file.id;
console.log(URL);
} else {
console.log("Failed to set permissions: " + err);
}
});
} else {
console.log("Upload failed: " + err);
}
});
Without signedURL() using makePublic()
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp()
var bucket = admin.storage().bucket();
// --- [Above] for admin related operations, [Below] for making a public url from a GCS uploaded object
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
exports.testDlUrl = functions.storage.object().onFinalize(async (objMetadata) => {
console.log('bucket, file', objMetadata.bucket + ' ' + objMetadata.name.split('/').pop()); // assuming file is in folder
return storage.bucket(objMetadata.bucket).file(objMetadata.name).makePublic().then(function (data) {
return admin.firestore().collection('publicUrl').doc().set({ publicUrl: 'https://storage.googleapis.com/' + objMetadata.bucket + '/' + objMetadata.name }).then(writeResult => {
return console.log('publicUrl', writeResult);
});
});
});
answer by https://stackoverflow.com/users/269447/laurent works best
const uploadOptions: UploadOptions = {
public: true
};
const bucket = admin.storage().bucket();
[ffile] = await bucket.upload(oPath, uploadOptions);
ffile.metadata.mediaLink // this is what you need
I saw this on the admin storage doc
const options = {
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
};
// Get a v4 signed URL for reading the file
const [url] = await storage
.bucket(bucketName)
.file(filename)
.getSignedUrl(options);
console.log('Generated GET signed URL:');
console.log(url);
console.log('You can use this URL with any user agent, for example:');
console.log(`curl '${url}'`);
For those who are using Firebase SDK andadmin.initializeApp:
1 - Generate a Private Key and place in /functions folder.
2 - Configure your code as follows:
const serviceAccount = require('../../serviceAccountKey.json');
try { admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) })); } catch (e) {}
Documentation
The try/catch is because I'm using a index.js that imports other files and creates one function to each file. If you're using a single index.js file with all functions, you should be ok with admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) }));.
As of firebase 6.0.0 I was able to access the storage directly with the admin like this:
const bucket = admin.storage().bucket();
So I didn't need to add a service account. Then setting the UUID as referenced above worked for getting the firebase url.
This is the best I came up. It is redundant, but the only reasonable solution that worked for me.
await bucket.upload(localFilePath, {destination: uploadPath, public: true});
const f = await bucket.file(uploadPath)
const meta = await f.getMetadata()
console.log(meta[0].mediaLink)
I already post my ans... in below URL Where you can get full code with solution
How do I upload a base64 encoded image (string) directly to a Google Cloud Storage bucket using Node.js?
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
const os = require('os')
const path = require('path')
const cors = require('cors')({ origin: true })
const Busboy = require('busboy')
const fs = require('fs')
var admin = require("firebase-admin");
var serviceAccount = {
"type": "service_account",
"project_id": "xxxxxx",
"private_key_id": "xxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\jr5x+4AvctKLonBafg\nElTg3Cj7pAEbUfIO9I44zZ8=\n-----END PRIVATE KEY-----\n",
"client_email": "xxxx#xxxx.iam.gserviceaccount.com",
"client_id": "xxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-5rmdm%40xxxxx.iam.gserviceaccount.com"
}
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "xxxxx-xxxx" // use your storage bucket name
});
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('/uploadFile', (req, response) => {
response.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers })
let uploadData = null
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(os.tmpdir(), filename)
uploadData = { file: filepath, type: mimetype }
console.log("-------------->>",filepath)
file.pipe(fs.createWriteStream(filepath))
})
busboy.on('finish', () => {
const bucket = admin.storage().bucket();
bucket.upload(uploadData.file, {
uploadType: 'media',
metadata: {
metadata: { firebaseStorageDownloadTokens: uuid,
contentType: uploadData.type,
},
},
})
.catch(err => {
res.status(500).json({
error: err,
})
})
})
busboy.end(req.rawBody)
});
exports.widgets = functions.https.onRequest(app);
For those trying to use the token parameter to share the file and would like to use gsutil command, here is how I did it:
First you need to authenticate by running: gcloud auth
Then run:
gsutil setmeta -h "x-goog-meta-firebaseStorageDownloadTokens:$FILE_TOKEN" gs://$FIREBASE_REPO/$FILE_NAME
Then you can download the file with the following link:
https://firebasestorage.googleapis.com/v0/b/$FIREBASE_REPO/o/$FILE_NAME?alt=media&token=$FILE_TOKEN
From the Admin SDKs, you cannot retrieve the download token generated by Firebase of an uploaded file, but you can set that token when uploading by adding it in the metadata.
For those who are working on Python SDK. This is the way to do it:
from firebase_admin import storage
from uuid import uuid4
bucket = storage.bucket()
blob = bucket.blob(path_to_file)
token = str(uuid4()) # Random ID
blob.metadata = {
"firebaseStorageDownloadTokens": token
}
blob.upload_from_file(file)
You have now uploaded the file and got the URL token. You could now save the token (or even the full download URL) into your database (e.g. Firestore) and send it to the client when the file is requested and then making the client itself retrieve the file.
The full download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/{bucket_name}/o/{file_name}?alt=media&token={token}
If you are getting error:
Google Cloud Functions: require(…) is not a function
try this:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({keyFilename: 'service-account-key.json'});
const bucket = storage.bucket(object.bucket);
const file = bucket.file(filePath);
.....