In the web sdk of firebase storage you can upload an image from Blob data. I have a nodeJS app and want to upload images from a url to my storage bucket. In the docs it's recommended to use the admin SDK if running a node server environment. But i cannot find this feature in the firbase storage admin documentation.
Heres my code:
const admin = require('firebase-admin');
const serviceAccount = require(`./credentials/my-keyfile.json`);
const app = admin.initializeApp({
credential: admin.credential.cert(serviceAccount)
});
const storage = app.storage();
// Get Blob data from an external ImageUrl
const axios = require("axios");
const getBlobFromUrl = (url) => {
const response = await axios.get(url, { responseType: "blob" });
return response.data;
}
const blobData = getBlobFromUrl("https://images.unsplash.com/photo-1591476315349-faa1c8e6f567?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max")
// MY QUESTION -> how can i upload that blob data to my storage?
I was recently trying to do something similar, writing an in-memory file from a Node server (with the Firebase Admin SDK) to Firebase Storage.
I solved it by realizing that I needed to use Google Cloud Storage's Node API to do so (mentioned in the docs). This is the relevant doc page to do streaming transfers with Google Cloud Storage: https://cloud.google.com/storage/docs/streaming#storage-stream-upload-object-nodejs.
Here are the basic steps:
Set the appropriate storageBucket address (e.g. gs://APP-NAME.appspot.com/), which you can get from your Firebase Storage console. Also, get the appropriate Firebase storage and bucket JS objects. Follow the example at https://firebase.google.com/docs/storage/admin/start
get the stream of byte data from your blob/file: https://developer.mozilla.org/en-US/docs/Web/API/Blob/stream
use Google Cloud Storage's API to pipe/write that stream to a specified location in your bucket: https://cloud.google.com/storage/docs/streaming#storage-stream-upload-object-nodejs
So, following the the docs above, you'd do something like:
// Create a reference to a file object.
// myBucket is the bucket obtained in step 1 above
const file = myBucket.file("directoryName/imageName.png");
// step 2: get ReadableStream from Blob
const stream = blobData.stream();
// step 3: asynchronously pipe/write to file in cloud
async function streamFileUpload() {
stream.pipe(file.createWriteStream()).on('finish', () => {
// The file upload is complete
});
}
streamFileUpload().catch(console.error);
After this you'll see your file uploaded to Firebase Storage under directoryName/imageName.png.
The first link you have posted on uploading an image with Blob data
contains information on how you would do it with the client sdk.
When you use the Admin SDK, the use case is for the backend, for example, cloud functions. What you have shown in your post is used for this use case.
For implementations on how to upload an image using the client sdk, check out some of the Quickstart guides as well as the code labs.
If you're working primarily with Blob, you can check out some libraries like busboy and send HTTP requests instead.
Related
I am working on an admin app to be able to manage content of multiple firebase projects and I am stuck at working with firebase storage.
The flow is:
fill the form in admin app,
select picture to be uploaded
invoke https callable function from firebase functions
send image to function
function initialize particular app
uploads file to bucket in the firebase app
It has to be done via https callable function because I am checking against user authorization so the cloud function code starts like this:
export const uploadFile = functions.https.onCall(async (data, context) => {
if (!context.auth) {
return { error: "You are not loged in" }
}
if (context.auth.uid !== adminId) {
return { error: "This user has no access to data" }
}
const app = admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: DBURLS[`${data.appId}-${data.env}`],
storageBucket: STORAGEURLS[`${data.appId}-${data.env}`],
}, `${data.appId}-${data.env}`);
const bucket = getStorage(app).bucket();
// What's next ???
})
I am a bit confused what would be the best way to send the file to firestore function. as base64 string, blob or better to open a stream.
File should be mostly an image but in some cases it can be also a video which can be a large file over 1GB.
The front end is a Ionic/React app with use of React-Dropzone for file upload.
What would be the best way, including some code samples?
Any help appreciated
I advise to use the JS SDK to upload the image to Cloud Storage in a bucket dedicated to admin users and then call the Cloud Function that initializes a particular app and moves the image file to the bucket of this Firebase app.
I'm using nodejs for the download of some files (mp3) from Firebase Storage, that I will send to the client.
I want to get blob file and then send to the client.
I read the docs and Firebase use refFromURL method to get the downloadable url.
But when I start the script, it says that refFromURL is not an url:
const firebase = require('firebase-admin')
var serviceAccount = require('./api/admin.json')
firebase.initializeApp(optionFirebase)
var storageSongs = firebase.storage()
let linkSong = storageSongs.refFromURL('FILE_URL')
It's not an authentication problem because I use the same options for Realtime Database and it works well.
refFromURL is a method provided by the Firebase JavaScript web client SDK. It's not available in the Cloud Storage nodejs SDK for backends.
I need to upload and share my documents using Google Cloud and nodeJS but I'm not sure how to do it exactly.
The best way is to check the documentation and to find out more about Google Cloud Storage and how to upload your files or documents to Google Cloud Storage.
Regarding uploading a CSV file to Google Cloud Storage, you can access this link where you can use different ways to upload files to Google Cloud Storage.
For your case use, here is the code sample to upload a file using nodeJS. Here in your case just change in the const field for the filename to the path of your file and instead of .txt you should specify .csv.
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// const filename = 'Local file to upload, e.g. ./local/path/to/file.txt';
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function uploadFile() {
// Uploads a local file to the bucket
await storage.bucket(bucketName).upload(filename, {
// Support for HTTP requests made with `Accept-Encoding: gzip`
gzip: true,
// By setting the option `destination`, you can change the name of the
// object you are uploading to a bucket.
metadata: {
// Enable long-lived HTTP caching headers
// Use only if the contents of the file will never change
// (If the contents will change, use cacheControl: 'no-cache')
cacheControl: 'public, max-age=31536000',
},
});
console.log(`${filename} uploaded to ${bucketName}.`);
}
uploadFile();
Check out this GitHub repository its has examples for cloud storage with Nodejs
I am using react native, and want to upload image to gcp storage
I am not using any app engine.
According to gcp documentation for node
const Storage = require('#google-cloud/storage');
const storage = Storage({
projectId: 'id',
keyFilename: 'private-key'
});
const bucketName = 'buckt';
// Uploads a local file to the bucket
storage
.bucket(bucketName)
.upload('filename')
.then(() => {
console.log(`${filename} uploaded to ${bucketName}.`);
})
.catch(err => {
console.error('ERROR:', err);
});
To use this code in client, I need to pass private key which I don't want to do
According to this blog post https://medium.com/google-cloud/uploading-resizing-and-serving-images-with-google-cloud-platform-ca9631a2c556 , there is a get_serving_url() function in google app engine.
I reckon for Google Push notifications, google provides each client token so that client directly communicates with server
Is there any way where I can upload image directly from client?
You can use Firabase Cloud Storage service. This service provides securely upload to GCP Storage from the client based on Firebase Auth credential. See: https://firebase.google.com/docs/storage/
I'm using Google Cloud Storage and have a few buckets that contain objects which are not shared publicly. Example in screenshot below. Yet I was able to retrieve file without supplying any service account keys or authentication tokens from a local server using NodeJS.
I can't access the files from browser via the url formats (which is good):
https://www.googleapis.com/storage/v1/b/mygcstestbucket/o/20180221164035-user-IMG_0737.JPG
https://storage.googleapis.com/mygcstestbucket/20180221164035-user-IMG_0737.JPG
However, when I tried using retrieving the file from NodeJS without credentials, surprisingly it could download the file to disk. I checked process.env to make sure there were no GOOGLE_AUTHENTICATION_CREDENTIALS or any pem keys, and also even did a gcloud auth revoke --all on the command line just to make sure I was logged out, and still I was able to download the file. Does this mean that the files in my GCS bucket is not properly secured? Or I'm somehow authenticating myself with the GCS API in a way I'm not aware?
Any guidance or direction would be greatly appreciated!!
// Imports the Google Cloud client library
const Storage = require('#google-cloud/storage');
// Your Google Cloud Platform project ID
const projectId = [projectId];
// Creates a client
const storage = new Storage({
projectId: projectId
});
// The name for the new bucket
const bucketName = 'mygcstestbucket';
var userBucket = storage.bucket(bucketName);
app.get('/getFile', function(req, res){
let fileName = '20180221164035-user-IMG_0737.JPG';
var file = userBucket.file(fileName);
const options = {
destination: `${__dirname}/temp/${fileName}`
}
file.download(options, function(err){
if(err) return console.log('could not download due to error: ', err);
console.log('File completed');
res.json("File download completed");
})
})
Client Libraries use Application Default Credentials to authenticate Google APIs. So when you don't explicitly use a specific Service Account via GOOGLE_APPLICATION_CREDENTIALS the library will use the Default Credentials. You can find more details on this documentation.
Based on your sample, I'd assume the Application Default Credentials were used for fetching those files.
Lastly, you could always run echo $GOOGLE_APPLICATION_CREDENTIALS (Or applicable to your OS) to confirm if you've pointed a service account's path to the variable.
Create New Service Account in GCP for project and download the JSON file. Then set environment variable like following:
$env:GCLOUD_PROJECT="YOUR PROJECT ID"
$env:GOOGLE_APPLICATION_CREDENTIALS="YOUR_PATH_TO_JSON_ON_LOCAL"