I am following a tutorial to resize images via Cloud Functions on upload and am experiencing two major issues which I can't figure out:
1) If a PNG is uploaded, it generates the correctly sized thumbnails, but the preview of them won't load in Firestorage (Loading spinner shows indefinitely). It only shows the image after I click on "Generate new access token" (none of the generated thumbnails have an access token initially).
2) If a JPEG or any other format is uploaded, the MIME type shows as "application/octet-stream". I'm not sure how to extract the extension correctly to put into the filename of the newly generated thumbnails?
export const generateThumbs = functions.storage
.object()
.onFinalize(async object => {
const bucket = gcs.bucket(object.bucket);
const filePath = object.name;
const fileName = filePath.split('/').pop();
const bucketDir = dirname(filePath);
const workingDir = join(tmpdir(), 'thumbs');
const tmpFilePath = join(workingDir, 'source.png');
if (fileName.includes('thumb#') || !object.contentType.includes('image')) {
console.log('exiting function');
return false;
}
// 1. Ensure thumbnail dir exists
await fs.ensureDir(workingDir);
// 2. Download Source File
await bucket.file(filePath).download({
destination: tmpFilePath
});
// 3. Resize the images and define an array of upload promises
const sizes = [64, 128, 256];
const uploadPromises = sizes.map(async size => {
const thumbName = `thumb#${size}_${fileName}`;
const thumbPath = join(workingDir, thumbName);
// Resize source image
await sharp(tmpFilePath)
.resize(size, size)
.toFile(thumbPath);
// Upload to GCS
return bucket.upload(thumbPath, {
destination: join(bucketDir, thumbName)
});
});
// 4. Run the upload operations
await Promise.all(uploadPromises);
// 5. Cleanup remove the tmp/thumbs from the filesystem
return fs.remove(workingDir);
});
Would greatly appreciate any feedback!
I just had the same problem, for unknown reason Firebase's Resize Images on purposely remove the download token from the resized image
to disable deleting Download Access Tokens
goto https://console.cloud.google.com
select Cloud Functions from the left
select ext-storage-resize-images-generateResizedImage
Click EDIT
from Inline Editor goto file FUNCTIONS/LIB/INDEX.JS
Add // before this line (delete metadata.metadata.firebaseStorageDownloadTokens;)
Comment the same line from this file too FUNCTIONS/SRC/INDEX.TS
Press DEPLOY and wait until it finish
note: both original and resized will have the same Token.
I just started using the extension myself. I noticed that I can't access the image preview from the firebase console until I click on "create access token"
I guess that you have to create this token programatically before the image is available.
I hope it helps
November 2020
In connection to #Somebody answer, I can't seem to find ext-storage-resize-images-generateResizedImage in GCP Cloud Functions
The better way to do it, is to reuse the original file's firebaseStorageDownloadTokens
this is how I did mine
functions
.storage
.object()
.onFinalize((object) => {
// some image optimization code here
// get the original file access token
const downloadtoken = object.metadata?.firebaseStorageDownloadTokens;
return bucket.upload(tempLocalFile, {
destination: file,
metadata: {
metadata: {
optimized: true, // other custom flags
firebaseStorageDownloadTokens: downloadtoken, // access token
}
});
});
Related
I'm trying to upload a file to my bucket after it's written but I'm not sure how to do it.
I confirm that code to write the file is ok as I tested it locally and it's working normally.
bucket.upload doesn't seem to work as the file is saved locally.
bucket.file.save is also not working
the file is saved at "./public/fileName.xlsx".
When I use:
storage.bucket("bucketName").file("bucketFileName").save("./public/fileName.xlsx")
There's indeed a file been uploaded to the storage, but its content is the path string that I'm passing inside .save()
So to resume my question is: How do I write a file and then upload it to my bucket?
ps: the file is an excel worksheet
If you confirmed that the file is saved locally and just want to upload it to the bucket, you may refer to the sample code below:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
// Change to your bucket name
const bucketName = 'bucket-name';
async function uploadFile(path, filename) {
// Path where to save the file in Google Cloud Storage.
const destFileName = `public/${filename}`;
const options = {
destination: destFileName,
// Optional:
// Set a generation-match precondition to avoid potential race conditions
// and data corruptions. The request to upload is aborted if the object's
// generation number does not match your precondition. For a destination
// object that does not yet exist, set the ifGenerationMatch precondition to 0
// If the destination object already exists in your bucket, set instead a
// generation-match precondition using its generation number.
preconditionOpts: {ifGenerationMatch: generationMatchPrecondition},
};
// The `path` here is the location of the file that you want to upload.
await storage.bucket(bucketName).upload(path, options);
console.log(`${path} uploaded to ${bucketName}`);
}
uploadFile('./public/fileName.xlsx', 'fileName.xlsx').catch(console.error);
Added some comments on the sample code.
For more information, you may check this documentation.
I'm following up on this article to download objects from GCP Cloud storage bucket: https://cloud.google.com/storage/docs/downloading-objects#storage-download-object-nodejs
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function downloadIntoMemory() {
// Downloads the file into a buffer in memory.
const contents = await storage.bucket(bucketName).file(fileName).download();
return contents;
);
}
downloadIntoMemory().catch(console.error);
I'm currently getting a buffer data in contents. I've this code hooked upto a API on NodeJS backend. I'm using React Typescript on frontend. Calling the API, gives me data buffer. How can I use it to download the file instead of the data buffer?
I tried the above method explicitly providing file destination, but I'm still getting the following error: EISDIR: illegal operation on a directory, open '{file_path_which_i_was_set}. Err: -21
As rightly pointed out by #John Hanley, you are referring to the documentation, where the code sample downloads an object into memory/ buffer in memory. If you want to download an object from a bucket to a file, refer to this code sample, where the ‘options’ parameter has to be passed to the download() method.
The code goes like this :
// The ID of your GCS bucket
const bucketName = 'your-unique-bucket-name';
// The ID of your GCS file
const fileName = 'your-file-name';
// The path to which the file should be downloaded
const destFileName = '/local/path/to/file.txt';
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function downloadFile() {
const options = {
destination: destFileName,
};
// Downloads the file to the destination file path
await storage.bucket(bucketName).file(fileName).download(options);
console.log(
`gs://${bucketName}/${fileName} downloaded to ${destFileName}.`
);
}
downloadFile().catch(console.error);
I am trying to validate the image dimensions of the posts which are in my firebase storage using a Cloud Function. What I am trying to do is:
1: Let the user upload an image
2: When it is uploaded, the backend checks it dimensions
2.1: If it has good dimensions don't remove it
2.2: Else, remove it from the storage
The code looks like:
// Validate image dimensions
exports.validateImageDimensions = functions
.region("us-central1")
.runWith({ memory: "1GB", timeoutSeconds: 120 })
.storage.object()
.onFinalize(async (object) => {
// Get the bucket which contains the image
const bucket = gcs.bucket(object.bucket);
// Get the name
const filePath = object.name;
// Check if the file is an image
const isImage = object.contentType.startsWith("image/");
// Check if the image has valid dimensions
const hasValidDimensions = true; // TODO: How to get the image dimension?
// Do nothing if is an image and has valid dimensions
if (isImage && hasValidDimensions) {
return;
}
try {
await bucket.file(filePath).delete();
console.log(
`The image ${filePath} has been deleted because it has invalid dimensions.`
);
// TODO - Remove image's document in firestore
} catch (err) {
console.log(`Error deleting invalid file ${filePath}: ${err}`);
}
});
But I don't know how to get the object dimensions. I have check the documentation but not having answers.
Any ideas?
Use sharp , here's the documentation that talks about it:
const metadata = await sharp('image.jpg').metadata();
const width = metadata.width;
const height = metadata.height;
functions.logger.log(`width: `, width);
functions.logger.log(`height: `, height);
I'm trying to archive pdf files from remote websites to Google Cloud Storage using a google function triggered by a firebase write.
The code below works. However, this function copies the remote file to the bucket root.
I'd like to copy the pdf to the pth of the bucket: library-xxxx.appspot.com/Orgs/${params.ukey}.
How to do this?
exports.copyFiles = functions.database.ref('Orgs/{orgkey}/resources/{restypekey}/{ukey}/linkDesc/en').onWrite(event => {
const snapshot = event.data;
const params = event.params;
const filetocopy = snapshot.val();
if (validFileType(filetocopy)) {
const pth = 'Orgs/' + params.orgkey;
const bucket = gcs.bucket('library-xxxx.appspot.com')
return bucket.upload(filetocopy)
.then(res => {
console.log('res',res);
}).catch(err => {
console.log('err', err);
});
}
});
Let me begin with a brief explanation of how GCS file system works: as explained in the documentation of Google Cloud Storage, GCS is a flat name space where the concept of directories does not exist. If you have an object like gs://my-bucket/folder/file.txt, this means that there is an object called folder/file.txt stored in the root directory of gs://my-bucket, i.e. the object name includes / characters. It is true that the GCS UI in the Console and the gsutil CLI tool make the illusion of having a hierarchical file structure, but this is only to provide more clarity for the user, even though those directories do not exist, and everything is stored in a "flat" name space.
That being said, as described in the reference for the storage.bucket.upload() method, you can specify an options parameter containing the destination field, where you can specify a string with the complete filename to use.
Just as an example (note the options paramter difference between both functions):
var bucket = storage.bucket('my-sample-bucket');
var options = {
destination: 'somewhere/here.txt'
};
bucket.upload('sample.txt', function(err, file) {
console.log("Created object gs://my-sample-bucket/sample.txt");
});
bucket.upload('sample.txt', options, function(err, file) {
console.log("Created object gs://my-sample-bucket/somewhere/here.txt");
});
So in your case you can build a string containing the complete name that you want to use (containing also the "directory" structure you have in mind).
filepath --> local machine file storage path
await bucket.upload(filepath, {
public: true,
gzip: true,
metadata: {
cacheControl: "public, max-age=31536000",
},
});
I need to read a content of a password-protected archive (zip is preferred) to a node-js app, without writing the protected content to a file
In addition, the app is cross-platform so solution such this doesn't help
I looked also here but there is no code in the answer
The only library I can find that supports encryption is: https://github.com/rf00/minizip-asm.js
Unfortunately, it isn't well maintained.
This solution will read the file buffer which you can get from base64 or by reading the zip file, after that unzipping and opening the password-protected file is done in-memory. I hope this helps -
const unzipper = require("unzipper");
const unzipAndUnlockZipFileFromBuffer = async (zippedFileBase64, password) => {
try {
const zipBuffer = Buffer.from(zippedFileBase64, "base64"); // Change base64 to buffer
const zipDirectory = await unzipper.Open.buffer(zipBuffer); // unzip a buffered file
const file = zipDirectory.files[0]; // find the file you want
// if you want to find a specific file by path
// const file = zipDirectory.files.find((f) => f.path === "filename");
const extracted = await file.buffer(password); // unlock the file with the password
console.log(extracted.toString()); // file content
} catch (e) {
console.log(e);
}
};
const zippedFileBase64 = "{{BASE64}}";
const password = "1234";
unzipAndUnlockZipFileFromBuffer(zippedFileBase64, password);