Problem file naming when dowloading the file - node.js

this is my first stack post, sorry if it's a little blurry :/
So basically I have a Angular project with firestore behind. I got a cloud function which generates an .xlsx file and upload it to my fireStorage.
const path = 'hellothere/excels';
return workBook.xlsx.writeFile(`/tmp/myExcel.xlsx`).then(() => {
return storageFb.upload( `/tmp/myexcel.xlsx`,{
destination: path+'/myExcel.xlsx',
}
)
}).then(() => path);
Where StorageFb is the bucket of my storage.
Actuelly it's working, it uploads my .xlsx file under /hellothere/excels/ with the name myExcel.xlsx. But when I download it (by the admin panel or my angular client), it is fully named hellothere_excels_myExcel.xlsx.
Here is my client code:
this.fireStorage.ref('hellothere/excels/myExcel.xlsx').getDownloadURL().subscribe((url) => {
window.open(url, '_blank');
});
return Promise.resolve();
Simply. I know the code is messy but i'm testing all solution I can find so i'll clean it up afterall
Admin panel path
My file name
So I'm kinda stuck since I dunno why those file won't download with just the 'myExcel' name.
If anyone have a clue you'll save my week ahah ! Thanks !

You need to set the content disposition to define the filename. Try that
const path = 'hellothere/excels';
return workBook.xlsx.writeFile(`/tmp/myExcel.xlsx`).then(() => {
return storageFb.upload( `/tmp/myexcel.xlsx`,{
destination: path+'/myExcel.xlsx',
contentDisposition: 'filename=myExcel.xlsx'
}
)
}).then(() => path);

Related

Uploading data from firebase functions to firebase storage?

I have a website running with node.js, with the backend running on Firebase Functions. I want to store a bunch of JSON to Firebase Storage. The below snippet works just fine when I'm running on localhost, but when I upload it to Firebase functions, it says Error: EROFS: read-only file system, open 'export-stock-trades.json. Anyone know how to get around this?
fs.writeFile(fileNameToReadWrite, JSON.stringify(jsonObjToUploadAsFile), function(err){
bucket.upload(fileNameToReadWrite, {
destination: destinationPath,
});
res.send({success: true});
});
I can't tell for sure, since much of the context of your function is missing, but it looks like you function is attempting to write a file to local disk first (fs.writeFile), then upload it (bucket.upload).
On Cloud Functions, code you write only has write access to /tmp,
which is os.tmpdir() in node. Read more about that in the
documentation:
The only writeable part of the filesystem is the /tmp directory, which
you can use to store temporary files in a function instance. This is a
local disk mount point known as a "tmpfs" volume in which data written
to the volume is stored in memory. Note that it will consume memory
resources provisioned for the function.
This is probably what's causing your code to fail.
Incidentally, if the data you want to upload is in memory, you don't have to write it to a file first as you're doing now. You can instead use file.save() for that.
Another way I feel this could work is to convert the JSON file into a buffer and then perform an action like this (the code snippet below). I wrote an article on how you can do this using Google Cloud Storage but it works fine with Firebase storage. The only thing you need to change is the "service-account-key.json" file.
The link to the article can be found here: Link to article on medium
const util = require('util')
const gc = require('./config/')
const bucket = gc.bucket('all-mighti') // should be your bucket name
/**
*
* #param { File } object file object that will be uploaded
* #description - This function does the following
* - It uploads a file to the image bucket on Google Cloud
* - It accepts an object as an argument with the
* "originalname" and "buffer" as keys
*/
export const uploadImage = (file) => new Promise((resolve, reject) => {
const { originalname, buffer } = file
const blob = bucket.file(originalname.replace(/ /g, "_"))
const blobStream = blob.createWriteStream({
resumable: false
})
blobStream.on('finish', () => {
const publicUrl = format(
`https://storage.googleapis.com/${bucket.name}/${blob.name}`
)
resolve(publicUrl)
})
.on('error', () => {
reject(`Unable to upload image, something went wrong`)
})
.end(buffer)
})

Trouble downloading mp3 files from S3 using Amplify/Node

I'm quite confused on how to use the Amplify library to actually download an mp3 file stored in my s3 bucket. I am able to list the bucket contents and parse it all out into a tree viewer for users to browse the various files, but once I select a file I can't get it to trigger a download.
I'm confident my amplify configuration is correct since I can see all my expected directories and when I select the file I want to download, I see the response size being correct:
You can see it takes 2+ seconds and appears to be downloading the data/mp3 file, but the user is never prompted to save the file and it's not in my Downloads folder.
Here is a capture of my file metadata setup from my bucket:
And the method I'm calling:
getFile (fileKey) {
Storage.get(fileKey, {download: true})
}
Without the "download : true" configuration, I get the verified URL back in the response. I'd like to avoid making a 2nd request using that URL download the file if possible. Anything else I may have missed? Is it better for s3 operations to go back to the standard aws-sdk? Thanks in advance!
I ended up using a combination of this answer:
https://stackoverflow.com/a/36894564
and this snippet:
https://gist.github.com/javilobo8/097c30a233786be52070986d8cdb1743
So the file gets downloaded in the response data(result), I added more meta data tags to the files to get the file name and title. Finally adding the link to the DOM and executing a click() on it saves the file named correctly. Full solution below:
getFile (fileKey) {
Storage.get(fileKey, {download: true}).then(result => {
console.log(result)
let mimeType = result.ContentType
let fileName = result.Metadata.filename
if (mimeType !== 'audio/mp3') {
throw new TypeError("Unexpected MIME Type")
}
try {
let blob = new Blob([result.Body], {type: mimeType})
//downloading the file depends on the browser
//IE handles it differently than chrome/webkit
if (window.navigator && window.navigator.msSaveOrOpenBlob) {
window.navigator.msSaveOrOpenBlob(blob, fileName)
} else {
let objectUrl = URL.createObjectURL(blob);
let link = document.createElement('a')
link.href = objectUrl
link.setAttribute('download', fileName)
document.body.appendChild(link)
link.click()
document.body.removeChild(link)
}
} catch (exc) {
console.log("Save Blob method failed with the following exception.");
console.log(exc);
}
})
}
}

Internal server error om Azure when writing file from buffer to filesystem

Context
I am working on a Proof of Concept for an accounting bot. Part of the solution is the processing of receipts. User makes picture of receipt, bot asks some questions about it and stores it in the accounting solution.
Approach
I am using the BotFramework nodejs example 15.handling attachments that loads the attachment into an arraybuffer and stores it on the local filesystem. Ready to be picked up and send to the accounting software's api.
async function handleReceipts(attachments) {
const attachment = attachments[0];
const url = attachment.contentUrl;
const localFileName = path.join(__dirname, attachment.name);
try {
const response = await axios.get(url, { responseType: 'arraybuffer' });
if (response.headers['content-type'] === 'application/json') {
response.data = JSON.parse(response.data, (key, value) => {
return value && value.type === 'Buffer' ? Buffer.from(value.data) : value;
});
}
fs.writeFile(localFileName, response.data, (fsError) => {
if (fsError) {
throw fsError;
}
});
} catch (error) {
console.error(error);
return undefined;
}
return (`success`);
}
Running locally it all works like a charm (also thanks to mdrichardson - MSFT). Stored on Azure, I get
There was an error sending this message to your bot: HTTP status code InternalServerError
I narrowed the problem down to the second part of the code. The part that write to the local filesystem (fs.writefile). Small files and big files result in the same error on Azure.fs.writefile seams unable to find the file
What is happpening according to stream logs:
Attachment uploaded by user is saved on Azure
{ contentType: 'image/png',contentUrl:
'https://webchat.botframework.com/attachments//0000004/0/25753007.png?t=< a very long string>',name: 'fromClient::25753007.png' }
localFilename (the destination of the attachment) resolves into
localFileName: D:\home\site\wwwroot\dialogs\fromClient::25753007.png
Axios loads the attachment into an arraybuffer. Its response:
response.headers.content-type: image/png
This is interesting because locally it is 'application/octet-stream'
fs throws an error:
fsError: Error: ENOENT: no such file or directory, open 'D:\home\site\wwwroot\dialogs\fromClient::25753007.png
Some assistance really appreciated.
Removing ::fromClient prefix from attachment.name solved it. As #Sandeep mentioned in the comments, the special characters where probably the issue. Not sure what its purpose is. Will mention it in the Botframework sample library github repository.
[update] team will fix this. Was caused by directline service.

How to download Word Document File in ionic 3 using Sharepoint File URL

I have a stored pdf, xls, doc, etc.. file in list, I have a actual path of a file URL, I am create documents Gallery in my app with download option, when i'm click download icon i have to download a that particular document.
Working fine for PDF and XLS and XLSX file Tested. But Word Document(.doc,.docx) File has been not properly downloaded.
It Shows alert is success. But while opening time it shows,
Unable to open the document.File appears to be corrupted like this.
I have tried
file trasfer plugin
i can't achieve pls help me to solve this problem.
here my tried code.
this.download("Sample Document.docx","https://abcd.sharepoint.com/samplesite/Shared Documents/Sample Document.docx");
download(fileName: string, filePath: any) {
const url= encodeURI(filePath);
const fileTransfer: FileTransferObject = this.transfer.create();
fileTransfer.download(url, this.file.externalRootDirectory + fileName, true).then((entry) => {
//show toast message as success
}, (error) => {
//show toast message as error
});
}
here is my output,
pls give some idea to download a word document file. Is there any other way is available to download a file using url in ionic3?
I have modified the code.
Please check and let me know if you have any issues.
this.download("sample.docx","https://abcd.sharepoint.com/samplesite/Shared Documents/Sample Document.docx");
download(fileName: string, filePath: any) {
const fileTransfer: FileTransferObject = this.transfer.create();
fileTransfer.download(filePath, this.file.externalRootDirectory + fileName, true).then((entry) => {
//show toast message as success
console.log('download complete: ' + entry.toURL());
}, (error) => {
//show toast message as error
});
}
NOTE : If the error message is still appearing (The file appears to be corrupted).Try with some other files in share-point list or add more files and try with them.This is a working code and the issue may be in share-point side.

Copy file from one AWS S3 Bucket to another bucket with Node

I am trying to copy a file from a AWS S3 bucket to another bucket using Node. The problem is if the file name doesn't has the white space for example: "abc.csv", It is working fine.
But in case the file to which I want to copy has the white space in the file name for example: "abc xyz.csv". It is throwing the below error.
"The specified key does not exist."
"NoSuchKey: The specified key does not exist.
at Request.extractError (d:\Projects\Other\testproject\s3filetoarchieve\node_modules\aws-sdk\lib\services\s3.js:577:35)
Below is the code provided.
return Promise.each( files, file => {
var params = {
Bucket: process.env.CR_S3_BUCKET_NAME,
CopySource: `/${ process.env.CR_S3_BUCKET_NAME }/${ prefix }${ file.name}`,
Key: `${ archieveFolder }${ file.name }`
};
console.log(params);
return new Promise(( resolve, reject) => {
s3bucket.copyObject(params, function(err, data) {
if (err){
console.log(err, err.stack);
debugger
} else {
console.log(data);
debugger
}
});
});
}).then( result => {
debugger
});
Early help would be highly appreciable. Thank you.
I think the problem is exactly that space in the filename.
S3 keys must be url encoded, as they need to be accesible in URL form.
There are some packages that helps you with url formatting like speakingUrl
or you can try writting some on your own, maybe just simply replacing spaces (\s) with dashes (_ or -) if you want to keep it friendly.
If you don't mind about that, you can simply encodeURIComponent(file.name)
Hope it helps!

Resources