im downloading a zip file from the internet. I recieve it using a XHR request (using node-webkit) and this means that the content of the zip comes as a string in xhr.responseText. I now want to save this file to the disk, however, i cant seem to get it saved as a noncurrupted zip archive.
I have basically used fs.writeFile, fs.write, fs.createWriteStream, but I cant seem to get it right.
I am using a node module named AdmZip which accepts a file buffer that then can be saved as a zip archive. So, I guess, this could be one way to go, but how to I make a buffer out the the string that i recieve?
btw: i can't use the http module to recieve the file from the internet due to a bug in node.js, therefore im using the xhr request.
So, I found a soulution, by first and foremost setting the xhr.responseType = 'arraybuffer' and then turning the response into a Uint8Array. From there I converted the Uint8Array to a nodejs buffer which I then could save.
var arrayBuffer = xhr.response,
byteArray = new Uint8Array(arrayBuffer);
var buffer = new Buffer(byteArray.length);
for (var i = 0; i < byteArray.length; i++) {
buffer.writeUInt8(byteArray[i], i);
}
fs.writeFileSync(fname, buffer);
Related
I'm fairly new to node and streaming, and I am having an issue when attempting to stream a large amount of data to a file on the client browser.
So for example, if on the server if i have a large file, test.txt, i can easily stream this to the client browser by setting the header attachment and piping the file to the request response as follows.
res.setHeader('Content-Type', 'text/csv');
res.setHeader('Content-disposition', 'attachment;filename=myfile.text');
fs.createReadStream('./test.txt')
.pipe(res);
When the user clicks the button, the download begins, and we see the data getting streamed to the download file. The stream takes several minutes, but during this time the client is not blocked and they can continue to do other things while the file is downloaded by the browser.
However my data is not stored in a file, I need to retrieve it one string at a time from another server. So I'm attempting to create my own read stream and push my data chunk by chunk, but it does not work, when i do something like this:
var s = new Readable();
s.pipe(res);
for(let i=0; i<=total; i++) {
dataString = //code here to get next string needed to push
s.push(dataString);
};
s.push(null);
With this code, when the user request the download, once the download begins, the client is blocked and cannot do any other actions until the download is completed. Also if the data takes more than 30 seconds to stream, we hit the server timeout in this case, and the download fails. With the file stream this is not an issue
How to I get this to act like a file stream and not block the client from doing other request while it downloads. Any recommendations on the best way to implement this would be appreciated.
I was able resolve this issue by doing something similar to here:
How to call an asynchronous function inside a node.js readable stream
My basic code is as follows, and this is not blocking the client or timing out on the request as the data is continuously piped to the file download on the client side.
res.setHeader('Content-Type', 'text/csv');
res.setHeader('Content-disposition', 'attachment;filename=myfile.text');
function MyStream() {
var rs = new Readable();
var hitsadded = 0;
rs._read = function() {}; // needed to avoid "Not implemented" exception
getResults(queryString, function getMoreUntilDone(err, res) {
if (err){
logger.logError(err);
}
rs.push(res.data);
hitsadded += res.records;
if (res.recordsTotal > hitsadded) {
getNextPage(query, getMoreUntilDone);
} else {
rs.push(null);
}
});
return rs;
}
MyStream().pipe(zlib.createGzip()).pipe(res);
For some reason I need to store some files (mostly images or pdfs) into my database (PG 9.2.20).
Those files are uploaded by users and when I download them back, they are corrupted.
I'm working with nodejs.
The column type I store the file is BYTEA.
This is how I store them :
const { files, fields } = await asyncBusboy(ctx.req);
const fileName = files[0].filename;
const mimeType = files[0].mimeType;
const bufferedFile = fs.readFileSync(files[0].path, { encoding: 'hex' });
const fileData = `\\x${bufferedFile}`;
//Just a basic insert into with knex.raw
const fileId = await storageModel.create(fields.name, fields.description, fileName, mimeType, fileData, ctx.user);
And this is how I retrieve my file :
const file = await storageModel.find(ctx.params.fileId, ctx.user);
ctx.body = Buffer.from(file.file_bin, 'hex');
ctx.set('Content-disposition', `attachment; filename=${file.file_name}`);
The file is corrupted, and of course, if I look closely, the uploaded file and the one I downloaded are different.
See hex screenshot, there is some additional data at the start of the downloaded one : http://imgur.com/a/kTRAB
After some more testing I can tell the problem lies into the koa part, when I put the buffer into the ctx.body. It got corrupted (???)
EDIT : I was working with Swagger UI : https://github.com/swagger-api/swagger-ui/issues/1605
You should not use bytea as a regular text string. You should pass in type Buffer directly, and have the driver escape it for you correctly.
Not sure which driver you are using, but for example...
pg-promise does it automatically, see the example
node-postgres is supposed to do it automatically, which it does mostly, but I know there were issues with arrays, recently fixed here.
massive.js - based on pg-promise since v3.0, so the same story - it just works.
I have an audio file saved locally that I want to read, upload to a server via ajax and then store on the server. Somewhere along this process the file gets corrupted such that the file that's saved on the server cannot be played.
I'll list simplified bits of code that show the process I'm going through so hopefully it'll be evident where I'm going wrong.
1) After audio is recorded (using getUserMedia and MediaRecorder), a local file is saved:
var audioData = new Blob(chunks, { type: 'audio/webm' });
var fileReader = new FileReader();
fileReader.onloadend = function() {
var buffer = this.result,
uint8Array = new Uint8Array(buffer);
fs.writeFile('path/to/file.webm', uint8Array, { flags: 'w' });
}
fileReader.readAsArrayBuffer(audioData);
2) Later this local file is read and sent to a server (using the library axios to send the ajax request)
fs.readFile('path/to/file.webm', 'binary', (err, data) => {
var formData = new FormData();
formData.append('file', new Blob([data], {type: 'audio/webm'}), 'file.webm');
axios.put('/upload', formData);
});
3) The server then handles this request and saves the file
[HttpPut]
public IActionResult Upload(IFormFile file)
{
using (var fileStream = new FileStream("path/to/file.webm", FileMode.Create))
{
file.CopyTo(fileStream);
}
}
The local audio file can be played successfully however the audio file on the server does not play.
I'm not sure if this is helpful information, but here are the first few lines of text I see when I open the local file in a text editor (notepad++):
And the same when I open the one on the server:
So kinda the same... but different. I've tried encoding a myriad of different ways but everything seems to fail. Fingers crossed someone can point me in the right direction here.
The problem was with how I was passing through the file contents from fs.readFile. If I passed a base64 encoded raw buffer from fs.readFile via json, converted that to a byte array on the server and saved that, then I can successfully play it on the server.
fs.readFile('path/to/file.webm', (err, data) => {
axios.put('/upload', { audioData: data.toString('base64') });
});
[HttpPut]
public IActionResult Upload([FromBody]UploadViewModel upload)
{
var audioDataBytes = Convert.FromBase64String(upload.AudioData);
using (var memoryStream = new MemoryStream(audioDataBytes))
using (var fileStream = new FileStream("path/to/file.webm", FileMode.Create))
{
await memoryStream.CopyToAsync(fileStream);
}
}
Actually, this is a problem of character encoding. You are probably mixing UTF-8 and ISO-8859 which causes the file to be corrupted.
You should probably set the charset in the HTML page to the one expected on the server. Or perform preliminary checks on the server if you do not know the charset of the data you will receive.
Converting to base64 will solve the issue because then it will only use characters in the ASCII range.
I am writing a sails.js app. I am writing an API to accept a file and encrypt it.
var file = req.body('myFile');
var fileBuffer = convertToBuffer(file);
How do I convert a file to buffer?
That looks like you've got a string which represents the body of your file.
You just have to make a new buffer with it.
var fileBuffer = Buffer.from(file)
If your encoding is NOT utf8 you can specify an alternate encoding as a second optional argument.
var fileBuffer = Buffer.from(file, 'base64')
If the file is actually on disk, this is even easier, since, by default, the fs.readFile operation returns a buffer.
fs.readFile(file, function(err, buffer){})
If you're in a real old version of node Buffer.from doesn't exist and you have to use a very memory-unsafe new constructor. Please consider upgrading your node instance to support Buffer.from
I have a web service that takes a base 64 encoded string representing an image, creates a thumbnail of that image using the imagemagick library, then stores both of them in mongodb. I am doing this with the following code (approximately):
var buf = new Buffer(req.body.data, "base64"); //original image
im.resize({ srcData: buf, width: 256 }, function(err, stdout, stderr) {
this.thumbnail = new Buffer(stdout, "binary");
//store buf and stdout in mongo
});
You will notice that I am creating a Buffer object using the "binary" encoding, which the docs say not to do:
'binary' - A way of encoding raw binary data into strings by using
only the first 8 bits of each character. This encoding method is
deprecated and should be avoided in favor of Buffer objects where
possible. This encoding will be removed in future versions of Node.
First off I'm not sure what they are saying there. I'm trying to create a Buffer object and they seem to imply I should already have one.
Secondly, the source of the problem appears to be that the imagemagick resize method returns a string containing binary data. Doing typedef(stdout) return "string" and printing it out to the screen certainly appears to show a bunch of non-character data.
So what do I do here? I can't change how imagemagick works. Is there another way of doing what I'm trying to do?
Thats how I am doing the same with success, storing images in mongodb.
//original ---> base64
var thumbnail = new Buffer(req.body.data).toString('base64');
//you can store this string value in a mongoose model property, and save to mongodb
//base64 ---> image
var buffer = new Buffer(thumbnail, "base64");
I am not sure if storing images as base64 is the best way to do it
Please try this as your base64 might not be pre-handled:
var imgRawData =
req.body.images[0].replace(/^data:image\/png;base64,|^data:image\/jpeg;base64,|^data:image\/jpg;base64,|^data:image\/bmp;base64,/, "");
var yourBuffer = new Buffer(imgRawData, "base64");
Then, save the yourBuffer into MongoDB buffer.