Upload a binary encoded audio file via ajax and save - node.js

I have an audio file saved locally that I want to read, upload to a server via ajax and then store on the server. Somewhere along this process the file gets corrupted such that the file that's saved on the server cannot be played.
I'll list simplified bits of code that show the process I'm going through so hopefully it'll be evident where I'm going wrong.
1) After audio is recorded (using getUserMedia and MediaRecorder), a local file is saved:
var audioData = new Blob(chunks, { type: 'audio/webm' });
var fileReader = new FileReader();
fileReader.onloadend = function() {
var buffer = this.result,
uint8Array = new Uint8Array(buffer);
fs.writeFile('path/to/file.webm', uint8Array, { flags: 'w' });
}
fileReader.readAsArrayBuffer(audioData);
2) Later this local file is read and sent to a server (using the library axios to send the ajax request)
fs.readFile('path/to/file.webm', 'binary', (err, data) => {
var formData = new FormData();
formData.append('file', new Blob([data], {type: 'audio/webm'}), 'file.webm');
axios.put('/upload', formData);
});
3) The server then handles this request and saves the file
[HttpPut]
public IActionResult Upload(IFormFile file)
{
using (var fileStream = new FileStream("path/to/file.webm", FileMode.Create))
{
file.CopyTo(fileStream);
}
}
The local audio file can be played successfully however the audio file on the server does not play.
I'm not sure if this is helpful information, but here are the first few lines of text I see when I open the local file in a text editor (notepad++):
And the same when I open the one on the server:
So kinda the same... but different. I've tried encoding a myriad of different ways but everything seems to fail. Fingers crossed someone can point me in the right direction here.

The problem was with how I was passing through the file contents from fs.readFile. If I passed a base64 encoded raw buffer from fs.readFile via json, converted that to a byte array on the server and saved that, then I can successfully play it on the server.
fs.readFile('path/to/file.webm', (err, data) => {
axios.put('/upload', { audioData: data.toString('base64') });
});
[HttpPut]
public IActionResult Upload([FromBody]UploadViewModel upload)
{
var audioDataBytes = Convert.FromBase64String(upload.AudioData);
using (var memoryStream = new MemoryStream(audioDataBytes))
using (var fileStream = new FileStream("path/to/file.webm", FileMode.Create))
{
await memoryStream.CopyToAsync(fileStream);
}
}

Actually, this is a problem of character encoding. You are probably mixing UTF-8 and ISO-8859 which causes the file to be corrupted.
You should probably set the charset in the HTML page to the one expected on the server. Or perform preliminary checks on the server if you do not know the charset of the data you will receive.
Converting to base64 will solve the issue because then it will only use characters in the ASCII range.

Related

Getting Failed - Network Error while downloading PDF file from Amazon S3

Goal: Try to download a pdf file from Amazon S3 to my local machine via a NodeJS/VueJS application without creating a file on the server's filesystem.
Server: NodeJs(v 18.9.0) Express (4.17.1)
Middleware function that retrieves the file from S3 and converts the stream into a base64 string and sends that string to the client:
const filename = 'lets_go_to_the_snackbar.pdf';
const s3 = new AWS.S3(some access parameters);
const params = {
Bucket: do_not_kick_this_bucket,
Key: `yellowbrickroad/${filename}`
}
try {
const data = await s3
.getObject(params)
.promise();
const byte_string = Buffer.from(data.Body).toString('base64');
res.send(byte_string);
} catch (err) {
console.log(err);
}
Client: VueJS( v 3.2.33)
Function in component receives byte string via an axios (v 0.26.1) GET call to the server. The code to download is as follows:
getPdfContent: async function (filename) {
const resp = await AxiosService.getPdf(filename) // Get request to server made here.
const uriContent = `data:application/pdf;base64,${resp.data}`
const link = document.createElement('a')
link.href = uriContent
link.download = filename
document.body.appendChild(link) // Also tried omitting this line along with...
link.click()
link.remove() // ...omitting this line
}
Expected Result(s):
Browser opens a window to allow a directory to be selected as the file's destination.
Directory Selected.
File is downloaded.
Ice cream and mooncakes are served.
Actual Results(s):
Browser opens a window to allow a directory to be selected as the file's destination
Directory Selected.
Receive Failed - Network Error message.
Lots of crying...
Browser: Chrome (Version 105.0.5195.125 (Official Build) (x86_64))
Read somewhere that Chrome will balk at files larger than 4MB, so I checked the S3 bucket and according to Amazon S3 the file size is a svelte 41.7KB.
After doing some reading, a possible solution was presented that I tried to implement. It involved making a change to the VueJs getPdfContent function as follows:
getPdfContent: async function (filename) {
const resp = await AxiosService.getPdf(filename) // Get request to server made here.
/**** This is the line that was changed ****/
const uriContent = window.URL.createObjectURL(new Blob([resp.data], { type: 'application/pdf' } ))
const link = document.createElement('a')
link.href = uriContent
link.download = filename
document.body.appendChild(link) // Also tried omitting this line along with...
link.click()
link.remove() // ...omitting this line
}
Actual Results(s) for updated code:
Browser opens a window to allow a directory to be selected as the file's destination
Directory Selected.
PDF file downloaded.
Trying to open the file produces the message:
The file “lets_go_to_the_snackbar.pdf” could not be opened.
It may be damaged or use a file format that Preview doesn’t recognize.
I am able to download the file directly from S3 using the AWS S3 console with no problems opening the file.
I've read through similar postings and tried implementing their solutions, but found no joy. I would be highly appreciative if someone can
Give me an idea of where I am going off the path towards reaching the goal
Point me towards the correct path.
Thank you in advance for your help.
After doing some more research I found the problem was how I was returning the data from the server back to the client. I did not need to modify the data received from the S3 service.
Server Code:
let filename = req.params.filename;
const params = {
Bucket: do_not_kick_this_bucket,
Key: `yellowbrickroad/${filename}`
}
try {
const data = await s3
.getObject(params)
.promise();
/* Here I did not modify the information returned */
res.send(data.Body);
res.end();
} catch (err) {
console.log(err);
}
On the client side my VueJS component receives a Blob object as the response
Client Code:
async getFile (filename) {
let response = await AuthenticationService.downloadFile(filename)
const uriContent = window.URL.createObjectURL(new Blob([response.data]))
const link = document.createElement('a')
link.setAttribute('href', uriContent)
link.setAttribute('download', filename)
document.body.appendChild(link)
link.click()
link.remove()
}
In the end the goal was achieved; a file on S3 can be downloaded directly to a user's local machine without the application storing a file on the server.
I would like to mention Sunpun Sandaruwan's answer which gave me the final clue I needed to reach my goal.

How to send File through Websocket along with additional info?

I'm developing a Web application to send images, videos, etc. to two monitors from an admin interface. I'm using ws in Node.js for the server side. I've implemented selecting images available on the server and external URLs and sending them to the clients, but I also wanted to be able to directly send images selected from the device with a file input. I managed to do it using base64 but I think it's pretty inefficient.
Currently I send a stringified JSON object containing the client to which the resource has to be sent, the kind of resource and the resource itself, parse it in the server and send it to the appropriate client. I know I can set the Websocket binaryType to blob and just send the File object, but then I'd have no way to tell the server which client it has to send it to. I tried using typeson and BSON to accomplish this, but it didn't work.
Are there any other ways to do it?
You can send raw binary data through the WebSocket.
It's quite easy to manage.
One option is to prepend a "magic byte" (an identifier that marks the message as non-JSON). For example, prepend binary messages with the B character.
All the server has to do is test the first character before collecting the binary data (if the magic byte isn't there, it's probably the normal JSON message).
A more serious implementation will attach a header after the magic byte (i.e., file name, total length, position of data being sent etc').
This allows the upload to be resumed on disconnections (send just the parts that weren't acknowledged as received.
Your server will need to split the data into magic byte, header and binary_data before processing. but it's easy enough to accomplish.
Hope this help someone.
According to socket.io document you can send either string, Buffer or mix both of them
On Client side:
function uploadFile(e, socket, to) {
let file = e.target.files[0];
if (!file) {
return
}
if (file.size > 10000000) {
alert('File should be smaller than 1MB')
return
}
var reader = new FileReader();
var rawData = new ArrayBuffer();
reader.onload = function (e) {
rawData = e.target.result;
socket.emit("send_message", {
type: 'attachment',
data: rawData
} , (result) => {
alert("Server has received file!")
});
alert("the File has been transferred.")
}
reader.readAsArrayBuffer(file);
}
on server side:
socket.on('send_message', async (data, cb) => {
if (data.type == 'attachment') {
console.log('Found binary data')
cb("Received file successfully.")
return
}
// Process other business...
});
I am using pure WebSocket without io, where you cannot mix content - either String or Binary. Then my working solution is like this:
CLIENT:
import { serialize } from 'bson';
import { Buffer } from 'buffer';
const reader = new FileReader();
let rawData = new ArrayBuffer();
ws = new WebSocket(...)
reader.onload = (e) => {
rawData = e.target.result;
const bufferData = Buffer.from(rawData);
const bsonData = serialize({ // whatever js Object you need
file: bufferData,
route: 'TRANSFER',
action: 'FILE_UPLOAD',
});
ws.send(bsonData);
}
Then on Node server side, the message is catched and parsed like this:
const dataFromClient = deserialize(wsMessage, {promoteBuffers: true}) // edited
fs.writeFile(
path.join('../server', 'yourfiles', 'yourfile.txt'),
dataFromClient.file, // edited
'binary',
(err) => {
console.log('ERROR!!!!', err);
}
);
The killer is promoteBuffer option in deserialize function.

Cannot get node to pipe data to a download file correctly

I'm fairly new to node and streaming, and I am having an issue when attempting to stream a large amount of data to a file on the client browser.
So for example, if on the server if i have a large file, test.txt, i can easily stream this to the client browser by setting the header attachment and piping the file to the request response as follows.
res.setHeader('Content-Type', 'text/csv');
res.setHeader('Content-disposition', 'attachment;filename=myfile.text');
fs.createReadStream('./test.txt')
.pipe(res);
When the user clicks the button, the download begins, and we see the data getting streamed to the download file. The stream takes several minutes, but during this time the client is not blocked and they can continue to do other things while the file is downloaded by the browser.
However my data is not stored in a file, I need to retrieve it one string at a time from another server. So I'm attempting to create my own read stream and push my data chunk by chunk, but it does not work, when i do something like this:
var s = new Readable();
s.pipe(res);
for(let i=0; i<=total; i++) {
dataString = //code here to get next string needed to push
s.push(dataString);
};
s.push(null);
With this code, when the user request the download, once the download begins, the client is blocked and cannot do any other actions until the download is completed. Also if the data takes more than 30 seconds to stream, we hit the server timeout in this case, and the download fails. With the file stream this is not an issue
How to I get this to act like a file stream and not block the client from doing other request while it downloads. Any recommendations on the best way to implement this would be appreciated.
I was able resolve this issue by doing something similar to here:
How to call an asynchronous function inside a node.js readable stream
My basic code is as follows, and this is not blocking the client or timing out on the request as the data is continuously piped to the file download on the client side.
res.setHeader('Content-Type', 'text/csv');
res.setHeader('Content-disposition', 'attachment;filename=myfile.text');
function MyStream() {
var rs = new Readable();
var hitsadded = 0;
rs._read = function() {}; // needed to avoid "Not implemented" exception
getResults(queryString, function getMoreUntilDone(err, res) {
if (err){
logger.logError(err);
}
rs.push(res.data);
hitsadded += res.records;
if (res.recordsTotal > hitsadded) {
getNextPage(query, getMoreUntilDone);
} else {
rs.push(null);
}
});
return rs;
}
MyStream().pipe(zlib.createGzip()).pipe(res);

How to save pdf to android file system and then view PDF - react-native

I am using the react-native-fs and I am trying to save a base64 of a pdf file to my android emulators file system.
I receive base64 encoded pdf from the server.
I then decode the base64 string with the line:
var pdfBase64 = 'data:application/pdf;base64,'+base64Str;
saveFile() function
saveFile(filename, pdfBase64){
// create a path you want to write to
var path = RNFS.DocumentDirectoryPath + '/' + filename;
// write the file
RNFS.writeFile(path, base64Image, 'base64').then((success) => {
console.log('FILE WRITTEN!');
})
.catch((err) => {
console.log("SaveFile()", err.message);
});
}
Error
When I try saving the pdfBase64 the saveFile() function catches the following error:
bad base-64
Question
Can anyone tell where or what I am doing wrong?
Thanks.
For anyone having the same problem, here is the solution.
Solution
react-nativive-pdf-view must take the file path to the pdf_base64.
Firstly, I used the react-native-fetch-blob to request the pdf base64 from the server.(Because RN fetch API does not yet support BLOBs).
Also I discovered that react-native-fetch-blob also has a FileSystem API which is way better documented and easier to understand than the 'react-native-fs' library. (Check out its FileSystem API documentation)
Receiving base64 pdf and saving it to a file path:
var RNFetchBlob = require('react-native-fetch-blob').default;
const DocumentDir = RNFetchBlob.fs.dirs.DocumentDir;
getPdfFromServer: function(uri_attachment, filename_attachment) {
return new Promise((RESOLVE, REJECT) => {
// Fetch attachment
RNFetchBlob.fetch('GET', config.apiRoot+'/app/'+uri_attachment)
.then((res) => {
let base64Str = res.data;
let pdfLocation = DocumentDir + '/' + filename_attachment;
RNFetchBlob.fs.writeFile(pdfLocation, pdf_base64Str, 'base64');
RESOLVE(pdfLocation);
})
}).catch((error) => {
// error handling
console.log("Error", error)
});
}
What I was doing wrong was instead of saving the pdf_base64Str to the file location like I did in the example above. I was saving it like this:
var pdf_base64= 'data:application/pdf;base64,'+pdf_base64Str;
which was wrong.
Populate PDF view with file path:
<PDFView
ref={(pdf)=>{this.pdfView = pdf;}}
src={pdfLocation}
style={styles.pdf}
/>
There is a new package to handle the fetching (based on react-native-fetch-blob) and displaying of the PDF via URL: react-native-pdf.
Remove application type in base64 string and it's working for me
var pdfBase64 = 'data:application/pdf;base64,'+base64Str;
To
var pdfBase64 = base64Str;

hitting a multipart url in nodejs

I have a client code using form-data module to hit a url that returns a content-type of image/jpeg. Below is my code
var FormData = require('form-data');
var fs = require('fs');
var form = new FormData();
//form.append('POLICE', "hello");
//form.append('PAYSLIP', fs.createReadStream("./Desert.jpg"));
console.log(form);
//https://fbcdn-profile-a.akamaihd.net/hprofile-ak-xfp1/v/t1.0- 1/c8.0.50.50/p50x50/10934065_1389946604648669_2362155902065290483_n.jpg?oh=13640f19512fc3686063a4703494c6c1&oe=55ADC7C8&__gda__=1436921313_bf58cbf91270adcd7b29241838f7d01a
form.submit({
protocol: 'https:',
host: 'fbcdn-profile-a.akamaihd.net',
path: '/hprofile-ak-xfp1/v/t1.0-1/c8.0.50.50/p50x50/10934065_1389946604648669_2362155902065290483_n.jpg?oh=13640f19512fc3686063a3494c6c1&oe=55ADCC8&__gda__=1436921313_bf58cbf91270adcd7b2924183',
method: 'get'
}, function (err, res) {
var data = "";
res.on("data", function (chunks) {
data += chunks;
});
res.on("end", function () {
console.log(data);
console.log("Response Headers - " + JSON.stringify(res.headers));
});
});
I'm getting some chunk data and the response headers i received was
{"last-modified":"Thu, 12 Feb 2015 09:49:26 GMT","content-type":"image/jpeg","timing-allow-origin":"*","access-control-allow-origin":"*","content-length":"1443","cache-control":"no-transform, max-age=1209600","expires":"Thu, 30 Apr 2015 07:05:31 GMT","date":"Thu, 16 Apr 2015 07:05:31 GMT","connection":"keep-alive"}
I am now stuck as how to process the response that i received to a proper image.I tried base64 decoding but it seemed to be a wrong approach any help will be much appreciated.
I expect that data, once the file has been completely downloaded, contains a Buffer.
If that is the case, you should write the buffer as is, without any decoding, to a file:
fs.writeFile('path/to/file.jpg', data, function onFinished (err) {
// Handle possible error
})
See fs.writeFile() documentation - you will see that it accepts either a string or a buffer as data input.
Extra awesomeness by using streams
Since the res object is a readable stream, you can simply pipe the data directly to a file, without keeping it in memory. This has the added benefit that if you download really large file, Node.js will not have to keep the whole file in memory (as it does now), but will write it to the filesystem continuously as it arrives.
form.submit({
// ...
}, function (err, res) {
// res is a readable stream, so let's pipe it to the filesystem
var file = fs.createWriteStream('path/to/file.jpg')
res.on('end', function writeDone (err) {
// File is saved, unless err happened
})
.pipe(file) // Send the incoming file to the filesystem
})
The chunk you got is the raw image. Do whatever it is you want with the image, save it to disk, let the user download it, whatever.
So if I understand your question clearly, you want to download a file from an HTTP endpoint and save it to your computer, right? If so, you should look into using the request module instead of using form-data.
Here's a contrived example for downloading things using request:
var fs = require('fs');
var request = require('request')
request('http://www.example.com/picture.jpg')
.pipe(fs.createWriteStream('picture.jpg'))
Where 'picture.jpg' is the location to save to disk. You can open it up using a normal file browser.

Resources