Get mp3 file from server by fetch - node.js

I have a server written in node.js. The client sends a get request by fetch with the url of the mp3 file that is in the files on the server. My goal is to send the mp3 file back to the client so that it can be played. I wrote something like this:
if (req.url.indexOf(".mp3") != -1) {
fs.readFile(__dirname + decodeURI(req.url), function (error, data) {
res.setHeader("Access-Control-Allow-Origin", "*");
res.writeHead(200, {
"Content-type": "audio/mpeg",
});
res.write(data);
res.end();
})
}
but I get this error: Uncaught (in promise) SyntaxError: Unexpected token I in JSON at position 0
Also, here it is client side:
fetch("http://localhost:3000/static/mp3/" + value, { method: "get" })
.then((response) => response.json())
.then((data) => (this.song = data));
document.getElementById("audio_src").src =
"http://localhost:3000/" + this.song;

In the client, you're calling response.json(), but the response you're getting back is NOT json. The data you're getting back is binary. Perhaps you should be calling response.blob()?
But, then you're trying to put the binary data into a URL as text. And, you're not handling the asynchronous nature of fetch() properly either. No, this is not the way to do things. You could create a data encoded URL, but there's really no point in doing it that way since whatever audio element you're using the the HTML page can fetch the MP3 from the URL by itself.
I might suggest something simpler in the client:
document.getElementById("audio_src").src = "http://localhost:3000/static/mp3/" + value;
And, let the browser's html tag go get the MP3 for you. I'm assuming that the element represented by audio_src is something that knows how to play MP3 audio sources on it's own. If so, that means you just give it the URL and it will go fetch it and play it on its own.

Related

express.js and request.js - incomplete PDF transfer when using callback syntax

Simplified question
why, when using express.js & request.js following two examples:
request.get(url)
.on('response' (requestjsResponse) => {
requestjsResponse.pipe(res);
})
and
request.get(url, (err, requestjsResponse, requestjsBody) => {
res.send(requestjsResponse)
})
Tends not to produce same results, even when requestjsBody contain expected content?
Detailed question
I have two express.js versions of route handler that are handling some file proxying procedures for multiple file types. The code is using standard express.js req/res/next notation. Basically, what might be important from the background, non-code information for this issue is that two most mainly returned types are handled as follows:
PDF: shall be opened within browser, their size is usually no less
than 18K (accordinng to content-length header)
EML: Shall be
downloaded, therir size is usually smaller than 16K (accordinng to
content-length header)
Both handlers versions are using request.js, one with
get(url: string, callback: (Error, Response, Body) => void)
form, that I'll be referring as callback form, where entire body is expected inside such callback.In this case, the response to user is send by plain express.js res.send(Body). other one is using form
get(url: string).on(event: 'response', callback: listener: (request.Response) => void)
that I'll be referring as event/pipe form, and is transferring response to end user by piping it by request.Response.pipe(res) inside 'response' handler. Details provided in code listing.
I'm unable to find the difference between those two forms, but:
In case of .eml (MIME message/rfc822, you can threat them as fancy HTML) files both versions works exactly same way, file is nicely downloaded.
In case of .pdf, when using event/pipe form get(url).on('response', callback) I'm able to successfully transfer PDF document to client. When I'm using callback form (i.e. get(url: string, callback: (Error, Response, Body) => void)), even when I'm peeking body in debugger (seems to be complete PDF, contains PDF header, EOF marker, e.c.t.), client receives only some strange preamble declaring HTML:
<!doctype html><html><body style='height: 100%; width: 100%; overflow: hidden; margin:0px; background-color: rgb(82, 86, 89);'><embed style='position:absolute; left: 0; top: 0;'width='100%' height='100%' src='about:blank' type='application/pdf' internalid='FD93AFE96F19F67BE0799686C52D978F'></embed></body></html>
but no PDF document is received afterwards. Chrome claims, that he was unable to load the document.
Please see code:
Non-working callback version:
request.get(url, (err, documentResponse, documentBody) => {
if (err) {
logger.error('Document Fetch error:');
logger.error(err);
} else {
const documentResponseContentLength = Number.parseInt(documentResponse.headers['content-length'], 10);
if (documentResponseContentLength === 0 || Number.isNaN(documentResponseContentLength)) {
logger.warn('No content provided for requested document or length header malformed');
res.redirect(get404Navigation());
}
if (mimetype === 'application/pdf') {
logger.info(' overwriting Headers (PDF)');
res.set('content-type', 'application/pdf');
// eslint-disable-next-line max-len, prefer-template
res.set('content-disposition', 'inline; filename="someName.pdf"');
logger.info('Document Download Headers (overridden):', res.headers);
}
if (mimetype === 'message/rfc822') {
logger.info(' overwriting Headers (message/rfc822)');
res.set('content-type', 'message/rfc822');
// eslint-disable-next-line max-len, prefer-template
res.set('content-disposition', 'attachment; filename="someName.eml"');
logger.info('Document Download Headers (overridden):', res.headers);
}
res.send(documentBody) /* Sending message to clinet */
}
})
.on('data', (d) => {
console.log('We are debugging here')
})
Working event based/piped version:
const r = request
.get(url)
.on('response', (documentsResponse) => {
if (Number.parseInt(documentsResponse.headers['content-length'], 10) !== 0) {
// Überschreibe headers für PDF und TIFF, diese kommen gelegentlich unvollständig an
if (mimetype === 'application/pdf') {
logger.info(' overwriting Headers (PDF)');
res.set('content-type', 'application/pdf');
res.set('content-disposition', 'inline; filename="someName".pdf"')
logger.info('Document Download Headers (overridden):', documentsResponse.headers);
}
if (mimetype === 'message/rfc822') {
logger.info(' overwriting Headers (message/rfc822)');
res.set('content-type', 'message/rfc822');
res.set('content-disposition', 'attachment; filename="someName".eml"');
logger.info('Document Download Headers (overridden):', res.headers);
}
r.pipe(res); /* Response is piped to client */
} else {
res.redirect(get404Navigation());
}
}
.on('data', (d) => {
console.log('We are debugging here')
})
Event that part with r.pipe(res) seems extra suspicious (see where r is declared and where is used) this is the versions that works correctly for both cases.
I assume, that issue might be caused by nature of sending multipart content, so I added additional on('data', (d)=>{}) callbacks and set breakepoints to see, when response is ended/piped vs when data handler is called, and results are according to my expectations:
request(url, (err, response, body)) case, data handler is called twice, before execution of callback, entire body is accessible inside handler, so It's even more obscure to me that I'm unable just to res.send it.
request.get(url).on('response') piping to res is called firstly, then two times data handler is called. I believe internal guts of node.js HTTP engine are doing the asynchronous trick and are pushing responses one after another at each response chunk is received.
I'll be glad for any explanation, what I'm doing wrong and what can I align to make my callback version work as expected for PDF case.
Epilogue:
Why such code is used? Our backend is retrieving PDF data from external, non-exposed to public internet server, but due to legacy reasons some headers are set incorrectly (mainly Content-Disposition), so we are intercepting them and act as kind of alignment proxy between data source and client.

Axios request with chunked response stream from Node

I have a client app in React, and a server in Node (with Express).
At server side, I have an endpoint like the following (is not the real endpoint, just an idea of what i'm doing):
function endpoint(req, res) {
res.writeHead(200, {
'Content-Type': 'text/plain',
'Transfer-Encoding': 'chunked'
});
for(x < 1000){
res.write(some_string + '\n');
wait(a_couple_of_seconds); // just to make process slower for testing purposes
}
res.end();
}
This is working perfect, i mean, when I call this endpoint, I receive the whole stream with all the 1.000 rows.
The thing is that I cannot manage to get this data by chunks (for each 'write' or a bunch of 'writes') in order to show that on the frontend as soon as i'm receiving them..(think of a table that shows the rows as soon as i get them from the endpoint call)
In the frontend I'm using Axios to call the API with the following code:
async function getDataFromStream(_data): Promise<any> {
const { data, headers } = await Axios({
url: `http://the.api.url/endpoint`,
method: 'GET',
responseType: 'stream',
timeout: 0,
});
// this next line doesn't work. it says that 'on' is not a function
data.on('data', chunk => console.log('chunk', chunk));
// data has actually the whole response data (all the rows)
return Promise.resolve();
}
The thing is that the Axios call returns the whole data object after the 'res.end()' on the server is called, but I need to get data as soon as the server will start sending the chunks with the rows (on each res.write or whenever the server thinks is ready to send some bunch of chunks).
I have also tried not to use an await and get the value of the promise at the 'then()' of the axios call but it is the same behavior, the 'data' value comes with all the 'writes' together once the server does the 'res.end()'
So, what I doing wrong here ? maybe this is not possible with Axios or Node and I should use something like websockets to solve it.
Any help will be very appreciate it because I read a lot but couldn't get a working solution yet.
For anyone interested in this, what I ended up doing is the following:
At the Client side, I used the Axios onDownloadProgress handler that allows handling of progress events for downloads.
So, I implemented something like this:
function getDataFromStream(_data): Promise<any> {
return Axios({
url: `http://the.api.url/endpoint`,
method: 'GET',
onDownloadProgress: progressEvent => {
const dataChunk = progressEvent.currentTarget.response;
// dataChunk contains the data that have been obtained so far (the whole data so far)..
// So here we do whatever we want with this partial data..
// In my case I'm storing that on a redux store that is used to
// render a table, so now, table rows are rendered as soon as
// they are obtained from the endpoint.
}
}).then(({ data }) => Promise.resolve(data));
}

Send an image as the body of a request, image recived with a request from outside

Yeah i kinda didn't know how to type the title well...
I've a node server which recives an image via post form. I then want to send this image to Microsoft vision and the same Google service in order to gether information from both, do some stuff, and return a result to the user that has accessed my server.
My problem is: how do i send the actual data?
This is the actual code that cares of that:
const microsofComputerVision = require("microsoft-computer-vision");
module.exports = function(req, res)
{
var file;
if(req.files)
{
file = req.files.file;
// Everything went fine
microsofComputerVision.analyzeImage(
{
"Ocp-Apim-Subscription-Key": vision_key,
"content-type": "multipart/form-data",
"body": file.data.toString(),
"visual-features":"Tags, Faces",
"request-origin":"westcentralus"
}).then((result) =>
{
console.log("A");
res.write(result);
res.end();
}).catch((err)=>
{
console.log(err);
res.writeHead(400, {'Content-Type': 'application/json'});
res.write(JSON.stringify({error: "The request must contain an image"}));
res.end();
});
}
else
{
res.writeHead(400, {'Content-Type': 'application/octet-stream'});
res.write(JSON.stringify({error: "The request must contain an image"}));
res.end();
}
}
If instead of calling "analyzeImage" i do the following
res.set('Content-Type', 'image/jpg')
res.send(file.data);
res.end();
The browser renders the image correctly, which made me think "file.data" contains the actual file (considered it's of type buffer).
But apparently Microsoft does not agree with that, because when i send the request to computer vision i get the following response:
"InvalidImageFormat"
The only examples i found are here, and the "data" that is used in that example comes from a file system read, not stright from a request. But saving the file to load it and then delete it to me looks like an horrible workaround, so i'd rather like to know in what form and how should i work on the "file" that i have to send it correctly for the APIs call.
Edit: if i use file.data (which i thought was the most correct since it would be sending the raw image as the body) i get an error which says that i must use a string or a buffer as content. So apparently that file.data is not a buffer in the way "body" requires O.o i'm not understanding honestly.
Solved, the error was quite stupid. In the "then" part, res.write(result) did not accept result as argument. This happened when i actually used the corret request (file.data which is a buffer). The other errors occurred everytime i tryed using toString() on file.data, in that case the request wasn't accepted.
Solved, the request asked for a buffer, and file.data is indeed a buffer. After chacking file.data type in any possible way i started looking for other problems. The error was much easier and, forgive my being stupid, too stupid to be evident. The result was a json, and res.write didn't accept a json as argument.
This is how I did it with Amazon Recognition Image Classifier, I know its not the same service your using - hoping this helps a little thou:
const imagePath = `./bat.jpg`;
const bitmap = fs.readFileSync(imagePath);
const params = {
Image: { Bytes: bitmap },
MaxLabels: 10,
MinConfidence: 50.0
};
route.post('/', upload.single('image'), (req, res) => {
let params = getImage();
rekognition.detectLabels(params, function(err, data) {
if (err) {
console.log('error');
}else {
console.log(data);
res.json(data);
}
});
});

Node.js - Stream Binary Data Straight from Request to Remote server

I've been trying to stream binary data (PDF, images, other resources) directly from a request to a remote server but have had no luck so far. To be clear, I don't want to write the document to any filesystem. The client (browser) will make a request to my node process which will subsequently make a GET request to a remote server and directly stream that data back to the client.
var request = require('request');
app.get('/message/:id', function(req, res) {
// db call for specific id, etc.
var options = {
url: 'https://example.com/document.pdf',
encoding: null
};
// First try - unsuccessful
request(options).pipe(res);
// Second try - unsuccessful
request(options, function (err, response, body) {
var binaryData = body.toString('binary');
res.header('content-type', 'application/pdf');
res.send(binaryData);
});
});
Putting both data and binaryData in a console.log show that the proper data is there but the subsequent PDF that is downloaded is corrupt. I can't figure out why.
Wow, never mind. Found out Postman (Chrome App) was hijacking the request and response somehow. The // First Try example in my code excerpt works properly in browser.

Node.js response.write buffer limit restrictions

I am using nodeJS with some additional modules to do web page scraping and media item identification from a set of websites.
The node server basically throws back a JSON markup of all the items identified on the page and its associated metadata. The JSON data is generated correctly as I can see it in the server logs however when I write it to the client, for some reason the JSON response is terminated.
I tested this with all browsers and using rest clients and it seems to be point to an issue with response.write(response, 'utf-8') which may not be sending the whole data or the connection gets closed for some reason.
I verified that there is no chunking involved for my test cases so there is no question of the connection being aggressively closed by the client if its still waiting for the next chunk of data. i.e. response.write in this case returns true which implies that all the data has been written to client.
Any pointers as to what could be causing the connection to be terminated or the response to be truncated? For JSON responses of smaller sizes the response is received correctly by the client.
Code:
return parseDOM(page, url, function(err, response){
if(err){
res.writeHeader(200, {'Content-Type':'application/json'});
res.end('Error Parsing DOM from ' + url);
e.message = 'Error Parsing DOM';
callback(e, req, res, targetUrl);
return;
}
else {
if(response){
res.writeHeader(200, {'Content-Type':'application/json', 'Content-Length':response.length});
console.log(response);
res.write(response, 'UTF-8');
res.end();
callback(null, req, res, targetUrl);
return;
}
}
});
Sorry. My bad. I see that the content length is wrong. Identified solution via issue:
Node.js cuts off files when serving over HTTPS

Resources