I'm querying a data service with a restful URL. The server (also node) sends about 955k of JSON data.
1) I can CURL the data to get the correct result, ie, it passes JSON.parse().
2) From node, I can exec("curl ..."); and also get the correct result.
3) Using both Request and Axios, I get about 600k of data. The precise number of characters changes each time.
4) Using Axios, I streamed the data into a file and got many 'data' events which I concatenated into a file. It was also incorrect.
5) It works fine with a smaller payload.
Experts Unite!! I am at your mercy. I will supplicate and offer praise and thanks for your aid.
Without your help, I will have a production application that uses CURL from NodeJS and evil will win.
Sincerely,
TQ White II
UPDATE: I was asked for a code snippet. Here it is:
const datGetterWORKS_FOR_SMALL_DATA_LOADS=(element, next)=>{
const localCallback=sendToTransformerCallback(element, next);
const {url, headers}=networkSpecs.connection;
axios.get(url + element.urlSegment, {
method: 'get',
responseType: 'json',
headers: headers,
maxContentLength: 6000000,
})
.then(function (response) {
localCallback('', response, response.data)
});
}
Note that this is give to a require('async').each() process.
Related
I have a use-case when I need to check whether an HTTP resource is available (returns HTTP 200) but I don't want to download the response body because it might be large and my monthly transfers are limited.
In C# HttpClient there is an option HttpRequestOptions.ResponseHeadersRead = 1 that does exactly that.
I'm wondering if this can be done in NodeJS, preferably using the axios library? My only idea is to set the maxContentLength:
axios({
method: 'GET',
url: 'https://<any-url>/',
maxContentLength: 1000,
})
– but I don't like this idea because it's a hack and semantically wrong and sizing the maxContentLength parameter may cause a regular HTTP 404 to end up as a content too large - my code would incorrectly assume that the resource is available but too large.
var apiUrl = 'https://api.domain.com/v1/products/'
axios.head(apiUrl)
.then((data) => {
console.log('data', data.status)
})
.catch((e) => {
console.log('error', e)
})
Instead of GET request, you can use HEAD request and you will get the necessary status code with headers information containing content-length, content-type.
I have a client app in React, and a server in Node (with Express).
At server side, I have an endpoint like the following (is not the real endpoint, just an idea of what i'm doing):
function endpoint(req, res) {
res.writeHead(200, {
'Content-Type': 'text/plain',
'Transfer-Encoding': 'chunked'
});
for(x < 1000){
res.write(some_string + '\n');
wait(a_couple_of_seconds); // just to make process slower for testing purposes
}
res.end();
}
This is working perfect, i mean, when I call this endpoint, I receive the whole stream with all the 1.000 rows.
The thing is that I cannot manage to get this data by chunks (for each 'write' or a bunch of 'writes') in order to show that on the frontend as soon as i'm receiving them..(think of a table that shows the rows as soon as i get them from the endpoint call)
In the frontend I'm using Axios to call the API with the following code:
async function getDataFromStream(_data): Promise<any> {
const { data, headers } = await Axios({
url: `http://the.api.url/endpoint`,
method: 'GET',
responseType: 'stream',
timeout: 0,
});
// this next line doesn't work. it says that 'on' is not a function
data.on('data', chunk => console.log('chunk', chunk));
// data has actually the whole response data (all the rows)
return Promise.resolve();
}
The thing is that the Axios call returns the whole data object after the 'res.end()' on the server is called, but I need to get data as soon as the server will start sending the chunks with the rows (on each res.write or whenever the server thinks is ready to send some bunch of chunks).
I have also tried not to use an await and get the value of the promise at the 'then()' of the axios call but it is the same behavior, the 'data' value comes with all the 'writes' together once the server does the 'res.end()'
So, what I doing wrong here ? maybe this is not possible with Axios or Node and I should use something like websockets to solve it.
Any help will be very appreciate it because I read a lot but couldn't get a working solution yet.
For anyone interested in this, what I ended up doing is the following:
At the Client side, I used the Axios onDownloadProgress handler that allows handling of progress events for downloads.
So, I implemented something like this:
function getDataFromStream(_data): Promise<any> {
return Axios({
url: `http://the.api.url/endpoint`,
method: 'GET',
onDownloadProgress: progressEvent => {
const dataChunk = progressEvent.currentTarget.response;
// dataChunk contains the data that have been obtained so far (the whole data so far)..
// So here we do whatever we want with this partial data..
// In my case I'm storing that on a redux store that is used to
// render a table, so now, table rows are rendered as soon as
// they are obtained from the endpoint.
}
}).then(({ data }) => Promise.resolve(data));
}
I am trying implement REST API using REACT AND NODE. How to get JSON from front end(REACT JS)drag and drop images in template ex."https://www.canva.com/templates/" to store JSON in Mongodb using NODE JS.
Thanks in advance
You can use fetch api to call a particular route and send data along with it to nodejs backend.
You just need to do simply like this:
async function sendData(){
let res = await fetch(url, {
method: 'POST',
mode: 'CORS',
body: {}, //the json object you want to send
headers: {}, //if required
}
)
}
Hope this helps!
Since you asked how to send JSON to node.js I'm assuming you do not yet have an API that your front end can use.
To send data to the back end you need to create an API that accepts data.
You can do this quickly and easily using express.js.
Once the server is running and it has an endpoint to send data to, you can create a request (e.g. when sending data it should be a POST request).
This can be done in many different ways, although I would suggest trying axios.
Hope this helped.
Check the example to get the Json value and update it.
axios.get('https://jsonplaceholder.typicode.com/todos/'+ this.props.id + '/')
.then((res) => {
this.setState({
// do some action
});
})
.catch(function (error) {
console.log(error);
});
PhantomJs's webserver does not support multipart requests, so I'm trying to send a single-part request from NodeJs.
Unfortunatly the nodejs example looks to be multipart. is there any way of doing this with NodeJs?
http://nodejs.org/api/http.html#http_http_request_options_callback
edit:
in the nodejs docs it mentions:
Sending a 'Content-length' header will disable the default chunked encoding.
but unfortunatly it's still multi-part, just not multi-multipart :P
edit2: for showing code, it's a bit hard to show a distilled example, but here goes:
node.js code (it's Typescript code):
```
//send our POST body (our clientRequest)
var postBody = "hello";
var options : __node_d_ts.IRequestOptions = {
host: host,
port: port,
path: "/",
method: "POST",
headers: {
"Content-Type": "application/json",
"Content-length": postBody.length
}
};
//logger.assert(false);
var clientRequest = http.request(options,(response: http.ServerResponse) => {
//callback stuff here
});
clientRequest.on("error", (err) => {
thisObj.abort("error", "error,request error", err);
});
//clientRequest.write();
clientRequest.end(postBody);
```
when i read the results from PhantomJS, the post/postRaw fields are null.
when I use a tool like the Chrome "Advanced REST Client" extension to send a POST body, phantomjs gets it no problem.
i don't have a network sniffer, but as described here, it says phantomjs doesnt work with multipart so I think that's a good guesss: How can I send POST data to a phantomjs script
EDIT3:
indeed, here's the request phantomjs gets from my chrome extension (valid post)
//cookie, userAgent, and Origin headers removed for brevity
{"headers":{"Accept":"*/*","Accept-Encoding":"gzip,deflate,sdch","Accept-Language":"en-US,en;q=0.8,ko;q=0.6","Connection":"keep-alive","Content-Length":"5","Content-Type":"application/json","DNT":"1","Host":"localhost:41338", "httpVersion":"1.1","method":"POST","post":"hello","url":"/"}
and here's the request phantomjs gets from the nodejs code i show above:
//full request, nothing omitted!
{"headers":{"Connection":"keep-alive","Content-Type":"application/json","Content-length":"5","Host":"10.0.10.15:41338"},"httpVersion":"1.1","method":"POST","url":"/"}
I am trying to get data from the Bing search API, and since the existing libraries seem to be based on old discontinued APIs I though I'd try myself using the request library, which appears to be the most common library for this.
My code looks like
var SKEY = "myKey...." ,
ServiceRootURL = 'https://api.datamarket.azure.com/Bing/Search/v1/Composite';
function getBingData(query, top, skip, cb) {
var params = {
Sources: "'web'",
Query: "'"+query+"'",
'$format': "JSON",
'$top': top, '$skip': skip
},
req = request.get(ServiceRootURL).auth(SKEY, SKEY, false).qs(params);
request(req, cb)
}
getBingData("bookline.hu", 50, 0, someCallbackWhichParsesTheBody)
Bing returns some JSON and I can work with it sometimes but if the response body contains a large amount of non ASCII characters JSON.parse complains that the string is malformed. I tried switching to an ATOM content type, but there was no difference, the xml was invalid. Inspecting the response body as available in the request() callback actually shows bad code.
So I tried the same request with some python code, and that appears to work fine all the time. For reference:
r = requests.get(
'https://api.datamarket.azure.com/Bing/Search/v1/Composite?Sources=%27web%27&Query=%27sexy%20cosplay%20girls%27&$format=json',
auth=HTTPBasicAuth(SKEY,SKEY))
stuffWithResponse(r.json())
I am unable to reproduce the problem with smaller responses (e.g. limiting the number of results) and unable to identify a single result which causes the issue (by stepping up the offset).
My impression is that the response gets read in chunks, transcoded somehow and reassembled back in a bad way, which means the json/atom data becomes invalid if some multibyte character gets split, which happens on larger responses but not small ones.
Being new to node, I am not sure if there is something I should be doing (setting the encoding somewhere? Bing returns UTF-8, so this doesn't seem needed).
Anyone has any idea of what is going on?
FWIW, I'm on OSX 10.8, node is v0.8.20 installed via macports, request is v2.14.0 installed via npm.
i'm not sure about the request library but the default nodejs one works well for me. It also seems a lot easier to read than your library and does indeed come back in chunks.
http://nodejs.org/api/http.html#http_http_request_options_callback
or for https (like your req) http://nodejs.org/api/https.html#https_https_request_options_callback (the same really though)
For the options a little tip: use url parse
var url = require('url');
var params = '{}'
var dataURL = url.parse(ServiceRootURL);
var post_options = {
hostname: dataURL.hostname,
port: dataURL.port || 80,
path: dataURL.path,
method: 'GET',
headers: {
'Content-Type': 'application/json; charset=utf-8',
'Content-Length': params.length
}
};
obviously params needs to be the data you want to send
I think your request authentication is incorrect. Authentication has to be provided before request.get.
See the documentation for request HTTP authentication. qs is an object that has to be passed to request options just like url and auth.
Also you are using same req for second request. You should know that request.get returns a stream for the GET of url given. Your next request using req will go wrong.
If you only need HTTPBasicAuth, this should also work
//remove req = request.get and subsequent request
request.get('http://some.server.com/', {
'auth': {
'user': 'username',
'pass': 'password',
'sendImmediately': false
}
},function (error, response, body) {
});
The callback argument gets 3 arguments. The first is an error when applicable (usually from the http.Client option not the http.ClientRequest object). The second is an http.ClientResponse object. The third is the response body String or Buffer.
The second object is the response stream. To use it you must use events 'data', 'end', 'error' and 'close'.
Be sure to use the arguments correctly.
You have to pass the option {json:true} to enable json parsing of the response