I am trying implement REST API using REACT AND NODE. How to get JSON from front end(REACT JS)drag and drop images in template ex."https://www.canva.com/templates/" to store JSON in Mongodb using NODE JS.
Thanks in advance
You can use fetch api to call a particular route and send data along with it to nodejs backend.
You just need to do simply like this:
async function sendData(){
let res = await fetch(url, {
method: 'POST',
mode: 'CORS',
body: {}, //the json object you want to send
headers: {}, //if required
}
)
}
Hope this helps!
Since you asked how to send JSON to node.js I'm assuming you do not yet have an API that your front end can use.
To send data to the back end you need to create an API that accepts data.
You can do this quickly and easily using express.js.
Once the server is running and it has an endpoint to send data to, you can create a request (e.g. when sending data it should be a POST request).
This can be done in many different ways, although I would suggest trying axios.
Hope this helped.
Check the example to get the Json value and update it.
axios.get('https://jsonplaceholder.typicode.com/todos/'+ this.props.id + '/')
.then((res) => {
this.setState({
// do some action
});
})
.catch(function (error) {
console.log(error);
});
Related
This is a function on my front-end that makes the request.
function postNewUser(){
fetch(`http://12.0.0.1:8080/users/test`, {
method: 'POST',
body: {nome: name, email: "test#test.com.br", idade: 20}
})
}
This is my back-end code to receive the request.
router.post('/users/:id', koaBody(), ctx => {
ctx.set('Access-Control-Allow-Origin', '*');
users.push(ctx.request.body)
ctx.status = 201
ctx.body = ctx.params
console.log(users)
})
For some unknown reason I receive nothing. Not even a single error message. The "console.log()" on the back-end is also not triggered, so my theory is that the problem is on the front-end.
Edit
As sugested by gnososphere, I tested with Postman, and it worked. So now i know the problem must be on the fron-end code.
You can try your backend functionality with Postman. It's a great service for testing.
the request would look something like this
If the problem is on the frontend, double check your fetch method by posting to a website that will return data and logging that in your app.
I have a client app in React, and a server in Node (with Express).
At server side, I have an endpoint like the following (is not the real endpoint, just an idea of what i'm doing):
function endpoint(req, res) {
res.writeHead(200, {
'Content-Type': 'text/plain',
'Transfer-Encoding': 'chunked'
});
for(x < 1000){
res.write(some_string + '\n');
wait(a_couple_of_seconds); // just to make process slower for testing purposes
}
res.end();
}
This is working perfect, i mean, when I call this endpoint, I receive the whole stream with all the 1.000 rows.
The thing is that I cannot manage to get this data by chunks (for each 'write' or a bunch of 'writes') in order to show that on the frontend as soon as i'm receiving them..(think of a table that shows the rows as soon as i get them from the endpoint call)
In the frontend I'm using Axios to call the API with the following code:
async function getDataFromStream(_data): Promise<any> {
const { data, headers } = await Axios({
url: `http://the.api.url/endpoint`,
method: 'GET',
responseType: 'stream',
timeout: 0,
});
// this next line doesn't work. it says that 'on' is not a function
data.on('data', chunk => console.log('chunk', chunk));
// data has actually the whole response data (all the rows)
return Promise.resolve();
}
The thing is that the Axios call returns the whole data object after the 'res.end()' on the server is called, but I need to get data as soon as the server will start sending the chunks with the rows (on each res.write or whenever the server thinks is ready to send some bunch of chunks).
I have also tried not to use an await and get the value of the promise at the 'then()' of the axios call but it is the same behavior, the 'data' value comes with all the 'writes' together once the server does the 'res.end()'
So, what I doing wrong here ? maybe this is not possible with Axios or Node and I should use something like websockets to solve it.
Any help will be very appreciate it because I read a lot but couldn't get a working solution yet.
For anyone interested in this, what I ended up doing is the following:
At the Client side, I used the Axios onDownloadProgress handler that allows handling of progress events for downloads.
So, I implemented something like this:
function getDataFromStream(_data): Promise<any> {
return Axios({
url: `http://the.api.url/endpoint`,
method: 'GET',
onDownloadProgress: progressEvent => {
const dataChunk = progressEvent.currentTarget.response;
// dataChunk contains the data that have been obtained so far (the whole data so far)..
// So here we do whatever we want with this partial data..
// In my case I'm storing that on a redux store that is used to
// render a table, so now, table rows are rendered as soon as
// they are obtained from the endpoint.
}
}).then(({ data }) => Promise.resolve(data));
}
I want to send form data which is coming from my angularjs form to nodejs and then I want to send these data from nodejs to my service API. But the data is not receiving at my service API. I cannot understand what was going wrong in this. Please help me to overcome this problem.
requestMethodDelete: function (url, form_data, header) {
return new Promise((resolve, reject) => {
console.log("FORM DATA IN DELETE REQUESST METHOD");
console.log(form_data);
//SET ALL THESE PARATMETER TO MAKE REQUEST
request.delete({url: url, form: form_data, headers: header},
function (error, response, body) {
resolve(body);
}
});
});
},
I want to send form data and header but couldn't receive at service API, please tell me what can I do for my expected result.
In request.delete function form option is not work for sending form_data. To sending form-data to our service api we need to use qs option rather than form option. that's it, it solved your problem also if you are facing the same problem which I was facing.
I am trying to update multiple records through a single put request using Angular HTTP service, which in turn is consuming a Node JS Express API that handles a PUT request. But so far the examples I have seen on the internet a referring to update a single record through a put request. But instead I want to pass an array of objects into the Put request from Angular Http service and it should be able to read that collection in Node JS API. So far I have been passing one single object as a part of request and I could read it's properties via "req.body.propertyname". Can it read the whole array which i want to pass ?
Let's say this is my code on Angular side to update a single book through a put request as below :
updateBook(updatedBook: Book): Observable {
return this.http.put(/api/books/${updatedBook.bookID}, updatedBook, {
headers: new HttpHeaders({
'Content-Type': 'application/json'
})
});
}
On Node js front it is able to read the passed book object from client(Angular ) side like below:
.put(function(req, res) {
var data = getBookData();
var matchingBooks = data.filter(function(item) {
return item.bookID == req.params.id;
});
if(matchingBooks.length === 0) {
res.sendStatus(404);
} else {
var bookToUpdate = matchingBooks[0];
bookToUpdate.title = req.body.title;
bookToUpdate.author = req.body.author;
bookToUpdate.publicationYear = req.body.publicationYear;
saveBookData(data);
res.sendStatus(204);
}
});
My question is if I could pass the collection of books at once so that all of them gets updated with a single request ?
updateBook(updatedBooks: Book[]): Observable {
return this.http.put(/api/books, updatedBooks, {
headers: new HttpHeaders({
'Content-Type': 'application/json'
})
});
}
If yes then how Node JS could even read this Array passed from client. Will req.body contain this array passed ?
Yes, since we are passing data using the PUT method. In your case, you can send the data to be updated as an array of objects. If you are using MongoDB as your database, you can update data using Update Array Operator.
My suggestion:
The thing is, there is nothing much to be done in the front end. If you need to update an only single record, then pass the single data as an object inside the array. If you want to update multiple records, pass every record as an object inside the array to the backend.
In the backend side, we can receive the data using req.body.books (The name of the object you pass from the frontend). If you are using mongoDB, you can refer to the link on how to store data in an array.
Im also new to node js but i tried to update multiple values through postman and i hit the '/api/books' , the below code worked for me and updated all the values at once in my DataBase using PUT call . Used Mongo DB and Node JS
app.put('/api/books',(req,res)=>{
var updatedData="";
var headersAgain=false;
for(let i =0;i<req.body.length;i++){
DataBaseName.findByIdAndUpdate(req.body[i]._id)
.then(val=>{
val.url=req.body[i].title
val.position=req.body[i].author
val.auto_scroll=req.body[i].publicationYear
updatedData=val+updatedData
val.save((err,updatedObject)=>{
console.log('inside save.....',updatedData)
if(err){
return res.status(500).send(err)
}else{
if(!headersAgain==true){
headersAgain=true
return res.status(201).send(updatedData)
}
}
})
})
}
})
I'm looking into putting a REST layer (using Express) on top of a GraphQL server (Apollo Server v2) to support some legacy apps. To share as much logic as possible, the REST endpoint should ideally wrap a GraphQL query that I'm sending to the GraphQL server, and be able to do small modifications to the response before sending the response to the client.
I'm having trouble figuring out the best way to query the apollo server from the Express routing middleware. So far I've explored two different solutions:
Modify the request from the REST endpoint such that req.body is a valid graphql query, change the req.url to /graphql, and call next(). The problem with this is that I cannot modify the result before it's being sent to the client, which I need to do.
Calling the /graphql endpoint with axios from the routing middleware, and modify the response before sending to the client. This works, but feels to me a bit hacky.
Do you have other suggestions, or maybe even an example?
I believe the solution 2 is okay to implement.
I've made a similar implementation, but in my case, a GraphQL service fetches data from another(multiple) GraphQL service(s).
And somewhere down the line I did something like this:
export type serviceConnectionType = {
endpoint: string
queryType: {
query: Object // gql Object Query
variables: {
input: Object // query arguments (can be null)
}
}
}
export async function connectService(params: serviceConnectionType) {
const response = await fetch(params.endpoint, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(params.queryType),
})
if (response.status === 404) {
console.warn('404 not found')
}
return response.json()
}