Fetching in nuxt(3) - frontend

I have a async fetch function that doesn't seem to be called in nuxt. Here's what I have.
async fetch(){
console.log("TEST")
this.stay = await this.$axios.$get('myURL')
}
TEST doesn't show up anywhere in the logs and my backend is unresponsive so I don't think this function is even being called. I've tried a couple of different ways to fetch this data including http and axios. Following the example from https://v3.nuxtjs.org/getting-started/data-fetching/#usefetch
const {data: stay} = await useFetch('myURL')
Hasn't helped either. I know my backend is working alright because I can curl it.
I'm pretty lost, does anyone know what's going on or give me ideas?

Related

NodeJS fetch returns ECONNREFUSED error when executed in fs.writeFile callback

I have some code where I have generated some json data, then I would like to
write it to a file, then
hit an api endpoint (a get request)
The issue I am running into is:
when I execute fetch after using fs.writeFile I get an ECONNREFUSED error. If I do not write the file, my get request to the endpoint is successful.
I am putting my fetch in the callback to the writeFile function - I have also tried fs.writeFileSync(url) which gives me the same results. My full code requires writeFile to come first.
I noticed if I wrap fetch in a setTimeout with 10000ms, then fetch will work. It seems as if writeFile isn't waiting long enough to execute the callback function.
Am I doing something incorrectly? or How do I correctly write a file and then subsequently fetch API data?
I boiled down my code to the most minimal example to reproduce this behavior - as well as allowing node to correctly return the error messages. (using a fake URL for this example as the real url isn't publicly accessible)
const fetch = require('node-fetch');
const fs = require('fs');
try {
fs.writeFile('./example.json', JSON.stringify(['test', 'one', 'two']), () => {
fetch('http://www.example.com/api_endpoint?q=test')
.then(console.info)
.catch(console.error);
});
} catch (e) {
console.info(e);
}
I'm running this in nodejs v10.15.1 on Linux Debian 8.11 (jessie)
I found out the problem, it's real silly...
My script is in the same repo as my API server. When running the server in dev mode (adonis serve --dev), fs.writeFile triggers the file watcher to reload (of course), which temporarily disconnects the server. It is very obvious now why it's not working.
The solution was to have the file watcher ignore the folder I am writing the json file to.
In my case (working with adonisjs) that is adonis server --dev -i scripts
Oddly enough, this is a project that worked a month ago and I didn't have this issue then. I guess something changed in how I'm running it between then and now.

Send request progress to client side via nodejs and express

I am using this (contentful-export) library in my express app like so
const app = require('express');
...
app.get('/export', (req, rex, next) => {
const contentfulExport = require('contentful-export');
const options = {
...
}
contentfulExport(options).then((result) => {
res.send(result);
});
})
now this does work, but the method takes a bit of time and sends status / progress messages to the node console, but I would like to keep the user updated also.. is there a way I can send the node console progress messages to the client??
This is my first time using node / express any help would be appreciated, I'm not sure if this already has an answer since im not entirely sure what to call it?
Looking of the documentation for contentful-export I don't think this is possible. The way this usually works in Node is that you have an object (contentfulExport in this case), you call a method on this object and the same object is also an EventEmitter. This way you'd get a hook to react to fired events.
// pseudo code
someLibrary.on('someEvent', (event) => { /* do something */ })
someLibrary.doLongRunningTask()
.then(/* ... */)
This is not documented for contentful-export so I assume that there is no way to hook into the log messages that are sent to the console.
Your question has another tricky angle though. In the code you shared you include a single endpoint (/export). If you would like to display updates or show some progress you'd probably need a second endpoint giving information about the progress of your long running task (which you can not access with contentful-export though).
The way this is usually handled is that you kick of a long running task via a certain HTTP endpoint and then use another endpoint that serves infos via polling or or a web socket connection.
Sorry that I can't give a proper solution but due to the limitation of contentful-export I don't think there is a clean/easy way to show progress of the exported data.
Hope that helps. :)

Fetch gives empty response while waiting for long period

I have app made with react and node .
The react app needs to make api call to the node app the node app is running on port 5100 . I am facing problem where I get net_err empty response in the console after waiting long period of time . The thing is my api takes 200s to get the response from the server .
When I hit
http://localhost:5100/api/users/wait-ip
I get response after 200 second But when I hit this in the react app
fetch('/api/users/wait-ip')
I get the following error on console
GET http://localhost:3000/api/users/wait-ip net::ERR_EMPTY_RESPONSE
This is my function for api
router.get('/api/users/wait-ip',(req,res)=>{
//Others things happen here
setTimeout(()=>{
return res.json({
data:1
})
},150000)
})
This is the response I get while hitting directly on browser after 150seconds
Any help on how to solve this will be appreciated
Using Node.js API with React is a common use case. I think the reason you are facing the issue in getting response is that you are using fetch call synchronously. Always use async/await for it.
async function getUsers() {
let response = await fetch('/api/users/wait-ip');
let users= await response.json();
//...
return users;
}
Using the function:
getUsers().then(result => {console.log(JSON.stringify(result));});
Hope that helps.
For me the problem was on both client and the server .
I have read some of the post on how to fix timeout on the server . One of them was
const server = http.listen(port, () => console.log(`Server running on port ${port}`));
server.timeout = 200000;
Well this worked but only for the direct browser call .
For the asynchronous call I need to set it for each function where I wanted it like this
router.get('/wait-ip',(req,res)=>{
req.timeout =160000;
setTimeout(()=>{
return res.json({
data:1
})
},150000)
})
And for client part with pxoy it didn't work properly . So what I did was posted the full url
fetch('http://localhost:5100/api/users/wait-ip')
I hope this helps with other person too
I may be because of the Cross Origin Resource Sharing (CORS) header. Usually calling from a browser there is an "OPTION" call that is made followed by, in this case, the "GET".
I would try
fetch('/api/users/wait-ip', {
mode: 'no-cors' // 'cors' by default
})
If this fixes the problem either you force not to use cors on the client side. Or the server should manage it. Another option is to allow the proxy to set these headers.
ref: https://developers.google.com/web/ilt/pwa/working-with-the-fetch-api
section: Cross-origin requests
This is an API problem. I'd make set breakpoints in your API and make sure the right fields are populated for the response.
Node.js Debugging Guide...

Not able to trace all http requests in async parallel with zipkins in Node API

I am new to node js and was trying to integrate zipkins with my node APi using appmetrics-zipkin npm package. Zipkin works fine except when there are multiple http calls in async parallel method , it gives trace of only the first http call which was finished...I need trace for all the API calls in async parallel......Please help
Well, without seeing any code, I could only give you a sample of how you should achieve this. So an http call, for example if you use node-fetch or axios will return a promise. To wait for promises paralelly, you can do the following:
async function myParallelRequests() {
const requestOne = fetch(urlOne);
const requestTwo = fetch(urlTwo);
const requestThree = fetch(urlThree);
const [responseOne, responseTwo, responseThree] = await Promise.all([
requestOne,
requestTwo,
requestThree,
]);
}
Note that I use fetch API here, provided in node by the node-fetch package. Fetch returns a Promise. Then I call Promise.all(promises) where promises is a Promise array. You can then do whatever you would like to do with the 3 responses and your requests were made paralelly.
Hope this helps, good luck!

How to trigger background-processes, how to receive intermediate results?

I have a NodeJS / background-process issue, that I don't know how to solve it 'elegant', straight, the right way.
The user submits some (like ~10 or more) URLs via a textarea and then they should be processed asynchronous. [a screenshot with puppeteer has to be taken, some information gathered, the screenshot should be processed with sharp and the result should be persisted in a MongoDB. The screenshot via GridFS and the URL in an own collection with a reference to the screenshot].
While this async process is calculated in the background, the page should be updated whenever a URL got processed.
There are so many ways to do that, but which one is the most correct/straightforward/resource saving way?
Browserify and I do it in the browser? No, too much stuff on the client side.. AJAX/Axios posts and wait for the URLs to be processed and reflect the results on the side? Trigger the process before the response gets send back to the client or let the client start the processing?
So, I made a workflow engine of some sort that supports long-running jobs. And I followed this tutorial https://farazdagi.com/2014/rest-and-long-running-jobs/
Which is nothing, when a request is created you just return a status code and when the jobs are completed you just log them somewhere and use that.
For, this I used EventEmitter which is used inside a promise. It's only my solution maybe not elegant, maybe outright wrong. Made a little POC for you.
const events = require('events')
const emitter = new events.EventEmitter();
const actualWork = function() {
return new Promise((res,rej)=>{
setTimeout(res, 1000);
})
}
emitter.on('workCompleted', function(){
// log somewhere
});
app.get('/someroute', (req,res)=>{
res.json({msg:'reqest initiated', id: 'some_id'})
actualWork()
.then(()=>{
emitter.emit('workCompleted', {id: 'some_id'});
});
})
app.get('/someroute/id/status', (req,res)=>{
//get the log
})

Resources