I have a route where a user can post information for external webpage they want to save. Like the page title, URL, and image url for the page- similar to pinterest.
When my server gets this information (title, url, and image url), my server does a GET request for the image url to the external site. When my server gets the image data back, it crops and stores it on AWS and stores the link in a new mongo document with the other web page info and sends the response with this document back to the client.
As my production web app grows sometimes these image requests cause a H13 error on heroku. Even with a 6 second timeout set on the image request, I'm still getting a "connection closed without response" error in production maybe because other parts of the request are long too or it's a particularly busy moment in time on the site.
I feel like getting the image in the request and waiting for it to load in order to send a response to the client is maybe the wrong way to go about this. Is there a way to handle this request that will work better while my web app scales?
Not sure if this question doesn't belong on stackoverflow because maybe it's opinionated. But I'm not sure if there is a standard way to go about this and I'm just a newb.
Related
I've been looking into trying to get steam logins working with a small website project I'm making, and I've been looking through some resources but come to a similar roadblock(or rather a question) on how this is useful.
For example, in this article: https://medium.com/geekculture/sign-in-through-steam-using-nodejs-e3202d4719
They do all of the setup and such, and when the user logs in, it sends them to the steam login, and if successful, redirects them to the nodeJS port (ex: 3001) + api, in the article it is "localhost:XXXX/", but my question is: How does this get used on the actual website/frontend? How does the website know when to grab from this API? How does it know if the login was successful or not? Would it be a useEffect that checks the localhost:XXXX/ api every time the page loads to see if the api returned valid data or if it was just NULL data?
I've been working on the front-end so far, now I'm going to create my first full-stack application. I want to use node.js, express and AWS for this.
At the design stage, I already encountered a few problems. Therefore, I have a few questions and I am asking you for help:
Can I send a message (simple JSON or database value) from the server to all clients who have already opened my home page in a simple and cheap way?
I'm not talking about logged in users, but all who downloaded the main page (GET, '/')?
Using the admin panel ('www.xxxxxxxxx/admin'), I want to send a message to the server once a day. Then I want to change the HTML to display this message. I was thinking to use EJS for this and download this message from the database.
Can I make it better? If someone visits my home page (GET, '/'), EJS will download the message from the database each time! Even though its value is the same for 24 hours. Can I get the value once and then use it until the value is changed? How to store the message? As a JSON on the server? Or maybe in the .env file?
If the user refreshes the page, do I have to pay for calling all AWS functions to build the page each time? Even if nothing has changed in the files?
How to check if the page has new content and then send it to the user, instead of sending the unchanged page files: .html, .js, .css, etc.?
Can I send the user only the changed, dynamically created html file, and not send again unchanged .js and .css files?
Does every user who opens the home page (GET, '/') create a new connection to the server using WebSocket / socket.io?
I will try to answer some of your questions:
Can I send a message (simple JSON or database value) from the server to all clients who have already opened my home page in a simple
and cheap way? I'm not talking about logged in users, but all who
downloaded the main page (GET, '/')?
I guess you mean sending push notifications from the server to the user. This can be done with different services depending on what are you trying to build.
If you are planning to use GraphQL, you already have GraphQL subscriptions out of the box. If you are using AWS, go for Appsync, which is the AWS service for GraphQL.
If you are using REST and a WebApp (not a mobile app), go for AWS IoT using lambdas. Here is a good resource using Serverless Framework (API Gateway + lambdas + IoT) for unauthenticated users: https://www.serverless.com/blog/serverless-notifications-on-aws
If you are planning to use notifications on a mobile app, you can go for SNS, the "de facto" service for push notifications in AWS world.
Using the admin panel ('www.xxxxxxxxx/admin'), I want to send a message to the server once a day. Then I want to change the HTML to display this message. I was thinking to use EJS for this and download this message from the database. Can I make it better? If someone visits my home page (GET, '/'), EJS will download the message from the database each time! Even though its value is the same for 24 hours. Can I get the value once and then use it until the value is changed? How to store the message? As a JSON on the server? Or maybe in the .env file?
Yes, this is the way it's expected to work. The HTML is changed dynamically using frontend code in Javascript; which makes calls (using axios for example) to the backend every time you get into, i.e. "/" path. You can store this data in frontend variables, or even use state management in the frontend using REDUX, VUEX, etc. Remember the frontend code will always run in the browser of your users, not on your servers!
If the user refreshes the page, do I have to pay for calling all AWS functions to build the page each time? Even if nothing has changed in the files?
What you can do is store all your HTML, CSS, Javascript in an S3 bucket and serve from there (this is super cheap, even free till a certain limit). If you want to use Server Side Rendering (SSR), then yes, you'll need to serve your users every time they make a GET request for example. If you use lambda, the first million request per month are free. If you have an EC2 instance to serve your content, then a t2.micro is also free. If you need more than that, you'll need to pay.
How to check if the page has new content and then send it to the user, instead of sending the unchanged page files: .html, .js, .css, etc.?
I think you need to understand how JS (or frameworks like React, Vue or Angular) do this. Basically you download the js code on the client, and the js makes all the functionality to update backend and frontend accordingly. In order to connect frontend with backend, use Axios for example.
Can I send the user only the changed, dynamically created html file, and not send again unchanged .js and .css files?
See answer above. Use frameworks like React or Vue, will help you a lot.
Does every user who opens the home page (GET, '/') create a new connection to the server using WebSocket / socket.io?
Depends on what you code. But by default what happens is the user will make a new GET request everytime he accesses your domain, and that's it. (It's not establishing any connection if you don't tell the code to do so).
Hope this helps!! Happy coding!
A bit background: I have an Express server backend. In it, I have an axios request to fetch Twitter timeline. It gets data from Twitter, then sends a JSON to the client side. The client side GETs it and displays it in the UI.
My problem is that the HTTP request only gets fired upon deployment of the app on Heroku. When I post a new tweet, the client side doesn't show this new tweet. Only when I re-deploy the app, the new tweet shows up.
Why is this?
Without any code, it's hard to figure out the exact reason. But it sounds like you're doing this twitter call when the app starts, rather than when a request is made to your node service.
You should make the call to twitter when your client side issues the GET request. So the flow might look like this.
Client side makes GET request to your node service
Your service makes the call to the twitter API
Your service transforms the data that has come back from Twitter and returns it as the response to the GET request.
This means it'll make a call to twitter every time you make your API call, not just once when the application starts up.
I'm trying to setup a facebook share on https://donate.mozilla.org/en-US/thunderbird/share/
The og:url points to just /thunderbird which is the url I would want shared. Best I can tell the og tags are all there.
When I try to update the data on https://developers.facebook.com/tools/debug/og/object/
When I fetch new scrape information I get one of two errors. Initially, it'll take a long time then respond with a Curl Error : OPERATION_TIMEOUTED Operation timed out after 10000 milliseconds with {some number less than 10000} bytes received then subsequent fetch attempts respond with Curl Error : PARTIAL_FILE transfer closed with 17071 bytes remaining to read
We're using AWS Cloudfront and nodejs with hapijs
It responds with a 206 partial content, which, should be fine. The og tags are all in the beginning of the file.
I found this: docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/RangeGETs.html
There it says a range request is used to get the file in chunks, not to get just the part of the file and give up. So maybe that's causing unexpected behavior. Maybe cloudfront is sending it back in chunks, and facebook stops listening after the first response? I dunno. Just trying to find a theory that fits the facts.
We already have a working share for donate.mozilla.org/en-US/share/ but that might be old data from when we were not using hapijs and instead using expressjs which I don't think was supporting range requests and would instead return a 200.
I'm mostly a front end dev, so a lot of this is out of my comfort zone but I have already learned a lot :)
Edit: I also want to point out we use Heroku for hosting, and if I setup a test with just heroku and without cloudfront: donate.mofostaging.net/en-US/thunderbird/ it fetches the tags successfully. So I suspect it's a bug when facebook and hapijs interact with cloudfront.
I'm trying to build a NodeJS REST API project based on the so called "micro architecture" (basically multiple smaller NodeJS projects that can run totally independent, but at the same time work together).
Currently users are able to upload images from the app, and my NodeJS backend then processes and saves them appropriately.
Now, what I want to do is the following:
User selects an image to upload from the app -> The app makes a request to the "Main API" endpoint -> The Main API endpoint then forwards this request to the "Image Service" -> Once the Image Service (which is a totally different server) has successfully finished, it should return the URL where the image is stored to the Main API server endpoint, which will then return the info back to the app.
My question is, how do I forward the image upload request from one server to another? Ideally, I don't want the Main API to store the image temporarily and then make a request to the Image Service.
What I'd like is try and forward the data the Main API receives straight to the Image Service server. I guess you could say I want to "stream" the data from one place to another without having to temporarily store on disk or memory. I literally just want it to "tunnel" from one server to another.
Is this possible and is this an efficient way? I just want 1 central point for the app to access, I don't want it to know about this Image Service server. I'd like the app to only ever make requests to the Main API, which will then call my other little services as required.
I'm using NodeJS, Express, Multer (for image uploads) and Digital Ocean hosting (if that should make any difference at all).
What you would basically be doing is setting up a proxy server that will pass requests straight through to another machine and back. There are a few libraries out there to help with this, and this article in particular http://blog.vanamco.com/proxy-requests-in-node-js/ will explain how to go about setting it up even though they are really just trying to get around HTTPS, the same concept applies here.
In short, you get the file upload POST, and then immediately just make that same request to another server and when the response is returned, immediately return it back to the front end. Your entry point can be set up as a hub, and you can proxy requests through to other servers or even just handle them on the same server if necessary.