If I am building a social media app, how do I go about storing the likes of a post? - node.js

I am building a social media clone following this tutorial
https://www.youtube.com/watch?v=U7uyolAHLc4&list=PLB97yPrFwo5g0FQr4rqImKa55F_aPiQWk&index=30
However, storing this way seems infeasible if a post has 1 million likes or so. Can anyone tell any effienct way of going about this??

You could try Redis, the content of a post in fact is a string, you could first write them in the memory and then store them on the disk, Redis have high performace, It can take 10,000connection in one second easily, but for 1 million, you could do some optimization

Related

Parsing a very long JSON file in NodeJS

I'm using a Nuxt App that has it's own Express based API, it looks to be working fine, but I'm scared of overusing someone else's services.
And I found a JSON that has everything I need to deliver before-hand.
The problem is, that it is 4MB long, and I don't think it would be a very efficient progress to retrieve and add data from it.
If I want to efficiently parse a huge JSON, and use it for the server (As in, to serve parts of it according to the requests I receive).
How would you go around it? Any ideas?
Is only 4MB, you should just load it to memory, any other thing you want to do probably is overkill, literally, fs.readFile, then JSON.parse, that's it, in memory.
Trying to come up with a more sophisticated and "efficient" way on this context is probably not worth the trouble and maybe is just not posible, if you end up using another service just to store and manage those 4mb of data, the IO needed for that is orders of magnitude more than just keeping it on ram.

Blur photos for unauthorized users

I want to create a simple feature on my site (React, Node, MongoDB). I have users who can upload their photos, and I want to show their faces blurred for unauthorized visitors. What is the best way of developing this functionality? Saving blurred images separately in DB or calling every time API for blurring images before responding from the backend, or blurring images in frontend. How to make it fast and safe??? Please any help, thank you in advance.
Everything has a pro and con approach.
Upload one photo and using a tag in the data such as user object or better yet inside an auth token apply a blur filter to the image. The downside if someone is clever enough they can get the real picture e.g to intercept the download
Upload one photo and using a tag in the backend data models or user session reduce the quality of the image on the download. The downside pulling images down will be slower as there has to be image manipulation before its sent to the front end.
Upload two images one normal and one low quality. Downside longer initial upload and you are now taking up more space in your image bucket which will cost you more money.
There will be more approaches but each will have a trade-off between speed, security and cost/space. I personally would go with number three if the cost is not an issue and if you use good compression and don't get snowballed with users the cost difference should not be that much.
It depends on your use case blurring images on frontend after calling an API to verify whether user is authorised or not is least secure. Saving two images on upload seems like a good idea but it's a bit waste as you're saving same image twice. I would go with blurring images on the backend.

How to store and handle big data string (around 2mb) in node js

I have a frontend in angular and the API is in nodejs. The frontend sends me the encrypted file and right now I am storing it in MongoDB. But when I send this file to a frontend, the call will sometimes break. So please suggest me how can I solve this call break issue.
Your question isn't particularly clear. As I understand it, you want to send a large file from the Node backend to the client. If this is correct, then read on.
I had a similar issue whereby a long-running API call took several minutes to compile the data and send the single large response back to the client. This issue I had was a timeout and I couldn't extend it.
With Node you can use 'streams' to stream the data as it is available to the client. This approach worked really well for me as the server was streaming the data, the client was reading it. This got round the timeout issue as there is frequent 'chatter' between the server and client.
Using Streams did take a bit of time to understand and I spent a while reading various articles and examples. That said, once I understood, it was pretty straightforward to implement.
This article on Node Streams on the freeCodeCamp site is excellent. It contains a really useful example where it creates a very large text file, which is then 'piped' to the client using streams. It shows how you can read the text file in 'chunks', make transformations to those checks and sent it to the client. What this article doesn't explain is how to read this data in the client...
For the client-side, the approach is different from the typical fetch().then() approach. I found another article that shows a working example of reading such streams from the back-end. See Using Readable Streams on the Mozilla site. In particular, look at the large code example that uses the 'pump' function. This is exactly what I used to read the data from streamed from the back-end.
Sorry I can't give a specific answer to your question, but I hope the links get you started.

Web application (API and Front-end) - routes design

I suppose this type of topics always exist, but i like to have an specifics opinion for my case.
Since 1/2 month i'm thinking about make a listing web application for my daily life (shopping, due, etc.)
I started out define my object model like this (very simple design model)
Models image
So, i decid to create a NodeJS API for back-end, and Angular 7 for front-end. It's not a technical problem for me to develop the application and the API, but my problem is in the design of this, and particuly to the routes design.
My first suggestion for routes API is :
User :
/users
/users/:id
List :
/lists
/lists/:id
Element :
/elements
/elements/:id
Technicaly it's ok, but i'm not sure it's the good practices.
As User contains List and List contains Element, Wouldn't it be better to have routes like this :
/users/:id
/users/:id/list
/users/:id/list/:id
/users/:id/list/:id/element
/users/:id/list/:id/element/:id
Thanks for your answers, or suggestions !
PS : If you have any web sites / video / topics ... to suggests, do not hesitate.
I'd say you got it OK in the first place, the second approach is messy as you can get huge routes, and you're sending a lot unnecesary data. Why do you need the user id to get an element? An element is an entity by itself, and it will probably grow, you may need to get related elements, filter them... its better to just have /elements
What you can do is find simple relations, like:
/users/:id/lists
/lists/:id/elements
I'd recommend reading building apis you won't hate :)
Firstly you are in absolute correct path of defining Routes in angular, at the same time you have to use Lazy loading concept of Routing.
I would recommend you to, go for plural sight course , by Deborah Kurata. I'm not trying to promote or advertise anything but for your current situation that course would be the right guidance. It would provide you all the necessary things that you need to build enterprise ready apps.
Alternatively Core UI Angular provides some best designs which are already implemented with Angular Route and things. Lazy loading and other Angular routing are implemented, all you need to do is understand it.
Hope this helps.,
Principle
as short as possible
easy to read
user-friendly input when the user enters the URL
Examples
User list
/users
User detail
/user/:id
Add user
/user/new
User's functional page
/user/:id/tel

Best way to request API and store every minute

I have an app that is hitting the rate limit for an API which is hurting the user experience. I have an idea to solve this but have no idea if this is what should be done ideally to solve this issue. Does this idea makes sense and is it a good way to solve this issue? And how should I go about implementing it? I'm using react-native and nodejs.
Here is the idea:
My app will request the data from a "middleman" API that I make. The middle man API will request data once per minute from the main API that I am having the rate limit problem with (this should solve the rate limit issue then) then store it for the one minute until it updates again. I was thinking the best way to do this is spin a server on AWS that requests from the other API every minute (Is this the easiest way to get a request every minute?) then store it on either a blank middleman webpage (or do I need to store it in a database like MongoDB?). Then my app will call from that middleman webpage/API.
Your idea is good.
Your middleman would be a caching proxy. It would act just as you stated. Hava a look at https://github.com/active-video/caching-proxy it does almost what you want. It creates a server that will receive requests of URLs, fetch and cache those, and serve the cached version from now on.
The only downside is that it does not have a lifetime option for the cache. You could either fork to add the option, or run a daemon that would delete the files that are too old to force a re-fetch.
EDIT:
A very interesting addition to the caching-proxy would be to have a head request to know if the result changed. While this is not provided by all API, this could become useful if yours is displaying such info. Only if HEAD requests do not count toward your API limits...

Resources