This might seem like a simple question for you, but I have spent the whole day on the internet searching for a good answer. What I am creating is a news web app with ReactJS as front-end and NodeJS Express backend and MongoDB. I am stuck on how do I store the articles with images inside the articles if required (very important). One option is to store them as documents but then how do I update it whenever required? (Help out this silly beginner, please)
Assuming you have decided that it's the best thing to store images inside the database, I would base64 encode it and store it in the database, and when you retrieve it, decode and serve the image.
Without that assumption, though, putting images in the database makes your database grow larger, backups take longer, etc. And you get very few benefits - it's not like sorting or indexing pictures is a thing.
Related
Hey guys I'm begginer and asking if i could save uploaded files in a folder in the server and then save a path in the database
Note that I'm building a short video sharing app
Here's what I can tell you from my personal experience + research.
You should definitely put videos in a server and store the URL in the DB.
Make sure you use a Compression Library inside your app to compress videos before uploading! Think of what's app and messenger, they compress before uploading.
When Retrieving Said videos, store them in the App Cache and try to clear the cache when the video is out of scope or not in use. An App like this requires multithreading for optimal performance. I know that there is a lot to take in but try to experiment, and you'll understand why these concepts are important.
For images, it may vary. I'm mentioning pictures because you may have thumbnails, etc. If they are small images and you're able to compress them you can store them as a Blob in your SQL DB however most people wouldn't recommend it.
what I'm trying to do is a website for a photography contest.
I'm using Node.js, Express, MongoDB and Mongoose. I've already managed to allow users to register using MongoDB.
What I am missing now is the space to store the photographs (weight ranging from 5mb to 50mb), and I am looking for a preferably free method of doing this.
I thought of an upload to Google Drive via the API (due to the free 15gb) but I want something that is automatic, I don't want the user to enter credentials or anything else, I want the server to take care of everything by sending only a confirmation that everything went well. From what I understand through the Google API it would always be a question of requesting an authorization and access to Google, which I do not want.
I don't know if I misunderstood and if there is a way to do it through Google in the way I mean but, if there wasn't, any online storage would be fine.
Sorry for poor english. Thank you.
For what you're trying to do, I think Firebase Storage may be the best fit:
https://firebase.google.com/pricing
You 10 GB free storage, and it's probably the easiest way to do it. Here's some stuff to get started:
https://firebase.google.com/docs/storage
Best of luck!
I've just started trying to use NodeJS and socket.io to create a simple multiplayer online game (similar idea to online chess). I apologise if the answer to my question is really obvious because I have tried googling around, but I think I am missing some key bit of understanding.
Basically, I need to store a few things on the server while the application is running. For example:
I need to store which socket connections are hosts, and which are players.
I need to store the current state of each game (e.g. in the case of chess, where the pieces are and whose turn it is)
It would also be nice to be able to store all the socket.io "rooms".
Feel free to answer the question at this point, information below is for extra reference.
There are a few things that I have tried or seen online:
When I google something with "persistence", I get results based on saving to a database or something, I don't think this is what I want.
I have tried just adding variables at the top of the NodeJS file, like I would with global variables in an ordinary JS file. This seems to work, but just feels wrong to me, if someone could explain how this works it would be great.
I have also seen things called session variables, I think this might be what I want.
I have seen applications that do this by just passing the information back and forward between to client and server, but I would prefer that the client couldn't just edit the information to "hack" to game.
Any help or explanation appreciated.
Nothing wrong with saving to a database. If your server crashes and restarts a few seconds later, you don't really want everyone's data to just be obliterated. I think you're thinking about it in the way that databases are always long-term and slow. But really, there are DB technologies great for this type of thing, and oft used with socket.io.
The one I'd probably opt for is Redis, which is super fast and stores data in-memory. This means that it's not constantly writing to disk, and it's a bit of a halfway house between having full persistent storage like with MySQL, and the slightly dodgy method of just keeping it in Node memory via variables.
When reddit created "Place", that massive multiplayer drawing with a tonne of concurrent users, they used Redis and Cassandra together. You can read a bit about it here.
I am working on a page action extension and would like to store information that all users of the extension can access. The information will be key:value pairs, where the key is a web url and the value is an array of links.
I have to be able to update the database without redeploying the extension to the chrome store. What is it that I should look into using? The storage APIs seem oriented towards user data rather than data stored by the app and updated by the developer.
If you want something to be updated without deploying an updated version through CWS, you'll need to host the data yourself somewhere and have the extension query it.
Using chrome.storage.local as a cache for said data would be totally appropriate.
the question is pretty broad so ill give you some ideas Ive done before.
since you say you dont want to republish when the db changes, you need to store the data for the db yourself. this doesnt mean you need to store an actual db, just a way for the extension to get the data.
ideally, you are only adding new pairs. if so, an easy way is to store your pairs in a public google spreadsheet. the extension then remembers the last row synced and uses the row feed to get data incrementally.
there a few tricks to get right the spreadsheet sync. take a look at my github "plus for trello" for a full implementation.
this is a good way to incrementally sync, thou if the db isnt huge you could just host a csv file and get it periodically from the extension.
now that you can get the data into the extension, decide how to store it. chrome.storage.local or indexedDb should be fine thou indexedDb is usually best for later querying more complex things than just a hash table.
I am new to redis and would like to store the web analytic of web site globally and per user activity .
Below is what i am stuck with.
// to get all unique ips
client.sadd('visitors',ip);
// to records hits per ip
client.hincrby('hits',ip,1);
The above so far works fine and i do get number of different ips and hit counter per ip.
the problem comes to store the activities made by each ip. i.e. Storing the link he clicked, searches he did, with datetime
Can some one please throw light on how to best manage it.
Thanks
the problem comes to store the activities made by each
You will need a separate structure for storing these.
The simplest rational structure is to have a "list of actions by session". Take a look at the sorted sets commands which provide a basic framework for creating a list of actions within a session.
This will get you something quickly. However, this is probably not what you really want. In fact redis is probably not useful for this at all.
If you want to re-trace an entire site visit you really want to connect to some sort of true analytics framework. There are dozens of website tracking tools that provide this type of functionality, so it's not really clear that building one is very efficient.