Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I'm making a website that will handle video upload and encoding. My idea was to have the main server handle both client requests and video processing. But from my understanding, video encoding is cpu intensive. So I'm not sure if its a good idea to have one server do all the work, or have a separate server to do processing stuff. I want to try to future proof myself a bit in case I ever get high volumes of traffic, thus adding more processing work for the server.
So my question, is it overkill these days to have a separate server for video encoding, or am I going about this all wrong?
Ps. I'm using nodejs.
It will be overkill for someone starting out. As you mentioned, you don't have an idea of how much volume of traffic to expect yet, and it's difficult to project growth of your web app since it might grow gradually or take off immediately and hammer your server.
I would approach this in such a way where I can separate and queue the video processing work away from the main website. This will allow you to scale the video processing portion of your app without requiring you to run the entire website on there.
With a type of queuing system, you can also manage the amount of video you're processing at any point in time. So if 1 server can handle 5 video processing requests at once, any new request would have to wait until it finishes the previous request etc. Almost a micro-service type architecture.
Hope this gives you some ideas.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I've got a web app where I use plain file system for my custom logs - a lot of small files, I don't want to put that into db, that works for me quite well. But now I need to scale my app by using a load balancer in front, so I also need to keep those logs in sync between servers. Is there any reliable solution for such cases ? I know I could sync it by some OS means, or by scripting, but I'm thinking if there is any better solution for such scenarios? Is it the case for MongoDB usage or something more modern or is it better to keep it on file system as plain files ?
This questions is going to get you some heat since essentially your asking for our opinion. Ill be frank tho and wont argue with anyone since its just MY opinion. With web apps in my humble opinion, its always better to keep your data in a DB for scalability but also for analytical research. I know little about what your app does but its easier to write third party data apps that tell you how many of X or Y etc when its centrally stored in a DB. Since the app that gets said data can be anywhere. I know I probably wasted time with an argument but hey, hope I helped a bit.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
At the risk of this question being closed I will ask anyway.
I have been looking at the different JavaScript Frameworks as most jobs roles seem to want:
angular.js
Knockout.js
Node.js
Whilst i can see Angualr.js and Knockout.js provides a MVC construct to the markup pages (though still not sure which one is best to use) I cannot see what is the case for node.js?
Whilst I appreciate node.js is good for real-time comms but so is Signalr as they both use long-polling.
At present I use signalr to update images on my clients.
is there any purpose to swapping this out for node.js?
Like I said this question could be voted to be closed as it may seem to be asking an opinion - and that would be an answer to me in itself as it would be down to developer choice but is there a DEFINITIVE reason to use node.js over signalr?
thanks
One reason to use node.js is code redundancy. Both the server and client run the same language, thus they may share a certain part of the codebase, meaning potentially less to write. With libraries like Browserify this process can be made a lot more transparent and writing the client-side can be almost indistinguishable from server-side development. Another opportunity this opens up is both client and server side rendering + MVC setups with, for example, rendr.js. So you can have both the fast load speeds of server-side and responsiveness of client-side rendering. If any of this will be useful naturally depends on what you are developing.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
This is not a holy war question, I'm just asking what framework would be the best choice in terms of performance in my specific project.
I'm writing a REST API and choosing between Node.js and Sinatra. One method of the API will be used very frequently (± 100k requests per day).
This request is very simple: select one row from a database, make a few calculations, update one row in a database.
But, as I said, it will be called frequently and I need to choose a framework that will perform better in this case.
This is a simple app and in this case I don't care which framework is easier or "better", just interested in the performance. I already wrote a prototype in Sinatra, the whole app is less than 150 lines of code.
I read about Node.js, but never created a real app with it.
Will Node.js be a significantly better choice for this project in terms of performance and scalability?
100k requests a day is roughly a request per second assuming a flat distribution of requests during the day. Both solutions will probably serve that without a problem. You're probably falling into the premature optimisation trap.
That being said, Javascript, because of it's asynchronous nature is significantly better at high i/o than Ruby (Sinatra is just a simple web framework, Node is just how you run Javascript on a server).
Now as per the "what should I do", I suspect most people would tell you to use the prototype you already have working and use it until it's no longer good, if it ever comes to it. Seeing it's such a small app it shouldn't be a problem to rewrite it later with Node anyway!
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
When playing with NodeJS, I came up with this question, since one can now put some code either on client side or server side using even the same language.
E.g. For a small game app, I can put the computation on client side when interacting (via some onclick function); also I can initiate a server request and do the computation there.
With more investigation, the terminology for my question is client vs server side rendering. Now there's a lot of materials I can find.
It's basically a tradeoff, depends on the user case, server capacity, etc.
The best philosophy for deciding what is left to the client and what is left server is often to leave as much as possible up to the client. While this often does not apply to very complex applications, most applications can apply this with no negative effects.
The logic here is that 1 dedicated computer (the client) can handle its' individual needs (such as images, video, gameplay) much easier than 1 or a few servers can handle 1000's of client needs.
However, some things require an external application (the server). Good examples of these are sessions, leaderboards, user authentication, and social media integration.
The only downside is that it may increase your applications initial load time. For small applications this may only be milliseconds. For larger applications, that take more than 2-3 second to load, I would say add a loading bar.
Cheers
-Nick
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
We are looking for any browser based file upload solution, commercial or free, that manages to survive internet connection interruptions and continues the upload process once the connection is re-established.
The scenario: a website used in areas where the users experience very unstable internet connections and yet need to upload files up to 3 MB (which sounds low but can really be a problem).
There are various jQuery and Flash based solutions around, like CuteUpload, Ajax Uploader and so on, but none of them has so far implemented a mechanism that helps in such a scenario. I am aware that the http protocol does not handle connection resets in a way that allows continuing a post.
A solution is conceivable if the client software knows how much has been uploaded already and is able to slice the upload into chunks while the server-side is smart enough to glue them back together. Or if client and server agree on chunk sizes beforehand, enumerate them, keep the session open and make sure every little piece get shipped. Possible, but probably not easy to write. We are working on .NET, but the server platform doesn't really matter.
Does anyone have a hint where to look?
there are no really popular or well known solutions for this problem. And I really hope that future versions of HTML will support this out-of-the-box.
But for now you can look at http://upload.thinfile.com/upload/thin.php which is a paid tool but you can try the demo for free.
Also, in the rails world, there is a FOSS gem https://github.com/stakach/Resumable-Uploads.
The method / approach they use is quite sound and cross browser compatible.
Also, don't look for "internet connection interruptions" look for "resumable file uploads" when googling :)