Best solution to host a (command line) Windows application? - azure

I have a Windows application that does some calculations and is called from command line. On my Windows machine, I have a PHP script running under Apache that executes the application and shows the output.
Is there any hosting solution that I can use to do the same? I can't figure out if EC2 or Azure are the right solutions. Basically, I need a web server + ability to execute my application.
Suggestions? Thanks.

You can host your application on AppHarbor, the .NET Platform-as-a-Service. You can either port your web frontend to .NET or try to get your PHP stuff working with Phalanger. AppHarbor is working on Background Tasks, which might be a good match for your workload.

I would just run the PHP script you already have under IIS in a Windows Azure web role.

If it is a Windows Application and you have the source code I would go with an Azure Worker Role. The advantage of using a PaaS (as Azure) instead of an IaaS (as Amazon) is that you wont have to bother of keeping the server up to date.
The real investment in time will be when you rewrite your application to make it work as a Worker Role. The time needed to do this work depends on how your application works right now. If is uses a lot of disc access it might be difficult and perhaps an Amazon server would be better. But if it only crunches numbers in memory an Azure Worker Role is a very good candidate.
The real advantage of using an Amazon server is that you probably wont need to do any work at all. Except maintaining the server.

As described in the question both Azure and EC2 will do the job very well. This is the kind of task both systems are designed for.
So the question becomes really: which is best? That depends on two things: what the application needs to do and your own experience and preference.
As it's a Windows application there should probably be a leaning towards Azure. While EC2 supports Windows, the tooling and support resources for Azure are probably deeper at this point.
If cost is a factor then a (somewhat outdated) resource is here: http://blog.mccrory.me/2010/10/30/public-cloud-hourly-cost-comparison/ -- the conclusion is that, by and large, Azure and Amazon are roughly similar for compute charges.

Steve Marx has a blog post that describes how to run another web server (i.e not IIS) on Azure
This potentially has everything you need - you can deploy Apache and your executable and run it in exactly the same way.
Alternatively - you can deploy your executable along side a bit of code in a worker role that would run that application periodically, all depending on your exact requirements

Related

node.js on shared server space

I am playing with node.js and angular for first time. I use shared server hosting space. I am trying to get some node.js tests running.
CPanel seem to provide interface to deploy node applications. Example:
application url: myurl.com
application root: node-hello-world
application startup file: app.js
This seems to create directory and some artifacts in
/home/myurl/nodevenv/node-hello-world/6/bin
I have (limited?) shell access through Cpanel emulation, however I get error on source command.
source activate
got error: error: jailshell: fork: Cannot allocate memory
Does this mean node.js is installed and ready to run? Do I have to upload project as well? Where to? Trying to find more info on process of deploying to this type of server if possible.
sorry for nooby question.
Googling your error message, I came across this thread -- which admittedly is very old, but it's from cPanel and has the following comment from an administrator at the time:
Jailshell is a constrained environment by design. It is not meant to be a replacement for a full-featured, unrestricted, shell environment, such as is provided by Bash. If your user's need such full-featured environments then perhaps they need full shell access, or another method whereby they can accomplish their goal.
That answer was given in 2006 (yes, 13 years ago) but I have to imagine the spirit of that response is still true.
To be perfectly honest, I'd be afraid to use any shared hosting provider that would give you more than very limited shell to use -- it opens the door to many security vulnerabilities, and if multiple customers are in the same runtime (i.e. shared hosting) it could be catastrophic. Maybe your host does allow this, or maybe what you're describing isn't actually the same thing I'm referring to... you didn't offer a lot of details on this point.
Back to your question: Does this mean node.js is installed and ready to run? Do I have to upload project as well? Where to?
If I had to guess, Node probably isn't installed (it isn't in most shared hosting providers) -- but I can't say for sure based on the information you provided. My recommendation would be to call their customer support. Or pay for a dedicated hosting account where you get root access. Or just use something like Heroku.

While creating new Azure Function App in what scenario do I select operating system other than Windows?

We created and tested several Azure Function Apps hosted at Windows. While creating new Azure Function App in what scenario do I select OS other than Windows? Meaning Linux or Docker.
I created test instances for all three OS selection options and basic settings of each of them appear to be very close.
Linux or Docker is useful if your functions have dependencies that only work on Linux/Docker. For example, some node.js native libraries only work on Linux, and will never work on Windows.
If you don't need Linux for anything specific, then I suggest sticking to Windows since that is currently (at the time of writing) the best and most supported environment for running Azure Functions.
Azure Functions 2.0 runtime is based on .NET Core, so it is cross-platform. If you choose Linux/Docker, Functions runtime will be deployed on Linux.
2.0 is still in preview, so Linux/Docker are not supported in production yet. For now, Consumption Plan (pay per call) is not supported.
See The Azure Functions on Linux Preview. Quote:
Functions on Linux can be hosted in a dedicated App Service tier in 2 different modes:
You bring the Function App code and we provide and manage the container, no specific Docker related knowledge required.
You bring your own Docker container including the Azure Functions runtime 2.0, specific dependencies, and Function App code.
For consumptions mode, the cold start varies a little bit among the OS.
It looks like, although the average time is very close between Windows and Linux, the best and worst cases are much better for Linux... which kind of makes sense.
Check this as a good reference: https://mikhail.io/serverless/coldstarts/azure/
Now, if you are deploying to a dedicated Apps Service Plan, it plays a bigger role. Linux Plans are cheaper than Windows Plans due to the OS licensing cost.

Scheduler on Azure

I need to be able to generate some type of Scheduling service within Windows Azure, but which is the best and most resilient?
Currently I have a Windows Service running Quartz, which works okay, but on a Windows Server. I need this to run in the cloud.
The tasks, read/write to a database and some will send emails.
I've looked over all the possible solutions in Stack Overflow, but they appear to be old and not updated to the latests Azure Platform.
Any suggestions or pointers?
The most adapted solution might be a worker role, MS has a tutorial specifically for what you're looking for: http://www.windowsazure.com/en-us/develop/net/tutorials/multi-tier-web-site/4-worker-role-a/
This would definitely a less expensive solution than instantiating a virtual machine, but might require some work.
I ended up using the Azure Mobile service and the Scheduler that come with it, which works a treat
I run a Worker Role using Quartz .NET to schedule stuff. Works great!
https://github.com/quartznet/quartznet
Obviously, that would be difficult to do on the cloud since you won't be able to install services or anything that could run in the background. A less than perfect solution would be to have a workstation under your control handle the scheduling and send updates to the web server which would then write them to the DB server. Otherwise, you should self host the website and application, etc.

Why node.js can't run on shared hosting? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
First thing: I searched all of the known web hosting companies for shared hosting of nodejs but I didn't find anyone. Then I came to know that nodejs cannot run on a shared host system. I want to know why?
Second thing: I am a normal guy with a normal budget. Choosing a vps or deicated server or cloud hosting makes the node run, but its out of my pocket money range as compared to the PHP shared hosting services, so should I learn node.js?
Theoretically it can, but practically it depends on hosting provider to have such infrastructure in place.
Node comparing to classic web platforms is self-sustainable platform. In case with PHP (for example), it runs on with of apache or nginx (or any other), and PHP it self is just script language with some libraries that does not do much apart of logic implementation, and requires web server solution. Web server creates socket to listen specific ports for traffic, will do its own magic and will execute PHP to process requests.
In meantime node.js creates own socket, and binds it to own port. That gives it much more low-level access, so it is web server it self. You can't bind to one port two applications, so it already unsharable.
There are services (web servers) that allow you to create proxy to route traffic to your node.js process but that is not as efficient in some cases, and shared hosting does not provide such functionality.
As node.js is still fairly young as well as is well, different, it still did not hit majority of shared hosting services. There are some available services online to host your node.js applications in a 'shared' manner.
Additionally you can rent EC2 Micro instance on AWS for free (Free Tier) for one year, which gives you plenty possibilities and time to try and test different stuff. You'll get semi-dedicated system, where you can do pretty much anything (install software, modify OS configurations, and much more), where shared hosting would not allow you to do so.
Look into Heroku. For simple low traffic apps, they are free and can easily be scaled for more traffic (for an added cost). Additionally, you use Git to deploy, so is really simple to get stuff updated...
There are other ways to deploy node.js apps.
You can use PaaS services, like Openshift, Heroku, AppFog, Paastor, dotCloud etc.
Other great node app hosting options include Joyent's SmartOS and Microsoft Azure. Both have a free trial period.
Azure can be a great learning platform for node.js as you can host your node app in Windows Server, Ubuntu Linux, or Azure's special "web site" shared deployment scheme.
http://www.windowsazure.com/en-us/develop/nodejs/tutorials/create-a-website-(mac)/
Another cost efective solution for node app hosting is Azure's "Web Site" approach - about $10 per month. The down side is that you have to use their shared environment that hosts your node app via IIS. In practice, this worked for well for me, but you are limited in that you can't use certain Linux functions from Node when it's running on Windows, and you won't learn how to configure the node service yourself, which may or may not be important to you. (Note: Azure's GIT deployment process works great if you want to deploy your app from a local GIT repository. Also note that NodeIIS will stop your node app when it's not in use for a certain period of time - and it auto-starts again when a request for your app comes in.)
Joyent's SmartOS platform is a Linux OS optimized for hosting your node.js app. They have impressive reliability and performance as well as great diagnostic tools.
http://wiki.joyent.com/wiki/display/jpc2/Developing+a+Node.js+Application
The most cost effective solution I have found so far is DigitalOcean, a great new hosting solution where you can host a full Linux VM for only $5/month! I have had great luck hosting Node apps there so far: https://www.digitalocean.com/pricing
a2hosting allows Node.js in shared hosting.But don't have experience there.Found from a web search
Update : Use DigitalOcean. Private VPS
Node doesn't work like most servers. With IIS and Apache, there is one server running multiple sites, which lends itself to shared environments. With Node, you're running your own server so instead you tend to share resources on a machine.
I can't tell you whether it's worth learning node because I don't know your motivation, but it can expand your career opportunities if you choose to go there, and to expand your skillset.
Here are a couple of hosting options in the low price range.
http://nodester.com/
https://www.nodejitsu.com/

Windows Azure and a third-party Windows Service

I am developing a website that I intend to run within Windows Azure using a single Web Role. The site will make use of the Sphinx Search engine which will need to run as a Windows Service. So, my question is this...is it possible to install the Sphinx Search Windows Service inside of a Web Role.
From my initial research into Azure I am thinking "yes" for the reason that the Web Role is a VM running IIS. Therefore I should be able to remote in, install the service, and it should work. :)
Does this sound right?
Installing software via RDP is not a viable solution with Web/Worker role instances, as these changes won't persist. You need to install it either from a startup script or from OnStart(). Since you want to install as a service, that would imply startup script, since it would need elevated permissions. Note: The installer must support unattended mode, where all parameters are specified via command line with no human interaction.
What about scalability? If you have more than one instance of your web role running, can sphinx run across two instances? From what I read, it supports ODBC-compliant databases, and you might be able to use it against Windows Azure SQL Database. If that's the case, can two sphinx engines run on two different machines accessing the same data store? If so, this sounds like a viable solution.
If installation cannot be automated, or you need something additional like MySQL, you may want to consider placing the sphinx search engine inside a Virtual Machine (new in June 2012). Now you can spin up a Windows 2008 Server, RDP into it, configure it exactly how you want it.
Strictly speaking yes, you could do that. However this makes the assumption that you would be running on one VM instance and also that the instance would never need restarting.
You should consider looking at Azure worker roles for any functionality that would normally exist as a windows service.
After reading your answers, and thinking about it a bit more, I think dropping the idea of installing a service would be the best course of action. I've been looking at the API for Lucene.NET (this may be the same for Sphinx) and it's possible to encapsulate the writing/managing of indexes, etc, within in code and therefore no need for a service.
For the Azure, there is a library for managing index files using both local and Azure storage which could be of use. Scenarios I've read about show that it's then possible to have a Web Role that will process HTTP requests and perform the searches and a Worker Role to accept DB changes via a queue and have it write them to the indexes.

Resources