OnStart vs Startup Script for batch file? - azure

I have a Ruby on Rails application that needs to find a home in an Azure Worker Role.
I currently automate the deployment of the application with a batch file - a file that takes the apache and ruby installers, runs them, and then drops the RoR app in the appropriate directory. After the batch script finishes, Apache is serving to and from the application via port 80.
I'm new to Azure and trying to figure out how to do this.
From my understanding, I have two options here: OnStart with the installation files in Blob Storage, or a startup script. I'm not sure how to do the latter, but I have located the onStart method within the WorkerRole.vb file in the new Azure project I just created.
My question: Is it recommended to use OnStart to deploy the application (using the batch script)? If so, how would I go about integrating the script into the project? And - how do I get started with storing and referencing the files in blob storage?
I know these are super high-level questions. Any input or suggested reading would be super helpful. I have tried to google / search for relevant resources but haven't been able to find much. Thank you for your time!

When you are inside OnStart() function it is better to do role configuration things i.e. IP binding, etc however if you would want to install runtime, download application zip, configured role specific setting, it is best to use Startup task. Please visit my blog Windows Azure: Startup task or OnStart(), which to choose? to learn more about it.
Now in your case it is best to use Startup task. What you can do it as below:
Create your ROR package a zip and place it at Windows Azure Blob Storage
Create a Cmmmand batch file which will do:
2.1 Download the ZIP
2.2 Unzip to Zip content to a specific location
2.3 Update the status back to AZure Blob Storage (Optional)
In your OnStart() function you just need to configure the ROR
The code will look as below if you have TCP Endpoint name "RORWeb80" set to use port 80:
TcpListener RoRPortListener = new TcpListener(RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["RORWeb80"].IPEndpoint);
RoRPortListener.Start();
I have written a sample app for Tomcat/Java based worker role which does exactly the same. So what you can do it just replace the Tomcat ZIP file with ROR ZIP and reuse the code exactly.

As long as you don't need admin-level access (e.g. modifying registry, installing msi's, etc.) you can do your setup from OnStart(), including launching your script. Just include the startup script with your project (don't forget to set Copy Local to true).
Same goes with startup script: you call your cmd file, which then executes the sequence for you. And if you give it elevated permissions, you can run installers, modify registry settings, install custom perf counters, whatever.
In either case: you can keep your apache zip, ruby installers, etc. in blob storage and, at startup, download them to local storage. This saves you from bundling everything within the deployment, which gives you a few advantages (being able to update ruby / apache without redeploy, reduced package size, etc.).
There's a sample app on codeplex that demonstrates the basics of setting up Tomcat via startup script. For one more example, you can look at the scripts installed via Eclipse Windows Azure plugin for Java. These scripts are quite similar. The key is to have some way of downloading files from blob storage and then unzipping them. the codeplex project I referred to points to a sample app that does simple blob downloading. The Eclipse packaging provides similar functionality in a .vbs app. Here's a snippet of one of my scripts from an Eclipse-based project:
SET SERVER_DIR_NAME=apache-tomcat-7.0.25
SET WAR_NAME=myapp.war
rd "\%ROLENAME%"
mklink /D "\%ROLENAME%" "%ROLEROOT%\approot"
cd /d "\%ROLENAME%"
cscript /NoLogo util\unzip.vbs jre7.zip "%CD%"
cscript /NoLogo util\unzip.vbs tomcat7.zip "%CD%"
copy %WAR_NAME% "%SERVER_DIR_NAME%\webapps\%WAR_NAME%"
cd "%SERVER_DIR_NAME%\bin"
set JAVA_HOME=\%ROLENAME%\jre7
set PATH=%PATH%;%JAVA_HOME%\bin
cmd /c startup.bat
The codeplex project has a similar-looking script.
Don't forget: you'll need to set up an Input Endpoint for your role (part of the role properties).
To get blobs into blob storage, there are both free tools (like Clumsy Leaf CloudXplorer and paid tools (such as Cerebrata's Cloud Storage Studio).
To download blobs to local storage, you can either write a few lines of .net code (from OnStart) or just use the utility pointed to in the codeplex project.

Related

How to develop foxx services using arangodb web interface

I am creating foxx services, right now I am doing it in VS Code and uploading the zip file in the services section with a mount point in DEVELOPMENT mode. Now I want to quick edit the foxx service in web interface itself. I was reading this possible but for some reason I do not get an option to edit it using web interface. Am I missing some configuration/setting or something.
One way to do rapid development with Foxx is to use an IDE that automatically uploads modified files to a local 'deployment' location.
For example, if you use WebStorm IDE, and edit the files in a directory that is integrated with GIT, then you can checkout and check in your code.
WebStorm (or other IDE's as well) have a feature where they monitor edited files, and then automatically copy those files to a destination location.
You can set it up so that it notices when you save a file, then rather than zip it and deploy it via the web UI, it just copies the files to the directory that Foxx is using as the source of your web service.
If you have the Foxx service running in 'Development' mode, then it recompiles every invocation, so it will pick up the newly edited changes that just got copied in.
You need to find the target directory that you have your Foxx Service running out of, when you enable Development mode it will tell you the path in the Web UI.
Not sure if you can do that with VSCode, but if you can then that's the easiest way to do it.

Upgrade Service Fabric Application

Is there a way to copy only modified files to Service Fabric.
I have a Service Fabric application containing an ASP. Net 5 application as service. Whenever am doing a change to a JavaScript file inside my ASP. Net 5 service, every time I need to copy the entire service fabric application package. Is there a command which allows to copy only the modified file?
The best way to accomplish this is to use diff packaging and app upgrade. See this link for more info: https://azure.microsoft.com/en-us/documentation/articles/service-fabric-application-upgrade-advanced/. Diff packaging allows you to define an application package that only contains the package parts that you wish to upgrade. However, it only applies to a component of an application package, such as a Service or Code package for example. You can't create a diff package at the file level. So if you've only changed a single file in your code package, you must include that file along with every other file that belongs to the code package. You can't just include the single file that changed. But the benefit of diff packaging is that you'd only need to include that single code package. You wouldn't need to provide other Service's code packages, for example, assuming they haven't been changed.
Service Fabric SDK 2.5 brings in a preview feature called "Refresh Application".
Using this feature you can get quicker feedback of your code changes.
To enable that, set the following from project properties
Application Debug mode = Refresh Application.
More details and limitations can be found here:
https://sharepointforum.org/threads/speed-up-service-fabric-development-with-the-new-refresh-application-debug-mode.111162/
In Fabric Explorer you need to find the node where you Web Application is running. I my case that is _Node_0
By SF SDK design, local SF published file is under C:\SfDevCluster\Data_App\ . In my environment, the website file path is C:\SfDevCluster\Data_App_Node_0\Application1Type_App1\Web1Pkg.Code.1.0.0\wwwroot\
So you can also find your HTML, CSS, JS and other static resources under below path: C:\SfDevCluster\Data_App[node_id][application_type_and_instance_name][service_type_and_version]\
You can just modify the files in this folder, then the change will immediately apply to your local test web browser. Please notice if your service is hosted by micro-service running in several nodes, you may need to modify all nodes files because load balancer may access any folder files randomly.

Setting Up Continuous Deployment of a WPF Desktop Application

For a project I am currently working on, I need to create a setup application for an existing desktop application. The setup application will be downloaded from a website, and will download required files to the correct locations. When the application is started, it will look for newer versions of these files, download them if any exist, then start the application.
I am using Visual Studio Online with TFVC, linked to Azure. I have a test application set up so that when I trigger a build, Release Management finds the build directory, and moves the files to Azure Blob Storage, but prepends a GUID to the file names being transferred. So what I have in my storage container is:
{Some GUID}/2390/Test.exe
{Some GUID}/2389/Test.exe
{Some GUID}/2387/Test.exe
...
What I want in my container is the latest version of Test.exe, so I can connect to the container, and determine whether I want to download or not.
I have put together a NullSoft installer that checks a website, and downloads files. I have also written a NullSoft "launcher" that will compare local file versions with versions on the website (using a version xml file on the website), and download if newer, then launch the application. What I need to figure out is how to get the newer files to the website after a build, with automation being one of the goals.
I am an intern, and new to deployment in general, and I don't even know if I'm going about this the right way.
Questions:
Does what I am doing make sense for what I am trying to accomplish?
We are trying to emulate ClickOnce functionality, but can't use ClickOnce due to the fact that the application dynamically loads a number of DLLs. Is there a way to configure ClickOnce to include non-referenced DLLs?
Is there a best practice for doing what I'm describing?
I appreciate any advice, links to references, or real-world examples.
You are mentioning ClickOnce, which you investigated but can't use. Have you already tried an alternative: Squirrel? With Squirrel you can specify which files should be part of the installation, allowing you to explicitly specify which files to include even if you load them dynamically.
Link: https://github.com/Squirrel/Squirrel.Windows
Squirrel is a full framework for creating an auto-update application and can work with Azure Blob Storage hosting (and also CDN if you need to scale up)

Will Autoupdate Startup task work in azure application?

I have built one startup task for Azure application contain exe file(running periodically with some time interval) and now i would like to make it autoupdating at every week as i have asked before here
However i'll do some logic of replacing that file through that exe(startup task) then also it is not going to take any effect of new file. I have concluded that new startup task will take effect only if we upgrade/created that azure project with new file. (Correct me if i understood something wrong)
So is there any way to do my logic works by rebooting instance (by exe/startuptask) ?
I think it will also take original file(added in startuptask at the time of upgrading/creating application) instead of new file!
Is it possible anyway?
This is a very unreliable solution. If an Azure instance crashes or is taken down for updates you will have a new instance started from the original service package. All the state of the modified instance will be lost.
A much more reliable way would be to have the volatile executable stored somewhere like Azure Blob storage. You upload a new version to the blob storage and the role somehow sees that (either by polling the storage or by some user-invoked operation - doesn't matter), downloads the new version and replaces the existing version with the new one.
This way if your role crashes it will reliably fetch the newest version from the persistent storage on startup.
After I studied your problem i can propose a very simple solution as below which I have done before for a Tomcat/Java Sample:
Prepare your EXE to Reboot the VM along with your original code:
In your EXE, create a method to look for specific XML file on Azure storage at certain interval, also add retry logic to access XML
Parse XML for specific value and if certain value is set reboot the Machine
Package your EXE in ZIP format and place at your Azure Storage
Be sure to place the XML on Cloud and set the reboot = false value
What to do in Startup Task:
Create a startup task and download the ZIP from Azure Storage which contains your EXE
After the download, unzip the file and place the EXE to specific folder
launch the EXE
What to do when you want to update the EXE:
Update your EXE, package into ZIP and place at same place at Azure Storage with same name
Update your XML to enable Reboot
How update will occur:
The EXE will look for XML after certain internal as designed
Once it sees Reboot is set, it will reboot the VM
After the reboot, the Startup task will be launched and your new EXE will be downloaded to Azure VM and will be updated. Be sure that download and update is done at same folder.
Take a look at Startup tak in the sample below which use similar method:
http://tomcatazure.codeplex.com/

Setting read / write / execute privilege on "cgi-bin" folder in Windows Azure webrole

We're talking about a simple webapp.
So I have a file called "modulev2.cgi" which is part of a trusted 3rd party online payment company. This file has to be put in a folder named "cgi-bin". For windows IIS environnement the file is renamed "modulev2.exe" and put in the same directory. This is what the documentation says.
Module is called as this :
FORM ACTION=../cgi-bin/modulev2.exe METHOD=post
with a bunch of parameters. It should not download when called of course but execute.
And indeed it does work in my dedicated server, provided the "cgi-bin" folder and the file in have "execute" setting level in IIS.
So to the point, would I be able to set the rights to execute to this file in Windows Azure ? If yes, how to script such a process ?
Any help greatly appreciated.
Thanks !
The best way to do this is to script it out locally against your IIS using appcmd.exe. You want to add your CGI handler programmatically. By default, IIS in Windows Azure is already running CGI/Fast-CGI, so you don't have to install it, it should be ready. I think you need to add it to the CGI restriction list and add your handler mappings.
http://technet.microsoft.com/en-us/library/cc732851(WS.10).aspx
Once you have a .cmd file that will correctly configure your local IIS settings, you can use that as the basis for a Startup task in Windows Azure to bootstrap the role.
http://channel9.msdn.com/Shows/Cloud+Cover/Cloud-Cover-Episode-31-Startup-Tasks-Elevated-Privileges-and-Classic-ASP
http://channel9.msdn.com/Shows/Cloud+Cover/Cloud-Cover-Episode-34-Advanced-Startup-Tasks-and-Video-Encoding

Resources