Will Autoupdate Startup task work in azure application? - azure

I have built one startup task for Azure application contain exe file(running periodically with some time interval) and now i would like to make it autoupdating at every week as i have asked before here
However i'll do some logic of replacing that file through that exe(startup task) then also it is not going to take any effect of new file. I have concluded that new startup task will take effect only if we upgrade/created that azure project with new file. (Correct me if i understood something wrong)
So is there any way to do my logic works by rebooting instance (by exe/startuptask) ?
I think it will also take original file(added in startuptask at the time of upgrading/creating application) instead of new file!
Is it possible anyway?

This is a very unreliable solution. If an Azure instance crashes or is taken down for updates you will have a new instance started from the original service package. All the state of the modified instance will be lost.
A much more reliable way would be to have the volatile executable stored somewhere like Azure Blob storage. You upload a new version to the blob storage and the role somehow sees that (either by polling the storage or by some user-invoked operation - doesn't matter), downloads the new version and replaces the existing version with the new one.
This way if your role crashes it will reliably fetch the newest version from the persistent storage on startup.

After I studied your problem i can propose a very simple solution as below which I have done before for a Tomcat/Java Sample:
Prepare your EXE to Reboot the VM along with your original code:
In your EXE, create a method to look for specific XML file on Azure storage at certain interval, also add retry logic to access XML
Parse XML for specific value and if certain value is set reboot the Machine
Package your EXE in ZIP format and place at your Azure Storage
Be sure to place the XML on Cloud and set the reboot = false value
What to do in Startup Task:
Create a startup task and download the ZIP from Azure Storage which contains your EXE
After the download, unzip the file and place the EXE to specific folder
launch the EXE
What to do when you want to update the EXE:
Update your EXE, package into ZIP and place at same place at Azure Storage with same name
Update your XML to enable Reboot
How update will occur:
The EXE will look for XML after certain internal as designed
Once it sees Reboot is set, it will reboot the VM
After the reboot, the Startup task will be launched and your new EXE will be downloaded to Azure VM and will be updated. Be sure that download and update is done at same folder.
Take a look at Startup tak in the sample below which use similar method:
http://tomcatazure.codeplex.com/

Related

Azure - Process Message Files in real time

I am working on Azure platform and use Python 3.x for data integration (ETL) activities using Azure Data Factory v2. I got a requirement to parse the message files in .txt format real time as and when they are downloaded from blob storage to Windows Virtual Machine under the path D:/MessageFiles/.
I wrote a Python script to parse the message files because it's a fixed width file and it parses all the files in the directory and generates the output. Once the files are successfully parsed, it will be moved to archive directory. This script runs well in local disk on ad-hoc mode whenever i need it.
Now, i would like to make this script run continuously in Azure so that it looks for the incoming message files in the directory D:/MessageFiles/ all the time and perform the processing as and when it sees the new files in the path.
Can someone please let me know how to do this? Should i use any stream analytics application to achieve this?
Note: I don't want to use Timer option in Python script. Instead, i am looking for an option in Azure to use Python logic only for File Parsing.

access certain folder in azure cloud service

In my code (which has worker role) I need to specify a path to a directory (third party library requires it). Locally I've included folder into project and just give full path to it. However after deployment of course I need a new path. How do I confirm that whole folder has been deployed and how do I determine a new path to it?
Edit:
I added folder to the role node in visual studio and accessed it like this: Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), "my_folder");
Will this directory be used for reading and writing? If yes, you should use a LocalStorage resource. https://azure.microsoft.com/en-us/documentation/articles/cloud-services-configure-local-storage-resources/ shows how to use this.
If the directory is only for reading (ie. you have binaries or config files there), then you can use the %RoleRoot% environment variable to identify the path where your package was deployed to, then just append whatever folder you refernced in your project (ie. %RoleRoot%\Myfiles).
I'd take a slightly different approach. Place the 3rd party package into Windows Azure blob storage, then during role startup, you can download/extract it and place the files into the available Local storage (giving it whatever permissions the app needs). Then leverage that location from your application via the same local storage configuration entry.
This should help you reduce the size of your deployment package as well as give you the ability to update the 3rd party components without completely redeploying your solution. And by leveraging it on startup, you can guarantee that the files will be there in case the role instance gets torn down and rebuilt.

OnStart vs Startup Script for batch file?

I have a Ruby on Rails application that needs to find a home in an Azure Worker Role.
I currently automate the deployment of the application with a batch file - a file that takes the apache and ruby installers, runs them, and then drops the RoR app in the appropriate directory. After the batch script finishes, Apache is serving to and from the application via port 80.
I'm new to Azure and trying to figure out how to do this.
From my understanding, I have two options here: OnStart with the installation files in Blob Storage, or a startup script. I'm not sure how to do the latter, but I have located the onStart method within the WorkerRole.vb file in the new Azure project I just created.
My question: Is it recommended to use OnStart to deploy the application (using the batch script)? If so, how would I go about integrating the script into the project? And - how do I get started with storing and referencing the files in blob storage?
I know these are super high-level questions. Any input or suggested reading would be super helpful. I have tried to google / search for relevant resources but haven't been able to find much. Thank you for your time!
When you are inside OnStart() function it is better to do role configuration things i.e. IP binding, etc however if you would want to install runtime, download application zip, configured role specific setting, it is best to use Startup task. Please visit my blog Windows Azure: Startup task or OnStart(), which to choose? to learn more about it.
Now in your case it is best to use Startup task. What you can do it as below:
Create your ROR package a zip and place it at Windows Azure Blob Storage
Create a Cmmmand batch file which will do:
2.1 Download the ZIP
2.2 Unzip to Zip content to a specific location
2.3 Update the status back to AZure Blob Storage (Optional)
In your OnStart() function you just need to configure the ROR
The code will look as below if you have TCP Endpoint name "RORWeb80" set to use port 80:
TcpListener RoRPortListener = new TcpListener(RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["RORWeb80"].IPEndpoint);
RoRPortListener.Start();
I have written a sample app for Tomcat/Java based worker role which does exactly the same. So what you can do it just replace the Tomcat ZIP file with ROR ZIP and reuse the code exactly.
As long as you don't need admin-level access (e.g. modifying registry, installing msi's, etc.) you can do your setup from OnStart(), including launching your script. Just include the startup script with your project (don't forget to set Copy Local to true).
Same goes with startup script: you call your cmd file, which then executes the sequence for you. And if you give it elevated permissions, you can run installers, modify registry settings, install custom perf counters, whatever.
In either case: you can keep your apache zip, ruby installers, etc. in blob storage and, at startup, download them to local storage. This saves you from bundling everything within the deployment, which gives you a few advantages (being able to update ruby / apache without redeploy, reduced package size, etc.).
There's a sample app on codeplex that demonstrates the basics of setting up Tomcat via startup script. For one more example, you can look at the scripts installed via Eclipse Windows Azure plugin for Java. These scripts are quite similar. The key is to have some way of downloading files from blob storage and then unzipping them. the codeplex project I referred to points to a sample app that does simple blob downloading. The Eclipse packaging provides similar functionality in a .vbs app. Here's a snippet of one of my scripts from an Eclipse-based project:
SET SERVER_DIR_NAME=apache-tomcat-7.0.25
SET WAR_NAME=myapp.war
rd "\%ROLENAME%"
mklink /D "\%ROLENAME%" "%ROLEROOT%\approot"
cd /d "\%ROLENAME%"
cscript /NoLogo util\unzip.vbs jre7.zip "%CD%"
cscript /NoLogo util\unzip.vbs tomcat7.zip "%CD%"
copy %WAR_NAME% "%SERVER_DIR_NAME%\webapps\%WAR_NAME%"
cd "%SERVER_DIR_NAME%\bin"
set JAVA_HOME=\%ROLENAME%\jre7
set PATH=%PATH%;%JAVA_HOME%\bin
cmd /c startup.bat
The codeplex project has a similar-looking script.
Don't forget: you'll need to set up an Input Endpoint for your role (part of the role properties).
To get blobs into blob storage, there are both free tools (like Clumsy Leaf CloudXplorer and paid tools (such as Cerebrata's Cloud Storage Studio).
To download blobs to local storage, you can either write a few lines of .net code (from OnStart) or just use the utility pointed to in the codeplex project.

How Can I Update My Web Site Automaticlly EveryDay?

The tasks I do manually for updating my web site:
Stop IIS 7
Copy source files from a folder to the virtual directory of my web site
Start IIS 7
There are many ways to approach this, but here is one way.
I am assuming you don't want every single file in your source repository to exist on your destination server. The best way to reliably extract what you need from your source on a regular basis is through a build file. Two options for achieving this are nant and msbuild.
Once you have the set of files you want to deploy, you now need a way to distribute them to your destination server & to stop and start IIS. Again, there are options, but I would personally recommend powershell (with the IIS snapin) for this.
If you want this to happen regularly, consider a batch file executed by some timer, such as a scheduled task, or even better, a CI solution such as TeamCity.
For a full rundown, there are examples within my PowerUp project that does this.
It depends where you are updating from, but you could have a your virtual directory pointing to a local read-only working copy of your source code and create a task that every day runs a batch file/powershell script/etc. that would update that working copy (via a svn update, git pull etc.)
That supposes that you have a branch that always contains the latest releasable code.
You have to create a batch file with the following content:
Stop WWW publishing service
Delete your old files
Copy the new files
Start WWW publishing service
You can start/stop services like this:
net stop "World Wide Web Publishing Service"
When you have your batch file you can create a task in the Task Scheduler that calls your batch in a regular time interval (e.g. each day).

How to use exe in SharePoint on itemAdded?

I have a need to convert any document gets uploaded to Image.
I downloaded the exe (with all the dlls) on my local machine (dont have to install)
export.exe sourcefile.doc destinationfile.gif >> this syntax works from my local dos prompt.
How do I use the same syntax "export.exe exampledoc.doc exampledoc.gif" when an item is added to sharepoint doc library.
and Do I need to put the folder (where the exe and dlls are for this) in the sharepoint frontend server so it's accessible? If yes, where should this folder reside? Does the folder and files need sharepoint service account access?
I am totally new and I would really like if someone can shed some light on this (step by step if possible)?
Thanks
Justin...
In order to do this from a SharePoint event handler, each WFE on the farm would need to have your conversion application available, your event handler code would need to place the uploaded file in a temporary location on disc, invoke the conversion application (look at the .NET Process class for this), cancel the addition of the original, unconverted document, and add the output of your processed file to the library (ensure you use the DisableEventFiring() method of the event handler as to not have the event handler trigger itself during the addition of the new file), and then clean up after itself.
Having said that, this operation seems like something that you really wouldn't want to tax a web server getting any real traffic with doing in real time. You may want to look into batching the jobs to be done daily during traffic lulls by another system, or one dedicated resource on the farm.

Resources