Azure web role recycle - would local data be lost? - azure

Am trying to host my org's nuget repository in an Azure Web role. That means, the packages are configured to be stored in a local drive (say d:\NuGetPackages) in the machine on the cloud.
The hosting works perfectly as expected, am able to push packages and download from it. However, my concern is if the VM is recycled (either manually or automatically), would that preserve my local repositories of the packages? Or is it that on recycling the whole VM is recreated and only the webrole restored, thus wiping out any other folders outside the scope of the webrole?
I can't have this folder as a part of the webrole folders (like the default package installation for NuGet is ~\packages, which would translate to the IIS folder in the webrole) as it would be overwritten on every deployment. Thus, this packages folder has to be outside somewhere. Would a FileStorage help (I'll need to check, though, if a nuget package path can accept a file storage path).
Thanks

Yes, any data on D:\ can and will at some point vanish.
Your options are either Azure Storage (including their new File service) or the older Azure CloudDrive (don't know if that's even still around, but wasn't a good option anyway).
I fear I don't really understand your use case scenario but wouldn't an Azure VM (IaaS) be a better choice here? Feels like the statelessness of a PaaS role is your main problem, so PaaS wouldn't really be a good fit

Related

patching website on azure webroles

Sometimes in our website which is deployed on Azure web roles, issue comes related to small bugs in javascript and HTML. We go to all instances of webroles and fix these JS and HTML file on machines.
But I was looking into some automated way of doing this, downloading the files to patch from some central location and replace the files in all azure web roles. I am using ASP .net MVC for website.
It is possible to redeploy the website with the patch in the package but we don't want to wait for long deployment time. Please let me know if it is possible via some internal WEB API which replaces the content on all azure web roles.
There are 2 ways to deploy a new webrole:
redeploy
inplace update
The first one is the slowest, meaning new VM's are booted.
With inplace upgrade (https://azure.microsoft.com/en-us/documentation/articles/cloud-services-update-azure-service/)
The new application package is mounted on a new drive (usually F: instead of E:) and the IIS website is swapped to the new drive.
You can try this by going to the old portal and upload a new application package. In just a few seconds/minutes the update is done.
After digging many things on stackoverflow, I crafted my own solution which is creating a topic and subscribing to the topic in code when website starts. When I want to patch the web app then I send a message to Topic to start patching then each machine in the web roles will get notification from topic and start patching themselves. Patching itself is very easy, which is going to a web storage and downloading files from there and replacing files in approot.
When azure maintenance happens this patching may go away, so for this situation I made patching work started at start up of website too.
Cloud service deployment packages tend to be slow since they are basically a recipe on how to build and configure your deployment. The deployment not only puts the recipe out in Azure (so it can be used again if it needs to move your machine), but also follows the recipe to build out a VM for your Cloud Service (WebRoles/WorkerRoles are platform as a service so you don't have to worry about the OS and infrastructure level like you would if you were using the Virtual Machine Azure product but they do still run in VMs on physical hardware).
What you are looking to do is something that will update the recipe (your cloud service package) and your deployment after it is out and running already ... there is no simple way to do that in Cloud Services.
However, yes you could create a startup script that could pull the site files from blob storage or some other centralized location - this would compare to how applications (fiddler for example) look for updates then know how to update and replace themselves. For that sort of feature you will likely need to run code as an elevated user - one nice thing about startup scripts are they can run as an elevated user - so they can do about anything you need done on a machine (but will require you to restart the instance for them to run). Basically you would need to write some code that will allow your site to update itself. This link may help: https://azure.microsoft.com/en-us/documentation/articles/cloud-services-startup-tasks/
If you have the ability to migrate to WebApps and WebJobs, I would recommend looking into that since that compute product solves your problem really well.
Here is a useful answer of the differences between WebApps and Cloud Services: What is the difference between an Azure Web Site and an Azure Web Role

Configure nuget server to get packages from a specific azure location

I need a nuget server for our company's internal usage. I read about how to create a nuget server, and followed the steps and published my server to azure. I am able to upload my packages to the server through command prompt. Everything is working fine upto now as expected.
Now, as far as I know the packages will be stored in azure where my website is deployed. Now if the location of the cloud storage where my website is deployed changes, there is a possibilty that I will be losing all the packages pushed till then. So, to avoid this issue I would like to know if there is a possibility where I can configure the nuget server to look for packages from a cloud storage location which I know for sure will be there forever. Can this be done?
I am new to publishing websites to azure so if I was wrong in some of my assumptions above please let me know.
I think its tricky there was a package around that is out of date and UNC with authentication is a pain from web best option maybe a VM with plenty of disk..

Worker Role third-party software

We have third-party software hosted in Azure Virtual Machines. This software hosts a service which is consumed by one of our Cloud Service Web Roles.
The issue is that the network latency between the VM and the Web Role is significantly affecting the performance of our application.
A solution would be to publish this third-party software in the same Cloud Service (in a Worker Role).
VM Role sounds like a good implementation for the above problem. Unfortunately this is a deprecated service!
One idea would be to package the relevant installation scripts and files into a Visual Studio project and configure the ServiceDefinition to set up the software accordingly. The concern here is that the installation files are over 1Gb.
Is there currently any Azure service that can support my problem? Is there a replacement to the VM Role?
Though a bit old, but you may want to take a look at Azure Bootstraper on CodePlex. From the project description page:
The Windows Azure Bootstrapper is a command line tool meant to be used
by your running Web and Worker roles in Windows Azure. This tool
allows you to easily download resources (either public resources or
ones in your blob storage), extract them if necessary, and launch
them. Since you don't want to always download and run during
restarts, it will also help track those dependencies and only launch
an installer one time! In addition, there are some very useful
features that make it a great tool to package with your roles.
Yet another idea (though I have not tried it) would be to make use of Azure File Service. What you could do is upload the installers in an Azure File Service share and then mount the share in your Cloud Service VMs and use it as a drive on your VM. You should be able to install software that way.
You're right that bundling 3rd-party software inside the cspkg can be problematic, size-wise.
It's common practice to download needed software from either a startup command file (.cmd) or from OnStart(). These downloads can be sourced from anywhere that you have access to: Azure blob storage, the actual vendors themselves (e.g. download from their public download link), etc. In your startup script, you'd need to handle the downloading (and potential unzipping) into a local folder, then installing as necessary.

Windows Azure Cloud Service: recover lost source code

I have a cloud service (WCF role) published on Azure. Source code has been lost. Is there anyway to download the deployment package back from Azure? Or any other way to get the DLL's back.
Perhaps. If you have RDP enabled, or at least configured, in your service definition on the role you can RDP into the instance and retrieve the DLLs that way.
If you deployed using Visual Studio then a copy of the package is in one of your Storage accounts because it uploads the package there before deploying it. Check each of your storage accounts for a vsDeploy container in your BLOB storage. I think a few other deployment mechanisms use this as well. If you find it you can download the cspkg file, rename it .zip and open it up just like a zip file. Inside for each role you'll see a cssx file. Extract that and rename it to .zip as well. Opening the csx folder will show you the code that was deployed to your instance.
Regardless of how you perform your deployments I highly recommend keeping the cspkg files you publish so that you can rollback or know what went out. I'd also recommend having RDP at least configured in your service definition but perhaps disabled for when you need to troubleshoot. Turning it on and off is a configuration update, though that can have it's own side effects.
If all else fails and you have a Windows Azure Support level of some kind above free you can put in a ticket to see if they will retrieve the DLLs for you I guess. I've not tried that.
Update: I didn't know about the operation to get package that Gaurav indicated. That should be your answer to retrieve your code.
Windows Azure Service Management API has an operation for that: http://msdn.microsoft.com/en-us/library/windowsazure/jj154121.aspx. I suggest you take a look at it.

When should I repackage my Azure compute role?

When setting up an Azure Web / Worker Role for the first time I need to 'Package' the project and upload it via the Azure portal. After doing this I can 'Publish' the application from Visual Studio.
Under which circumstances do I need to 'Package' the project again and update it via the Azure portal?
In other words - which changes require the project to be re-packaged?
Note: I need to 'Package' the project in order to upload it via the Azure portal. When I create a Compute Role in Azure, I must upload a package in order to be make the Compute Role operational.
From Azure portal:
You have nothing deployed to the production environment.
UPLOAD A NEW PRODUCTION DEPLOYMENT.
The Cloud Service package contains the role definitions, configuration settings, runtime bits, and other static content bundled with your app. Visual Studio (or PowerShell) creates an encrypted package (actually a zip file that you can look into, when building for emulator) for upload to the named slot you created via the portal.
In the future, there are certain things you can do without rebuilding the package, such as changing instance count and other configuration settings. Also: If you move your static content (such as your CSS, images, etc.) to blob storage, you can then update those directly without ever needing to recreate / redeploy the package (you may need to send some type of signal to your running app, to reload some resources, but that's going to be app-specific). If you have specific exe's or MSI's that get installed as part of your startup scripts, you can move these to blob storage as well, since they can easily be downloaded as your role startup code executes (and this cuts down on package size).
If you change anything defined exclusively in the service definition file (e.g. if you add a role or change a role size), you will have to repackage/redeploy (but you can deploy as an update, which won't take your service down [assuming you have 2 or more instances] or replace your assigned IP address).
I don't think you must package your project at the first time. You can publish your azure project for the first time. I'm not sure what prevent from your publishing. Could you explain a bit more.
I fact, the publish is very similar as package. Visual studio just packaged the project and uploaded them to azure on behalf of you.

Resources