I have a cloud service (WCF role) published on Azure. Source code has been lost. Is there anyway to download the deployment package back from Azure? Or any other way to get the DLL's back.
Perhaps. If you have RDP enabled, or at least configured, in your service definition on the role you can RDP into the instance and retrieve the DLLs that way.
If you deployed using Visual Studio then a copy of the package is in one of your Storage accounts because it uploads the package there before deploying it. Check each of your storage accounts for a vsDeploy container in your BLOB storage. I think a few other deployment mechanisms use this as well. If you find it you can download the cspkg file, rename it .zip and open it up just like a zip file. Inside for each role you'll see a cssx file. Extract that and rename it to .zip as well. Opening the csx folder will show you the code that was deployed to your instance.
Regardless of how you perform your deployments I highly recommend keeping the cspkg files you publish so that you can rollback or know what went out. I'd also recommend having RDP at least configured in your service definition but perhaps disabled for when you need to troubleshoot. Turning it on and off is a configuration update, though that can have it's own side effects.
If all else fails and you have a Windows Azure Support level of some kind above free you can put in a ticket to see if they will retrieve the DLLs for you I guess. I've not tried that.
Update: I didn't know about the operation to get package that Gaurav indicated. That should be your answer to retrieve your code.
Windows Azure Service Management API has an operation for that: http://msdn.microsoft.com/en-us/library/windowsazure/jj154121.aspx. I suggest you take a look at it.
Related
Sometimes in our website which is deployed on Azure web roles, issue comes related to small bugs in javascript and HTML. We go to all instances of webroles and fix these JS and HTML file on machines.
But I was looking into some automated way of doing this, downloading the files to patch from some central location and replace the files in all azure web roles. I am using ASP .net MVC for website.
It is possible to redeploy the website with the patch in the package but we don't want to wait for long deployment time. Please let me know if it is possible via some internal WEB API which replaces the content on all azure web roles.
There are 2 ways to deploy a new webrole:
redeploy
inplace update
The first one is the slowest, meaning new VM's are booted.
With inplace upgrade (https://azure.microsoft.com/en-us/documentation/articles/cloud-services-update-azure-service/)
The new application package is mounted on a new drive (usually F: instead of E:) and the IIS website is swapped to the new drive.
You can try this by going to the old portal and upload a new application package. In just a few seconds/minutes the update is done.
After digging many things on stackoverflow, I crafted my own solution which is creating a topic and subscribing to the topic in code when website starts. When I want to patch the web app then I send a message to Topic to start patching then each machine in the web roles will get notification from topic and start patching themselves. Patching itself is very easy, which is going to a web storage and downloading files from there and replacing files in approot.
When azure maintenance happens this patching may go away, so for this situation I made patching work started at start up of website too.
Cloud service deployment packages tend to be slow since they are basically a recipe on how to build and configure your deployment. The deployment not only puts the recipe out in Azure (so it can be used again if it needs to move your machine), but also follows the recipe to build out a VM for your Cloud Service (WebRoles/WorkerRoles are platform as a service so you don't have to worry about the OS and infrastructure level like you would if you were using the Virtual Machine Azure product but they do still run in VMs on physical hardware).
What you are looking to do is something that will update the recipe (your cloud service package) and your deployment after it is out and running already ... there is no simple way to do that in Cloud Services.
However, yes you could create a startup script that could pull the site files from blob storage or some other centralized location - this would compare to how applications (fiddler for example) look for updates then know how to update and replace themselves. For that sort of feature you will likely need to run code as an elevated user - one nice thing about startup scripts are they can run as an elevated user - so they can do about anything you need done on a machine (but will require you to restart the instance for them to run). Basically you would need to write some code that will allow your site to update itself. This link may help: https://azure.microsoft.com/en-us/documentation/articles/cloud-services-startup-tasks/
If you have the ability to migrate to WebApps and WebJobs, I would recommend looking into that since that compute product solves your problem really well.
Here is a useful answer of the differences between WebApps and Cloud Services: What is the difference between an Azure Web Site and an Azure Web Role
Am trying to host my org's nuget repository in an Azure Web role. That means, the packages are configured to be stored in a local drive (say d:\NuGetPackages) in the machine on the cloud.
The hosting works perfectly as expected, am able to push packages and download from it. However, my concern is if the VM is recycled (either manually or automatically), would that preserve my local repositories of the packages? Or is it that on recycling the whole VM is recreated and only the webrole restored, thus wiping out any other folders outside the scope of the webrole?
I can't have this folder as a part of the webrole folders (like the default package installation for NuGet is ~\packages, which would translate to the IIS folder in the webrole) as it would be overwritten on every deployment. Thus, this packages folder has to be outside somewhere. Would a FileStorage help (I'll need to check, though, if a nuget package path can accept a file storage path).
Thanks
Yes, any data on D:\ can and will at some point vanish.
Your options are either Azure Storage (including their new File service) or the older Azure CloudDrive (don't know if that's even still around, but wasn't a good option anyway).
I fear I don't really understand your use case scenario but wouldn't an Azure VM (IaaS) be a better choice here? Feels like the statelessness of a PaaS role is your main problem, so PaaS wouldn't really be a good fit
I have a project and I have it deployed on Windows Azure cloud service, Sometimes it happens that I have to just update a image file or change a css styling. Is there a way that I can copy only the required files to Azure using RDP without missing the changes when the service is refreshed, I dont want to do a full deployment just to reflect a line of change.
Any help with this will be helpful.
Thanks
Full deployment is really the way Azure Cloud Services are engineered to work. However, you can take these content files, and perhaps move them to blob storage and reference them from there. This way they can be updated outside of the root deployment.
I have created a cloud app, with a distributed packaged that makes azure download the content from a weburl. (composite c1)
Now i would like to take a backup of my application, possible do some changes to the source code and upload again.
How can i get a backup of the files on the cloudapp?
You aren't very clear if you need a backup of your application code, or the files that are being downloaded (though I think you somewhat clarify in your follow up comment).
When you deploy a Cloud Services app the code is packaged into the cspkg file and sent up to be deployed. Some of the deployment tools (like Visual Studio, the PowerShell Cmdlets, etc.) will use a storage account to upload the package file to BLOB storage. This means that you'd have a copy of the packages you've deployed. If you don't use a tool that uses the storage account to deploy from I highly recommend you also keep a copy of the packages you deploy just in case you need to roll back to one.
Now, if you are changing code in your application, then you make the change locally, test it out and then you can redploy. You have several options for this. You can delete the previous deployment and redeploy the new one (which will cause downtime for your app). Another option is to do an inplace upgrade where your deployment is updated with the new package a few machines at a time (if you are running multiple instances). The other options is also a VIP swap where you load up your new code in the staging slot and then swap the bits between staging and production. I'd suggest researching these options to deploy new code on MSDN to understand them (they all have benefits and drawbacks, and some of them can't be done depending on the changes you are making to your code).
In your comment it seems that you are more interested in getting a back up of the files that are in your BLOB storage account after you application pulls them down. You have a couple of opions here as well:
Manually download them using an Azure storage explorer tool like Cerebrata Cloud Storage Studio, AzureXplorer, etc. (I use the Cerebrata tool)
Create some code that pulls down the data. You can do this using any of the Client libraries (.NET, PHP, etc.), the PowerShell cmdlets, or even use a tool by a vendor like Red Gate who has Cmdlets designed for backup/restore.
This might not be so much of a programming question..but still..
I have the need of getting a site the currently is hosted in azure down to a local development environment.. is there anyway to do that?, any tools or such?..
Thanks in advance!
Not currently. Once the cloud service deployment package has been handed over to the Azure Fabric controller, there is no way to reclaim it, even if you submit a support ticket. The closest you can get to this is either upload packages to Windows Azure Blob Storage first, then deploy from there, or enable remote desktop and copy the files from inside the VM to an external storage account.
My suggestion would be to do one of the following:
If you have RDP enabled, you can remote in and grab the files
Otherwise, I would suggest creating a support case and having Microsoft help you get out the files: https://support.microsoft.com/oas/default.aspx?&c1=501&gprid=14928&&st=1&wfxredirect=1&sd=gn