I have a project and I have it deployed on Windows Azure cloud service, Sometimes it happens that I have to just update a image file or change a css styling. Is there a way that I can copy only the required files to Azure using RDP without missing the changes when the service is refreshed, I dont want to do a full deployment just to reflect a line of change.
Any help with this will be helpful.
Thanks
Full deployment is really the way Azure Cloud Services are engineered to work. However, you can take these content files, and perhaps move them to blob storage and reference them from there. This way they can be updated outside of the root deployment.
Related
I have a website hosted as an azure web app. It's an asp.net website that's in a vs solution. On folder of that website is my products documentation, all of it as static resources (html and images). These static resources are located in a folder in another vs solution (this is the actual products solution). Both solutions are TFS based in VSO.
As of now, i have a webjob running in the context of the website that is basically doing a "tfs get" on the documentation folder and placing it's contents into the documentation folder on the website. This is working, however, the vms the website is running on do change quite frequently and the mechanism to create a workspace is bound to the machine, not to the disk drive. Thus, i cannot get only the changes but i must always get all the content which right now takes about 20 minutes and creates unnecessary load on the website. (This is why i'm only running this webjob once a week.)
Now i'm looking for a better way to do this. I would like to only get the files that have changed, making this a lot faster and let cpu/drive costly.
I did not find a way to create a workspace on the webserver that isn't vanishing each time the webservers vm changes. (if it was possible to somehow attach the workspace to the drive instead of to the machine name, that would solve the problem.)
i was also looking at my continues build definition that i have running for the products solution. as part of that, i think it's possibly to create a deployment where the documentation folder is copied to the app services's documentation folder. This way i could get rid of the "special" webjob, but i'd still copy all the docs files each time. (also, the build agent for that is running on premises, so i'd also have to copy those files from premises up to the cloud when they're actually already there inside vso.). So basically, i don't think this option is a lot of use for my case.
Obviously if i moved the static docs resources from the products solution to the websites solution, i could simply use the automatic deployment that is available for website projects from vso to azure web app. Unfortunately, for various other reasons (one of which being, the static resources are partially created automatically from the .cs sources in the products solution) i simply cannot move the docs folder from the products solution to the websites solution.
So does anybody have a suggestion for a method where i could update the documentation folder in the web app based on changes in the corresponding VSO folder?
You can upload the updated files to Azure app service by using Kudu API.
Simple steps:
Create a Continuous Integration build
Check Allow Scripts to Access OAuth Token option in Options tab
Add PowerShell step/task to check the changes with REST API. (Refer to Calling VSTS APIs with PowerShell)
Add Azure PowerShell step/task to upload files to app service by using kudu api. (Refer to Remove files and folders on Azure before a new deploy from VSTS)
Here is what i ended up doing:
Created a ServiceHook in VSO that is wired to "Web Hooks". The hook is called upon each Check-In and filtered based on the directory i want. (All of this can be done using the existing functionality in VSO.)
The hook calls an Azure "functionapp" (which is easy to do, because functionapps have a "HttpTrigger" mechanism which fits in nicely here.)
The hook passes the id of the checkins ChangeSet to the function app.
The FunctionApp puts that id into an Azure (Storage) Queue.
This triggers an azure webjob which listens on that queue. That webjob uses the ChangeSet-Id to get the changes from VSO and acts on the changetype for each change. (e.g.: Downloads or deletes a file.)
I am using VS Team Services to build and deploy my Asp.net MVC application to Azure. When the application is running, users can upload files to a directory.
The problem is that when I run a new build and deploy task in Team Services it overwrites or deletes the files that were uploaded by my users.
It seems like Team Services erases the target location before it does the deployment.
Is there a way in Team Services to tell it not to delete a specific directory when it is deploying an update to the application?
If I cant do that then is there a way to automate the copying of the original files before the deployment, then write them back after the deployment?
I've been searching google most of the day and cant seem to find an answer.
Thanks
Tony
I believe there is a better approach. Your user data should be saved to Azure blob storage, not uploaded to the web deployment location.
Here is a good getting started tutorial on using Azure storage: https://learn.microsoft.com/en-us/azure/storage/storage-dotnet-how-to-use-blobs
It is not recommended for user files to be stored in the same directory as the web site. Even if you simply load them into file storage on your VM, it should be in its own location.
There is Remove Additional Files at Destination option in Additional Deployment Options of Azure App Service Deploy step/task.
So If you check this option, it will delete additional files on AzureRM Web APP otherwise, it won’t delete additional files.
I have a cloud service (WCF role) published on Azure. Source code has been lost. Is there anyway to download the deployment package back from Azure? Or any other way to get the DLL's back.
Perhaps. If you have RDP enabled, or at least configured, in your service definition on the role you can RDP into the instance and retrieve the DLLs that way.
If you deployed using Visual Studio then a copy of the package is in one of your Storage accounts because it uploads the package there before deploying it. Check each of your storage accounts for a vsDeploy container in your BLOB storage. I think a few other deployment mechanisms use this as well. If you find it you can download the cspkg file, rename it .zip and open it up just like a zip file. Inside for each role you'll see a cssx file. Extract that and rename it to .zip as well. Opening the csx folder will show you the code that was deployed to your instance.
Regardless of how you perform your deployments I highly recommend keeping the cspkg files you publish so that you can rollback or know what went out. I'd also recommend having RDP at least configured in your service definition but perhaps disabled for when you need to troubleshoot. Turning it on and off is a configuration update, though that can have it's own side effects.
If all else fails and you have a Windows Azure Support level of some kind above free you can put in a ticket to see if they will retrieve the DLLs for you I guess. I've not tried that.
Update: I didn't know about the operation to get package that Gaurav indicated. That should be your answer to retrieve your code.
Windows Azure Service Management API has an operation for that: http://msdn.microsoft.com/en-us/library/windowsazure/jj154121.aspx. I suggest you take a look at it.
I have created a cloud app, with a distributed packaged that makes azure download the content from a weburl. (composite c1)
Now i would like to take a backup of my application, possible do some changes to the source code and upload again.
How can i get a backup of the files on the cloudapp?
You aren't very clear if you need a backup of your application code, or the files that are being downloaded (though I think you somewhat clarify in your follow up comment).
When you deploy a Cloud Services app the code is packaged into the cspkg file and sent up to be deployed. Some of the deployment tools (like Visual Studio, the PowerShell Cmdlets, etc.) will use a storage account to upload the package file to BLOB storage. This means that you'd have a copy of the packages you've deployed. If you don't use a tool that uses the storage account to deploy from I highly recommend you also keep a copy of the packages you deploy just in case you need to roll back to one.
Now, if you are changing code in your application, then you make the change locally, test it out and then you can redploy. You have several options for this. You can delete the previous deployment and redeploy the new one (which will cause downtime for your app). Another option is to do an inplace upgrade where your deployment is updated with the new package a few machines at a time (if you are running multiple instances). The other options is also a VIP swap where you load up your new code in the staging slot and then swap the bits between staging and production. I'd suggest researching these options to deploy new code on MSDN to understand them (they all have benefits and drawbacks, and some of them can't be done depending on the changes you are making to your code).
In your comment it seems that you are more interested in getting a back up of the files that are in your BLOB storage account after you application pulls them down. You have a couple of opions here as well:
Manually download them using an Azure storage explorer tool like Cerebrata Cloud Storage Studio, AzureXplorer, etc. (I use the Cerebrata tool)
Create some code that pulls down the data. You can do this using any of the Client libraries (.NET, PHP, etc.), the PowerShell cmdlets, or even use a tool by a vendor like Red Gate who has Cmdlets designed for backup/restore.
I try to deploy a file for download to azure together with my web application.
The file is part of my project in a folder in my VS2010 solution. In the file properties, I tried both, setting the Build Action to "Content" or setting "Copy to output" to always. After deployment, the file is not there, I get a 404. Any ideas?
Regards
If your goal is to provide a file for download, you may want to look at hosting that file in a storage account in BLOB storage. You could then leverage the CDN to provide better performance to your end users in their download experience.
As for troubleshooting your existing problem, here are some ideas...
You probably only need to be setting the build action to "Content". That should do the trick for making sure that your file is being included when you publish your web application.
One thing you can do to make sure is to right click your web application project in Solution Explorer and publish it to the file system. Then you can inspect it to make sure your files are staged appropriately.
You could even go the extent of using the new Remote Desktop capabilities in the 1.3 SDK to remote into your Web Role instance in the cloud and inspect how your package was deployed.