Configure nuget server to get packages from a specific azure location - azure

I need a nuget server for our company's internal usage. I read about how to create a nuget server, and followed the steps and published my server to azure. I am able to upload my packages to the server through command prompt. Everything is working fine upto now as expected.
Now, as far as I know the packages will be stored in azure where my website is deployed. Now if the location of the cloud storage where my website is deployed changes, there is a possibilty that I will be losing all the packages pushed till then. So, to avoid this issue I would like to know if there is a possibility where I can configure the nuget server to look for packages from a cloud storage location which I know for sure will be there forever. Can this be done?
I am new to publishing websites to azure so if I was wrong in some of my assumptions above please let me know.

I think its tricky there was a package around that is out of date and UNC with authentication is a pain from web best option maybe a VM with plenty of disk..

Related

Dotnet-counters output from Azure App Services

I'm trying to find a way to get dotnet-counters output from Azure App Services to either the Azure Portal (good enough for some trouble shooting) or some other console tool.
Does anyone know of a way to get the data, even when the app service is running across multiple app service plans? I was picturing maybe there is an app service extension (but no luck so far).
Eventually I'll want to automate this so I can get the data onto our monitoring system, but first baby steps.... just need something I can manually eyeball to help debug issues.
Anyone have any thoughts on how to do this?
Thanks
Ken
From offical doc, we can install dotnet-counters by cli command and download .exe directly.
As azure web app running in sandbox environment, so we can't add dotnet-counters to env, which means we can install it, but we can't use it.
So my solution is,
we can download .exe file directly.
After downloading finished, we can copy and paste it to wwwroot.
Then we can use it on scm site and azure portal.
On kudu site.
On azure portal.

patching website on azure webroles

Sometimes in our website which is deployed on Azure web roles, issue comes related to small bugs in javascript and HTML. We go to all instances of webroles and fix these JS and HTML file on machines.
But I was looking into some automated way of doing this, downloading the files to patch from some central location and replace the files in all azure web roles. I am using ASP .net MVC for website.
It is possible to redeploy the website with the patch in the package but we don't want to wait for long deployment time. Please let me know if it is possible via some internal WEB API which replaces the content on all azure web roles.
There are 2 ways to deploy a new webrole:
redeploy
inplace update
The first one is the slowest, meaning new VM's are booted.
With inplace upgrade (https://azure.microsoft.com/en-us/documentation/articles/cloud-services-update-azure-service/)
The new application package is mounted on a new drive (usually F: instead of E:) and the IIS website is swapped to the new drive.
You can try this by going to the old portal and upload a new application package. In just a few seconds/minutes the update is done.
After digging many things on stackoverflow, I crafted my own solution which is creating a topic and subscribing to the topic in code when website starts. When I want to patch the web app then I send a message to Topic to start patching then each machine in the web roles will get notification from topic and start patching themselves. Patching itself is very easy, which is going to a web storage and downloading files from there and replacing files in approot.
When azure maintenance happens this patching may go away, so for this situation I made patching work started at start up of website too.
Cloud service deployment packages tend to be slow since they are basically a recipe on how to build and configure your deployment. The deployment not only puts the recipe out in Azure (so it can be used again if it needs to move your machine), but also follows the recipe to build out a VM for your Cloud Service (WebRoles/WorkerRoles are platform as a service so you don't have to worry about the OS and infrastructure level like you would if you were using the Virtual Machine Azure product but they do still run in VMs on physical hardware).
What you are looking to do is something that will update the recipe (your cloud service package) and your deployment after it is out and running already ... there is no simple way to do that in Cloud Services.
However, yes you could create a startup script that could pull the site files from blob storage or some other centralized location - this would compare to how applications (fiddler for example) look for updates then know how to update and replace themselves. For that sort of feature you will likely need to run code as an elevated user - one nice thing about startup scripts are they can run as an elevated user - so they can do about anything you need done on a machine (but will require you to restart the instance for them to run). Basically you would need to write some code that will allow your site to update itself. This link may help: https://azure.microsoft.com/en-us/documentation/articles/cloud-services-startup-tasks/
If you have the ability to migrate to WebApps and WebJobs, I would recommend looking into that since that compute product solves your problem really well.
Here is a useful answer of the differences between WebApps and Cloud Services: What is the difference between an Azure Web Site and an Azure Web Role

Azure web role recycle - would local data be lost?

Am trying to host my org's nuget repository in an Azure Web role. That means, the packages are configured to be stored in a local drive (say d:\NuGetPackages) in the machine on the cloud.
The hosting works perfectly as expected, am able to push packages and download from it. However, my concern is if the VM is recycled (either manually or automatically), would that preserve my local repositories of the packages? Or is it that on recycling the whole VM is recreated and only the webrole restored, thus wiping out any other folders outside the scope of the webrole?
I can't have this folder as a part of the webrole folders (like the default package installation for NuGet is ~\packages, which would translate to the IIS folder in the webrole) as it would be overwritten on every deployment. Thus, this packages folder has to be outside somewhere. Would a FileStorage help (I'll need to check, though, if a nuget package path can accept a file storage path).
Thanks
Yes, any data on D:\ can and will at some point vanish.
Your options are either Azure Storage (including their new File service) or the older Azure CloudDrive (don't know if that's even still around, but wasn't a good option anyway).
I fear I don't really understand your use case scenario but wouldn't an Azure VM (IaaS) be a better choice here? Feels like the statelessness of a PaaS role is your main problem, so PaaS wouldn't really be a good fit

Windows Azure Cloud Service: recover lost source code

I have a cloud service (WCF role) published on Azure. Source code has been lost. Is there anyway to download the deployment package back from Azure? Or any other way to get the DLL's back.
Perhaps. If you have RDP enabled, or at least configured, in your service definition on the role you can RDP into the instance and retrieve the DLLs that way.
If you deployed using Visual Studio then a copy of the package is in one of your Storage accounts because it uploads the package there before deploying it. Check each of your storage accounts for a vsDeploy container in your BLOB storage. I think a few other deployment mechanisms use this as well. If you find it you can download the cspkg file, rename it .zip and open it up just like a zip file. Inside for each role you'll see a cssx file. Extract that and rename it to .zip as well. Opening the csx folder will show you the code that was deployed to your instance.
Regardless of how you perform your deployments I highly recommend keeping the cspkg files you publish so that you can rollback or know what went out. I'd also recommend having RDP at least configured in your service definition but perhaps disabled for when you need to troubleshoot. Turning it on and off is a configuration update, though that can have it's own side effects.
If all else fails and you have a Windows Azure Support level of some kind above free you can put in a ticket to see if they will retrieve the DLLs for you I guess. I've not tried that.
Update: I didn't know about the operation to get package that Gaurav indicated. That should be your answer to retrieve your code.
Windows Azure Service Management API has an operation for that: http://msdn.microsoft.com/en-us/library/windowsazure/jj154121.aspx. I suggest you take a look at it.

Azure, Download the site to a local development environment?

This might not be so much of a programming question..but still..
I have the need of getting a site the currently is hosted in azure down to a local development environment.. is there anyway to do that?, any tools or such?..
Thanks in advance!
Not currently. Once the cloud service deployment package has been handed over to the Azure Fabric controller, there is no way to reclaim it, even if you submit a support ticket. The closest you can get to this is either upload packages to Windows Azure Blob Storage first, then deploy from there, or enable remote desktop and copy the files from inside the VM to an external storage account.
My suggestion would be to do one of the following:
If you have RDP enabled, you can remote in and grab the files
Otherwise, I would suggest creating a support case and having Microsoft help you get out the files: https://support.microsoft.com/oas/default.aspx?&c1=501&gprid=14928&&st=1&wfxredirect=1&sd=gn

Resources