I'm looking to set up a CI/CD flow for the content in the developer portal, so we can make changes in our dev environment and promote them to higher environments. From what I found, this appears to have information on the suggested way to perform this:
https://learn.microsoft.com/en-us/azure/api-management/automate-portal-deployments
which links to this repo:
https://github.com/Azure/api-management-developer-portal
As far as I can tell, the majority of the code in that repository. is for the developer portal itself in case you wanted to self host. I'm not interested in self hosting, I'm only interested in the scripts contained there, as they will allow me to extract and publish the content between environments. Is there a repository with ONLY the code required for the scripts to run (specifically v3)? I'd prefer to avoid manually going through and deleting files that are non-script related if possible, as I don't really know/understand what they all are.
If such a repository existed, it would enable my ideal scenario which would be to fork that repository, run the "capture" script, then check the extracted developer portal content in to the new repository.
Well, so why don't you just copy the scripts.v3 folder and use it? As you noticed you don't need rest files if you are not running self hosted version. So you can just simply copy paste them. Those scripts are nothing more than a client for Azure REST API endpoints written in node.js. And they can run completely independently from the rest of the repository.
If you don't like node.js you can even write your own scripts to deploy developer portal - with languange of your choice.
Developer portal contains Content Types which contain Content Items. One extra thing is media (fonts, images etc.) that are stored in the APIM blob storage. Those two things determinate how Developer Portal looks like.
So all you need to do is:
Grab all content items (using Azure REST API) from one instance and put them to another APIM
Connect to APIM blob storage and grab all media blobs and put them to another APIM blob storage. You can get SAS url to the blob storage using Azure REST API as well.
And if you examine carefully, those scripts are doing exactly the same thing:
capture.js - will take all files from given APIM instance and put it in to your local folder
generate.js - will take files from your local folder and put it to APIM instance of your choice
migrate.js - is just a combination of previous two scripts. It will take files from one instance and put it to another.
cleanup.js - it is the same thing like reset button in developer portal. It brings back the default state.
Related
I have created an App Service (inside Linux based App Service Plan). I have connected to wwwroot folder using FTP Client
In the hostingstaart.html, I made a simple text change.
I couldn't see the change getting reflected when I navigate to https://.azurewebsites.net/
However, when I navigate to Kudu and access the same page, I can see the change.
Below is the screenshot.
In the left hand side (kudu website), I can see the change that I have made (DevOps Engineers). However, In this right hand side, the changes is not shown.
In order to troubleshoot, I have added another file as well which is visible in Kudu. But, I'm not able to access the same.
This is not at all an issue in Windows Based App Service Plan. This is an issue only with Linux based app.
Is there any docker container internally where it reads the files from? if yes, how do change those files?
Appreciate any help here .
Thanks,
Praveen
Yes, Azure App Service on Linux run in containers and depending on the platform, different webservers and locations are used which can get confusing.
For SSH access into the container go to https://<yourappname>.scm.azurewebsites.net/webssh/host - in .NET projects on Linux the default page you'll see on a new app service is stored at /defaulthome/hostingstart/wwwroot/hostingstart.html while files you upload via FTP will go to /home/site/wwwroot.
That being said, I'd recommend to use the documented ways to publish content based on the platform you want to use. For .NET, you can find the docs here.
I am trying to deploy an on-prem instance of Azure DevOps Server to a VM in an Azure Government subscription (which by nature, seems it does not support standard DevOps).
This template is referenced within support material directly from Microsoft:
https://github.com/usri/deploy_DevOps_Server_AzureSQL
All the referenced resources were created from scratch for the purpose of getting this server running.
This requires an AAD account with the associated password stored in a Key Vault. However, every attempt I make to run the template returns the following error on the 'Write VirtualMachines' step (when all other components pass):
The client has permission to perform action 'Microsoft.Compute/images/read' on scope '(MY_SUBSCRIPTION)\(MY_RESOURCEGROUP)\(VM)', however the current tenant '(MY_KEYVAULT)' is not authorized to access linked subscription '(ID in the template with the deployment files)'
This seems to me like the password cannot be retrieved from Key Vault- is it a formatting issue with the Secret? An access control issue somewhere? I've tried many various combinations of both. Hopefully this is just a trivial issue..
I am the original author of the code in that repo. I went ahead and merged a pull request into that repo which should address your issue. I did the following:
Updated the ReadMe file to include information on creating the image
Updated the azuredeploy.json with parameters for Key Vault & image references
Updated the ps1 file to eliminate hard links for KV (a particularly bad oversight on my part, my apologies).
Updated and tested everything for the latest version of Azure DevOps Server 2020
This should fix your issue and several other related ones. I retested the entire deployment from scratch and it worked as designed. A couple of other quick notes:
The USRI and all of it's repositories including the one being utilized here are not Microsoft official repositories. They represent an open-source Azure community dedicated to regulated entity customers. The members which contribute there are mostly Microsoft employees and the repos themselves just represent interesting and sometime niche templates that might be of interest.
This particular repo shows a manner in which Azure templates could be used to deploy services when no internet connection is available or permitted. I just used Azure DevOps Server because it was interesting and regulated industry customers use it.
All the best
I have an MSDN Subcription which allows me to download and install various Microsoft products.
I would like to download from there the Sql Server 2016 Developer edition. But for the reasons that are irrelevant for this question, I would like to keep the downloaded ISO file not on the local machine, but on an Azure File share I have in my Azure account.
Right now, I first download from MSDN to my machine and then upload to the Azure File Storage.
Wasteful, it would be much more efficient if I could somehow download it directly to Azure Files from MSDN.
I understand that if I had an Azure VM, that would be possible. But I do not. So, I am wondering if it is possible to do it anyway without an Azure VM.
Part answer to your problem.
If the URL of the file you're trying to use (ISO for SQL Server 2016 Developer edition in your case) is public (i.e. if you paste the URL in browser's address bar for example, you should be able to access that file), then you don't have to download/reupload. Azure Storage provides asynchronous server-side copy functionality where a file can be created by copying from a publicly accessible resource.
For more about copy file, please see this link. Since you didn't mention any programming language, I provided reference to REST API. But you can simply use one of the available SDKs to accomplish this.
The reason I said part answer is because I don't know how you will access the link of the file programmatically.
For a particular customer, the IT team do not approve OneDrive and Box as source deploy repo. Box, Dropbox etc anyways is not fully supported for syncronizing using Azure functions as the API are at File events and not on folder events.
What would be the various alternatives, where customer staffs could work on 'n' static HTML sites by each BU team, deploy easily to Azure Storage using explorer or sort and how to have this update the webapp or webapp point to this Azure storage file share location ( whichever is easy and feasible )
Please suggest me any case studies or alternatives that I could explore for this scenario.
For a particular customer, the IT team do not approve OneDrive and Box as source deploy repo.
A sensible bunch.
Please suggest me any case studies or alternatives that I could explore for this scenario.
Why not a private repo in GitHub Business?
It even has drag and drop now (a la Dropbox).
You could deploy to Azure Storage using Storage Explorer:
http://storageexplorer.com/
Another options:
1-Write a simple c# console app that will use Azure SDK to deploy files to your Azure Storage Account. (You can use FileSystemWatcher to monitor a network folder, for example)
2-setup a CD infrasctructure using TeamCity (or any other tool), and during the after build call a script that will publish to your Azure Storage Account.
3-Take a look on Azure Files:
http://social.technet.microsoft.com/wiki/contents/articles/33258.azure-file-storage-on-premises-folder-sync.aspx
4-Use AZCopy: https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/ (not sure if it has a way to detect that the file already exists and is the same on Storage Account)
5-Take a look on Sync Framework. The page is a little outdated, but it seems that there's support for Azure: https://msdn.microsoft.com/en-us/library/mt763483.aspx
I don't have the latest version of the code deployed in the company's Windows Azure account, and I need to provide a fix to it. I know this can be done with "Azure Web Sites", but I'm not sure it's possible with "Azure Cloud Services".
Can anyone help?
If you did git deployment of the cloud service, you could fetch from the remote the same way you could with Windows Azure Web Sites. You may have updated the cloud service by uploading the package to blob storage first, in which case you could get the package. But, the package is not source code.
From a process perspective, you should label your deployments with a tag that can be matched in source control. You never know when a "hotfix" needs to be added to a branch off of the current production code.
In Windows Azure Cloud Services, instances are uploaded in the form of .cspkg packages.
According to the documentation, the Get Package operation retrieves a cloud service package for a deployment and stores the package files in Windows Azure Blob storage.
You could then download and extract this package (it is in ZIP file format) to retrieve its content. See this answer for more details.
In the case of ASP.NET applications, that will be a mixture of text files and binary assemblies (.DLLs). In the case of Java, it will be .jar files. You could use the appropriate decompiler to retrieve an approximation of the original source code. But it probably wouldn't be safe to change this reverse-engineered source code and upload it back into production, at least not without extensive testing.
yes. you can download it with an ftp client.
Ggo to dashboard of your site on https://manage.windowsazure.com.
Get credentials (username , password, host) and connect with you preferred ftp client.
Well, Azure now had a new portal and things are bit different. I had to retrieve the code for one of my websites.
To download the code,
go to App Service. In Overview panel, download publisher profile.
Now go to Deployment credentials panel. Enter the username for FTP and choose a password.
To connect to ftp, you need the URL from publishing profile (example.PublishSettings).
Now fire up your FTP client (FileZilla in my case) and put the FTP address and put the username like sitename\ftpusername (example\ftp-exmaple-user for me) and put in the password you choose in Deployment credentials panel.
wwwroot contains your code!
I realize it's an old question, in case anyone else needs it... I use the Cloud Explorer in Visual Studio 2017. In the Cloud explorer, you can drill down Subscription -> Resource Group -> App Service -> Files. Then, at the bottom of the Cloud Explorer, click "Download Files as a Zip."