For a Azure cloud service, there're typically 2 configuration files.
Serviceconfiguration.Cloud.cscfg
ServiceConfiguration.Local.cscfg
When I press F5 in Visual Studio to run with local development environment, the Local one will be used.
When I publish it to Azure with Visual Studio, the Cloud one will be used.
Now I am building my cloud service with Team Foundation Build. Where can I configure which cscfg file the build engine should use?
Add
With the publish wizard in VS2013, I can choose the config file to use:
But when I rebuild the cloud project, it always use the *.Local.cscfg file to generate the final
ServiceConfiguration.cscfg. Like below:
How can I make my local build use the .Cloud.cscfg file? My purpose is to config TFS build process to use the right cscfg file.
Related
When I deploy my Azure Function project to my Function App based on the v2 runtime, the binding extensions my project depend on (Azure Storage in my case), are not automatically created.
I deploy my project with an extensions.csproj file on the root, but after deploying I have to manually run the following command to create a bin and obj folder at wwwroot.
dotnet build extensions.csproj -o bin --no-incremental --packages D:\home\.nuget
If I understand correctly, this should happen automatically.
For deployment by CLI func azure functionapp publish
Function core tools use zip deployment to deploy functions, in this way, kudu doesn't build project by default. To enable the feature, set SCM_DO_BUILD_DURING_DEPLOYMENT to true in Application settings on portal.
As for why the default setting is false, zip deployment usually requires the content to be deployed including all related files hence there's no need to build again.
For Azure function core tools, we usually use command func extensions install to register extensions for input/output binding when extensions are not installed automatically like we create trigger from template. This is why command func start and func azure functionapp publish doesn't build extensions.csproj, extensions are supposed to be installed before we run or publish functions.
Update for DevOps deployment
With Azure pipeline, we need to build extensions.csporj before archive files. Add a .NET Core build task, arguments are -o bin.
If you want kudu to build project, go to Deployment Center under Platform features. Choose VSTS as a CI repository and kudu will build and deploy project for you.
When using Visual Studio, you'll be referencing the extension packages directly from your project in order to use their attributes and other types, so Visual Studio handles the installation process, but registration still needs to be performed.
This is handled by a custom build task, added by the Microsoft.Azure.WebJobs.Script.ExtensionsMetadataGenerator NuGet package, which must be explicitly referenced (this will be automatically brought in by the SDK/Visual Studio tools in a future update).
These are the steps you must follow to use the CosmosDB extension mentioned in our previous example:
1.Add a reference to the Microsoft.Azure.WebJobs.Extensions.Storage NuGet package
2.Add a reference to the Microsoft.Azure.WebJobs.Script.ExtensionsMetadataGenerator
3.Build the project
For more details, you could refer to this article.
I'm setting up a test pipeline using VSTS and now I have my builds working using an Azure VM as a build agent, I want to also deploy to that machine.
I've tried using the Windows File Copy and IIS deploy tasks but understand this isn't a very good solution for security reasons, so wondered what the best task to use would be to get the build/release agent on the machine to copy the artefacts to the Azure based VM and deploy locally to its IIS install?
I'd suggest that you strongly reconsider not deploying your application to your build agent. This will make it extremely difficult to find issues due to missing DLLs or files because the build server has everything. I suggest either creating another VM to deploy to or leverage Azure's PaaS for web applications.
With all of that said, because you are working locally on the same VM, you can simply leverage the Copy Files task to move the files to where they need to be. To "deploy" the application, you can simply copy the output of the website to the IIS directory.
Another option would be to create a PowerShell script that would setup, configure and deploy the application to the local machine. In which case, you could simply leverage the PowerShell task.
The source (Get sources section in build definition) will be download to build agent automatically during the build, so you don’t need to copy the files to that machine through Windows File Copy task, the simple workflow is that:
Add NuGet task to restore packages
Add Visual Studio Build task (MSBuild Arguments: /p:SkipInvalidConfigurations=true /p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageLocation="$(build.artifactstagingdirectory)\\web.zip" /P:PackageTempRootDir="")
Add WinRM-IIS Web App Deployment task: (Web Deploy package: $(Build.ArtifactStagingDirectory)\web.zip)
As virusstorm said that you can copy files to other path on that machine through Copy Files task.
On the other hand, by default the artifact will be downloaded to the target machine if you are using release and you can consider Deployment groups if the deployment machine is different with the build agent machine.
I have an existing website that I would like to deploy on Azure, using Visual Studio Team Services. The website is made up of static files, there's no ASP.NET or anything else involved.
Within Visual Studio Team Services, I created a build which executes npm install and a gulp build. This results in a dist folder containing all the files for the website. In Azure, everything is set up correctly (subscription, web app,...).
However, I'm unsure on how to push my code to Azure. Exploring the options in the Release tab in VSTS, an 'artifact' always seems to be required, but I just have a bunch of files. I need to publish the files in the dist folder and make sure index.html is served.
How can I do that?
This question is related to this one, however, the answers all state to start from Azure, and do not mention how to deploy existing code using Visual Studio Team Services.
The trick is to create the artifact yourself, which can be as simple as a zip file containing the static website files. The zip file should be copied as an artifact to the $(build.artifactstagingdirectory) target directory. Next, you can use a simple Web App deployment task to publish the zip file to Azure. If index.html is in the root directory, Azure is smart enough to serve it.
Below is a working build and deploy flow. It assumes gulp is used to build the website and write the build output (i.e. the static files) to a dist folder.
The easiest way is to deploy from a source control, if you take a look under "Settings" for your Website in the Azure portal you will probably see "Continuous deployment".
From there you can deploy from Visual Studio Team Services, Github, etc.
Every check-in will be deployed, also wrong ones, so you may want a introduce a staging environment as a deployment slot as well, where you can swap staging with production whenever you feel your site is ready for production.
Without the need to create an artifact, another solution could be FTP deployment after creating an Service Endpoint in VSTS
I'm currently using the 'Add Azure WebJob' functionality as part of the Azure SDK in Visual Studio to add a WebJob into a Web App that gets deployed to Azure. I use a custom .proj file for my build, and when I build using msbuild on the command line, the package that gets generated correctly includes the webjob. However, when I use TFS to build the custom .proj file, WITH EXACTLY THE SAME COMMAND the webjob doesn't get included. For whatever reason, the publish targets for the webjob in the targets file provided by Microsoft.Web.WebJobs.Publish nuget package simply don't get executed. Has anybody run across anything like this before ?
We have built a small MVC4 application using Azure Cloud Services. It has been deployed through Visual Studio. Now we are going add a test environment where the application should be tested, before being deployed into production.
I would like to have our CI server to build, test and create a deployable package, This package could then be deployed to any environment, providing correct configurations.
But I have not found a convenient way to do this. It is easy to build a package for a specific environment, with configuration transformations for .config and .cscfg files.
Is having the CI server to build a separate package for each environment the way to go, or have I missed something?
There are ways described how the web.config could be modified when the WebRole is starting, but this feels a bit hacky, and not the way the guys at Microsoft intended when creating Cloud Services.
Using the CI server to deploy the specific configuration has been the easiest in my experience. I think using the Visual Studio "Build" section in Team Explorer is what your looking for. We use Team Foundation Service as our Continuous Integration and Continuous Delivery server. In Visual Studio we've created Production and Testing build configurations. In the Build tab we've created a Continuous Integration Build which will kick off unit tests on every checkin, and a Continuous Delivery Build That will deploy newly tested checked in code on a regular schedule. These Build Events can be set to use a specific (Production/Testing) build configuration.