We currently have our source-code in Bitbucket. Which in his turn, deploys the development code to Azure.
We have a local Jenkins instance that we want to use to perform some tests against the Azure instance.
Is there a possibility to monitor Azure and start running tests when the deploy has finished?
If I just poll the SCM I have no guarantee that the app is deployed, I will only know that someone checked in some code.
Thanks in advance.
Regards
This article helped me fix the problem: https://microsoft.github.io/techcasestudies/azure%20app%20service/devops/2016/12/20/Arena.html
The key is to use their code example to create the PostDeploymentActions folder and copy any scripts to that folder after the deployment. They will get executed autonatically if you are using Kudu.
Related
My website is already deployed and can be accessed by a server I've already provisioned.
I created an automated selenium webdriver test in Visual Studio to test my website. Now I want this to run on a nightly basis automatically, would it be possible to use Azure DevOps for this? I've been checking tutorials on running Automated Tests in Azure DevOps and it always includes deploying the website in Azure DevOps too, which is not applicable for my site.
Now I want this to run on a nightly basis automatically, would it be
possible to use Azure DevOps for this? I've been checking tutorials on
running Automated Tests in Azure DevOps and it always includes
deploying the website in Azure DevOps too, which is not applicable for
my site.
Yes, it's possible. If your source code is in Azure Devops repo, you can add Visual Studio build task to generate the test assembly. And then use Visual Studio Test task to run the tests.
Or if you've got the test assembly in one path, just specify the search folder and Test files:
You don't need deploy-related tasks since your website is deployed. Just make sure your website is running, and then install self-hosted agent in the server you've already provisioned. Run the pipeline with that self-hosted agent from Default agent pool, then you can run tests easily for your website.
We are currently testing this where I'm just running the test.
This is what my Azure pipeline looks like:
Basically we have a console app running the test (With NUnit) on the server.
We deploy the app that contains all the tests. We have variables for the URL to use in the pipeline with the login user etc that will populate our runsettings file.
We then create a Virtual VS Session then run the .dll containing the selenium tests.
Export the .trx report and mail it out.
Hope this helps a bit
We are deploying into azure using Octopus deploy. We are using it since more than a year, and suddenly we started (about 3 weeks ago) to get errors on few deployments.
Microsoft.Web.Deployment.DeploymentDetailedClientServerException: Web Deploy cannot modify the file 'msvcr120.dll' on the destination because it is locked by an external process. In order to allow the publish operation to succeed, you may need to either restart your application to release the lock, or use the AppOffline rule handler for .Net applications on your next publish attempt. Learn more at: http://go.microsoft.com/fwlink/?LinkId=221672#ERROR_FILE_IN_USE
We have the webapp running and always on and we have the app setting 'MSDEPLOY_RENAME_LOCKED_FILES' to 1 that in theory prevents this.
Does anyone knows if something was changed in azure or octopus?
There are a number of reasons files may be locked during deployment. You should be able to get an idea of what may be locking files by using the kudu process explorer, which you can access using the url {yoursite}.scm.azurewebsites.net.
In order to avoid the locking issue altogether, you could make use of slots to achieve a zero downtime deployment if that's an option for you. In this case you could stop the site or enable App Offline which should unlock any files and allow the deployment to succeed after which a slot swap will make the deployment live. App Offline is preferred over using MSDEPLOY_RENAME_LOCKED_FILES, but will take the application offline during the deployment. Octopus also has support for this as an option on the Deploy an Azure Web App step itself, so may be worth a try even without slots.
You can use custom pre/post deployment scripts as part of your Deploy an Azure Web App to make use of the Stop-AzureRmWebAppSlot, Start-AzureRmWebAppSlot and Switch-AzureRmWebAppSlot Powershell commands Azure commandlets to achieve the above.
An alternative may be to use zip deployments, however, the Deploy an Azure Web App Octopus step doesn't have first class support for this quite yet. It can still be achieved using a Run an Azure PowerShell Script along with a package references if this is what you are wanting to do.
I have a deployment process that places everything needed within a repository which my azure AppService is already configured to pull from.
This deployment process is fully automated and works well.
I would like to amend this deployment process to include one or more console applications which would then be configured to be run as WebJobs, either when triggered, or on a continuous basis.
However the configuration for webjobs appears to want me to upload the .exe during configuration, rather than point at a pre-existing .exe.
This seems less than optimal, because it suggests that I'll have to reupload each time said console app changes.
It would be far more convenient to be able to point to a known location within the AppService which contained the full deployment of the WebJob console App.
Is there a way to achieve this?
As I know, the deployment process you want couldn't be done. No matter on which way WebJob is deployed, Job is copied to the file system on the Kudu in essence. And WebJob is a function depending on Web App Service, so the deployment couldn't be processed as a whole. You could read the Wiki.
From your description, I suggest you using Azure Functions. You also could use TimeTriggerăBlobTriggerăHTTPTrigger etc. You could write just the code you need for the problem at hand, without worrying about a whole application or the infrastructure to run it.
If you still have questions, please let me know.
I've just completed an automation script that:
downloads a project build to local storage (worker role)
installs ruby, apache, and other dependencies
configures apache and the RoR application to serve requests via port 81
This is all working locally. I'm working with visual studio; running the application successfully pulls the local machine from "blank slate" to "serving requests".
I'm now trying to push this up to Azure - no longer using the local machine, but an actual worker role.
I've packaged the project and uploaded it to a production environment via my Azure subscription portal, but navigating to the site URL doesn't give me anything (site not found).
I'm a bit new to Azure. What steps do I need to take to ensure that this application will work up in the cloud? I feel like I've forgotten to configure something, like the endpoint port (81). Any advice or recommended reading would be super helpful; thank you so much for your time!
If you need some real assistance to troubleshoot the problem, tt would be best to see the following 3 things:
Your automation script (Startup Task)
Your worker role OnStart() function
Your ServiceDefinition and ServieConfiguration
Are you using ProgramEntryPoint to luanch your RoR app or you are doing all of this in Startup task? Based on above info, it is easy to understand the application architecture and some suggestions can be made.
However, the best way to troubleshoot this problem is to enabled RDP access to your Azure Worker Role VM and then Log into your Azure VM to understand what is going on. RDP access to Azure VM will validate that your install script ran correctly and all the modules were started.
As your are new to Windows Azure there could be several things could be missing and if you provide more info you will get accurate help instead of some guess work.
I have a created a sample web role application using cloud service. Before hosting my application in cloud, i want to test the application in Dev Fabric. I am sure that when we run the application from VS, it creates an environment that simulates the cloud.
But, if I want to give my application for testing to QA, do I still need to give my source to them and run the application from VS under Dev Fabric or is there any other ways in running my deployed package under Dev Fabric.
In a line, my question is: How do i run my packaged Azure application under Dev Fabric before hosting in Cloud?
Can anyone having an idea, please share me some information?
Thanks for your quick response. CSRun command helped in accoomplishing my requirement. But i can see that it is taking an IP Address, http://127.0.0.1:80/ by default.
Also i am trying to find it out that, is there a way we can change this to a proper name instead of using like an IP?
for ex: http://localhost/ or
with deployed machine name like http://applicationserver/webrole1/ - so that we can access this from any machine in the netwrok.
I went through the Dev Fabric UI, where we can see the curent instances running, but i didnt find any options for these.
Please share me some information on this.
When you run your application locally, a different kind of package gets created (actually a directory) with a .csx extension.
As long as you have that .csx directory and your configuration file (.cscfg), you can run the package by using the "csrun" command. (So no, you don't need Visual Studio.)
You can use this blog post to access azure services running in DevFabric (DF) from other boxes -
http://blog.ehuna.org/2009/10/an_easier_way_to_access_the_wi.html