We are trying to setup CI/CD for a winform app developed using DotNet core 6. We have setup the build pipeline and it's producing the correct set of artifacts but the problem is we are unable to publish these artifacts to network file share drive (DFS server). While connecting to network file share we are getting error Incorrect username or password. Do we need a service account which would connect to network file share ? If there is a service account needed we don't see any option in release pipeline under which task in Azure Devops we need to pass username and password as the Publish artifact task allow us to define the network location without using any credentials. Can anyone please suggest.
Do we need a service account which would connect to network file share
?
Seems like you asked here too. For a self-hosted agent, yes, we need a service account that can connect to the network file share.
Basically, you can try to setup the agent with the account which has the correct permission (write permission) to access the network share. (specify the account as the agent service account).
Alternately, you can use Windows Machine File Copy task to copy the artifacts to the network share. In the task you can specify the username and password to access the network share.
Can be accomplish through service principal/agent that has write permission to the drive. You can then create scripts in the build pipeline to transfer the file to the Azure storage account blob container.
Related
I want to download ADLGen2 file to my local server using ADF.
I tried to set up self hosted integration run-time, but I am getting below error
the integration runtime (self-hosted) node has encountered an error during registration
The account through which I have logged into my azure portal and the account which has access on my local machine are completely different.
Azure Portal login : xyz_abc#gmail.com
Local machine login: officiallogin#companyname
Is the issue because of 2 completely different logins
Can you please let me know how to resolve this issue.
Thanks,
Raksha
When you say "The account through which I have logged into my azure portal and the account which has access on my local machine are completely different", make sure to uninstall and re-install the self-hosted integration runtime manually choosing the option2.
OR
You may try to uninstall the current self-hosted integration runtime. Maybe you can try the Option1. It will install and register the integration runtime automatically.
Reference: Create and configure a self-hosted integration runtime.
I'm using VSTS to deploy a azure cloud service package and getting a file access issue for web.config not able to be written to after a successful deployment.
At the moment I manually set the file permission via RDP to correct it for eg. on e:\siteroot\1\Web.config (Everyone FC)
To avoid this manual step - how can I set a folder/file permission for a file under cloud service deployment.
The issue seems not related to azure projects directly. It's not suggest to give a permission with Everyone FC, this may cause security risk.
For Azure Cloud Services Web Roles, the default Application Pool Identity account is “Network Service”.
In a normal basis the Application Pool account needs read permission over the web.config file so it can read all the application configuration.
For this kind of issue, you could creating a Startup Task to give write permission to Network Service in the application Web.config file.
Please go through the detail steps in this similar issue: Error “Access to the path ‘E:sitesrootWeb.config’ is denied” when storing Azure AD’s public key in Web.config of an Azure Cloud Services application.
Also take a look at this related question: Web.config Access denied when Package Azure cloud services projects in Visual Studio 2013 for Web
You can create a .bat file where you call the icacls command with the parameters you prefer:
How to grant permission to users for a directory using command line in Windows?
and then you can configure a startup task for your cloud service to run this file:
https://learn.microsoft.com/en-us/azure/cloud-services/cloud-services-startup-tasks
We are trying to upload the artifact to blob storage from TFS build server. AzCopy task needs the azure subscription details, which is not available to us. We need to upload the artifacts to azure blob storage using azure blob storage connection string. Is there a way to upload files to blob storage using connection string only.
Anything you can do from PowerShell you can do from build and release. There is a task named "PowerShell" and one named "Azure PowerShell". If you don't have the Azure subscription details I doubt you will be able to use the "Azure PowerShell" task. However, if you have a PowerShell script you run locally that works you might be able to simply run it as part of your build with the "PowerShell" task.
Option two is have someone that knows the details to create an Azure Service Endpoint for you. They never have to share the details with you to make the connection. Once the connection is created you can use it without having to ever know the details.
I have two azure entities I'm trying to connect, a build agent running my nuget deployments and a VM (Windows Server 2012 R2) on which I'm trying to set up symbol hosting using SymStore.
From this page I am told to Grant full control permission to the build agent service account. How do I go about doing this? In the Publish Symbols build step, I can enter a URL to store the symbols, but I have no way of logging the build agent into the VM as the user that I've given full control permission to.
This setup is actually not possible; from the SymSrv docs: writes to an http-based symbol store are not possible.
It appears that the SymStore has to be on the same machine as the Azure build agent to give it full control permission. I happened to discover that NuGet will let me publish .pdbs, so I'm going to try that route instead.
I have successfully implemented Jenkins to deploy to a server hosted locally, but now I need to create a job to deploy to a Azure hosted website running on PaaS. Both the Jenkins host and Website hosts are Windows machines.
I have found a link for setting up a virtual machine template for Azure Slave plugin, but there is no VM because it is IaaS and I dont have additional slaves in this case.
I am asking about the plug ins and process flow please.
Which Azure Plugin should I use in Jenkins (if any)?
E.g. Azure PublisherSettings Credentials plugin
Do I use the Get-AzurePublishSettingsFile and Import-AzurePublishSettingsFile ?
Would these contain all the relevant details required for Jenkins to know
where to copy to?
Would I create a zip file of the build, upload the zip to BLOB storage,
and then extract it to the website?
Is it possible to upload a zip file and then proceeding to extract the files once the whole file has been uploaded?
If the connection is interrupted at any stage while uploading 1000 individual files then the website will be unstable and therefore I need to investigate a single file upload with extraction thereafter.
So if I were you I'd do the following:
1. Install jenkins powershell plugin, install Azure PowerShell commandlets.
2. Create a job in Jenkins that creates a the zip file and uploads it to Azure Storage
3. Create an ARM template to deploy Azure WebApp from the zip file in Azure Storage.
4. Create a job to deploy said template.
So the ARM template would take the zip file and upload it to the Azure WebApp and the WebApp would handle all the hassle with the zip file internally.