Jenkins pipeline does not deploy to azure - azure

im trying to deploy azure web app to azure form git through jenkins pipeline
code looks like this
pipeline {
agent any
stages {
stage ('Checkout') {
steps {
checkout([$class: 'GitSCM', branches: [[name: '*/develop']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'creds', url: 'https://xx.git']]])
}
}
stage ('Development - NuGet restore') {
steps {
bat """
C:\\nuget\\nuget.exe restore "%WORKSPACE%\\src\\xx.sln"
"""
}
}
stage ('Development - MSBuild') {
steps {
bat """
"C:\\Program Files\\dotnet\\dotnet.exe" msbuild "%WORKSPACE%\\src\\xx.sln" /p:VisualStudioVersion=15.0 /p:BuildInParallel=true /m:8 /p:Configuration=Release /p:DeployOnBuild=true /t:Clean,Build
"""
}
}
stage ('Development - Deploy') {
steps {
azureWebAppPublish appName: "xx",
azureCredentialsId: 'xx',
resourceGroup: "xx",
filePath: 'xx'
}
}
}
post {
failure {
xxx....;
}
}
}
Output form azure deployment plugin is:
Starting Azure Web App Deployment
Cloning repository https://xx.scm.azurewebsites.net:443/gitfile.git
c:\Program Files\Git\cmd\git.exe init C:\Program Files (x86)\Jenkins\workspace\xx.azure-deploy # timeout=10
Fetching upstream changes from https://xx.scm.azurewebsites.net:443/gitfile.git
c:\Program Files\Git\cmd\git.exe --version # timeout=10
using GIT_ASKPASS to set credentials
c:\Program Files\Git\cmd\git.exe fetch --tags --progress https://xx.scm.azurewebsites.net:443/gitfile.git +refs/heads/:refs/remotes/origin/ # timeout=10
c:\Program Files\Git\cmd\git.exe config remote.origin.url https://xx.scm.azurewebsites.net:443/gitfile.git # timeout=10
c:\Program Files\Git\cmd\git.exe config --add remote.origin.fetch +refs/heads/:refs/remotes/origin/ # timeout=10
Seen 0 remote branches
c:\Program Files\Git\cmd\git.exe add -A # timeout=10
Deploy repository is up-to-date. Nothing to commit.
Done Azure Web App deployment.
Plugin is trying to fetch changes form actual azure webapp url which is obviously wrong, using correct git file name.
How is this possible? Is there any way to supply git repo url as a parameter to azure plugin?
Thanks!

The comments part from this Azure Function Plugin link talks regarding similar issue (i.e., regarding issue of 'Deploy repository is up-to-date. Nothing to commit.') and the comment provided is to explicitly have a process to make sure the changed files are included in the 'Files' list. For more information and comparison with Azure App Service plugin, please refer it. Hope this helps!!

Related

Renovate bot Azure DevOps npm feed auth error

I get an 401 error if I try to use my private npm registry in Azure DevOps. My configuration looks like this:
# pipeline.yaml (repo root folder)
steps:
- task: npmAuthenticate#0
inputs:
workingFile: .npmrc
- script: |
git config --global user.email 'bot#renovateapp.com'
git config --global user.name 'Renovate Bot'
npx --userconfig .npmrc renovate
env:
TOKEN: $(System.AccessToken)
PAT: $(PAT)
# config.js (repo root folder)
module.exports = {
platform: 'azure',
endpoint: 'https://devops.<url>.de/.../',
logLevel: 'debug',
token: process.env.TOKEN,
repositories: ['...'],
enabledManagers: ["npm"],
hostRules: [
{
enabled: true,
hostType: 'npm',
matchHost: 'devops.<url>.de',
token: process.env.PAT,
},
],
};
# .npmrc (repo root folder)
registry=https://devops.<url>.de/Collaboration/_packaging/.../npm/registry/
always-auth=true
The installation of renovate works and my registry get used for it. But renovate itself runs into a 401. How can I tell renovate to use the .npmrc generated from the `npmAuthenticate#0` task?
Error stack:
ERROR: Repository has unknown error (repository=...)
"err": {
"statusCode": 401,
"message": "Failed request: (401)",
"stack": "Error: Failed request: (401)\n at RestClient.<anonymous> (/root/.npm/_npx/05eeecd92f4e18e0/node_modules/typed-rest-client/RestClient.js:202:31)\n at Generator.next (<anonymous>)\n at fulfilled (/root/.npm/_npx/05eeecd92f4e18e0/node_modules/typed-rest-client/RestClient.js:6:58)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)"
}
The renovate command will modify the repo you defined in the config.js file(e.g. repositories: ['...']).
Since you are using the $(System.AccessToken) as authentication method, you need to grant Contribute permissions (e.g. Contribute,Contribute to pull requests,Create branch ) of target repo to the corresponding build service account.
Project Level Build Service Account Name: Your-project-name Build Service (your-collection-name)
Organization Level Build Service Account Name: Project Collection Build Service (your-collection-name)
You can navigate to Project Settings -> Repositories -> Target Repo -> Security and grant the Contribute permission to the two build service account.
For example:
For more detailed info, you can refer to this doc: Manage build service account permissions
On the other hand, if you need to update the repo from another project. You need to disable the option: Limit job authorization scope to current project for non-release pipelines in Project Settings -> Settings.
It seems that the official renovate docs for azure devops with a private feed isn't correct. This works for me:
Give the pipeline "Build User" contribute permissions on the feed:
Azure Devops Artifacts -> Settings -> Permissions -> Add the user/service that runs the pipeline with contributor.
azure-pipelines.yml
schedules:
- cron: '0 3 * * *'
displayName: 'Every day at 3am'
branches:
include: [main]
always: true
trigger: none
pool:
vmImage: ubuntu-latest
steps:
- task: npmAuthenticate#0
inputs:
workingFile: .npmrc
- bash: |
git config --global user.email 'bot#renovateapp.com'
git config --global user.name 'Renovate Bot'
npx --userconfig .npmrc renovate
env:
LOG_LEVEL: DEBUG
TOKEN: $(System.AccessToken)
RENOVATE_TOKEN: AZURE_DEVOPS_PAT_TOKEN_HERE
GITHUB_COM_TOKEN: REPLACEME
config.js
The important part here is to not use "pkgs.dev.azure.com" as the matchHost value, instead you can see in the debug logs if the feed is different on the 401'd requests, in my case it's "ORG_NAME_LOWERCASED.pkgs.visualstudio.com".
const repositories = require("./repositories");
// Security token used by the running build
const pipelineToken = process.env.TOKEN;
const patTokenForFeed = process.env.RENOVATE_TOKEN;
module.exports = {
platform: "azure",
endpoint: "https://dev.azure.com/ORG_NAME/",
token: pipelineToken,
hostRules: [
{
hostType: "npm",
matchHost: "ORG_NAME_LOWERCASED.pkgs.visualstudio.com",
username: "apikey",
password: patTokenForFeed,
},
],
repositories
};
.npmrc
registry=https://pkgs.dev.azure.com/ORG_NAME/PROJECT_NAME/_packaging/FEED_NAME/npm/registry/
always-auth=true

How to Sync Azure with Github, issues: Cannot find path 'D:\a\1\s\ because it does not exist; Repository not found fetch failed; Filename too long

OK, so the main question here is: How to Sync Github with Azure?
Now, I had main reference source: How to synchronize Azure Repos with External Git Repos
https://faun.pub/how-to-synchronize-azure-repos-with-external-git-repos-70ff92e51c63
And that is perfect match, but here is a catch, it is kind of abstract and you have to have work with Azure experience, like knowing what is YAML for and how to use it.
Long story short, it did not work. And then I found this gentleman's video: Merge From Github to Azure DevOps
https://www.youtube.com/watch?v=Kks1pCG51bI
That is super close, yet there still were several error and bugs ,like:
Cannot find path 'D:\a\1\s\copyrepo~' because it does not exist;
remote: Not Found fatal: repository 'https://github.com/***/' not found ##[warning]Git fetch failed with exit code 128, back off 5.443 seconds before retry.
error: unable to create file Filename too long
So, that is a bummer...
I mean, yes, you technically can create and synchronize GitHub repo with Azure, but you have to create new GitHub repo, with existing one there was: error: unable to create file Filename too long.
Please tell me what you think.
Try to use the following yaml file as I used to sync Github to Azure Repos:
name: Sync Azure with Github
variables:
REMOTE_ADDR: 'https://github.com/{user{/{repo.git}'
stages:
- stage: syncing Repos
displayName: syncing Repos
jobs:
- job: run_Git_Commands
displayName: run_Git_Commands
continueOnError: false
steps:
- checkout: self
clean: true
persistCredentials: true
displayName: run_commands
- bash: |
git checkout master
git remote add repoGithub $(REMOTE_ADDR)
git fetch repoGithub master
git reset --hard repoGithub/master
git pull --rebase repoGithub master
git push --force origin

Publish button on Azure Synapse using code

Hello guys I'm currently working with Azure Synapse studio. My situation could be described in this way:
I have 3 env: Dev, Test and Prod, each of them has a Azure synapse workspace but I can access only to the Dev one. I need to make some changes from Dev also for the other 2 env (sql script, pipelines etc) and then publish them to other env without touching them.
So I think Azure DevOps can be the solution.
From Dev Syanapse studio Workspace I created 3 branches 1 per env, all of them linked to an Azure DevOps repo. Also Test and Prod are linked to the same repo.
The problem is that the code on Test and Prod workspace could be different from the code on Dev. So I can't use the same ARM template (generated by publishing on the publish branch of the workspace) for all the 3 environment. A good way could be find a way to hit the publish button also on the other envs without using the portal, for example by a REST API ? It is possible ?
Now I only set up the 3 branch solution so I can magae the 3 env directly from Dev env but I think that this will not be the right solution, are changes applied on other envs ? Can I run SQL scripts or pipelines manually from other envs ?
This is my current situation on the other envs I asked to set collaboration and publish branch with the same value as the env branch name (test-test-test and prod-prod-prod)
with the new version (V2) of the Synapse workspace deployment (in Preview 2022-06), it is now possible to deploy from any branch using Azure Devops, so no need for a workspace_publish branch or the Publish button.
Just make the object json files available as artifacts to the release pipeline, and select "Validate and deploy" as the Operation Type.
I am working with Microsoft directly, building a Synapse warehouse myself for a large corporation. We have the same issue, in that the Publish button must be pressed manually for the ARM templates to be generated. Microsoft have confirmed that there is no automatic method for this available right now; we had hoped to receive a Preview AzDevOps deployment task this month, but it turns out that it simply allows us to validate the JSON assets - it still deploys using the ARM template.
We have also looked at using Azure Data Factory tools to deploy from the JSON component files, but we run into issues with the dedicated pool stored procedure tasks being unsupported. :(
The only standard option to achieve this is by creating GitHub repository and then creating Continuous Integration and creating a self-hosted Azure DevOps VM agent or use an Azure DevOps hosted agent.
Then you can setup release pipelines in Azure DevOps to work with different environments. But still you need to commit the changes in the GitHub repository for each environment, there is no Publish button kind of this available.
Refer Continuous integration and delivery for an Azure Synapse Analytics workspace for more details.
This was bothering me as well, so put together the following to be run once any PR is approved to merge into the Synapse collaboration branch, in our case, "main".
For your case, you can modify to target the relevant workspaces.
See below Azure DevOps pipeline code.
What it does is:
runs the Synapse workspace validation task, which also generates the workspace template jsons as an artifact that need to be published to the workspace_publish branch.
It will then check out your publish branch and commit and push the templates that were generated from the previous task
Such that the workspace UI does not think there are any unpublished changes when you click the "Publish" button, we need to update the workspace configuration to reflect the latest commit ID from the workspace COLLABORATION branch (main in this example) that was used to generate what we pushed to the PUBLISH branch in the previous step.
Any suggestions/improvements welcome. Hope this helps.
name: $(TeamProject)_$(Build.DefinitionName)_$(SourceBranchName)_$(Date:yyyyMMdd)$(Rev:.r) # sets Build.BuildNumber
trigger:
branches:
include:
- main
paths:
include:
- synapse/*
resources:
repositories:
- repository: 'Synapse-Publish'
type: git
name: Synapse # update to the name of your repo
ref: workspace_publish # update to the name of your synapse PUBLISH branch
variables:
repoName: $(Build.Repository.Name)
azureSubscription: your_subscription
azureTenantId: your_tenant_guid
adoOrg: your_azure_devops_org_name
adoProject: your_azure_devops_project_name
SourceWorkspaceName: your_synapse_workspace_name
workspacePublishBranch: workspace_publish # should be the same for you but update if not
stages:
- stage: build_stage
displayName: Build, Run Validations, Publish NonProd if merged to main
jobs:
# other jobs excluded from this snippet
- job: publish_workspace_artifacts_job
displayName: Publish for $(SourceWorkspaceName) $(workspacePublishBranch)
# only kick off workspace publish job for non-PR builds
condition: and(not(or(failed(), canceled())), ne(variables['Build.Reason'], 'PullRequest'))
pool:
name: 'linux-vmss' # update this for whatever you need
steps:
- checkout: self # main
clean: true
persistCredentials: true
- task: Synapse workspace deployment#2
displayName: Generate workspace artifact templates
condition: true
continueOnError: false
inputs:
operation: 'validate' # despite this name, it also generates the templates
ArtifactsFolder: '$(Build.SourcesDirectory)/$(repoName)/synapse'
TargetWorkspaceName: $(SourceWorkspaceName)
- checkout: 'Synapse-Publish' # workspace_publish
clean: true
persistCredentials: true
- task: CmdLine#2
displayName: 'Set git user'
inputs:
workingDirectory: '$(System.DefaultWorkingDirectory)'
failOnStderr: true
script: |
git config --global user.email "whatever.you.want#your_org.com"
git config --global user.name "Whatever You Want"
- task: AzurePowerShell#5
displayName: Publish to $(SourceWorkspaceName) $(workspace_publish)
condition: true
inputs:
azureSubscription: '$(azureSubscription)'
ScriptType: InlineScript
Inline: |
# the output from the workspace validate step above are saved here, also published as artifact with name = the synapse workspace name
# Get-ChildItem $(Build.SourcesDirectory)/ExportedArtifacts -Name
cd $(Build.SourcesDirectory)/$(repoName)
git pull origin $(workspacePublishBranch)
git switch $(workspacePublishBranch)
Move-Item -Path $(Build.SourcesDirectory)/ExportedArtifacts/*.json -Destination $(Build.SourcesDirectory)/$(repoName)/$(SourceWorkspaceName) -Force -Verbose
git add $(Build.SourcesDirectory)/$(repoName)/$(SourceWorkspaceName)/*.json
$diff = git diff --cached
$status = git status
if (!($status.ToLower() -like "*nothing to commit*"))
{
echo "##[section]git push changes to repo";
git commit -m "Update $(workspacePublishBranch) for source workspace $(SourceWorkspaceName) [skip ci]";
git pull --rebase;
git push origin $(workspacePublishBranch);
}
else
{
echo "##[warning]No new changes to push for source workspace $(SourceWorkspaceName) templates";
git reset –-hard origin/$(workspacePublishBranch)
git clean -fxd
}
azurePowerShellVersion: 'LatestVersion'
- task: AzurePowerShell#5
displayName: Update $(SourceWorkspaceName) Git Config # this is required so when you click "Publish" within the workspace it doesn't think there are any changes vs. what's already published
inputs:
azureSubscription: '$(azureSubscription)'
ScriptType: InlineScript
Inline: |
# get latest version of this module which now has the LastCommitId parameter that we need
Install-Module -Name Az.Synapse -Confirm:$false -RequiredVersion 1.5.0 -Force
Import-Module -Name Az.Synapse -MinimumVersion 1.5.0
cd $(Build.SourcesDirectory)/$(repoName)
[String] $latestCommitHash = git log -n 1 origin/main --pretty=format:"%H" # format to get only the hash value of the latest commit
$config = New-AzSynapseGitRepositoryConfig `
-RepositoryType AzureDevOpsGit `
-TenantId $(azureTenantId) `
-AccountName $(adoOrg) `
-ProjectName $(adoProject) `
-RepositoryName $(repoName) `
-CollaborationBranch main `
-RootFolder "/synapse" `
-LastCommitId $latestCommitHash
echo "##[section] Updating $(SourceWorkspaceName) git configuration to point to the latest main branch commit ID"
# see https://learn.microsoft.com/en-us/powershell/module/az.synapse/update-azsynapseworkspace?view=azps-8.0.0
Update-AzSynapseWorkspace -Name $(SourceWorkspaceName) -GitRepository $config
azurePowerShellVersion: 'LatestVersion'

Run Python Code within Azure Devops Pipeline and Export output to folder in Devops Repos ($(Build.SourcesDirectory))

How could I rewrite this python script so that it can run within azure devops pipeline and export the dataframe as a csv to the devops repository. I'm able to achieve this locally but would like to achieve this remotely.
Put different, how can I export a pandas dataframe to devops repos folder as a csv file using an azure devops pipeline task. Below is the python script that needs to run as a pipeline task.
local_path in this case should be azure devops path.
from azureml.core import Workspace, Dataset
local_path = 'data/prepared.csv'
dataframe.to_csv(local_path)
⚠️You really should not do this. Azure pipelines are for building code, not for processing data. Assuming that you meant Azure DevOps Pipelines, opposed to Azure ML Pipelines.
Also you should not commit data to your repository.
If you still want to proceed, here is an example for what you try to achieve. Note that for the last line, i.e. git push, you need to give the agent permission to write the repository. See Run Git commands in a script for an approximate☹️ documentation on how to do that on your account.
trigger: none
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
persistCredentials: true
- task: UsePythonVersion#0
inputs:
versionSpec: '3.8'
addToPath: true
architecture: 'x64'
- script: |
python your_data_generating_script.py
git config --global user.email "you#example.com"
git config --global user.name "Your Name"
git add data/prepared.csv
git commit -m'test commit'
git push origin HEAD:master
displayName: 'push data to master'

How to publish a maven project in Artifactory and Jenkins pipeline script

I have a Maven SpringBoot Project. I want to push it to a cloud foundry. For that I have written a groovy pipeline script for Jenkins.
What I have to add? in the script and/or in the pom.xml to publish it in Artifactory so that Jenkins will pull the code from Git and publish it to Artifactory.In another enviroment I'll pull the artifactory versioned JAR and Push it to Cloud Foundry.
say my project's groupId is com.example, artifactid is XYZ and version is 1.0-SNAPSHOT.
So just to be clear, you aren't going to publish the code to Artifactory, you're going to publish the artifacts that the Maven step in your pipeline produces. You can do this right from Maven (maven deploy or maven release:prepare release:perform) provided you have your Maven configuration setup to properly authenticate with the target Artifactory server.
You can also use the Artifactory plugin which provides steps to accomplish this:
https://jenkins.io/doc/pipeline/steps/artifactory/
Finally, you can use the Cloud Foundry plugin in another stage of your current pipeline, or in a 'deployment' pipeline to deploy the artifact(s) to your Cloud Foundry instance.
https://jenkins.io/doc/pipeline/steps/cloudfoundry/
If you are using gitlab and talking about jenkins pipeline script , you have to create a pipeline job. below is the sample groovy script that you can enhance.
node {
stage ("Checkout"){
checkout changelog: false, poll: false, scm: [$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [[$class: 'LocalBranch', localBranch: 'master']], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'your_git_cred', url: 'your_gitlab_url']]]
}
stage ("Build and Push to Artifactory"){
tool name: 'Maven3.0', type: 'maven'
sh "mvn clean deploy"
}
}
In the pom , you have to add the artifactory location in the distribution management
<distributionManagement>
<repository>
<id>artifactory</id>
<url>your_artifactory_url</url>
</repository>
<snapshotRepository>
<id>artifactory</id>
<url>your_artifactory_url</url>
</snapshotRepository>
</distributionManagement>
More information about jenkins -pipline
https://jenkins.io/doc/book/pipeline/

Resources