I am using google cloud build for CI/CD purpose, in which I need to give access for specific repositories (can't use All repositories option). Is there any way to provide access by repository wise through python code. If not possible through python is there any alternative to this requirement.
Thanks,
Raghunath.
When I checked with GitHub support, they have shared me below links.
To get repository id -- https://developer.github.com/v3/repos/#get-a-repository
To get installations details -- https://developer.github.com/v3/orgs/#list-installations-for-an-organization
To add repository to an installation -- https://developer.github.com/v3/apps/installations/#add-repository-to-installation
I used these links to create below mentioned code which has helped me to implement the desired requirement.
# Header to use in request
header = dict()
header['Authorization'] = 'token %s' % personal_token
header['Accept'] = 'application/vnd.github.machine-man-preview+json'
# Fetching repository id
url = f'https://api.github.com/repos/{organization_name}/{repo_name}'
output = requests.get(url, headers=header)
repo_id = output.json()['id']
# Adding repository to Google Cloud build installation
url = f'https://api.github.com/user/installations/{gcb_installation_id}/repositories/{repo_id}'
output = requests.put(url, headers=header)
Related
I am working on a Atlassian Jira cloud product. I have a requirement to get the details of Jira cloud assets and update a field with some specific key-value pair. To achieve same, I have chosen python api to interact with Atlassian jira cloud and do those changes.
But, I am unable to access jira cloud assets using the atlassian-python-api which I have got from the https://pypi.org/project/atlassian-python-api/. Even I tried with different versions of it to achieve same.
Here is my sample code.
from atlassian import Jira
# Enter your Jira Cloud site URL and API token
JIRA_URL = "https://<your-domain>.atlassian.net"
JIRA_API_TOKEN = "<your-api-token>"
# Initialize the Jira API client
jira = Jira(
url=JIRA_URL,
username="",
password="",
cloud=True,
api_token=JIRA_API_TOKEN,
)
# Fetch a list of all assets
assets = jira.assets().search("")
# Print the asset details in a tabular format
print("Asset ID\tAsset Type\tName\tDescription")
for asset in assets:
print(f"{asset['id']}\t{asset['type']}\t{asset['name']}\t{asset['description']}")
But, getting below error.
assets = jira.assets().search("")
AttributeError: 'Jira' object has no attribute 'assets'
After that, I tried to fetch with different components like - jira.jira_service_desk().get_all_assets() , jira.get_all_assets() and jira.cloud.asset_management.get_all_assets() . But, everytime I am facing respective issue like 'Jira' object has no attribute <>`.
Could you please suggest a way make the bulk operations on jira-cloud-assets.
FYI, Even I went with the atlassian provided api - https://developer.atlassian.com/cloud/assetsapi/rest/api-group-assets/#api-asset-get - It also doesn't help to get them.
Expecting a solution to make write operations on Jira-Cloud assets.
I try to request the number of stars and commits of a public software repository in GitLab using its Python client. However, I keep getting GitlabHttpError 503 if executing the following script.
import gitlab
import requests
url = 'https://gitlab.com/juliensimon/huggingface-demos'
private_token = 'xxxxxxxx'
gl = gitlab.Gitlab(url, private_token=private_token)
all_projects = gl.projects.list(all=True)
I read the previous posts but none of them works for me: [1], [2], and [3]. People mentioned:
Retrying later usually works [I tried in different periods but still got the same error.]
Setting an environment variable for no_proxy [Not sure what it means for me? I do not set the proxy explicitly.]
The 503 response is telling you something - your base URL is off. You only need to provide the base GitLab URL so the client makes requests against its api/v4/ endpoint.
Either use https://gitlab.com only, so that the client will correctly call https://gitlab.com/api/v4 endpoints (instead of trying https://gitlab.com/juliensimon/huggingface-demos/api/v4 as it does now), or skip it entirely when using GitLab.com if you're on python-gitlab 3.0.0 or later.
# Explicit gitlab.com
url = 'https://gitlab.com'
gl = gitlab.Gitlab(url, private_token=private_token)
# Or just use the default for GitLab.com (python-gitlab 3.0.0+ required)
gl = gitlab.Gitlab(private_token=private_token)
Edit: The original question was about the 503, but the comment to my answer is a follow-up on how to get project details. Here's the full snippet, which also fetches the token from the environment instead:
import os
import gitlab
private_token = os.getenv("GITLAB_PRIVATE_TOKEN")
gl = gitlab.Gitlab(private_token=private_token)
project = gl.projects.get("juliensimon/huggingface-demos")
print(project.forks_count)
print(project.star_count)
Scenario: I need to get when was the latest commit done in the repo and by whom, and to which branch did that user do the commit on?
Solution:
I'm using python azure.devops module. and here is my code:
cm_search_criteria = models.GitQueryCommitsCriteria(history_mode='firstParent', top=10)
commits = git_client.get_commits(repo.id, search_criteria=cm_search_criteria, project=project.name)
for i in commits:
datetimeobj = datetime.strptime(i.committer.date.strftime("%x"), '%m/%d/%y')
last_commit_on = datetimeobj.date()
last_commit_by = i.committer.email
break
Now how do I get the branch name to which the user had committed the code? In the UI we can see the branch name... how can i get the same data using Azure Devops REST API ?
enter image description here
you may need to use Stats - List to retrieve statistics about all branches within a repository, then evaluate the latest commit, you also need to consider argument of baseVersionDescriptor.versionOptions of firstParent
I'm not sure if Python wrapper module support this seems like the github project is achieved now.
I am working on GitHub and Google Cloud Build integration for CI/CD.
I am able to link repository to Google Cloud build installation in GitHub using below code.
import requests
header = dict()
header['Authorization'] = 'token %s' % personal_token
header['Accept'] = 'application/vnd.github.machine-man-preview+json'
# Fetching repository id
url = f'https://api.github.com/repos/{organization_name}/{repo_name}'
output = requests.get(url, headers=header)
repo_id = output.json()['id']
# Adding repository to Google Cloud build installation
url = f'https://api.github.com/user/installations/{gcb_installation_id}/repositories/{repo_id}'
output = requests.put(url, headers=header)
When I try to create trigger with below code, I am getting an error google.api_core.exceptions.FailedPrecondition: 400 Repository mapping does not exist
from google.cloud.devtools import cloudbuild_v1
client = cloudbuild_v1.CloudBuildClient()
build_trigger_template = cloudbuild_v1.types.BuildTrigger()
build_trigger_template.description = 'test to create trigger'
build_trigger_template.name = 'github-cloudbuild-trigger1'
build_trigger_template.github.name = 'github-cloudbuild'
build_trigger_template.github.pull_request.branch = 'master'
build_trigger_template.filename = 'cloudbuild.yaml'
response = client.create_build_trigger('dev',
build_trigger_template)
Can anyone please help me on how to connect (map) a GitHub repository to Google Cloud build project using python or any other automated process. I am able to manually map repository but need to automate.
Thanks,
Raghunath.
I have connected Visual Studio Online to my Azure website. This is not a .NET ASP.NET MVC project, just several static HTML files.
Now I want to get my files uploaded to Azure and available 'online' after my commits/pushes to the TFS.
When a build definition (based on GitContinuousDeploymentTemplate.12.xaml) is executed it fails with an obvious message:
Exception Message: The process parameter ProjectsToBuild is required but no value was set.
My question: how do I setup a build definition so that it automatically copies my static files to Azure on commits?
Or do I need to use a different tooling for this task (like WebMatrix).
update
I ended up with creating an empty website and deploying it manually from Visual Studio using webdeploy. Other possible options to consider to create local Git at Azure.
Alright, let me try to give you an answer:
I was having quite a similar issue. I had a static HTML, JS and CSS site which I needed to have in TFS due to the project and wanted to make my life easier using the continuous deployment. So what I did was following:
When you have a Git in TFS, you get an URL for the repository - something like:
https://yoursite.visualstudio.com/COLLECTION/PROJECT/_git/REPOSITORY
, however in order to access the repository itself, you need to authenticate, which is not currently possible, if you try to put the URL with authentication into Azure:
https://username:password#TFS_URL
It will not accept it. So what you do, in order to bind the deployment is that you just put the URL for repository there (the deployment will fail, however it will prepare the environment for us to proceed).
However, when you link it there, you can get DEPLOYMENT TRIGGER URL on the Configure tab of the Website. What it is for is that when you push a change to your repository (say to GitHub) what happens is that GitHub makes a HTTP POST request to that link and it tells Azure to deploy new code onto the site.
Now I went to Kudu which is the underlaying system of Azure Websites which handles the deployments. I figured that if you send correct contents in the HTTP POST (JSON format) to the DEPLOYMENT TRIGGER URL, you can have it deploy code from any repository and it even authenticates!
So the thing left to do is to generate the alternative authentication credentials on the TFS site and put the whole request together. I wrapped this entire process into the following PowerShell script:
# Windows Azure Website Configuration
#
# WAWS_username: The user account which has access to the website, can be obtained from https://manage.windowsazure.com portal on the Configure tab under DEPLOYMENT TRIGGER URL
# WAWS_password: The password for the account specified above
# WAWS: The Azure site name
$WAWS_username = ''
$WAWS_password = ''
$WAWS = ''
# Visual Studio Online Repository Configuration
#
# VSO_username: The user account used for basic authentication in VSO (has to be manually enabled)
# VSO_password: The password for the account specified above
# VSO_URL: The URL to the Git repository (branch is specified on the https://manage.windowsazure.com Configuration tab BRANCH TO DEPLOY
$VSO_username = ''
$VSO_password = ''
$VSO_URL = ''
# DO NOT EDIT ANY OF THE CODE BELOW
$WAWS_URL = 'https://' + $WAWS + '.scm.azurewebsites.net/deploy'
$BODY = '
{
"format": "basic",
"url": "https://' + $VSO_username + ':' + $VSO_password + '#' + $VSO_URL + '"
}'
$authorization = "Basic "+[System.Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($WAWS_username+":"+$WAWS_password ))
$bytes = [System.Text.Encoding]::ASCII.GetBytes($BODY)
$webRequest = [System.Net.WebRequest]::Create($WAWS_URL)
$webRequest.Method = "POST"
$webRequest.Headers.Add("Authorization", $authorization)
$webRequest.ContentLength = $bytes.Length
$webRequestStream = $webRequest.GetRequestStream();
$webRequestStream.Write($bytes, 0, $bytes.Length);
$webRequest.GetResponse()
I hope that what I wrote here makes sense. The last thing you would need is to bind this script to a hook in Git, so when you perform a push the script gets automatically triggered after it and the site is deployed. I haven't figured this piece yet tho.
This should also work to deploy a PHP/Node.js and similar code.
The easiest way would be to add them to an empty ASP .NET project, set them to be copied to the output folder, and then "build" the project.
Failing that, you could modify the build process template, but that's a "last resort" option.