How to retrieve the star counts in GitLab Python API? - gitlab

I try to request the number of stars and commits of a public software repository in GitLab using its Python client. However, I keep getting GitlabHttpError 503 if executing the following script.
import gitlab
import requests
url = 'https://gitlab.com/juliensimon/huggingface-demos'
private_token = 'xxxxxxxx'
gl = gitlab.Gitlab(url, private_token=private_token)
all_projects = gl.projects.list(all=True)
I read the previous posts but none of them works for me: [1], [2], and [3]. People mentioned:
Retrying later usually works [I tried in different periods but still got the same error.]
Setting an environment variable for no_proxy [Not sure what it means for me? I do not set the proxy explicitly.]

The 503 response is telling you something - your base URL is off. You only need to provide the base GitLab URL so the client makes requests against its api/v4/ endpoint.
Either use https://gitlab.com only, so that the client will correctly call https://gitlab.com/api/v4 endpoints (instead of trying https://gitlab.com/juliensimon/huggingface-demos/api/v4 as it does now), or skip it entirely when using GitLab.com if you're on python-gitlab 3.0.0 or later.
# Explicit gitlab.com
url = 'https://gitlab.com'
gl = gitlab.Gitlab(url, private_token=private_token)
# Or just use the default for GitLab.com (python-gitlab 3.0.0+ required)
gl = gitlab.Gitlab(private_token=private_token)
Edit: The original question was about the 503, but the comment to my answer is a follow-up on how to get project details. Here's the full snippet, which also fetches the token from the environment instead:
import os
import gitlab
private_token = os.getenv("GITLAB_PRIVATE_TOKEN")
gl = gitlab.Gitlab(private_token=private_token)
project = gl.projects.get("juliensimon/huggingface-demos")
print(project.forks_count)
print(project.star_count)

Related

How to activate Gitlab Pages Properly?

I am currently setting up gitlab pages for our internal network. Now I have completed my project and the CI pipeline is working. Now I have gone through all the steps in the gitlab.rb configuration via the gitlab docs but still I can't get gitlab pages to work.
My Gitlab.rb config:
gitlab_pages['enable'] = true
gitlab_pages['pages_external_url'] = pages.domain.xyz
gitlab_pages['external_http'] = ['192.168.x.x:80']
gitlab_pages['external_https'] = ['192.168.x.x:443']
gitlab_pages['cert'] = "/etc/gitlab/ssl/pages.domain.xyz.crt"
gitlab_pages['cert_key'] = "/etc/gitlab/ssl/pages.domain.xyz.key"
gitlab_pages['status_uri'] = "/#status"
gitlab_pages['max_connections'] = 0
gitlab_pages['log_format'] = "json"
gitlab_pages['log_verbose'] = true
gitlab_pages['redirect_http'] = true
gitlab_pages['dir'] = "/var/opt/gitlab/gitlab-pages"
gitlab_pages['log_directory'] = "/var/log/gitlab/gitlab-pages"
gitlab_pages['gitlab_server'] = 'https://gitlab.domain.xyz' # Defaults to external_url
My DNS is as follows:
A record for gitlab instance
A records for pages.domain.xyz
Wildcard for *.pages.domain.xyz
When I go to the pages page in my project the page URL is https://user.pages.domain.xyz/project
and this is not how it works I believe.
I hope someone can help me tackle this problem!
Maybe GitLab 15.4 (September 2022) will help:
Getting started with GitLab Pages just got easier
We’ve made it much easier to get started with GitLab Pages. Instead of
creating configuration files by hand, build them interactively using the
GitLab UI.
Just answer a few basic questions on how your app is built,
and we’ll build the .gitlab-ci.yml file to get you started.
This is the first time we’re using our new Pipeline Wizard,
a tool that makes it easy to create .gitlab-ci.yml files by building
them in the GitLab UI.
You can look forward to more simplified
onboarding helpers like this one.
See Documentation and Issue.

How to give repository access to installed GitHub app

I am using google cloud build for CI/CD purpose, in which I need to give access for specific repositories (can't use All repositories option). Is there any way to provide access by repository wise through python code. If not possible through python is there any alternative to this requirement.
Thanks,
Raghunath.
When I checked with GitHub support, they have shared me below links.
To get repository id -- https://developer.github.com/v3/repos/#get-a-repository
To get installations details -- https://developer.github.com/v3/orgs/#list-installations-for-an-organization
To add repository to an installation -- https://developer.github.com/v3/apps/installations/#add-repository-to-installation
I used these links to create below mentioned code which has helped me to implement the desired requirement.
# Header to use in request
header = dict()
header['Authorization'] = 'token %s' % personal_token
header['Accept'] = 'application/vnd.github.machine-man-preview+json'
# Fetching repository id
url = f'https://api.github.com/repos/{organization_name}/{repo_name}'
output = requests.get(url, headers=header)
repo_id = output.json()['id']
# Adding repository to Google Cloud build installation
url = f'https://api.github.com/user/installations/{gcb_installation_id}/repositories/{repo_id}'
output = requests.put(url, headers=header)

Python wrapper coinbase api errors

So I am trying to create a new wallet using the Python wrapper for the coinbase api.
My current code is this:
from coinbase.wallet.client import Client
client = Client('API-Key',
'SECRET',
api_version='2019-12-30')
# Get your primary coinbase account
primary_account = client.get_primary_account()
address = primary_account.create_address()
print(address)
When trying to use the code above, I always get the error:
coinbase.wallet.error.AuthenticationError: APIError(id=authentication_error): request timestamp expired
My guess is that the wrapper is not passing the right timestamp.
On the github page for this wrapper, it says that the current build is failing. I don't know how to fix this. The github hasn't had any recent updates. I tried to look at the client file to see if I could fix it myself, but I have had no luck.
I was facing the same issue but, as I've understood from the various contributions, the problem is due to the difference between the local OS time and Coinbase servers time. Besides 30 seconds of epochs difference, coinbase server returns the tedious timestamp expiration error!
I've found python code to update local Windows time based on various ntp servers ntp_update_time.py (shared by gilmotta). Launching the ntp_update_time's code before running coinbase client again makes the error disappear and everything work as indicated in Coinbase API references!!!

What is suggested method to get service versions

What is the best way to get list of service versions in google app engine in flex env? (from service instance in Python 3). I want to authenticate using service account json keys file. I need to find currently default version (with most of traffic).
Is there any lib I can use like googleapiclient.discovery, or google.appengine.api.modules? Or I should build it from scratches and request REST api on apps.services.versions.list using oauth? I couldn't not find any information in google docs..
https://cloud.google.com/appengine/docs/standard/python3/python-differences#cloud_client_libraries
Finally I was able to solve it. Simple things on GAE became big problems..
SOLUTION:
I have path to service_account.json set in GOOGLE_APPLICATION_CREDENTIALS env variable. Then you can use google.auth.default
from googleapiclient.discovery import build
import google.auth
creds, project = google.auth.default(scopes=['https://www.googleapis.com/auth/cloud-platform.read-only'])
service = build('appengine', 'v1', credentials=creds, cache_discovery=False)
data = service.apps().services().get(appsId=APPLICATION_ID, servicesId=SERVICE_ID).execute()
print data['split']['allocations']
Return value is allocations dictionary with versions as keys and traffic percents in values.
All the best!
You can use Google's Python Client Library to interact with the Google App Engine Admin API, in order to get the list of a GAE service versions.
Once you have google-api-python-client installed, you might want to use the list method to list all services in your application:
list(appsId, pageSize=None, pageToken=None, x__xgafv=None)
The arguments of the method should include the following:
appsId: string, Part of `name`. Name of the resource requested. Example: apps/myapp. (required)
pageSize: integer, Maximum results to return per page.
pageToken: string, Continuation token for fetching the next page of results.
x__xgafv: string, V1 error format. Allowed values: v1 error format, v2 error format
You can find more information on this method in the link mentioned above.

Serving Flask-RESTPlus on https server

I am relatively new to python and I created a micro service using flask-resplus.
Works fine on my computer and on the dev server served with http.
I dont have control on where the microservice could be deployed. In these case it seems is behind a load balancer(not sure of details), served with https.
The actual errors given by the browser: Can't read from server. It may not have the appropriate access-control-origin settings.
When i check the network developer tools i see it fails loading swagger.json. But is checking it using:
http://hostname/api/swagger.json, instead of https.
I have been googling, and i ran into discussions of this issue.
And this seemed to be the fix that could work without me having to change the library or configurations on the server.
However still i couldnt get it to work.
This is what i have:
on the api file:
api_blueprint = Blueprint('api', __name__, url_prefix='/api')
api = Api(api_blueprint, doc='/doc/', version='1.0', title='My api',
description="My api")
on the main app file:
from flask import Flask
from werkzeug.contrib.fixers import ProxyFix
from lib.api import api_blueprint
app = Flask(__name__)
app.wsgi_app = ProxyFix(app.wsgi_app)
app.register_blueprint(api_blueprint)
Also tried adding:
app.config['SERVER_URL'] = 'http://testfsdf.co.za' # but it dont look like is being considered
Using flask-restplus==0.9.2,
Any solution will be appreciated, as long as i dont need to make configurations on the container where the service will be deployed
(am ok with setting environment variables), i.e. service need to be self contained. And if there is a version of flask-resplus that i can install with pip, that already has a fix i can appreciate.
Thanks a lot guys,
Overide API class with _scheme='https' option in spec property.
class MyApi(Api):
#property
def specs_url(self):
"""Monkey patch for HTTPS"""
scheme = 'http' if '5000' in self.base_url else 'https'
return url_for(self.endpoint('specs'), _external=True, _scheme=scheme)
api = MyApi(api_blueprint, doc='/doc/', version='1.0', title='My api',
description="My api")
The solution above works like a charm. There are couple of things you should check.
Before applying the fix, make sure in your chrome developertools -> Network tab that whenever you reload the page(in https server) that shows the swagger UI, you get a mixed content error for swagger.json request.
The solution in the above post solves the issue when deployed on an https server but locally it might give issue. For that you can use the environment variable trick.
Set a custom environment variable or any variable which is already there on your https server while deploying your app. Check for the existence of that environment variable before applying the solution to make sure your app in running in the https server.
Now when you run the app locally, this hack won't be applied and swagger.json would be served through http and in your server it would be served via https. Implementation might look similar to this.
import os
from flask import url_for
from flask_restplus import Api
app = Flask( __name__)
if os.environ.get('CUSTOM_ENV_VAR'):
#property
def specs_url(self):
return url_for(self.endpoint('specs'), _external=True, _scheme='https')
Api.specs_url = specs_url
api = Api(app)

Resources