I can successfully access info about a user with this command:
curl http://gitlab.$INTERNAL_SERVER.com/api/v3/\
users/$USER_ID\?private_token\=$GITLAB_TOKEN
However, I can not find the API endpoint for getting a list of the commits that the user has pushed to the GitLab server. Does a URL with this info exist?
To the best of my knowledge, such an API endpoint does not exist. Essentially the best I've been able to come up with is this flow:
find all the projects the user is involved with (not 100% simple in itself)
then get commits for that project
THEN filter those commits based on useremail.
I am using java-gitlab-api to access the Gitlab server, so don't have curl samples handy (sorry!).
It looks like you can get a list of commits by using the Events endpoint
data = requests.get(host + "/api/v4/users/{id}/events".format(id=user_id),
params={"action": "pushed"})
And you can chain that by updating params to
params.update({"before": before_date})
Where before date can be the last element in data, and you can loop continuously to get all commits by user from a specific date
I have written a Python script that does what #demaniak suggest. Enjoy
import requests
import ujson as json
header ={...}
def get_all_commits_gitlab(project_id, username):
json_loads_of_commit = []
f_date = "2022-01-01T00:00:42.000+01:00"
params = {"until": f_date}
url_p = "https://gitlab.xxx.xx/api/v4/projects/%d/\
repository/commits" % project_id
r = requests.get(url_p, params, headers=header)
c = 0
while r.status_code == 200:
jsLoad = json.loads(r.content)
newDate = jsLoad[-1]["committed_date"]
if (params["until"] == newDate):
break
user_commits = []
for cm in jsLoad:
if cm["author_name"] == username:
user_commits.append(cm)
c += 1
json_loads_of_commit.append(user_commits)
params["until"] = newDate
r = requests.get(url_p, params, headers=header)
print("project %d: %d commits by user %s, \
the first one %s" % (project_id, c, username, newDate))
return json_loads_of_commit
Related
Building a search with Algolia search product for our Moodle site.
To update the index at Algolia site we want to use some automated process (upload the index data).
I decided to start with the Python client (Python v 3.9.x). Since we are working inside a corporate network we are behind the firewalls and there is a problem accessing algolia's servers.
I'm testing with two methods:
Search within already existing indices: index.search
update all indexes: index.replace_all_objects
Error message: 'Unreachable hosts'
Here is my code snippets:
# Search:
searchAppID = constants.ALGOLIA_APP_ID
searchKeyHash = constants.ALGOLIA_SEARCH_KEY
config = SearchConfig(searchAppID, searchKeyHash)
config.connect_timeout = 2
config.read_timeout = 5
config.write_timeout = 30
client = SearchClient.create_with_config(config)
index = client.init_index('myIndex')
result = index.search('SearchString')
# function to update index using json file:
def replace_IdxObj(client,index,fileName):
try:
if (index.exists()):
with open(fileName, encoding="utf8") as f:
data = json.load(f)
result = index.replace_all_objects(data, { 'autoGenerateObjectIDIfNotExist' : True })
return result
else:
print('No Index found! Cancel operation... \n')
return None
except Exception as err:
print ("Error: " + str(err))
When I test my code outside of the company's network - it's all ok!
Also I have a SSL certificate that could be used to work with external resources, so I used it in the following test (running from company's network behind the firewall):
response = requests.get('https://google.com/', verify=("C:\\Program Files\\Common Files\\SSL\\certs\\cert.cer"))
print(response)
This test works just fine (I'm getting 200 response)!
Please advise are there any ways to make Python process (which is used by algolia's python client) to use the certificate that I have ??? Any alternatives?
Many thanks!
I have to export incident data from service now rest API. The incident state is one of new, in progress, pending not resolved and closed. I am able to fetch data that are in active state but not able to apply correct filter also in output it is showing one extra character 'b', so how to remove that extra character?
input:
import requests
URL = 'https://instance_name.service-now.com/incident.do?CSV&sysparm_query=active=true'
user = 'user_name'
password = 'password'
headers = {"Accept": "application/xml"}
response = requests.get(URL,auth=(user, password), headers=headers)
if response.status_code != 200:
print('Status:', response.status_code, 'Headers:', response.headers, 'Error Response:', response.content)
exit()
print(response.content.splitlines())
Output:
[b'"number","short_description","state"', b'"INC0010001","Test incident creation through REST","New"', b'"INC0010002","incident creation","Closed"', b'"INC0010004","test","In Progress"']
It's a Byte literal (for more info. please refer https://docs.python.org/3/reference/lexical_analysis.html#string-and-bytes-literals)
To remove Byte literal, we need to decode the strings,
Just give a try as-
new_list = []
s = [b'"number","short_description","state"', b'"INC0010001","Test
incident creation through REST","New"', b'"INC0010002","incident
creation","Closed"', b'"INC0010004","test","In Progress"']
for ele in s:
ele= ele.decode("utf-8")
new_list.append(ele)
print(new_list)
Output:
['"number","short_description","state"', '"INC0010001","Test incident
creation through REST","New"', '"INC0010002","incid
ent creation","Closed"', '"INC0010004","test","In Progress"']
Hope! It will work
I am trying to figure out a way to get a users access key age through an aws lambda function using Python 3.6 and Boto 3. My issue is that I can't seem to find the right api call to use if any exists for this purpose. The two closest that I can seem to find are list_access_keys which I can use to find the creation date of the key. And get_access_key_last_used which can give me the day the key was last used. However neither or others I can seem to find give simply the access key age like is shown in the AWS IAM console users view. Does a way exist to get simply the Access key age?
This simple code do the same stuff without converting a lot of time etc:
import boto3
from datetime import date
client = boto3.client('iam')
username = "<YOUR-USERNAME>"
res = client.list_access_keys(UserName=username)
accesskeydate = res['AccessKeyMetadata'][0]['CreateDate'].date()
currentdate = date.today()
active_days = currentdate - accesskeydate
print (active_days.days)
There is no direct way. You can use the following code snippet to achieve what you are trying:
import boto3, json, time, datetime, sys
client = boto3.client('iam')
username = "<YOUR-USERNAME>"
res = client.list_access_keys(UserName=username)
accesskeydate = res['AccessKeyMetadata'][0]['CreateDate'] ### Use for loop if you are going to run this on production. I just wrote it real quick
accesskeydate = accesskeydate.strftime("%Y-%m-%d %H:%M:%S")
currentdate = time.strftime("%Y-%m-%d %H:%M:%S", time.gmtime())
accesskeyd = time.mktime(datetime.datetime.strptime(accesskeydate, "%Y-%m-%d %H:%M:%S").timetuple())
currentd = time.mktime(datetime.datetime.strptime(currentdate, "%Y-%m-%d %H:%M:%S").timetuple())
active_days = (currentd - accesskeyd)/60/60/24 ### We get the data in seconds. converting it to days
print (int(round(active_days)))
Let me know if this works as expected.
Upon further testing, I've come up with the following which runs in Lambda. This function in python3.6 will email users if their IAM keys are 90 days or older.
Pre-requisites
all IAM users have an email tag with a proper email address as the value.
Example;
IAM user tag key: email
IAM user tag value: someone#gmail.com
every email used, needs to be confirmed in SES
import boto3, os, time, datetime, sys, json
from datetime import date
from botocore.exceptions import ClientError
iam = boto3.client('iam')
email_list = []
def lambda_handler(event, context):
print("All IAM user emails that have AccessKeys 90 days or older")
for userlist in iam.list_users()['Users']:
userKeys = iam.list_access_keys(UserName=userlist['UserName'])
for keyValue in userKeys['AccessKeyMetadata']:
if keyValue['Status'] == 'Active':
currentdate = date.today()
active_days = currentdate - \
keyValue['CreateDate'].date()
if active_days >= datetime.timedelta(days=90):
userTags = iam.list_user_tags(
UserName=keyValue['UserName'])
email_tag = list(filter(lambda tag: tag['Key'] == 'email', userTags['Tags']))
if(len(email_tag) == 1):
email = email_tag[0]['Value']
email_list.append(email)
print(email)
email_unique = list(set(email_list))
print(email_unique)
RECIPIENTS = email_unique
SENDER = "AWS SECURITY "
AWS_REGION = os.environ['region']
SUBJECT = "IAM Access Key Rotation"
BODY_TEXT = ("Your IAM Access Key need to be rotated in AWS Account: 123456789 as it is 3 months or older.\r\n"
"Log into AWS and go to your IAM user to fix: https://console.aws.amazon.com/iam/home?#security_credential"
)
BODY_HTML = """
AWS Security: IAM Access Key Rotation: Your IAM Access Key need to be rotated in AWS Account: 123456789 as it is 3 months or older. Log into AWS and go to your https://console.aws.amazon.com/iam/home?#security_credential to create a new set of keys. Ensure to disable / remove your previous key pair.
"""
CHARSET = "UTF-8"
client = boto3.client('ses',region_name=AWS_REGION)
try:
response = client.send_email(
Destination={
'ToAddresses': RECIPIENTS,
},
Message={
'Body': {
'Html': {
'Charset': CHARSET,
'Data': BODY_HTML,
},
'Text': {
'Charset': CHARSET,
'Data': BODY_TEXT,
},
},
'Subject': {
'Charset': CHARSET,
'Data': SUBJECT,
},
},
Source=SENDER,
)
except ClientError as e:
print(e.response['Error']['Message'])
else:
print("Email sent! Message ID:"),
print(response['MessageId'])
Using the above methods you will only get the age of the access keys. But as a best practice or a security approach, you need to check the rotation period, when the keys are last rotated. If the keys rotation age is more than 90 days you could alert your team.
The only way to get the rotation age of the access keys is by using the credentials report from IAM. Download it, parse it, and calculate the age.
I intend to transfer issues from Redmine to GitLab using this script
https://github.com/sdslabs/redmine-to-gitlab/blob/master/issue-tranfer.py
It works, but I would like to keep the issues ids during the transition. By default GitLab just starts from #1 and increases. I tried adding "newissue['iid']=issue['id']" and variations to the parameters, but apparently GitLab simply does not permit assigning an id. Anyone knows if there's a way?
"issue" is the data acquired from redmine:
newissue = {}
newissue['id'] = pro['id']
newissue['title'] = issue['subject']
newissue['description'] = issue["description"]
if 'assigned_to' in issue:
auser = con.finduserbyname(issue['assigned_to']['name'])
if(auser):
newissue['assignee_id'] = auser['id']
print newissue
if ('fixed_version' in issue):
newissue['milestone_id'] = issue['fixed_version']['id']
newiss = post('/projects/' + str(pro['id']) + '/issues', newissue)
and this is the "post" function
def post( url, load = {}):
load['private_token'] = conf.token
r = requests.post(conf.base_url + url, params = load, verify = conf.sslverify)
return r.json()
The API does not allow you to specify an issue ID at creation time. The ID is intended to be sequential. The only way you could potentially accomplish this task is to interact with the database directly. If you choose this route I caution you to be extremely careful, and have backups.
I'm trying to write a Python script to talk to my instance of Jenkins. I am using the newest version of the jenkinsapi module and querying Jenkins 1.509.3.
I can get a job list like follows:
l=j.get_jobs_list()
where j is an instance of jenkinsapi.Jenkins (I used the requester from jenkinsapi.utils.requester to skip ssl verification)
However, when I try to get more information on an individual job with
j.get_job(l[0])
it fails with this error: Inappropriate content found at [some_address] and what is returned is a bunch of HTML (that looks like the starting page for my instance, the one you see when you log in) instead of anything that should look like the response. Pasting [some_address] into the browser gives me what I expect as a response.
While I can get some information on the Jenkins instance, what I am really interested in is info on individual jobs. Any ideas how to fix it and get the job info?
Using python 3.6, python-jenkins 1.0.1 and Jenkins 2.121.1, following works nicely:
import pprint
import jenkins
IP = 'localhost'
USERNAME = 'my_username'
PW = 'my_password'
def get_version(server):
user = server.get_whoami()
version = server.get_version()
print('Hello %s from Jenkins %s' % (user['fullName'], version))
def get_jobs(server):
jobs = server.get_jobs() # List[dict]
print("Here are top 5 jobs")
pprint(jobs[:5])
return jobs
def get_job(server, job_name):
job_config = server.get_job_config(job_name) # XML
job_info = server.get_job_info(job_name) # dict
print("\n --- JOB CONFIG --- ")
print(job_config)
print("\n --- JOB INFO --- ")
pprint(job_info)
if __name__ == "__main__":
_server = jenkins.Jenkins(IP, username=USERNAME, password=PW)
get_version(_server)
_jobs = get_jobs(_server)
get_job(_server, _jobs[0]['name'])
Jenkins API I was using is documented here: https://python-jenkins.readthedocs.io/en/latest/index.html