Download dotenv job artifact via GitLab API - gitlab

I'm trying to download dotenv artifact called release.env from the latest job related to branch feature/tests-update but API returns 404 error.
API docs: https://docs.gitlab.com/14.8/ee/api/job_artifacts.html#download-a-single-artifact-file-by-job-id
Request example:
curl -v --get --header "PRIVATE-TOKEN: <TOKEN>" --data-urlencode "job=create-release" "https://<gitlab-host>/api/v4/projects/2/jobs/artifacts/feature%2Ftests-update/raw/release%2Eenv"
GitLab community 14.8.

Unfortunately, artifacts:reports:dotenv artifacts are not exposed by the API. Only files in the artifacts archive (e.g. in artifacts:paths:) can be retrieved from this endpoint.
You are, however, able to download the dotenv reports from the UI.
As far as I can tell, this seems to be an oversight in the jobs API.
You can see that the jobs API lists dotenv reports in its artifacts with a filename of .env.gz:
...
'artifacts': [{'file_type': 'trace',
'size': 9954,
'filename': 'job.log',
'file_format': None},
{'file_type': 'dotenv',
'size': 66,
'filename': '.env.gz',
'file_format': 'gzip'}],
...
However, even if you use the filename .env.gz, it seems you cannot download this file from the API.
As a workaround, you can add your release.env file to artifacts:paths: and retrieve it in the way you described.

Related

Get the curl command for each api endpoint defined in swagger.yaml file

I have a swagger.yaml file where I document all the node.js api endpoints. It uses openapi version 3.0.0. We also use express-openapi-validator package to validate the request and response of every api endpoint defined in the yaml file.
I want to get just the curl command for all the endpoints defined the yaml file. Is there any npm package or anything that can help me generate curl command from the swagger.yaml file so that I can display that curl command inside a custom component in react on a custom built page.

How to download artifacts using an URL in a gitlab job?

In the gitlab documentation some URL's are described for the purpose of downloading artifacts from pipelines HERE. They seem to have forgotten to describe HOW to download artifacts given these URLs.
Can it be done in a simple way? Or do I have to install e.g. wget, create a token, define a token, use a token?
If someone could give an example that would be great.
The documentation referenced in the question is supposed to be an REST API call, using the Job Artifact API:
GET /projects/:id/jobs/artifacts/:ref_name/download?job=name
To use this in a script definition inside .gitlab-ci.yml, you can use either:
The JOB-TOKEN header with the GitLab-provided CI_JOB_TOKEN variable.
For example, the following job downloads the artifacts of the test job of the main branch.
The command is wrapped in single quotes because it contains a colon (:):
artifact_download:
stage: test
script:
- 'curl --location --output artifacts.zip --header "JOB-TOKEN: $CI_JOB_TOKEN" "https://gitlab.example.com/api/v4/projects/$CI_PROJECT_ID/jobs/artifacts/main/download?job=test"'
Or the job_token attribute with the GitLab-provided CI_JOB_TOKEN variable.
For example, the following job downloads the artifacts of the test job of the main branch:
artifact_download:
stage: test
script:
- 'curl --location --output artifacts.zip >"https://gitlab.example.com/api/v4/projects/$CI_PROJECT_ID/jobs/artifacts/main/download?job=test&job_token=$CI_JOB_TOKEN"'
But the artifact: directive is meant to store data in the job workspace, for a new iteration of the job to get back, in the same folder.
No "download" involved, as illustrated in the article "GitLab CI: Cache and Artifacts explained by example" by Anton Yakutovich.
As such, no curl/wget/TOKEN should be needed to access an artifact stored by a previous job execution.

Updating a file for a quick-pull using github cli

Currently in the github UI, a user can edit a file and create a new branch in a single action. This can also be done through the github api using something like this:
curl 'https://github.com/<my_org>/<my_repo>/tree-save/master/<path_to_file>' \
-H 'content-type: application/x-www-form-urlencoded' \
--data-raw 'authenticity_token=<generated_token>&filename=<filename>&new_filename=<filename>&content_changed=true&value=<new_contents_of_file>&message=Updated+file+in+my+repo&placeholder_message=Update+<filename>&description=&commit-choice=quick-pull&target_branch=<new_branch_name>&quick_pull=master&guidance_task=&commit=<target_commit_checksum>&same_repo=1&pr='
What I would like to be able to do, is perform the same action using the github cli* (gh). I have tried using the following commands:
gh api <my_org>/<my_repo>/tree-save/master/<path_to_file> -F "filename=<filename>" -F ...
and
gh api repos/<my_org>/<my_repo>/contents/<path_to_file> -F "filename=<filename>" -F ...
For both cases (and many variations on these options), I'm getting a 404** back. Any ideas what I'm doing wrong? Does the github cli even allow the functionality allowed in the above curl?
* For those curious, I want to use the CLI because of how it handles auth and it's statelessness. I can't generate a token to use, like in the curl above. And, due to multiple issues, I also can't clone the repo locally.
** I'm able to retrieve the file just fine using the simple GET command (the second command above without the '-F' flags)
After reading documentation, and then verifying by altering credentials, it appears to be a permissions issue. Evidently, for security reasons, if a token is used that does not meet the required permissions, a 404 is returned instead of a 403.
Interesting that I can still use the curl above through the browser. So, now i need to figure out why the gh cli token does not have the same permissions as my user.

Is there an API in SwaggerHub to update the file definition?

Is there an API to update the file definition?
I am looking for a way to keep my project in Git and SwaggerHub in sync automatically, so I would like to update the file definition at every merge.
Is it possible? How do you manage keeping your project and SwaggerHub definition in sync automatically?
Yes, SwaggerHub has an API:
https://api.swaggerhub.com
Integrating with the SwaggerHub API
and a number of official API clients.
API
cURL command to create or update an API (note the use of --data-binary instead of -d/--data):
curl -X POST "https://api.swaggerhub.com/apis/OWNER/API_NAME" \
-H "Authorization: YOUR_API_KEY" \
-H "Content-Type: application/yaml" \
--data-binary #myapi.yaml
Raw HTTP request for the reference:
POST https://api.swaggerhub.com/apis/OWNER/API_NAME
Authorization: YOUR_API_KEY
Content-Type: application/yaml
# Request body is your complete YAML/JSON file
swagger: '2.0'
info:
title: My API
version: 1.0.0
paths:
...
Use the correct Content-Type header value: application/yaml for YAML or application/json for JSON.
SwaggerHub CLI
A command-line wrapper around the SwaggerHub API, available as a npm module.
npm install -g swaggerhub-cli
Specify your API key (get it from https://app.swaggerhub.com/settings/apiKey):
swaggerhub configure
? SwaggerHub URL: https://api.swaggerhub.com
? API Key: <paste your key>
Create a new API:
swaggerhub api:create OWNER/API_NAME --file myapi.yaml
Update an existing API:
swaggerhub api:update OWNER/API_NAME/VERSION --file myapi.yaml --visibility private
Maven plugin
https://github.com/swagger-api/swaggerhub-maven-plugin/
Gradle plugin
https://github.com/swagger-api/swaggerhub-gradle-plugin/

How to get subfolders and files using gitlab api

I am using gitlab api to get the files and folders and succeded,
But I can able to get only directory names, not its subfolders and files.
So, how can i get full tree of my repository.
Please let me know.
Thanks in advance,
Mallikarjuna
According to the API, we can use
GET /projects/:id/repository/tree
to list files and directories in a project. But we can only get the files and directories in top-level of the repo in this way, and sub-directories of directories in top-level with param path.
If you wanna get directories of script/js/components, for example, you can use
GET /projects/:id/repository/tree?path=script/js/components
Rest API
You can use the recursive option to get the full tree using /projects/:id/repository/tree?recursive=true
For example : https://your_gitlab_host/api/v4/projects/:id/repository/tree?recursive=true&per_page=100
GraphQL API
You can also use the recently released Gitlab GraphQL API to get the trees in a recursive way :
{
project(fullPath: "project_name_here") {
repository {
tree(ref: "master", recursive: true){
blobs{
nodes {
name
type
flatPath
}
}
}
}
}
}
You can go to the following URL : https://$gitlab_url/-/graphql-explorer and past the above query
The Graphql endpoint is a POST on "https://$gitlab_url/api/graphql"
An example using curl & jq :
gitlab_url=<your gitlab host>
access_token=<your access token>
project_name=<your project name>
branch=master
curl -s -H "Authorization: Bearer $access_token" \
-H "Content-Type:application/json" \
-d '{
"query": "{ project(fullPath: \"'$project_name'\") { repository { tree(ref: \"'$branch'\", recursive: true){ blobs{ nodes { name type flatPath }}}}}}"
}' "https://$gitlab_url/api/graphql" | jq '.'
you should do url encoding to the full path of the file. for example lest assume that the path to file under your repository is: javascript/header.js
then you could use:
curl --head --header "PRIVATE-TOKEN: <your_access_token>" "https://<>/api/v4/projects//repository/files/javascript%2Fheader%2Ejs"
Of course, as mentioned in other responses, you have missed the path attribute of the gitlab repositories API which lets you browse the file hierarchy.
In addition, for simplicity, the python gitlab project exposes it through the projects API. Example:
# list the content of the root directory for the default branch
items = project.repository_tree()
# list the content of a subdirectory on a specific branch
items = project.repository_tree(path='docs', ref='branch1')
For getting the whole tree with sub-directories and files, you can pass a parameter called "recursive" to true
By default it's false
Api - {Gitlab_URl}/api/v4/projects/{Project_id}/repository/tree?recursive=true
Thanks!

Resources