How to create multiple Repository in OpenCMIS? - cmis

I am working on java integrated with CMIS Repository.And i tried with all functions in OpenCMIS Fileshare by making changes in repository.properties as per our Requirement.
Here i want to create multiple repository, now i configured creating multiple repository in repository.properties file /src/main/webapp/WEB-INF/classes/repository.properties.
repository.secondRepository = {user.home}
repository.secondRepository.readwrite = test, cmisuser
repository.secondRepository.readonly = reader,user
Is there any REST api or other way to create multiple repository.

Repository management is not in the scope of CMIS and therefore there is no standardized API for adding repositories.
But the OpenCMIS FileShare implementation can dynamically add new repositories. The class FileShareRepositoryManager provides the method addRepository() to do this.

Related

How to clone all public repositories from gitlab server?

There is an unstable gitlab server and I am not sure that it will be able to work in the future. Therefore, I want to make a backup copy of all the repositories (projects) that are there.
Cloning the source code will be enough, but it will be great if there is a way to save issues as well. Are there any ways to do this?
It depends on what kind of access you have, but if you don't have administrator access to do a full backup, then the best thing to do is to use a couple of API endpoints to get the information you need and go from there.
Use the Projects API to get a list of all projects accessible to you.
Note the pagination limits.
What you store depends on how you want to get the information.
Store at least the ID number of each.
Filter by membership if you only want the ones you're a member of.
Filter by min_access_level = maintainer (or higher) if you want to export whole projects.
Use the Project export API to trigger a project export for each project you're a member of, and you're a maintainer (or higher).
For all other projects where you have a lower role, or where it's public, you could still use git clone for the repositories by storing the ssh_url_to_repo or http_url_to_repo from the Projects API and running through each.
For all other parts of a project, you could store the JSON version to recreate them later if you want to go through the hassle. For example, for issues, use the Issues API.

How to list all repositories (cross organization) using Azure DevOps API

Is there a way to list every repository from an ADO instance using the API?
I see it is possible to list the repository of a specific Project but that would result in many API calls (1 per project for each organization). Is there a more efficient way to do this?
There is no other way. What you can do to improve is call first for projects
GET https://dev.azure.com/{organization}/_apis/projects?api-version=6.0
and use response to iterate and get all repositories.

GitLab change permission of protected branches

We recently migrated to GitLab Self Hosted (V14.3.0)
We migrated 100+ repos to Gitlab and then we realized, by default only maintainers have write access to Gitlab protected branched.
Is there a way to change the following setting in one shot for multiple repositories or we will have to manually change for every repository?
We want to change "Allowed to merge" from "Maintainers" to "Developers + Maintainers"
In the main group we have set it to the following, I was hoping that this will make it work but no luck -
Well manually will be a bad approach, but the GitLab API offers a lot of functionality regarding that problem. I will not write the script, but i will outline you the APIs you can use and why you use them.
Fetch a list of all projects you want to change - the Projects API
GET /projects
With this endpoint you will receive a list of all the projects within you instance, on which the user has access - be aware that this is a paginated request - so just calling it once will not be sufficient.
Adapt the Protected branches - the Protected Branches API
With the project IDs from the first part you can now query each project and change the protection. We ended up with first deleting the protection and recreating them, because it has proven to be easier.
Anyway i recommend to automate this with a script, and do it rather sooner than later. As some projects might start with custom protections, and this can make the migration harder.
the GitLab API offers a lot of functionality regarding that problem
Actually, GitLab 15.6 (November 2022) does provide said API:
Update access levels from Protected Branch API
Previously, the UI was required to update the access levels of protected
branches. The API required you to unprotect, then reprotect, a branch when
updating its access levels.
Now, the
protected branches API
enables you to directly update which users or groups are allowed_to_push, allowed_to_merge,
allowed_to_unprotect, and more.
This one-step method decreases the risk of a bot
changing this setting and leaving a branch unprotected.
See Documentation and Issue.

How to restrict exposing project settings file to all in mavenExecute step of cloud-s4-sdk pipeline?

We are working on the s4sdk pipeline implementation for delivery of SAP CloudFoundry applications (spring-boot micro-services) using the SAP Cloud SDK for Java.
We have multiple developers working on multiple micro-services but all these micro-services are having some common dependencies.
We want to control the versions for all the common dependencies from a central location.
For this we have created a Maven BOM (Bill of Materials) dependency and added it as the parent in pom.xml of all the micro-services.
The aforementioned BOM is housed in Nexus repository and all pom.xmls (of the micro-services) can access the parent using the repository tag like below.
<repository>
<id>my-repo</id>
<name>nexus-repo</name>
<url>http://some/url</url>
</repository> `
The credentials for the above nexus repository are placed in the settings.xml file.
We want to run the above model in the cloud-s4-sdk pipeline. Although it works fine, the problem is that we need to expose the nexus repo access credentials in the settings.xml file.
Per documentation in https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/configuration.md#mavenexecute, the settings.xml for maven builds needs to be placed relative to the
project root. This is a security concern for us as the project repository is in GitHub and as such projectSettingsFile can be accessed by the developers.
We don't want these credentials to be exposed to the developers. It should be limited to only the admin team.
Is there a way we can achieve this using the cloud-s4-sdk pipeline?
Although nexus facilitates user token for maven settings.xml, but that does not work here as GUI login is still possible using the token values.
I think you could consider the following options:
Allow anonymous read access for artifacts
The developers anyway need a way to build the artifacts locally. How could developers build your service without having access to a dependency. Allowing read access would also enable them to do that.
Commit credentials to git but make git repository private
If you don't want to allow all employees (I guess the only employees have access to your nexus), you can commit the credentials together with the settings.xml but make the repository private to not share these details.
Inject credentials as environment variable
You can inject the credentials as environment variable to your settings xml file. See also: How to pass Maven settings via environmental vars
The setup the environment variable you can surround the full pipeline in your Jenkinsfile with the withCredentials step. For details see: https://jenkins.io/doc/pipeline/steps/credentials-binding/
String pipelineVersion = "master"
node {
deleteDir()
sh "git clone --depth 1 https://github.com/SAP/cloud-s4-sdk-pipeline.git -b ${pipelineVersion} pipelines"
withCredentials([usernamePassword(credentialsId: 'nexus', usernameVariable: 'NEXUS_USERNAME', passwordVariable: 'NEXUS_PASSWORD')]) {
load './pipelines/s4sdk-pipeline.groovy'
}
}
and a settings.xml like:
<username>${env.NEXUS_USERNAME}</username>
<password>${env.NEXUS_PASSWORD}</password>

Transfer set of repositories from one gitlab group to another subgroup all at once

How to transfer the repository or a whole set of repositories from one gitlab group to another subgroup. For example companyname.gitlab.com/team one/. To gitlab.com/team_first/phase1/
The repositories/projects themselves still need to be exported by API, one by one.
But the new "Group Import/Export " feature from GitLab 13.0 (May 2020) can be a welcome addition.
Export and Import Groups in the UI
Previously, users could only migrate groups by using the Export/Import API to create an Export file, then using the API a second time to upload the file to the target instance.
As a first step toward a more frictionless solution, we have enabled Group Export in the GitLab UI.
We plan to introduce similar Import functionality to the UI within the next few weeks.
See documentation, issue and Epic.
See GitLab 14.2 (August 2021)
Group Migration achieves parity with group import/export
The new GitLab Migration feature can now migrate an entire group with all its subgroups and related data. The data migrated includes everything contained in group exports, making this a much easier way to migrate entire groups.
The pre-existing group import/export is a two-step process that requires exporting a file and then importing it into another GitLab instance.
Now, users can initiate a group migration with a single click. Migration also includes all the subgroups and their data, which previously required separate export and import processes for each subgroup.
See Documentation and Epic.
See GitLab 15.6 (November 2022)
Associate MRs to issues when migrating groups with projects
When migrating groups using GitLab Migration, GitLab now preserves associations of imported merge requests to issues.
This populates the Related merge requests section
on the issue details page.
See Documentation and Issue.

Resources