I have a private GitHub repo which looks like this:
[Project root]
- angular
- nodejs
Of course, it would be cleaner to separate these two projects into two
repos but due to the limited amount of private repos on GitHub I
decided to place both parts into the same repo.
Is there a way I can work with my own GitHub repo and push a new version of a nodejs application to OpenShift without having to use OpenShift's git repo?
I remember that CloudFoundry - another PaaS - has its own CLI tool which simply allows you to write cf push <app>, no need to have a git repo at CF. Is there something similar with rhc?
The easiest (and probably best) solution to this problem is to use gitlab (my preference) or another service such as bitbucket for hosting private repos. Gitlab offers free unlimited private repos.
Related
I'm new to google cloud source repository and confused to why code pushed in one of the repository is not syncing up with another repository. is it because all of the cloud repos need to be connected to same git-lab/git-hub repository? or am i missing something else with respect to sync of code across repository's.
I have 2 repository's namely DEV and QA. pushed my code in DEV and not able to see any of the pushed code in QA repository. as of now both are not connected to any git-lab/git-hub repository, as i can see it in the setting of the cloud repository.
Posting #ezkl's comment as a Community wiki for better visibility.
If you are looking for the code you pushed to one repository to also be available in another repository, that isn't how git repositories work.
To achieve this goal you would need an automation.
There are some out of the box tools that might be useful:
Push code from an existing repository on your local machine to Cloud Source Repositories
Mirroring of GitHub or Bitbucket repositories
Add a Google Cloud repository as a remote to a local Git repository
I have enabled Bitbucket Pipelines in one of my node.js repositories to have it run the build on every commit. My repository depends on another node.js repository. For development I've linked the one to the other using npm link.
I've tried a git clone of that repository that is specified in the bitbucket-pipelines.yml file, but the build gets stuck on that command. I guess it's because git is asking for authentication at that point.
Is there a way to allow the container to access other repositories in the same team? Or is there a better way altogether on how to solve this? I'd also be fine with switching to another CI tool if Bitbucket Pipelines aren't capable of this – the only requirement is that it's free for teams < 5 people.
Btw. I'd like to avoid paying for npm private packages if possible.
Thanks!
You can organize access by ssh key for another repo like described in official docs https://confluence.atlassian.com/bitbucket/access-remote-hosts-via-ssh-847452940.html
I have searched for this a lot but I didn't find the solution I want.
I have one private repository on gitlab with flask microframework and I want to put that on openshift. I have followed this link, then I changed the files structure to work good for openshift and everything was going good. But then I realized that any commit/push I do on openshift repo is not appearing on gitlab repo. Any solution? Or what is the useful workflow for this?
This is because they are two different repository and you should push to both
It is ok to push to gitlab repositories directly, from outside gitlab?
Mainly what I would like to know is:
* would gitlab detect changes?
* is is safe, as in if it will not break repos due to concurrency?
If I understand your question correctly you're asking whether it is possible to push commits from another git client than Gitlab to a Gitlab instance.
There's no problem at all concerning this, actually this is exactly what git and Gitlab is about.
It doesn't matter at all which Git client you use to get your commits done and pushed to the server running Gitlab. Think of Gitlab as just one possible frontend to your repositories.
If you are interested in the technical background:
Git is completely file-based and doesn't rely on any kind of central server managing your repositories. All relevant data is stored in the .git subdirectory of your project. This enables the use of multiple clients with a single repository - for example git and Gitlab.
Gitlab internally uses the gem gitlab_git which itself uses the library rugged that provides Ruby bindings for libgit2. That library is also used in the implementation of other git clients, "including the GitHub.com site, in Plastic SCM and also powering Microsoft's Visual Studio tools for Git".
Regarding the handling of actual concurrency problems, have a look at this answer by kan. Correct permissions are handled via git hooks as was kindly pointed towards in this comment below by Ciro Santilli.
OpenShift seems to encourage developers to push a repo directly to OpenShift, which is pretty convenient. If an application (using node.js, but that probably doesn't matter much for this question) has mostly public files, but a few private files for things like DB passwords, external api keys, license keys, New Relic config, etc., what is the recommended way of deploying?
One idea that comes to mind is to have 1 public repo, 1 private repo, and a deploy script that puts everything together, commits to a separate private deploy repo, and then push that deploy repo to OpenShift.
This seems like it would be a common use case though, so perhaps the deploy script with extra repo is unnecessary if OpenShift already has a process for this.
The recommended way is to use environment variables for things like db passwords and keys. And unless you are also copying your code to a public github repo, then everything you push to your OpenShift hosted git repo is private to just you.