google cloud source repositories are not in sync - google-cloud-source-repos

I'm new to google cloud source repository and confused to why code pushed in one of the repository is not syncing up with another repository. is it because all of the cloud repos need to be connected to same git-lab/git-hub repository? or am i missing something else with respect to sync of code across repository's.
I have 2 repository's namely DEV and QA. pushed my code in DEV and not able to see any of the pushed code in QA repository. as of now both are not connected to any git-lab/git-hub repository, as i can see it in the setting of the cloud repository.

Posting #ezkl's comment as a Community wiki for better visibility.
If you are looking for the code you pushed to one repository to also be available in another repository, that isn't how git repositories work.
To achieve this goal you would need an automation.
There are some out of the box tools that might be useful:
Push code from an existing repository on your local machine to Cloud Source Repositories
Mirroring of GitHub or Bitbucket repositories
Add a Google Cloud repository as a remote to a local Git repository

Related

How to automate pushing of code from local repository to remote bitbucket repository?

We have job that automatically generates a java code and pushes to a local repository in linux.
I need to find a solution on how to push the newly created java code from local repo to bitbucket repo automatically.
Each time the java code is generated, it will be with different folder name.
May it help you if you use branches?
You could create branches automatically and push them into bitbucket.

How to setup authoring env to publish site to remote git repo?

I downloaded and started authoring environment (crafter-cms-authoring.zip)
Created site backed by remote git repo as described in: Create site based on a blueprint then push to remote bare git repository
Created a content type, new page.
Published everything
Now, I would expect, that I can see my changes in the remote repo. But all I can see are the initial commits from the 2. step above. No new content type, no new page, no branch "live". (The content items are however visible in the local repo)
What is missing?
Edit:
Since Creafter can by set up in many ways, in order to clarify my deployment scenario, I am adding deployment diagram + short description.
There are 3 hosts - one for each environment + shared git repo.
Authoring
This is where studio is located and content authors make changes. Each change is saved to the sandbox local git repository. When a content is published, the changes are pulled to the published local git repository. These two local repos are not accessible from other hosts.
Delivery
This is what provides published content to the end user/application.
Deployer is responsible for getting new publications to the delivery instance. It does so by polling (periodically pulling from) specific git repository. When it pulls new changes, it updates the local git repository site, and Solr indexes.
Gitlab
This hosts git repository site. It is accessible from both - Authoring and Delivery hosts. After its creation, the new site is pushed to this repo. The repo is also polled for new changes by Deployers of Delivery instances.
In order for this setup to work, the published changes must somehow end up in Gitlab's site repo, but they do not (the red communication path from Authoring Deployer to the Gitlab's site)
Solution based on #summerz answer
I implemented GitPushProcessor and configured new deployment target in authoring Deployer, adding mysite-live.yaml to /opt/crafter-cms-authoring/data/deployer/target/:
target:
env: live
siteName: codelists
engineUrl: http://localhost:9080
localRepoPath: /opt/crafter-cms-authoring/data/repos/sites/mysite/published
deployment:
pipeline:
- processorName: gitPushProcessor
remoteRepo:
url: ssh://path/to/gitlab/site/mysite
I think you might have confused push with publish.
On Publishing
Authoring (Studio) publishes to Delivery (Engine) after an approval workflow that makes content go live. Authoring is where content (and code if you like) is managed and previewed safely, then that is published to the live delivery nodes for delivery to the end-user.
On DevOps
A site's local git repository can be pushed/pulled to/from remote repositories. This means:
Code can flow from a developer's workstation to Studio (via a github, gitlab, bitbucket etc.) <== this is code moving forward (and can flow via environments like QA, Load Testing, etc.)
Content can flow back, from Studio to the developer's local workstation in a similar manner <== this is content moving backward (you can have production content on your laptop if you want)
When code flows forward from a developer to Studio, that's when Studio pulls from the remote git repo.
When content flows backward from Studio to the developer, that's when Studio pushes to the remote git repo.
Documentation
A good bird's eye view of the architecture of the system relating to publishing can be found here: http://docs.craftercms.org/en/3.0/developers/architecture.html
A good article that explains the DevOps workflow/Git stuff is here: http://docs.craftercms.org/en/3.0/developers/developer-workflow.html
Update based on the expanded question
My new understanding based on your question is: You can't allow the deployers in Delivery to access Authoring's published repo to poll due to some constraint (even over SSH and even with limits on the source IP). You'd like to use GitLab as a form of content depot that's accessible as a push from Authoring and pull from Delivery.
If my understanding is correct, I can think of two immediate solutions.
Set up a cron job in authoring to push to GitLab periodically.
You'll need to add GitLab as a remote repo in published and then set up a cron like this:
* * * * * git --git-dir /opt/crafter/data/repos/sites/{YOUR_SITE}/published/.git push 2>&1
Test it out by hand first, then cron it.
Write a deployer processor that can push content out to an end-point upon a change or, wait for the ticket: https://github.com/craftercms/craftercms/issues/2017.
Once this is built, you'll need to configure another deployer in Authoring that will push to GitLab.
In either case, beware not to update things in GitLab since you're using published and not sandbox. (See DevOps notes above to learn why.)

Subversion Edge SVN cloud solution

At the moment we have a local server as our SVN server, using Subversion Edge 3.1.0, where users push their commits and is used as the main repository. Recently this has been giving us some problems, the server tends to switch off or encounter problems, which then the server needed to be restarted.
Since we also have some people offshore working on the same repository we decided it's best to have an Azure VM set-up, this will act as a backup server and also have the repository updated with each commit (like Dropbox, File Sync, etc.).
My questions are,
Have anyone actually managed to setup an environment similar to this?
How do the commits work? When someone pushes to the cloud repository
and someone then pushes on the local repository.
Have anyone actually managed to setup an environment similar to this?
As long as you have the networking configured such that users can reach this Azure VM via HTTP (preferably HTTPS), it should be no different from hosting a repository on your company network.
How do the commits work? When someone pushes to the cloud repository and someone then pushes on the local repository.
Subversion has no notion of a "cloud repository" vs. a "local repository" because it's a centralized VCS - there is only one repository, ever.
Users would simply commit to your Azure-hosted repository instead of the on-premises one. The commits work exactly the same.
this will act as a backup server and also have the repository updated with each commit
Subversion on its own is not a backup! You must take regular backups of your repository and keep them in a location that is separate from the repository server to truly keep your repository data safe.
Your repository will always be "updated with each commit" because that's how Subversion works in the first place. Assuming your developers are committing code regularly, that is.

Bitbucket Pipelines access other node repository

I have enabled Bitbucket Pipelines in one of my node.js repositories to have it run the build on every commit. My repository depends on another node.js repository. For development I've linked the one to the other using npm link.
I've tried a git clone of that repository that is specified in the bitbucket-pipelines.yml file, but the build gets stuck on that command. I guess it's because git is asking for authentication at that point.
Is there a way to allow the container to access other repositories in the same team? Or is there a better way altogether on how to solve this? I'd also be fine with switching to another CI tool if Bitbucket Pipelines aren't capable of this – the only requirement is that it's free for teams < 5 people.
Btw. I'd like to avoid paying for npm private packages if possible.
Thanks!
You can organize access by ssh key for another repo like described in official docs https://confluence.atlassian.com/bitbucket/access-remote-hosts-via-ssh-847452940.html

Push nodejs app from subdirectory of GitHub repo to openshift

I have a private GitHub repo which looks like this:
[Project root]
- angular
- nodejs
Of course, it would be cleaner to separate these two projects into two
repos but due to the limited amount of private repos on GitHub I
decided to place both parts into the same repo.
Is there a way I can work with my own GitHub repo and push a new version of a nodejs application to OpenShift without having to use OpenShift's git repo?
I remember that CloudFoundry - another PaaS - has its own CLI tool which simply allows you to write cf push <app>, no need to have a git repo at CF. Is there something similar with rhc?
The easiest (and probably best) solution to this problem is to use gitlab (my preference) or another service such as bitbucket for hosting private repos. Gitlab offers free unlimited private repos.

Resources