Subversion Edge SVN cloud solution - azure

At the moment we have a local server as our SVN server, using Subversion Edge 3.1.0, where users push their commits and is used as the main repository. Recently this has been giving us some problems, the server tends to switch off or encounter problems, which then the server needed to be restarted.
Since we also have some people offshore working on the same repository we decided it's best to have an Azure VM set-up, this will act as a backup server and also have the repository updated with each commit (like Dropbox, File Sync, etc.).
My questions are,
Have anyone actually managed to setup an environment similar to this?
How do the commits work? When someone pushes to the cloud repository
and someone then pushes on the local repository.

Have anyone actually managed to setup an environment similar to this?
As long as you have the networking configured such that users can reach this Azure VM via HTTP (preferably HTTPS), it should be no different from hosting a repository on your company network.
How do the commits work? When someone pushes to the cloud repository and someone then pushes on the local repository.
Subversion has no notion of a "cloud repository" vs. a "local repository" because it's a centralized VCS - there is only one repository, ever.
Users would simply commit to your Azure-hosted repository instead of the on-premises one. The commits work exactly the same.
this will act as a backup server and also have the repository updated with each commit
Subversion on its own is not a backup! You must take regular backups of your repository and keep them in a location that is separate from the repository server to truly keep your repository data safe.
Your repository will always be "updated with each commit" because that's how Subversion works in the first place. Assuming your developers are committing code regularly, that is.

Related

google cloud source repositories are not in sync

I'm new to google cloud source repository and confused to why code pushed in one of the repository is not syncing up with another repository. is it because all of the cloud repos need to be connected to same git-lab/git-hub repository? or am i missing something else with respect to sync of code across repository's.
I have 2 repository's namely DEV and QA. pushed my code in DEV and not able to see any of the pushed code in QA repository. as of now both are not connected to any git-lab/git-hub repository, as i can see it in the setting of the cloud repository.
Posting #ezkl's comment as a Community wiki for better visibility.
If you are looking for the code you pushed to one repository to also be available in another repository, that isn't how git repositories work.
To achieve this goal you would need an automation.
There are some out of the box tools that might be useful:
Push code from an existing repository on your local machine to Cloud Source Repositories
Mirroring of GitHub or Bitbucket repositories
Add a Google Cloud repository as a remote to a local Git repository

How to setup authoring env to publish site to remote git repo?

I downloaded and started authoring environment (crafter-cms-authoring.zip)
Created site backed by remote git repo as described in: Create site based on a blueprint then push to remote bare git repository
Created a content type, new page.
Published everything
Now, I would expect, that I can see my changes in the remote repo. But all I can see are the initial commits from the 2. step above. No new content type, no new page, no branch "live". (The content items are however visible in the local repo)
What is missing?
Edit:
Since Creafter can by set up in many ways, in order to clarify my deployment scenario, I am adding deployment diagram + short description.
There are 3 hosts - one for each environment + shared git repo.
Authoring
This is where studio is located and content authors make changes. Each change is saved to the sandbox local git repository. When a content is published, the changes are pulled to the published local git repository. These two local repos are not accessible from other hosts.
Delivery
This is what provides published content to the end user/application.
Deployer is responsible for getting new publications to the delivery instance. It does so by polling (periodically pulling from) specific git repository. When it pulls new changes, it updates the local git repository site, and Solr indexes.
Gitlab
This hosts git repository site. It is accessible from both - Authoring and Delivery hosts. After its creation, the new site is pushed to this repo. The repo is also polled for new changes by Deployers of Delivery instances.
In order for this setup to work, the published changes must somehow end up in Gitlab's site repo, but they do not (the red communication path from Authoring Deployer to the Gitlab's site)
Solution based on #summerz answer
I implemented GitPushProcessor and configured new deployment target in authoring Deployer, adding mysite-live.yaml to /opt/crafter-cms-authoring/data/deployer/target/:
target:
env: live
siteName: codelists
engineUrl: http://localhost:9080
localRepoPath: /opt/crafter-cms-authoring/data/repos/sites/mysite/published
deployment:
pipeline:
- processorName: gitPushProcessor
remoteRepo:
url: ssh://path/to/gitlab/site/mysite
I think you might have confused push with publish.
On Publishing
Authoring (Studio) publishes to Delivery (Engine) after an approval workflow that makes content go live. Authoring is where content (and code if you like) is managed and previewed safely, then that is published to the live delivery nodes for delivery to the end-user.
On DevOps
A site's local git repository can be pushed/pulled to/from remote repositories. This means:
Code can flow from a developer's workstation to Studio (via a github, gitlab, bitbucket etc.) <== this is code moving forward (and can flow via environments like QA, Load Testing, etc.)
Content can flow back, from Studio to the developer's local workstation in a similar manner <== this is content moving backward (you can have production content on your laptop if you want)
When code flows forward from a developer to Studio, that's when Studio pulls from the remote git repo.
When content flows backward from Studio to the developer, that's when Studio pushes to the remote git repo.
Documentation
A good bird's eye view of the architecture of the system relating to publishing can be found here: http://docs.craftercms.org/en/3.0/developers/architecture.html
A good article that explains the DevOps workflow/Git stuff is here: http://docs.craftercms.org/en/3.0/developers/developer-workflow.html
Update based on the expanded question
My new understanding based on your question is: You can't allow the deployers in Delivery to access Authoring's published repo to poll due to some constraint (even over SSH and even with limits on the source IP). You'd like to use GitLab as a form of content depot that's accessible as a push from Authoring and pull from Delivery.
If my understanding is correct, I can think of two immediate solutions.
Set up a cron job in authoring to push to GitLab periodically.
You'll need to add GitLab as a remote repo in published and then set up a cron like this:
* * * * * git --git-dir /opt/crafter/data/repos/sites/{YOUR_SITE}/published/.git push 2>&1
Test it out by hand first, then cron it.
Write a deployer processor that can push content out to an end-point upon a change or, wait for the ticket: https://github.com/craftercms/craftercms/issues/2017.
Once this is built, you'll need to configure another deployer in Authoring that will push to GitLab.
In either case, beware not to update things in GitLab since you're using published and not sandbox. (See DevOps notes above to learn why.)

It is safe to push to gitlab repositories directly, from outside gitlab?

It is ok to push to gitlab repositories directly, from outside gitlab?
Mainly what I would like to know is:
* would gitlab detect changes?
* is is safe, as in if it will not break repos due to concurrency?
If I understand your question correctly you're asking whether it is possible to push commits from another git client than Gitlab to a Gitlab instance.
There's no problem at all concerning this, actually this is exactly what git and Gitlab is about.
It doesn't matter at all which Git client you use to get your commits done and pushed to the server running Gitlab. Think of Gitlab as just one possible frontend to your repositories.
If you are interested in the technical background:
Git is completely file-based and doesn't rely on any kind of central server managing your repositories. All relevant data is stored in the .git subdirectory of your project. This enables the use of multiple clients with a single repository - for example git and Gitlab.
Gitlab internally uses the gem gitlab_git which itself uses the library rugged that provides Ruby bindings for libgit2. That library is also used in the implementation of other git clients, "including the GitHub.com site, in Plastic SCM and also powering Microsoft's Visual Studio tools for Git".
Regarding the handling of actual concurrency problems, have a look at this answer by kan. Correct permissions are handled via git hooks as was kindly pointed towards in this comment below by Ciro Santilli.

Looking for implementing a centralized git repository... with a catch

The idea behind what I am wanting to do is to create a centralized server on a linux system. I understand how to set this up, and already have. Next I would like to set up git on a windows system, aka the client, which I understand is possible through msysgit, and gitextensions. The problem though is that I am wanting to integrate the windows client to be able to push and pull visual studio files but keep the repositories on the linux server. So in short my question is how to have a centralized server on linux for git, while the client on windows is able to push to this centralized server. Thanks in advance!
I solved my problem. What I wanted to do was to create a ssh connection between the server(linux) and the client (windows). I used tortoise git in this case with the git source control provider (visual studio integration). Just follow the steps within the link and anybody else who might have this problem will be set!
Links:
For tortoise setup: http://theswarmintelligence.blogspot.com/2009/11/windows-tortoisegit-client-for-linux.html
What's the catch here? This sounds like a completely standard use case. It's probably best to use SSH as a transport to push to the server. A couple of things to be aware of are:
You should create your centralized repository as a bare repository (i.e. one without a working tree)
If you have multiple users who will push to that repository, create a group for them on the Linux machine, and make sure that the permissions for the repository are appropriate, e.g.:
git init --bare --shared=group newrepository.git
chgrp -R developers newrepository.git
Or if you're going to have multiple repositories or need more sophisticated access control, you may want to look at using gitolite on the server.
On the client side, GitHub has a nice walkthrough for installing msysgit on Windows (and generating an SSH key) here:
http://help.github.com/win-set-up-git/
... and there are tutorials for gitextensions on its site.

Syncing website files between local and live servers using GIT?

Say I have two web servers, one local development and one live.
Under SVN I would checkout the website files to my local webserver's public_html directory and also to the live webserver's public_html directory. I would then work on the files directly on the local server and commit any changes to the central repository.
When I'm ready for those changes to go live on the live server, I would SSH in and perform an SVN update.
Essentially I have two working copies, one on live and one locally, though other users may also have working copies on their local machines. But there will only ever be one working copy on the live server. The reason for this is so that we can just perform SVN update on live server every time we want changes to be published.
How can a simiar workflow be accomplished using GIT?
To model your current work flow almost exactly do:
Set up a git repo.
Clone the repo on the server and locally.
Work locally
git push to the git repo
ssh to server
git pull.
Another way to do it would be to set up a "production" branch in git, have a cron job that continually pulls this branch on the server, and then just merge and push to the "production" branch any time you want to publish your changes. Sounds like you need a more concrete branching strategy.
See: Git flow branching model && git flow cli tool
Good luck! This is a very solvable problem with git.
You might find this useful: http://joemaller.com/990/a-web-focused-git-workflow/
In your local working copy:
git push ssh://you#yourserver/path/to/your/wc
will push the commited changes in your local version to yourserver.
Having a setup that triggers automatically pulling like leonbloy and codemac suggested may seem like a good idea at first but it tends to be very fragile. I suggest a different alternative.
http://toroid.org/ams/git-website-howto

Resources