Concatenate multiple launchpad repositories - linux

I want to download many addresses at once with bzr branch, i tried several things, but nothing seems to be working.
Tried a file.sh with this kind of structure:
sudo bzr branch lp:~jmarquez/openerp-tecvemar/tcv_bank_deposit lp:~jmarquez/openerp- tecvemar/initial_stock lp:~jmarquez/openerp-tecvemar/tcv_sale lp:~jmarquez/openerp- tecvemar/tcv_mrp lp:~jmarquez/openerp-tecvemar/tcv_label_request lp:~jmarquez/openerp- tecvemar/tcv_check_voucher
lp:~jmarquez/openerp-tecvemar/tcv_stock
But when I execute file.sh it just doesn't works, can't read the other paths after 1st one, is there some particular command in bztto achieve this?
Thanks in advance!

I don't think it's possible to download multiple branches at once.
But you can rewrite your script like this:
#!/bin/sh
localrepo=/tmp/repo
bzr init-repo $localrepo
cd $localrepo
baseurl=lp:~jmarquez/openerp-tecvemar
for branch in tcv_bank_deposit initial_stock tcv_sale tcv_mrp tcv_label_request tcv_check_voucher tcv_stock; do
bzr branch $baseurl/$branch
done
Change the path in localrepo as you like. It's important to create a shared repository with bzr init-repo to contain the branches. This way the common revisions in the branches can be shared, which will save disk space and speed up your download.

Related

Apply all stashed changes within a subfolder

I had an awful list of old stashes
I have first removed the very old ones
git reflog expire --expire-unreachable=7.days refs/stash
I have one huge stash left, which contains many stashed changes. Some are to keep some other would damage my production system. I went through the diff
git diff stash#{0}^1 stash#{0}
and I know which ones are to keep
I could do
git checkout --patch stash#{0} -- myfilename
to unstash changes on myfilename and is working fine.
However, I have a large folder with many files with stashed changes inside. I would like to apply all of them but only within that subfolder.
I have tried to approach it using wildcards in ksh but I does not work
git checkout --patch stash#{0} -- myfolder/*
results in
error pathspec [...] did not match any files known to git
The solution does not need to be git based, can be a shell script to wrap git calls
Have you tried :
git checkout --patch stash#{0} -- myfolder
without the ending * ?
Chances are your shell expands myfolder/* before executing the git command, and lists the elements which currently exist on disk, which is probably not what you want.

Get only structure of repo folders without files

I have some repo in perforce, I want to download only structure of folders without files, do you know how can I make this ?
Cheers
To learn about the folders/directories that are in a certain section of your Perforce repository, you can use the p4 dirs command (see http://www.perforce.com/perforce/doc.current/manuals/cmdref/p4_dirs.html).
For example,
p4 dirs //depot/*
will tell you all the top-level directories under //depot. Suppose the list that comes back is:
//depot/main
//depot/r1.0
Then you could subsequently issue:
p4 dirs //depot/main/*
and
p4 dirs //depot/r1.0/*
to learn about the next level of directories, and so forth, until you find no further child directories under the section of the repository that you are searching.
Once you have learned the correct set of directories that correspond to the current contents of your repository in Perforce, you can issue the corresponding mkdir commands to make those directories on your workstation.

Piping the Results of git status to Subsequent Commands

I have a large list of active files in my Git repository. One change I made was deleting a large number of images across a number of directories. I want to commit that change immediately but I don't want to include all active files and I don't want to manually type out git rm myfile.png for every single image.
So essentially what I want to do is run git rm on all active files ending in .png. I'm trying to accomplish this by piping the results of git status into git rm but I'm having trouble isolating the file name and getting this to work as I'd like.
Is this a proper use of piping and if so what syntax do I need?
Any help is appreciated, thanks.
If you already removed the files, you can type:
git add -u
And they will be removed from git repository.
From git help add:
-u, --update
Only match <filepattern> against already tracked files in the index
rather than the working tree. That means that it will never stage
new files, but that it will stage modified new contents of tracked
files and that it will remove files from the index if the
corresponding files in the working tree have been removed.

Local Git as autosave

I would like a local GIT is my home directory to implement autosave to the repository that happens every five minutes.
I have two Questions:
Is this s sane thing to do?
How does one go about writing a script that implements this functionality for a specified set of directories in the home directory on linux?
The aim is to capture all the histories all the important files in my home directory automatically without any input from me. I can use this whenever I screw-up.
Sanity is all relative!
I guess it depends on why you are backing up. If it's for hardware failure, then this won't work because the repository is in the same folder (/home/) so if the folder goes, the repo goes. Unless of course you are pushing it to a storage repo on another machine somewhere as the actual backup.
We do use git to store important things, especially research papers and PDF's, so we can easily share them.
You would write a cron job that runs a script every so often. Basically you would write a simple bash script that does a git commit -a -m "commit message" periodically in your folder. The tricky part is doing the git add on the new files that were created so they are tracked. You will likely need to do a git status and parse the output from it in your script to find the new files, then git add that list. Python may be the easiest way to do that. Then you register that with cron.
Google is your friend here, there are plenty of examples on how to register scripts with cron.
Write a shell script that would enter each directory you want and run
git add .
git commit -m "new change"
git push
and then use cron to run the script each 5 minutes.
Write a shell script to do the following
1) git status --u=no //It gives you the files which are modified
2) Iterate through the file list from step 1 and do git add <file>
3) git commit -m "latest change <date:time>"
Schedule this script in cron.

Moving a git repo to another server

I have a git clone/repo on a development server, but I am now moving to another one. I don't want to commit all my local branches and changes to the main repository, so how can I make an exact copy of everything on oldserver to newserver?
I tried oldserver:~$ scp -rp project newserver:~/project
but then I just get loads and loads of "typechange" errors when trying to do anything on newserver.
Someone said something about x-modes, but how can I preserve that when moving files between servers?
If you want a git solution, you could try
git clone --mirror <oldurl> <newurl>
though this is only for bare repositories.
If this is a non-bare repo, you could also do the normal clone, followed by something like this:
git fetch origin
git branch -r | grep '^ *origin/[^ ]*$' |
while read rb; do git branch --no-track ${rb#*/} $rb; done
git remote rm origin
The middle step can of course be done in 5000 different ways, but that's one! (note that the continuation line \ isn't necessary after the pipe in bash - it knows it needs more input)
Finally, I'd suggest using rsync instead of scp (probably with -avz options?) if you want to directly copy. (What exactly are these typechange errors?)
I've actually done this, and all I did was tar the repo up first and scp it over. I would think that scp -rp would work as well.
"Typechange" would normally refer to things like a symlink becoming a file or vice-versa. Are the two servers running the same OS?
You may also want to try the simple dumb solution -- don't worry about how the typechanges got there, but let git fix them with a reset command:
git reset --hard HEAD
That only makes sense if (1) the problems all pertain to the checked-out files (and not the repository structure itself) and (2) you haven't made any changes on newserver which you need to preserve.
Given those caveats, it worked for me when I found myself with the same problem, and it doesn't require you to think about git's internals or how well your file-transfer process is preserving attributes.

Resources