I work on a lot of projects and sometimes I'd prefer to just delete my local repositories. However, I am limited in this by my occasional use of the secret phase (as I need to check how my repository differs from the server).
Is there a search I can use to find changesets in the secret phase or do I have to revert to (the slow)
hg log --debug | grep secret -B 2 -A 15
See hg help revsets for information on how to specify ranges of revisions. This should do the trick:
hg log -r "secret()"
Related
I have this command:
git checkout -b <name>
What does -b do in this command? Where can I read about such commands in git and in the terminal in particular?
The -b option specifies a git branch to check out.
For more information, view the git documentation.
The description of the -b option in the git documentation is a little dense:
git checkout -b|-B <new_branch> [<start point>]
Specifying -b causes a new branch to be created as if
git-branch[1] were called and then checked out. In this case you can
use the --track or --no-track options, which will be passed to git
branch. As a convenience, --track without -b implies branch
creation; see the description of --track below.
If -B is given, <new_branch> is created if it doesn’t exist;
otherwise, it is reset. This is the transactional equivalent of
$ git branch -f <branch> [<start point>]
$ git checkout <branch>
that is to say, the branch is not reset/created unless "git checkout"
is successful.
They are command options or parameters.
Commands can take many different options as input, and usually (but not always) these options are prefixed by - or --, followed by a letter or word, and then sometimes followed again by a value for that option.
For git checkout the -b option allows you to specify a value for branch name.
You can type git --help to view high level options or git checkout -h to find out about options specific to checkout function. However, Git being a large complex tool, has many many options so suggest to check out the official documentation online, rather than only the built in help on terminal.
Getting help for terminal commands in general: For most commands, you can type <command> --help or try -h if that didn't work. To read the long-form manual for a command type man <command>. To search through a list of all available commands try apropros <search terms> to find one you want.
BONUS TIP: If you are new to Linux terminal in general, and want to learn various commands quickly without having to google a lot, may I suggest installing the tldr tool.
sudo apt install tldr
Once installed via the above command, you can run tldr <command name>.
For example try tldr tar and it gives you some nice examples about how to use the tool.
I have a problem with TIME when Im using svnlook tree /var/www/svn/TEST --full-paths| grep -E 'RV/13\.9\.4\.[0-9]+/$'
/var/www/svn/TEST -> is quite big project has many folder and files.
Explication:
I listed only the repository that named like that RV/13.9.4.n/
to get this result
RV/13.9.4.0/
RV/13.9.4.1/
RV/13.9.4.2/
PROBLEM
my command takes too long to be executed because it needs to fetch project tree
how can I avoid that to make it faster?
PS: something like svn list [repo_path] | grep '13\.9\.4\.[0-9]+'
Unfortunately, I can't use the command svn list in svn subversion machine server side (Hooks)
It is technically possible to run this command in your hooks, but I am not sure whether this will be faster than svnlook tree:
svn list file:///var/www/svn/ILS | grep '13\.9\.4\.[0-9]+'
Note the file:// protocol.
Wrote a bash script for the prepare-commit-msg git hook. It lists all staged files that exist, but I only want the staged files that are attempting to be committed (i.e. Example of desired input/output at the bottom of the page).
My script's job is to prevent a commit from happening if the files attempting to be committed did not follow a certain commenting convention (i.e. think java docs). Not only this, but it edits and auto formats the comments to meet my commenting convention. This is extremely important to note because I can't just grab the SHA-1 of the commit because this script needs to happen before that key is ever created.
This works perfectly when I execute commit -a (i.e. commit all files). However, I run into problems when I want to just commit a few of my staged files.
Is there a way I can catch only the staged files that are attempting to be committed, not just every single staged file that exists?
For example, let's say my staged files were the following:
file1.txt
file2.txt
file3.txt
file4.txt
file5.txt
When I execute git commit file1.txt file2.txt file3.txt, I want to catch file1.txt file2.txt file3.txt in my script...but not file4.txt and file5.txt.
Is there anyway to do this?
EDIT: Definitely not a duplicate. The solution to the "duplicate" question is definitely not what I'm asking for.
$ git status -s -uno
M E
A R
The file E is modified, the file R is staged(added).
An unstaged file has the action marker in the second column (after git reset E, to unstage the file E):
$ git status -s -uno
M E
A R
These can be dropped with grep -v '^ ' for example.
Here is a complete proof in my test directory:
Tracked Files
~/test/ed $ git ls-tree HEAD
100644 blob 96bf192a9be8d1cecc314f66bb1ef5961564e983 E
100644 blob 11470e37f3d22a2548ce5c85040a44c9581d7727 I
100644 blob 8f2f9e95d9b00595d1588ccef91495c06295f5fa O
Filesystem Files (all, as in git commit -a)
~/test/ed $ ls -l .
total 16
-rw-r--r-- 1 ingo ingo 140 25. Jun 05:48 E
-rw-r--r-- 1 ingo ingo 143 25. Jun 05:39 I
-rw-r--r-- 1 ingo ingo 106 25. Jun 05:29 O
-rw-r--r-- 1 ingo ingo 157 25. Jun 05:28 R
Status of the working directory: Changes against HEAD and staged files
~/test/ed $ git status -s -uno
M E
A R
The output without the modified files that are not yet or no more (git reset) in the index (aka. not staged or unstaged)
~/test/ed $ git status -s -uno|grep -v '^ '
A R
Staged filenames only, without the operation flag
~/test/ed $ git status -s -uno|grep -v '^ '|awk '{print $2}'
R
Git commit operation, status and control
Git introduces its own terminology. Some of these words have been used in a wrong way, I will describe the misunderstood concepts and the problematic commands that lead to the erroneous formulation.
Luckily git has a very strong, defined language, where each term has an exact meaning, some of them can be seen in git help gitglossary. To understand the concepts git uses, the git help git page is worth to be read 5-50 times together with the introductory pages that are linked from there.
If you installed a git version without the documentation, slap your system administrator. I assume, that most people who actively read questions, answers and articles are there own administrators, so slap yourself, but not too hard ;) Of course the docs can be found on the net, but they are an integral part of a to-be-used git installation.
Luckily git was initiated and its core was completely written by one of the most excellent minds of our days or at least, by one who uses strictest logic concepts, instead of applying killer tools, to write and control his software development: Linus Thorvalds.
That makes it possible to use the same terms with defined meanings, when we talk about git and operations in a git repository. I won't go to deep though, as some of the concepts are developed with quite advanced theoretical terms in computer science in mind.
The git repository
There are two main types of git repositories, called bare and non-bare, or I sometimes say checked-out (git help init). In this article I just talk about non-bare repositories, where the tracked files of the repository live in the working directory
from gitglossary(7):
working tree
The tree of actual checked out files. The working tree normally
contains the contents of the HEAD commit’s tree, plus any local
changes that you have made but not yet committed.
Note for the Noobs: gitglossary(7) means the manual page with the name "gitglossary" in section 7. With man this page can be reached with man -s7 gitglossary. With git help gitglossary exactly the same will show, with git help --web gitglossary you see a well formatted document in your browser, if your system is configured to remote call a html page into your browser session. With Windows, where there is no man you will always be directed into the browser. For git commands such as add the manual page is man 1 git-add or git-add(1).
Tracked Files
We have seen here, that the term tracked means that the git repository knows and controls that file. The glossary does not come from the gitglossary(7), but from git-add(1), option
-u, --update
Update the index just where it already has an entry matching
<pathspec>. This removes as well as modifies index entries to
match the working tree, but adds no new files.
If no <pathspec> is given when -u option is used, all tracked
files in the entire working tree are updated (old versions of
Git used to limit the update to the current directory and
its subdirectories).
The command git add --update is one of the most important operations to understand the handling of in the working tree by git.
Here shows the problem
with git commit file1.txt file2.txt file4.txt, but lets first define some more terms.
Staged Files or Index
The set of staged files build the index (see gitglossary(7) for index, but ignore the several merge levels or the unmerged index). For our purpose
The index is a stored version of your working tree.
namely that stored version of your working tree that is prepared to be committed as one commit (again gitgloassary(7)
commit
`As a noun: A single point in the Git history;
... "revision" or "version" are synonyms from other version control systems. As Git users we say "commit".
... to be continued (26.Friday) ...
Not sure if this is possible or not, but I figured I'd ask to see if anyone knows. Is it possible to find a file containing a string in a Perforce repository? Specifically, is it possible to do so without syncing the entire repository to a local directory first? (It's quite large - I don't think I'd have room even if I deleted lots of stuff - that's what the archive servers are for anyhow.)
There's any number of tools that can search through files in a local directory (I personally use Agent Ransack, but it's just one of many), but these will not search a remote Perforce directory, unless there's some (preferably free) tool I'm not aware of that has this capability, or maybe some hidden feature within Perforce itself?
p4 grep is your friend. From the perforce blog
'p4 grep' allows users to use simple file searches as well as regular
expressions to search through file contents of head as well as earlier
revisions of files stored on the server. While not every single option
of a standard grep is supported, the most important options are
available. Here is the syntax of the command according to 'p4 help
grep':
p4 grep [ -a -i -n -v -A after -B before -C context -l -L -t -s -F -G ] -e pattern file[revRange]...
See also, the manual page.
Update: Note that there is a limitation on the number of files that Perforce will search in a single p4 grep command. Presumably this is to help keep the load on the server down. This manifests as an error:
Grep revision limit exceeded (over 10000).
If you have sufficient perforce permissions, you can use p4 configure to increase the dm.grep.maxrevs setting from this default of 10K to something larger. e.g. to set to 1 million:
p4 configure set dm.grep.maxrevs=1M
If you do not have permission to change this, you can work around it by splitting the p4 grep up into multiple commands over the subdirectories. You may have need to split further into sub-subdirectories etc depending on your depot structure.
For example, this command can be used at a bash shell to search each subdirectory of //depot/trunk one at a time. Makes use of the p4 dirs command to obtain the list of subdirectories from the server.
for dir in $(p4 dirs //depot/trunk/*); do
p4 grep -s -i -e the_search_string $dir/...
done
Actually, solved this one myself. p4 grep indeed does the trick. Doc here. You have to carefully narrow it down before it'll work properly - on our server at least you have to get it down to < 10000 files. I also had to redirect the output to a file instead of printing it out in the console, adding > output.txt, because there's a limit of 4096 chars per line in the console and the file paths are quite long.
It's not something you can do with the standard perforce tools. One helpful command might be p4 print but it's not really faster than syncing I would think.
This is a big if but if you have access to the server you can run agent ransack on the perforce directory. Perforce stores all versioned files on disk, it's only the metadata that's in a database.
I have a git clone/repo on a development server, but I am now moving to another one. I don't want to commit all my local branches and changes to the main repository, so how can I make an exact copy of everything on oldserver to newserver?
I tried oldserver:~$ scp -rp project newserver:~/project
but then I just get loads and loads of "typechange" errors when trying to do anything on newserver.
Someone said something about x-modes, but how can I preserve that when moving files between servers?
If you want a git solution, you could try
git clone --mirror <oldurl> <newurl>
though this is only for bare repositories.
If this is a non-bare repo, you could also do the normal clone, followed by something like this:
git fetch origin
git branch -r | grep '^ *origin/[^ ]*$' |
while read rb; do git branch --no-track ${rb#*/} $rb; done
git remote rm origin
The middle step can of course be done in 5000 different ways, but that's one! (note that the continuation line \ isn't necessary after the pipe in bash - it knows it needs more input)
Finally, I'd suggest using rsync instead of scp (probably with -avz options?) if you want to directly copy. (What exactly are these typechange errors?)
I've actually done this, and all I did was tar the repo up first and scp it over. I would think that scp -rp would work as well.
"Typechange" would normally refer to things like a symlink becoming a file or vice-versa. Are the two servers running the same OS?
You may also want to try the simple dumb solution -- don't worry about how the typechanges got there, but let git fix them with a reset command:
git reset --hard HEAD
That only makes sense if (1) the problems all pertain to the checked-out files (and not the repository structure itself) and (2) you haven't made any changes on newserver which you need to preserve.
Given those caveats, it worked for me when I found myself with the same problem, and it doesn't require you to think about git's internals or how well your file-transfer process is preserving attributes.