How to use cscope with paths that contain spaces - vim

There are some folder that contains space, and as a result, those folders can not be indexed using cscope.
Can i ask you for help to solve this,or any suggestion.
thanks
Julius
Thanks for your reply.
My steps to use cscope like the following
find . -name '*.scala'>cscope.files
cscope -b
at this step. i see the message indicates that can not find file:
cscope: cannot find file /work/project/copy
cscope: cannot find file of
cscope: cannot find file fp/src/main/jav....
Actually copy of fp is a folder.so i think cscope can not recognize the folder contains space.
I encountered this problem when i tried to use vim with cscope.maybe i need move this question to other tag.

You can do it simply using GNU find at least, you can use the -printf or -fprintf options for that:
find . -type f -fprintf cscope.files '"%p"\n'

pydave's answer is very slow. This way took 0.10s where pydave's answer took 14s:
find . -name "*.scala" | awk '{print "\""$0"\""}' > cscope.files

You can use find's -exec to force quotes around your output:
find . -name "*.scala" -exec echo \"{}\" \; > cscope.files
You might need to mess around with quoting/escaping if you're doing this from a script.

Double quoting the files names works in cygwin, where as escaping with backslash does not.
$ find $PWD -name "*.scala" | sed -e 's/^/"/g' -e 's/$/"/g' > cscope.files

Related

Bash Script - Need find command to output file paths enclosed in double quotes without line breaks

I have been using cscope/ctags database. However after sometime I noticed that some files in my cscope.files that stores the result of my find command, are broken into two or more lines. This causes them being ignored by cscope/ctags while indexing.
Currently I use it in an alias :
alias prp_indx='
rm cscope.in.out cscope.out cscope.files tags
find . -name '\''*.[chS]'\'' >> cscope.files
find . -name '\''*.cpp'\'' >> cscope.files
find . -name '\''*.hpp'\'' >> cscope.files
find . -name '\''*.cxx'\'' >> cscope.files
find . -name '\''*.hxx'\'' >> cscope.files
cscope -b -q -k; ctags -R
'
Please help me with an appropriate command that I can use in my alias/function to achieve the file names with double quotes without paths broken in many lines.
There is no reason I can think of for find to split a file name on several lines, except if the name itself has newline characters in it.
If you have such file names, it is probably better to rename these files as I think cscope does not really support file names with newlines in them. At least, I don't think there is a way to list such files in a cscope.files file, even with quoting or any kind of escaping (but if you know how to do, please let us know, such that we can adapt what follows accordingly). So, the best you could do is to let cscope do the search (-R) instead of providing a cscope.files file. If you do so cscope will indeed find and analyse these files, but then, when interacting with cscope you will discover that it gets confused and splits the names anyway...
If you do not have such unusual file names, but there are unwanted newline characters in your cscope.files file, there must be something else that tampers with it.
Anyway, prefer a function. Compared to functions, aliases mainly have drawbacks. With a bash function:
prp_indx () {
rm cscope.in.out cscope.out cscope.files tags
find . -name '*.[chS]' -o -name '*.[ch]pp' -o -name '*.[ch]xx' > cscope.files
cscope -b -q -k
ctags -R "$#"
}
Note: if you can have directories with names matching one of the 3 patterns add a -type test to exclude directories:
find . ! -type d \( -name '*.[chS]' -o -name '*.[ch]pp' -o -name '*.[ch]xx' \) > cscope.files
If you have unusual file names containing spaces, double-quotes and/or backslashes, you can add a post-processing with, e.g., sed:
sed -i 's/["\]/\\&/g;s/^\|$/"/g' cscope.files
This will add a backslash before any double-quote or backslash, plus double-quote all file names. Add this sed command to the function definition, after the find command.

Display multiple files in Linux/Unix

I'm looking to display 3 different files, if they exist. I thought the following would work, but it doesn't:
ls -R | grep 6-atom2D.vector$ 6-atom2D.klist 6-atom2D.struct
How can I do it?
Knowing the (base) filenames, you can use find:
find . -name '6-atom2D.vector$' -o -name '6-atom2D.klist' -o -name '6-atom2D.struct'
It searches recursive by default.
For case-insensitive search, use -iname instead.
ls -R | egrep "6-atom2D\.vector$|6-atom2D\.klist|6-atom2D\.struct"
If $ is supposed to be end of line regexp, then you might need to use \> instead. That works for me at least.
Edit: Backslash before .

Replace a part of statement with another in whole source code

I am trying to find the whole source code for occurrences of, say, "MY_NAME" and want to replace it with, say, "YOUR_NAME". I already know the files and the line numbers where they occur and i want to make a patch for the same so that anyone running the patch can do the same. Can anyone please help?
You can do it by console. Just use find to locate destination files, and then you can declare what you want to replace with what sentence. In example:
find -name '*' | xargs perl -pi -e 's/MY_NAME/YOUR_NAME/g'
It might be easier to do a sed command, and then generate a patch.
sed -e '12s/MY_NAME/YOUR_NAME/g;32s/MY_NAME/YOUR_NAME/g' file > file2
This will replace MY_NAME with YOUR_NAME on lines 12 and 32, and save the output into file2.
You can also generate a sed script if there are many changes:
#!/bin/sed -f
12s/MY_NAME/YOUR_NAME/g
32s/MY_NAME/YOUR_NAME/g
Then, for applying to many files, you should use find:
find -type f '(' -iname "*.c" -or -iname "*.h" ')' -exec "./script.sed" '{}' \;
Hope this helps =)
Use the command diff to create a patch-file that can then be distributed and applied with the patch-command.
man diff Will give you a lot of information on the process.

grep ignore vim temporary files

What is the best way to ignore vim temporary files when doing a search with grep?
grep --exclude=*~
I believe that should work.
I find Ack to be a drop in replacement for my grepping needs. No need to worry about excluding a bunch of file types or directories by default. You can always setup an .ackrc file in order to add more file types or alter ack's default behavior.
You haven't said this but I suspect that you're grepping through a directory tree.
This may not be the most elegant solution but you might use the output of 'find'.
I often find myself recursively grepping a directory tree like this:
grep <needle> `find . \( -name '*.cpp' -o -name '*.h' \) -print`
You could certainly do something like:
grep <needle> `find . \! -name '.??*swp' -print`

What's the best way to find a string/regex match in files recursively? (UNIX)

I have had to do this several times, usually when trying to find in what files a variable or a function is used.
I remember using xargs with grep in the past to do this, but I am wondering if there are any easier ways.
grep -r REGEX .
Replace . with whatever directory you want to search from.
The portable method* of doing this is
find . -type f -print0 | xargs -0 grep pattern
-print0 tells find to use ASCII nuls as the separator and -0 tells xargs the same thing. If you don't use them you will get errors on files and directories that contain spaces in their names.
* as opposed to grep -r, grep -R, or grep --recursive which only work on some machines.
This is one of the cases for which I've started using ack (http://petdance.com/ack/) in lieu of grep. From the site, you can get instructions to install it as a Perl CPAN component, or you can get a self-contained version that can be installed without dealing with dependencies.
Besides the fact that it defaults to recursive searching, it allows you to use Perl-strength regular expressions, use regex's to choose files to search, etc. It has an impressive list of options. I recommend visiting the site and checking it out. I've found it extremely easy to use, and there are tips for integrating it with vi(m), emacs, and even TextMate if you use that.
If you're looking for a string match, use
fgrep -r pattern .
which is faster than using grep.
More about the subject here: http://www.mkssoftware.com/docs/man1/grep.1.asp
grep -r if you're using GNU grep, which comes with most Linux distros.
On most UNIXes it's not installed by default so try this instead:
find . -type f | xargs grep regex
If you use the zsh shell you can use
grep REGEX **/*
or
grep REGEX **/*.java
This can run out of steam if there are too many matching files.
The canonical way though is to use find with exec.
find . -name '*.java' -exec grep REGEX {} \;
or
find . -type f -exec grep REGEX {} \;
The 'type f' bit just means type of file and will match all files.
I suggest changing the answer to:
grep REGEX -r .
The -r switch doesn't indicate regular expression. It tells grep to recurse into the directory provided.
This is a great way to find the exact expression recursively with one or more file types:
find . \\( -name '\''*.java'\'' -o -name '\''*.xml'\'' \\) | xargs egrep
(internal single quotes)
Where
-name '\''*.<filetype>'\'' -o
(again single quotes here)
is repeated in the parenthesis ( ) for how many more filetypes you want to add to your recursive search
an alias looks like this in bash
alias fnd='find . \\( -name '\''*.java'\'' -o -name '\''*.xml'\'' \\) | xargs egrep'

Resources