Create Google Chrome extension package without specified files - google-chrome-extension

Right now to create an extension with Google Chrome Extensions page we select a directory that contains created extension and it generates .crx file.
The problem is that it contains all files from this directory - for example, all docs, asset drafts, etc.
Is it possible to create some blacklist to ignore specified files like *.psd, *.pdf, docs/* ... ?

The Chromium team decided not to implement a manifest (or similar mechanism) for including only the desired files in a .CRX.
The recommended workflow is to have a build step that outputs only the needed files in a dist directory, and to create the CRX from that directory. This is common practice for JavaScript libraries.

My solution
I have created a .crxignore custom ignore file, like this:
.*
Makefile
*.md
bin
It is made for zip command! It is different than .gitignore. You can't add comments eg! See documentation: https://linux.die.net/man/1/zip , look for --exclude word.
Now you can create a zip without ignred files:
$ zip -qr -9 -X .out/out.zip . -x#.crxignore
# ^^^^^^^^^^^^^ using ignore file
After that I can convert zip to crx file with this go script: https://github.com/mmadfox/go-crx3 You have to build it with go build -o out/crx3 crx3/main.go command (you may get missing mod errors, but there will be the commands what you have to run). --> you can move your out/crx3 to where you want. Eg: [project]/bin/crx3.
I haven't tried, maybe chrome command also can convert zip to crx.
You have to generate a private key:
$ bin/crx3 keygen .out/key.pem
Final Makefile
build:
rm -f .out/out.zip
zip -qr -9 -X .out/out.zip . -x#.crxignore
bin/crx3 pack .out/out.zip -p .out/key.pem -o .out/out.crx
Build process:
$ make build
# --> build the .out/out.crx file

Try the command line program "zip", which could be found in cygwin if you're on windows, or is likely present on OSX, and easy to install if you're using linux.
zip package.zip -r * -x package.sh -x *.git* -x "*.*~" -x "*.pdf" docs/* "*.psd"

Related

Is there a way to access a file inside a .zip in a Linux environment

I want to find a specific line of text inside of a file and print it on my screen using Linux commands. I know I could do:
find -name '[filename]' | xargs grep -i ‘[text I'm looking for inside file]'
However, my file is inside a .zip file. I know unzip -l [.zip file name] will list of all files inside of .zip file, but it won't let me access in order to "grep" the information I need.
Is there a solution to this?
The find_zip tool from the open-source Zip-Ada project.
get / download the sources
build with the command gnatmake -P zipada (you get GNAT through apt or yum)
you have now a binary find_zip.
Usage: find_zip archive[.zip] ["]text["]

unzip file into same directory in linux

Example:
Here's list of files in "/tmp/test_dir"
file1
zip -r Test_Files.zip *
When I unzip Test_Files.zip I'm getting the below output
Current working directory "/tmp/test_dir"
/tmp/test_dir/file1
What I'm expecting when I unzip Test_Files.zip;
/tmp/test_dir/Test_Files/file1
Can anyone help how do i get expected result as mentioned above?
Use unzip. You can use -o to overwrite the existing files and -q to make it quiet. In doubt? Just use terminal and type in unzip (or try /usr/bin/unzip) to see helpful information.

How can I download all the files from a remote directory to my local directory?

I want to download all the files in a specific directory of my site.
Let's say I have 3 files in my remote SFTP directory
www.site.com/files/phone/2017-09-19-20-39-15
a.txt
b.txt
c.txt
My goal is to create a local folder on my desktop with ONLY those downloaded files. No parents files or parents directory needed. I am trying to get the clean report.
I've tried
wget -m --no-parent -l1 -nH -P ~/Desktop/phone/ www.site.com/files/phone/2017-09-19-20-39-15 --reject=index.html* -e robots=off
I got
I want to get
How do I tweak my wget command to get something like that?
Should I use anything else other than wget ?
Ihue,
Taking a shell programatic perspective I would recommend you try the following command line script, note I also added the citation so you can see the original threads.
wget -r -P ~/Desktop/phone/ -A txt www.site.com/files/phone/2017-09-19-20-39-15 --reject=index.html* -e robots=off
-r enables recursive retrieval. See Recursive Download for more information.
-P sets the directory prefix where all files and directories are saved to.
-A sets a whitelist for retrieving only certain file types. Strings and patterns are accepted, and both can be used in a comma separated list. See Types of Files for more information.
Ref: #don-joey
https://askubuntu.com/questions/373047/i-used-wget-to-download-html-files-where-are-the-images-in-the-file-stored

Checking changes made before/after installing application?

On Linux, I need to know which files were added/modified/moved/deleted after compiling and installing an application from source code, ie. the command-line, Linux equivalent to the venerale InCtrl5.
Is there a utility that does this, or a set of commands that I could run and would show me the changes?
Thank you.
Edit: The following commands are sort of OK, but I don't need to know the line numbers on which changes occured or that "./.." were updated:
# ls -aR /tmp > b4.txt
# touch /tmp/test.txt
# ls -aR /tmp > after.txt
# diff -u b4.txt after.txt
If you only need to know which files were touched, then you can use find for this:
touch /tmp/MARK
# install application here
find / -newercm /tmp/MARK
This will show you all files whose contents or metadata have changed since you touched /tmp/MARK (including newly added files).
I would personally use something like Mercurial (version control) to do this.
The main reason, is that it is not only effective but it is also clean, since it will only add a hidden directory to the top of the tree where you want to check these changes.
Let's say that you need to know what files changed in /etc/. So before installation (you need to have mercurial installed) you add the directory to mercurial:
cd /etc
hg init
hg add
hg ci -m "adding all files in /etc/ to track them down"
The above will effectively "add" all the files to track them. To verify nothing has changed:
hg st
Should return no files.
If you (or the installation) modifies a file, you should see something like this:
hg st
M foo.sh
The "M" before the file states the given file was modified.
For new files you would see a ? before the file like:
? bar.sh
After you are done and no longer want Mercurial, simple remove the hidden directory:
cd /etc
rm -rf .hg

Command to zip a directory using a specific directory as the root

I'm writing a PHP script that downloads a series of generated files (using wget) into a directory, and then zips then up, using the zip command.
The downloads work perfectly, and the zipping mostly works. I run the command:
zip -r /var/www/oraviewer/rgn_download/download/fcst_20100318_0319.zip /var/www/oraviewer/rgn_download/download/fcst_20100318_0319
which yields a zip file with all the downloaded files, but it contains the full /var/www/oraviewer/rgn_download/download/ directories, before reaching the fcst_20100318_0319/ directory.
I'm probably just missing a flag, or something small, from the zip command, but how do I get it to use fcst_20100318_0319/ as the root directory?
I don't think zip has a flag to do that. I think the only way is something like:
cd /var/www/oraviewer/rgn_download/download/ && \
zip -r fcst_20100318_0319.zip fcst_20100318_0319
(The backslash is just for clarity, you can remove it and put everything on one line.)
Since PHP is executing the command in a subshell, it won't change your current directory.
I have also get it worked by using this command
exec('cd '.$_SERVER['DOCUMENT_ROOT'].' && zip -r com.zip "./"');
cd /home/public_html/site/upload/ && zip -r sub_upload.zip sub_upload/
Use the -j or --junk-paths option in your zip command.
From the zip man page:
-j
--junk-paths
Store just the name of a saved file (junk the path), and do not store
directory names. By default, zip will store the full path (relative
to the current directory).

Resources