Can I create a .Z file with gzip? [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I need to create a .Z (compress) file as the receiver is expecting to read it with uncompress utility. But I don't have the possibility to install compress package on my Linux host.
Is there a way to get the compressed .Z file (using adaptive Lempel-Ziv coding) with gzip command?

No. gzip cannot compress to the .Z format.
Download the source code for compress, compile it, and use it. (You do not need to have it installed on your system.)

A couple of ideas:
scp your file to a system that is less hobbled and compress it there
use a docker image
You can run the fedora docker image like this with a "bind mount" so that the files on your local host are visible in /data in the container:
docker run -it --rm -v "$(pwd)":/data fedora
Then, inside the container, run:
yum install ncompress
compress SOMEFILE
exit
and your container will be removed and nothing will have been installed on your local host and you'll have a lovely, compressed SOMEFILE.Z

Related

How to force open a file in linux as a normal user [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed last year.
Improve this question
"/etc/docker/daemon.json"
"/etc/docker/daemon.json" E212: Can't open file for writing
I’m currently trying to set up a Loki server, Promtail, and Grafana as docker images
I installed all the plugins needed however when I tried editing the docker daemon config file with this command
sudo nano /etc/docker/daemon.json
It does not allow me to write due to permissions so I tried using
sudo vi chmod 666 /etc/docker/daemon.json
but this only creates a new file in my directory called chmod
The docker containers are up but I can't see the Loki metrics on my web browser when I try to use localhost:3100/metrics neither can it can be added as a Datasource
Please can you help?
It should be sudo chmod 666 /etc/docker/daemon.json.
What you are doing is running vi against 3 files, chmod, 666, /etc/docker/daemon.json.
The directory /etc/docker must also exist as a directory, and not as a file.

Where is my folder copied when I scp from a server to a name [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
So I wanted to copy a folder from a remote server to my local computer. I am using WSL (Windows Subsystem for Linux)/ Ubuntu.
First, I ssh into the server using ssh user#host.
Then I wrote the scp command which is scp -r user#host:/var/www backup-9-feb
But now I cant find this backup-9-feb folder, please help. Unfortunately, I forgot the name of the folder too. This is just an example.
After I executed these commands, A long list of files with there paths were shown
You don't need to ssh into the server to use scp. You want to do the following on your computer: scp -r server_user#server_host:/var/www backup-9-feb. This will copy (recursively) the directory /var/www of the server to the directory where you ran this command on your machine.
Note: scp is going to be deprecated so you probably want to start using an utility like rsync (works similarly).

rsync command on target server is located in different directory than on source server. How to use rsync? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I want to copy files from one server (server A) to another (server B) using rsync (yes, only rsync).
When using rsync on server A it returns:
bash: /usr/local/bin/rsync: No such file or directory
Typing in rsync on server A and server B both show up information about rsync so it is available. But I discovered that on server A the rsync command is located in /usr/local/bin/rsync, while on server B it is located in /usr/bin/rsync.
How do I tell rsync that on server A rsync is located in path /usr/bin/rsync?
Server A runs on SunOS and server B runs on Linux.
You need to specify the --rsync-path option:
rsync --rsync-path=/usr/bin/rsync SRC... DEST
See man rsync for the details.

Zip and unzip a directory and its file in linux [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I am a newbie in Linux. Whats is the complete process to zip and unzip a directory and its files? Please mention if any installation has to be done.
To zip a folder and it's contents recursively:
zip -r archivefile foldername
To unzip a zip file:
unzip archivefile
I had alot of trouble using unzip giving me errors like
sql.zip has more than one entry--rest ignored
Etc.
Using php worked like a sharm. Oneliner:
php -r '$zip = new ZipArchive; $zip->open("db.sql.zip"); $zip->extractTo("./"); $zip->close(); echo "Yay!";'
Run in cmd / terminal after php is installed
Several options exist, the most common ones:
On CLI (command line interface) there are the two utilities zip and unzip which do the obvious thing. For example to compress a directory "my-folder" with all its content using the zip algorithm you would do a zip -r my-folder.zip myfolder. To uncompress it your would use unzip my-folder.zip. Paths are always relative to the current working directory, so where you execute the command. Take a look at the "man page" to find out about the usage: man zip.
There are also GUI utilities (so utilities with a graphical user interface), but it depends on what desktop environment you use, since they are typically integrated. There is ark for KDE and a differente service menus that can be used for example in the file manager dolphin. There certainly are similar solutions for desktop environments like GNOME or Unity.
The question what packages you have to install depends a bit on the Linux distribution you use. The package names may vary slightly, but in general you certainly should be able to find the "zip" package in your local package management system.

Is there a single command to download binary files from an HTTP server to linux/solaris and install the binary subsequently? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I want to automate a process of downloading a binary file from a server (HTTP) and then install it on either Solaris or Linux servers.
I use wget command followed by executing the binary file as root user.
Can we combine these two steps?
put both commands into a single script that gets executed by root
You can do
wget url | command-to-install
However, if this is a tar folder and you know how to install, you can write a shell script to do it. e.g
wget url
tar xvzf archive
./configure
make
make install
and run it as root e.g sudo ./installscript.sh

Resources