I have a question. I am trying to access varnish logs (website is localhost) but when I type in varnishncsa -w test.log on the terminal it is just blank. Any ideas on how I can fix this? I check the /var/log.varnish/access.log and it is empty.
Try varnishncsa -w /var/log.varnish/access.log, if that's the path where you want them, and make sure you current user has write access to that file.
Related
cp /dev/null log working manually but from shell script it's not working for some logs.
Any idea why and how to fix this on AIX. Note: No ownership permission etc issue.
There's important information missing. But what about simply using
: > log
to empty log? This doesn't need cp or /dev/null at all.
I need to see server logs in different machines( around 20 ) and server logs are stored in location with hostname in their path ( I am using super putty).
So I dont have single command to chnage directory instead i have to do it individually.
With hostname command i can get m/c name but i am not able to use as vraible in my cd command.
>hostname
mymachinename
>cd /opt/$"hostname"/logs
no directory name /opt/hostname/logs
Any help on this?
Pardon me if its duplicate. I searched but didn't get any questions related to this.
shold be
cd /opt/$(hostname)
See..
root#mongodbServer1:~# cd /opt/$(hostname)
root#mongodbServer1:/opt/mongodbServer1# pwd
/opt/mongodbServer1
root#mongodbServer1:/opt/mongodbServer1#
Use $HOSTNAME or $(hostname) or `hostname` (inverted quote) to retrieve the hostname.
In wget I am trying to get the list of files and its properties from FTP server using below Wget command,
wget --no-remove-listing ftp://myftpserver/ftpdirectory/
This will generate two files: .listing (this is what I am looking in cURL) and index.html which is the html version of the listing file.
My expectation:
In cURL how to achieve this scenario?
What is the command to get the .listing and index.html file from FTP/SFTP server using CURL.
This is what I found on http://curl.haxx.se/docs/faq.html#How_do_I_get_an_FTP_directory_li
If you end the FTP URL you request with a slash, libcurl will provide you with a directory listing of that given directory. You can also set CURLOPT_CUSTOMREQUEST to alter what exact listing command libcurl would use to list the files.
The follow-up question that tend to follow the previous one, is how a program is supposed to parse the directory listing. How does it know what's a file and what's a dir and what's a symlink etc. The harsh reality is that FTP provides no such fine and easy-to-parse output. The output format FTP servers respond to LIST commands are entirely at the server's own liking and the NLST output doesn't reveal any types and in many cases don't even include all the directory entries. Also, both LIST and NLST tend to hide unix-style hidden files (those that start with a dot) by default so you need to do "LIST -a" or similar to see them.
Thanks & Regards,
Alok
I have checked it on WIN->CygWin and it works for me:
Do not forget to use / at the end of the path.
$ curl -s -l -u test1#test.com:Test_PASSWD --ftp-ssl 'ftps://ftp.test.com/Ent/'
I have a log file, which can only be appended to. I want to pipe this log file to console, so that once new data comes to the log file, the console also reflects the new piece of data. When there is no data to be appended, the console should block there waiting for updates.
It's like a proxy server, redirecting file updates to console.
How to achieve this using bash?
Thank you.
Use tail:
tail -f logfile
-f means follow which is exactly what you need.
Also, in case you run into some problems with file becoming inaccessible, try using -F
tail command should work for you.
tail -f <log file>
I want to include a ftp share into my normal filesystem by performing the following commands:
$ mkdir /Volumes/myfolder
$ mount_ftp ftp://user:password#ftp.domain.tld /Volumes/myfolder
I get no error message, but when I open the folder, the folder is just empty. I also tried to follow the instructions listed here, but I receive the error "The share does not exist on the server. Please check the share name, and then try again.". However, I can access to the share by typing ftp://user:password#ftp.domain.tld into my adress bar.
Any ideas what I could do wrong?
As you'll see in man mount_ftp, mount_ftp can set several return values. One way to get the return value is to say echo $? right after the mount_ftp command.