Can't increment a 0-padded number past 8 in busybox sh - linux

this is the code I am using to save files from a camera and name them from 0001 onward. The camera is running Busybox, and it has an ash shell inside.
The code is based on a previous answer by Charles Duffy here.
#!/bin/sh
# Snapshot script
cd /mnt/0/foto
sleep 1
set -- *.jpg # put the sorted list of picture namefiles on argv ( the number of files on the list can be requested by echo $# )
while [ $# -gt 1 ]; do # as long as there's more than one...
shift # ...some rows are shifted until only one remains
done
if [ "$1" = "*.jpg" ]; then # If cycle to determine if argv is empty because there is no jpg file present in the dir. #argv is set so that following cmds can start the sequence from 0 on.
set -- snapfull0000.jpg
else
echo "Piu' di un file jpg trovato."
fi
num=${1#*snapfull} # $1 is the first row of $#. The alphabetical part of the filename is removed.
num=${num%.*} # removes the suffix after the name.
num=$(printf "%04d" "$(($num + 1))") # the variable is updated to the next digit and the number is padded (zeroes are added)
# echoes for debug
echo "variabile num="$num # shows the number recognized in the latest filename
echo "\$#="$# # displays num of argv variables
echo "\$1="$1 # displays the first arg variable
wget http://127.0.0.1/snapfull.php -O "snapfull${num}.jpg" # the snapshot is requested to the camera, with the sequential naming of the jpeg file.
This is what I get on the cmd line during the script operation. I manually ran the script nine times, but after the saving of file snapfull0008.jpg, as you can see in the last lines, files are named only snapfull0000.jpg.
# ./snap4.sh
variable num=0001
$#=1
$1=snapfull0000.jpg
Connecting to 127.0.0.1 (127.0.0.1:80)
127.0.0.1 127.0.0.1 - [05/Dec/2014:20:22:22 +0000] "GET /snapfull.php HTTP/1.1" 302 0 "-" "Wget"
snapfull0001.jpg 100% |*******************************| 246k --:--:-- ETA
# ./snap4.sh
More than a jpg file found.
variable num=0002
$#=1
$1=snapfull0001.jpg
Connecting to 127.0.0.1 (127.0.0.1:80)
127.0.0.1 127.0.0.1 - [05/Dec/2014:20:22:32 +0000] "GET /snapfull.php HTTP/1.1" 302 0 "-" "Wget"
snapfull0002.jpg 100% |*******************************| 249k --:--:-- ETA
# ./snap4.sh
More than a jpg file found.
variable num=0003
$#=1
$1=snapfull0002.jpg
Connecting to 127.0.0.1 (127.0.0.1:80)
127.0.0.1 127.0.0.1 - [05/Dec/2014:20:22:38 +0000] "GET /snapfull.php HTTP/1.1" 302 0 "-" "Wget"
snapfull0003.jpg 100% |*******************************| 248k --:--:-- ETA
# ./snap4.sh
More than a jpg file found.
variable num=0004
$#=1
$1=snapfull0003.jpg
Connecting to 127.0.0.1 (127.0.0.1:80)
127.0.0.1 127.0.0.1 - [05/Dec/2014:20:22:43 +0000] "GET /snapfull.php HTTP/1.1" 302 0 "-" "Wget"
snapfull0004.jpg 100% |*******************************| 330k --:--:-- ETA
# ./snap4.sh
More than a jpg file found.
variable num=0005
$#=1
$1=snapfull0004.jpg
Connecting to 127.0.0.1 (127.0.0.1:80)
127.0.0.1 127.0.0.1 - [05/Dec/2014:20:22:51 +0000] "GET /snapfull.php HTTP/1.1" 302 0 "-" "Wget"
snapfull0005.jpg 100% |*******************************| 308k --:--:-- ETA
# ./snap4.sh
More than a jpg file found.
variable num=0006
$#=1
$1=snapfull0005.jpg
Connecting to 127.0.0.1 (127.0.0.1:80)
127.0.0.1 127.0.0.1 - [05/Dec/2014:20:22:55 +0000] "GET /snapfull.php HTTP/1.1" 302 0 "-" "Wget"
snapfull0006.jpg 100% |*******************************| 315k --:--:-- ETA
# ./snap4.sh
More than a jpg file found.
variable num=0007
$#=1
$1=snapfull0006.jpg
Connecting to 127.0.0.1 (127.0.0.1:80)
127.0.0.1 127.0.0.1 - [05/Dec/2014:20:22:59 +0000] "GET /snapfull.php HTTP/1.1" 302 0 "-" "Wget"
snapfull0007.jpg 100% |*******************************| 316k --:--:-- ETA
# ./snap4.sh
More than a jpg file found.
variable num=0008
$#=1
$1=snapfull0007.jpg
Connecting to 127.0.0.1 (127.0.0.1:80)
127.0.0.1 127.0.0.1 - [05/Dec/2014:20:23:04 +0000] "GET /snapfull.php HTTP/1.1" 302 0 "-" "Wget"
snapfull0008.jpg 100% |*******************************| 317k --:--:-- ETA
# ./snap4.sh
More than a jpg file found.
variable num=0000
$#=1
$1=snapfull0008.jpg
Connecting to 127.0.0.1 (127.0.0.1:80)
127.0.0.1 127.0.0.1 - [05/Dec/2014:20:23:10 +0000] "GET /snapfull.php HTTP/1.1" 302 0 "-" "Wget"
snapfull0000.jpg 100% |*******************************| 318k --:--:-- ETA
What could be the cause of the sequence stopping after file number 8?

The problem is that leading 0s cause a number to be read as octal.
In bash, using $((10#$num)) will force decimal. Thus:
num=$(printf "%04d" "$((10#$num + 1))")
To work with busybox ash, you'll need to strip the 0s. One way to do this which will work even in busybox ash:
while [ "${num:0:1}" = 0 ]; do
num=${num:1}
done
num=$(printf '%04d' "$((num + 1))")
See the below transcript showing use (tested with ash from busybox v1.22.1):
$ num=0008
$ while [ "${num:0:1}" = 0 ]; do
> num=${num:1}
> done
$ num=$(printf '%04d' "$((num + 1))")
$ echo "$num"
0009
If your shell doesn't support even the baseline set of parameter expansions required by POSIX, you could instead end up using:
num=$(echo "$num" | sed -e 's/^0*//')
num=$(printf '%04d' "$(($num + 1))")
...though this would imply that your busybox was built with a shell other than ash, a decision I would strongly suggest reconsidering.

Related

Cannot download with Curl and Wget in AWS EC2 Linux server

I am using the EC2 server with Putty.
I want to download the latest sonar-scanner to the EC2 server.
I tried to access the download-URL using both Wget & Curl but they kept failing with the same messages.
This is the server system I use: Red Hat Enterprise Linux Server 7.8 (Maipo)
WGET
GNU Wget 1.14 built on linux-gnu.
[root#ip-10-X-X-X ~]# wget -v https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-4.7.0.2747-linux.zip
--2022-06-09 09:56:55-- https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-4.7.0.2747-linux.zip
Resolving binaries.sonarsource.com (binaries.sonarsource.com)... 99.84.191.23, 99.84.191.71, 99.84.191.75, ...
Connecting to binaries.sonarsource.com (binaries.sonarsource.com)|99.84.191.23|:443... connected.
Unable to establish SSL connection.
CURL
curl 7.29.0 (x86_64-redhat-linux-gnu) libcurl/7.29.0 NSS/3.44 zlib/1.2.7 libidn/1.28 libssh2/1.8.0
Protocols: dict file ftp ftps gopher http https imap imaps ldap ldaps pop3 pop3s rtsp scp sftp smtp smtps telnet tftp
Features: AsynchDNS GSS-Negotiate IDN IPv6 Largefile NTLM NTLM_WB SSL libz unix-sockets
[root#ip-10-X-X-X ~]# curl -O -v https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-4.7.0.2747-linux.zip
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* About to connect() to binaries.sonarsource.com port 443 (#0)
* Trying 99.84.208.28...
* Connected to binaries.sonarsource.com (99.84.208.28) port 443 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* CAfile: /etc/pki/tls/certs/ca-bundle.crt
CApath: none
0 0 0 0 0 0 0 0 --:--:-- 0:00:29 --:--:-- 0* NSS error -5938 (PR_END_OF_FILE_ERROR)
* Encountered end of file
0 0 0 0 0 0 0 0 --:--:-- 0:00:30 --:--:-- 0
* Closing connection 0
curl: (35) Encountered end of file
I'm new with using this EC2 server. Do you know what could I do to solve this?
Thank you, any help would be really appreciated!
UPDATE:
I added -k and --no-check-certificate to respectively curl & wget, but still returning the same error messages
I tried to check the wget connection, but it doesn't seem to work for URLs with download end-point:
[root#ip-10-70-10-87 settings]# wget -q --spider https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-4.7.0.2747-linux.zip
[root#ip-10-70-10-87 settings]# echo $?
4
[root#ip-10-70-10-87 settings]# wget -q --spider https://www.google.com/
[root#ip-10-70-10-87 settings]# echo $?
0
[root#ip-10-70-10-87 settings]# wget -q --spider https://dlcdn.apache.org/maven/maven-3/3.8.6/binaries/apache-maven-3.8.6-bin.tar.gz
[root#ip-10-70-10-87 settings]# echo $?
4
[root#ip-10-70-10-87 settings]# wget -q --spider https://assets.ctfassets.net/br4ichkdqihc/6jNPyoUDznu06Mk4dr9CEn/560e34fec221fad43a501442a551ad92/SimpliSafe_Outdoor_Battery_Camera_Open_Source_Disclosures_Launch.DOCX
[root#ip-10-70-10-87 settings]# echo $?
4
[root#ip-10-70-10-87 settings]# wget -q --spider https://twitter.com/home
[root#ip-10-70-10-87 settings]# echo $?
0
I checked the availability of proxy following this answer (i.e. env | grep -i proxy), and nothing came up as response, so I assume I've got no proxy configured
Have you tried to update the OS and try to use the wget command without the -v flag like this:
wget https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-4.7.0.2747-linux.zi
you can also add --no-check-certificate or you can modify the ~/.wgetrc file and add
check_certificate = off
You can do this two things if you trust the host, hope this helps
May I know if you are using any proxies for this ec2? If yes could you try to execute the wget command with --no-proxy option.
Found a similar issue in here, perhap its good to check for tls version compatibility as mentioned in the answers: Unable to establish SSL connection upon wget on Ubuntu 14.04 LTS

Using echo and grep together

I want to use echo and grep statement together. I have tried most of the thing but couldn't get the exact output
as I want.
aa=$(grep -A100000 "2010-03-24" log.txt|grep "ORA")
echo "Ip-Address|Directory Name|${aa}" > output.txt
I am grepping date because I want all the lines after current date and then grep "ORA" from it. There are other ways but according to my log file this is most suitable way.
I am getting the output like this.
10.46.162.86|ASD----Exception|2010-03-24 07 ORA-00001 - 80 -
173.45.230.59
2010-03-24 07:00:47 ORA-00942 - 80 - 173.45.230.59
2010-03-24 07:01:15 ORA-00001 - 80 - 173.45.230.59
2010-03-24 07:02:17 ORA-12849 - 80 - 173.45.230.59
2010-03-24 07:05:09 ORA-00001 - 80 - 173.45.230.59
The ideal output should be like
10.46.162.86|ASD----Exception|2010-03-24 07 ORA-00001 - 80 -
173.45.230.59
10.46.162.86|ASD----Exception|2010-03-24 07:00:47 ORA-00942 - 80 -
173.45.230.59
10.46.162.86|ASD----Exception|2010-03-24 07:01:15 ORA-00001 - 80 -
173.45.230.59
10.46.162.86|ASD----Exception|2010-03-24 07:02:17 ORA-12849 - 80 -
173.45.230.59
10.46.162.86|ASD----Exception|2010-03-24 07:05:09 ORA-00001 - 80 -
173.45.230.59
I am fetching ORA from log files from different directories.
Input is like
2010-03-22 07:00:47 ZZZZC941948879 RUFFLES 222.222.222.222 ORA-00001 -
80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+(Windows;+U;+Windows+NT+9.0;+en-
US;+rv:1.9.2.2)
2010-03-22 07:00:47 ZZZZC941948879 RUFFLES 222.222.222.222 GET
/2009/10/yep-twitter-down.ht
2010-03-22 07:00:48 ZZZZC941948879 RUFFLES 222.222.222.222 GET
/img/input-bg.jpg - 80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+
(Windows;+U;+Windows+NT+9.0;+en-
US;+rv:1.9.2.2)+Gecko/20100319+Firefox/3.9.2
2010-03-23 07:00:48 ZZZZC941948879 RUFFLES 222.222.222.222 ORA-00001 -
80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+(Windows;+U;+Windows+NT+9.0;+en-
US;+rv:1.9.2.2)+Gecko/20100319+Firefox/3.9.2
2010-03-23 07:00:48 ZZZZC941948879 RUFFLES 222.222.222.222 GET
/img/topnav-about.jpg - 80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+
(Windows;+U;+Windows+NT+9.0;+en-US;+rv:1.9.2.2)+Gecko/20100319
2010-03-23 07:00:48 ZZZZC941948879 RUFFLES 222.222.222.222 GET
/img/entry-hr.jpg - 80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+
(Windows;+U;+Windows+NT+9.0;+en-US;+rv:1.9.2.2)+Gecko/20100319+Firefox
2010-03-23 07:00:48 ZZZZC941948879 RUFFLES 222.222.222.222 ORA-00001 -
80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+(Windows;+U;+Windows+NT+9.0;+en-
US;+rv:1.9.2.2)+Gecko/20100319+Firefox/3.9.2
2010-03-24 07:00:48 ZZZZC941948879 RUFFLES 222.222.222.222 GET
/img/header-bg.jpg - 80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+
(Windows;+U;+Windows+NT+9.0;+en-US;+rv:1.9.2.2)+Gecko/20100319
2010-03-24 07:00:48 ZZZZC941948879 RUFFLES 222.222.222.222 GET
/img/bullet.gif - 80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+
(Windows;+U;+Windows+NT+9.0;+en-US;+rv:1.9.2.2)+Gecko/20100319+Firefox
2010-03-24 07:00:49 ZZZZC941948879 RUFFLES 222.222.222.222 ORA-00001 -
80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+(Windows;+U;+Windows+NT+9.0;+en-
US;+rv:1.9.2.2)+Gecko/20100319+Firefox/3.9.2
2010-03-24 07:00:49 ZZZZC941948879 RUFFLES 222.222.222.222 GET /img/bg-
module.jpg - 80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+
(Windows;+U;+Windows+NT+9.0;+en-US;+rv:1.9.2.2)+Gecko/20100319
2010-03-24 07:00:50 ZZZZC941948879 RUFFLES 222.222.222.222 ORA-00942 -
80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+(Windows;+U;+Windows+NT+9.0;+en-
US;+rv:1.9.2.2)+Gecko/20100319+Firefox/3.9.2
2010-03-24 07:00:50 ZZZZC941948879 RUFFLES 222.222.222.222 GET /img/bg-
sidebarul.jpg - 80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+
(Windows;+U;+Windows+NT+9.0;+en-US;+rv:1.9.2.2)+Gecko/20100319
2010-03-24 07:00:50 ZZZZC941948879 RUFFLES 222.222.222.222 ORA-00001 -
80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+(Windows;+U;+Windows+NT+9.0;+en-
US;+rv:1.9.2.2)+Gecko/20100319+Firefox/3.9.2
2010-03-24 07:00:51 ZZZZC941948879 RUFFLES 222.222.222.222 ORA-00942 -
80 - 98.88.35.133 HTTP/1.1 Mozilla/5.0+(Windows;+U;+Windows+NT+9.0;+en-
US;+rv:1.9.2.2)+Gecko/20100319+Firefox/3.9.2
The problem here is when I am doing the grep operation it fetches 100 or more lines depending upon the exception and I am able to append the Ip-Address and node name to one line only.
Also, the IP-Address and node name are generated at run time.
Please do suggest a way to get the desired output.
Thanks.
Since I just know that special characters are going to show up in the directory names, I'd prefer awk over sed for this to avoid code injection problems:
grep -A100000 "2010-03-24" log.txt | awk -v prefix="IP-Address|Directory name|" '/ORA/ { print prefix $0 }' > output.txt
The relevant part is
awk -v prefix="IP-Address|Directory name|" '/ORA/ { print prefix $0 }'
With -v prefix=value, a variable named prefix with the given value is made known to awk, and /ORA/ { print prefix $0 } instructs awk to process all lines that match the regex ORA by printing prefix followed by the line (which is $0).
#etanreisner gave you the answer.
One way:
try:
grep -A100000 "2010-03-24" log.txt|grep "ORA" |
while read aa
do
echo "Ip-Address|Directory Name|${aa}"
done > output.txt

Block User Agent when it is a number - htaccess

I've been receiving a lot of visits to my site from bad bots.
The pattern is this:
190.204.58.162 - - [20/Oct/2014:16:46:54 +0200] "GET / HTTP/1.0" 200 318 mysite.com "-" "881087" "-"
201.243.204.1 - - [20/Oct/2014:16:46:54 +0200] "GET / HTTP/1.0" 200 318 mysite.com "-" "442762" "-"
200.109.59.218 - - [20/Oct/2014:16:46:54 +0200] "GET / HTTP/1.0" 200 318 mysite.com "-" "717724" "-"
113.140.25.4 - - [20/Oct/2014:16:46:54 +0200] "GET / HTTP/1.1" 200 318 mysite.com "-" "360319" "-"
183.136.221.6 - - [20/Oct/2014:16:46:54 +0200] "GET / HTTP/1.1" 200 318 mysite.com "-" "989851" "-"
195.154.78.122 - - [20/Oct/2014:16:46:54 +0200] "GET / HTTP/1.0" 200 318 mysite.com "-" "122984" "-"
59.151.103.52 - - [20/Oct/2014:16:46:54 +0200] "GET / HTTP/1.1" 200 318 mysite.com "-" "375843" "-"
Different IP and different user-agent.
However, the user-agent is always a numeric and normally it is 6 characters long.
For example on the first line, the user-agent is "881087" instead of being something like "Chrome", "Opera", "Safari", etc.
Does anyone know how to block it via .htaccess?
Sure can block that depends on what platform php or .net.
Personally I would use isnumeric on the User Agent. If its numeric you might use return from jsp die(); in php or response.end for .net.
As far as htaccess you might try a regex on the user agent.
Please let me know if you want the exact script for any of the above.

Why is MiniRedir losing authentication?

I'm trying to use this project to integrate WebDAV into my .NET MVC2 application.
I've traced the traffic from Office to my WebDAV server, and compared it to this example on how office determines if the document should be readonly or edit.
After Office successfully authenticates with the server I see these requests as the document is opening.
2014-07-22 18:41:36 127.0.0.1 OPTIONS / - 80 username#mydomain.com 127.0.0.1 Microsoft+Office+Protocol+Discovery 200 0 0 23
2014-07-22 18:41:36 127.0.0.1 OPTIONS /wordstorage - 80 username#mydomain.com 127.0.0.1 Microsoft-WebDAV-MiniRedir/6.1.7601 200 0 0 5
2014-07-22 18:41:36 127.0.0.1 PROPFIND /wordstorage - 80 username#mydomain.com 127.0.0.1 Microsoft-WebDAV-MiniRedir/6.1.7601 200 0 0 29
2014-07-22 18:41:36 127.0.0.1 PROPFIND /wordstorage - 80 username#mydomain.com 127.0.0.1 Microsoft-WebDAV-MiniRedir/6.1.7601 200 0 0 10
2014-07-22 18:41:36 127.0.0.1 OPTIONS / - 80 - 127.0.0.1 Microsoft-WebDAV-MiniRedir/6.1.7601 403 0 0 7
2014-07-22 18:41:36 127.0.0.1 PROPFIND /wordstorage - 80 - 127.0.0.1 Microsoft-WebDAV-MiniRedir/6.1.7601 302 0 0 9
2014-07-22 18:41:36 127.0.0.1 PROPFIND /Account/LogOn ReturnUrl=%2fwordstorage 80 - 127.0.0.1 Microsoft-WebDAV-MiniRedir/6.1.7601 200 0 0 29
2014-07-22 18:42:25 127.0.0.1 PROPFIND /wordstorage - 80 username#mydomain.com 127.0.0.1 Microsoft-WebDAV-MiniRedir/6.1.7601 200 0 0 33
2014-07-22 18:42:25 127.0.0.1 PROPFIND /wordstorage - 80 username#mydomain.com 127.0.0.1 Microsoft-WebDAV-MiniRedir/6.1.7601 200 0 0 6
2014-07-22 18:42:59 127.0.0.1 GET /wordstorage/Test-2.docx - 80 username#mydomain.com 127.0.0.1 Mozilla/4.0+(compatible;+MSIE+7.0;+Windows+NT+6.1;+WOW64;+Trident/7.0;+SLCC2;+.NET+CLR+2.0.50727;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30729;+Media+Center+PC+6.0;+.NET4.0C;+.NET4.0E;+InfoPath.2;+IPH+1.1.21.4019;+MSOffice+12) 200 0 0 37
2014-07-22 18:42:59 127.0.0.1 HEAD /wordstorage/Test-2.docx - 80 username#mydomain.com 127.0.0.1 Microsoft+Office+Existence+Discovery 200 0 0 186
The first two OPTIONS and PROPFIND requests return a 200 OK, but the third OPTIONS request is denied with a 403 - forbidden code.
If authentication is successful why would MiniRedir not send authentication with the OPTIONS request?
Here's my environment:
Win 7
Office 2007
IIS 7.5
Have you checked if the IIS Webdav module is disabled ?
It seems that it may cause problems if not disabled.

Question about how to make a filter using script

I'm trying to make a filter on script to make this happen:
Before:
123.125.66.126 - - [05/Apr/2010:09:18:12 -0300] "GET / HTTP/1.1" 302 290
66.249.71.167 - - [05/Apr/2010:09:18:13 -0300] "GET /robots.txt HTTP/1.1" 404 290
66.249.71.167 - - [05/Apr/2010:09:18:13 -0300] "GET /~leonardo_campos/IFBA/Web_Design_Aula_17.pdf HTTP/1.1" 404 324
After:
[05/Apr/2010:09:18:12 -0300] / 302 290
[05/Apr/2010:09:18:13 -0300] /robots.txt 404 290
[05/Apr/2010:09:18:13 -0300] /~leonardo_campos/IFBA/Web_Design_Aula_17.pdf 404 324
If someone could help it would be great...
Thanks in advance !
Supporting all HTTP methods:
sed 's#.*\(\[[^]]*\]\).*"[A-Z]* \(.*\) HTTP/[0-9.]*" \(.*\)#\1 \2 \3#'
It seems a perfect work for "sed".
You can easily construct a pair of "s" replacement patterns to remove the unwanted pieces of lines.
sed is your friend here, with regexps.
sed 's/^\(\[.*\]\) "GET \(.*\) .*" \(.*\)$/\1 \2 \3/'
if your file structure is always like that, you can just use fields. no need complex regex
$ awk '{print $4,$5,$7,$9,$10}' file
[05/Apr/2010:09:18:12 -0300] / 302 290
[05/Apr/2010:09:18:13 -0300] /robots.txt 404 290
[05/Apr/2010:09:18:13 -0300] /~leonardo_campos/IFBA/Web_Design_Aula_17.pdf 404 324

Resources