Automated script to zip IIS logs? - iis

I'd like to write a script/batch that will bunch up my daily IIS logs and zip them up by month.
ex080801.log which is in the format of exyymmdd.log
ex080801.log - ex080831.log gets zipped up and the log files deleted.
The reason we do this is because on a heavy site a log file for one day could be 500mb to 1gb so we zip them up which compresses them by 98% and dump the real log file. We use webtrend to analyze the log files and it is capable of reading into a zip file.
Does anyone have any ideas on how to script this or would be willing to share some code?

You'll need a command line tool to zip up the files. I recommend 7-Zip which is free and easy to use. The self-contained command line version (7za.exe) is the most portable choice.
Here's a two-line batch file that would zip the log files and delete them afterwards:
7za.exe a -tzip ex%1-logs.zip %2\ex%1*.log
del %2\ex%1*.log
The first parameter is the 4 digit year-and-month, and the second parameter is the path to the directory containing your logs. For example: ziplogs.bat 0808 c:\logs
It's possible to get more elaborate (i.e. searching the filenames to determine which months to archive). You might want to check out the Windows FINDSTR command for searching input text with regular expressions.

Here's my script which basically adapts David's, and zips up last month's logs, moves them and deletes the original log files. this can be adapted for Apache logs too.
The only problem with this is you may need to edit the replace commands, if your DOS date function outputs date of the week.
You'll also need to install 7-zip.
You can also download IISlogslite but it compresses each day's file into a single zip file which I didn't find useful. There is a vbscript floating about the web that does the same thing.
-------------------------------------------------------------------------------------
#echo on
:: Name - iislogzip.bat
:: Description - Server Log File Manager
::
:: History
:: Date Authory Change
:: 27-Aug-2008 David Crow Original (found on stack overflow)
:: 15-Oct-2008 AIMackenzie Slimmed down commands
:: ========================================================
:: setup variables and parameters
:: ========================================================
:: generate date and time variables
set month=%DATE:~3,2%
set year=%DATE:~8,2%
::Get last month and check edge conditions
set /a lastmonth=%month%-1
if %lastmonth% equ 0 set /a year=%year%-1
if %lastmonth% equ 0 set lastmonth=12
if %lastmonth% lss 10 set lastmonth=0%lastmonth%
set yymm=%year%%lastmonth%
set logpath="C:\WINDOWS\system32\LogFiles"
set zippath="C:\Program Files\7-Zip\7z.exe"
set arcpath="C:\WINDOWS\system32\LogFiles\WUDF"
:: ========================================================
:: Change to log file path
:: ========================================================
cd /D %logpath%
:: ========================================================
:: zip last months IIS log files, move zipped file to archive
:: then delete old logs
:: ========================================================
%zippath% a -tzip ex%yymm%-logs.zip %logpath%\ex%yymm%*.log
move "%logpath%\*.zip" "%arcpath%"
del %logpath%\ex%yymm%*.log

We use a script like the following. Gzip is from the cygwin project. I'm sure you could modify the syntax to use a zip tool instead. The "skip" argument is the number of files to not archive off -- we keep 11 days in the 'current' directory.
#echo off
setlocal
For /f "skip=11 delims=/" %%a in ('Dir D:\logs\W3SVC1\*.log /B /O:-N /T:C')do move "D:\logs\W3SVC1\%%a" "D:\logs\W3SVC1\old\%%a"
d:
cd "\logs\W3SVC1\old"
gzip -n *.log
Endlocal
exit

You can grab the command-line utilities package from DotNetZip to get tools to create zips from scripts. There's a nice little tool called Zipit.exe that runs on the command line, adds files or directories to zip files. It is fast, efficient.
A better option might be to just do the zipping from within PowerShell.
function ZipUp-Files ( $directory )
{
$children = get-childitem -path $directory
foreach ($o in $children)
{
if ($o.Name -ne "TestResults" -and
$o.Name -ne "obj" -and
$o.Name -ne "bin" -and
$o.Name -ne "tfs" -and
$o.Name -ne "notused" -and
$o.Name -ne "Release")
{
if ($o.PSIsContainer)
{
ZipUp-Files ( $o.FullName )
}
else
{
if ($o.Name -ne ".tfs-ignore" -and
!$o.Name.EndsWith(".cache") -and
!$o.Name.EndsWith(".zip") )
{
Write-output $o.FullName
$e= $zipfile.AddFile($o.FullName)
}
}
}
}
}
[System.Reflection.Assembly]::LoadFrom("c:\\\bin\\Ionic.Zip.dll");
$zipfile = new-object Ionic.Zip.ZipFile("zipsrc.zip");
ZipUp-Files "DotNetZip"
$zipfile.Save()

Borrowed zip function from http://blogs.msdn.com/daiken/archive/2007/02/12/compress-files-with-windows-powershell-then-package-a-windows-vista-sidebar-gadget.aspx
Here is powershell answer that works wonders:
param([string]$Path = $(read-host "Enter the path"))
function New-Zip
{
param([string]$zipfilename)
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
function Add-Zip
{
param([string]$zipfilename)
if(-not (test-path($zipfilename)))
{
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipfilename)
foreach($file in $input)
{
$zipPackage.CopyHere($file.FullName)
Start-sleep -milliseconds 500
}
}
$FilesToZip = dir $Path -recurse -include *.log
foreach ($file in $FilesToZip) {
New-Zip $file.BaseName
dir $($file.directoryname+"\"+$file.name) | Add-zip $($file.directoryname+"\$($file.basename).zip")
del $($file.directoryname+"\"+$file.name)
}

We use this powershell script: http://gallery.technet.microsoft.com/scriptcenter/31db73b4-746c-4d33-a0aa-7a79006317e6
It uses 7-zip and verifys the files before deleting them

Regex will do the trick... create a perl/python/php script to do the job for you..
I'm pretty sure windows batch file can't do regex.

Related

(Batch) How to get text inside href=""?

I have a htm file with href="example.com/page" somewhere on its source code, how could i get the link between the " "?
So far I have tried modifying this piece of code:
#echo off
setlocal EnableDelayedExpansion
set "str="
set "string=stuff href="example.com/page"end morestuff"
set string=!string:href=^
!
set string=!string:end=^
!
FOR /F skip^=1eol^= %%S in ("!string!") do if NOT DEFINED str set "str=%%S"
echo(!str!
pause > nul
However on line 6 it appears that changing href to href=" breaks the code, and changing end to " also breaks something, would like to know if it is possible to fix this or if there is an alternative for this?
I believe you just want the example.com/page part:
#echo off
set "string=stuff href="example.com/page"end morestuff"
for /f tokens^=2delims^="" %%a in ("%string%") do set "substr=%%a"
echo %substr%
Reading from file as per comment:
#echo off
set "file=file.txt"
for /f tokens^=2delims^="" %%a in ('findstr /IRC:"href=" "%file%"') do set "substr=%%a"
echo %substr%
Seeing that you require reading from an html file, I would recommend using something a little more robust, like powershell.
Create a file with .ps1 extension, paste the content, make sure you put the path and filename of you file in $file_path replacing file.txt:
$file_path = 'file.txt'
$rgx = '(?<=href\=").*?(?=">)'
select-string -Path $file_path -Pattern $rgx -AllMatches | % { $_.Matches } | % { $_.Value }
now you can either run it from cmd:
powershell -File test_url.ps1
Or simply open powershell cli and run directly from there:
.\test_url.ps1

Possible to convert xlsx file to csv file with bash script? [duplicate]

How do you convert multiple xlsx files to csv files with a batch script?
Try in2csv!
Usage:
in2csv file.xlsx > file.csv
Alternative way of converting to csv. Use libreoffice:
libreoffice --headless --convert-to csv *
Please be aware that this will only convert the first worksheet of your Excel file.
Get all file item and filter them by suffix and then use PowerShell Excel VBA object to save the excel files to csv files.
$excelApp = New-Object -ComObject Excel.Application
$excelApp.DisplayAlerts = $false
Get-ChildItem -File -Filter '*.xlsx' | ForEach-Object {
$workbook = $excelApp.Workbooks.Open($_.FullName)
$csvFilePath = $_.FullName -replace "\.xlsx$", ".csv"
$workbook.SaveAs($csvFilePath, [Microsoft.Office.Interop.Excel.XlFileFormat]::xlCSV)
$workbook.Close()
}
You can find the complete sample here How to convert Excel xlsx file to csv file in batch by PowerShell
To follow up on the answer by user183038, here is a shell script to batch rename all xlsx files to csv while preserving the file names. The xlsx2csv tool needs to be installed prior to running.
for i in *.xlsx;
do
filename=$(basename "$i" .xlsx);
outext=".csv"
xlsx2csv $i $filename$outext
done
You need an external tool, in example: SoftInterface.com - Convert XLSX to CSV.
After installing it, you can use following command in your batch:
"c:\Program Files\Softinterface, Inc\Convert XLS\ConvertXLS.EXE" /S"C:\MyExcelFile.xlsx" /F51 /N"Sheet1" /T"C:\MyExcelFile.CSV" /C6 /M1 /V
Needs installed excel as it uses the Excel.Application com object.Save this as .bat file:
#if (#X)==(#Y) #end /* JScript comment
#echo off
cscript //E:JScript //nologo "%~f0" %*
exit /b %errorlevel%
#if (#X)==(#Y) #end JScript comment */
var ARGS = WScript.Arguments;
var xlCSV = 6;
var objExcel = WScript.CreateObject("Excel.Application");
var objWorkbook = objExcel.Workbooks.Open(ARGS.Item(0));
objExcel.DisplayAlerts = false;
objExcel.Visible = false;
var objWorksheet = objWorkbook.Worksheets(ARGS.Item(1))
objWorksheet.SaveAs( ARGS.Item(2), xlCSV);
objExcel.Quit();
It accepts three arguments - the absolute path to the xlsx file, the sheet name and the absolute path to the target csv file:
call toCsv.bat "%cd%\Book1.xlsx" Sheet1 "%cd%\csv.csv"
Adding to #marbel's answer (which is a great suggestion!), here's the script that worked for me on Mac OS X El Captain's Terminal, for batch conversion (since that's what the OP asked). I thought it would be trivial to do a for loop but it wasn't! (had to change the extension by string manipulation and it looks like Mac's bash is a bit different also)
for x in $(ls *.xlsx); do x1=${x%".xlsx"}; in2csv $x > $x1.csv; echo "$x1.csv done."; done
Note:
${x%”.xlsx”} is bash string manipulation which clips .xlsx from the end of the string.
in2csv creates separate csv files (doesn’t overwrite the xlsx's).
The above won't work if the filenames have white spaces in them. Good to convert white spaces to underscores or something, before running the script.

How to rename all files in a folder and create a renaming map

Note: I have access to both Linux and Windows platform so answers for any of these platforms are fine.
I have a folder which contains less than 10K .png files. I would like to:
1. rename all files as follows:
<some_filename>.png to 0001.png
<some_other_name>.png to 0002.png
<another_name>.png to 0003.png
and so on...
2. keep this name mapping in a file (see 1 for mapping)
In Windows: This should sort the list alphabetically and rename them all with numbers, padded to 4 characters.
It writes the bat file that does the renaming. You can examine it before renaming and running it, and doubles as a map of the filenames.
Filenames with ! characters will probably be an issue.
#echo off
setlocal enabledelayedexpansion
set c=0
for %%a in (*.png) do (
set /a c=c+1
set num=0000!c!
set num=!num:~-4!
>>renfile.bat.txt echo ren "%%a" "!num!%%~xa"
)
To rename all .png files in the current directory and to save the renaming map to renaming-map.txt file:
$ perl -E'while (<*.png>) { $new = sprintf q(%04d.png), ++$i; say qq($_ $new);
rename($_, $new) }' > renaming-map.txt
For example, given the following directory content:
$ ls
a.png b.png c.png d.png e.png f.png g.png h.png i.png j.png
It produces:
$ perl -E'while (<*.png>) { $new = sprintf q(%04d.png), ++$i; say qq($_ $new);
rename($_, $new) }'
a.png 0001.png
b.png 0002.png
c.png 0003.png
d.png 0004.png
e.png 0005.png
f.png 0006.png
g.png 0007.png
h.png 0008.png
i.png 0009.png
j.png 0010.png
Result:
$ ls
0001.png 0003.png 0005.png 0007.png 0009.png
0002.png 0004.png 0006.png 0008.png 0010.png
It should work both on Windows and Linux if perl is available (replace perl -E'...' with perl -E "..." on Windows (single -> double quotes)).

Know the from a fullname string if path is a folder or a file?

Sorry for the title. I'll try to explain better.
Let's suppose that I run this command in order to get all paths of a directory and I want to redirect them to a text file.
gci e:\mytree -r | % {$_.fullname} | out-file e:\folderstructure.txt
Now I need to recreate this nested structure using new-item cmdlet.
Let's suppose now that I run this command:
gc e:\folderstructure.txt | % {
[system.io.fileinfo]$info = $_
write-host $info $info.extension
}
that produces this output:
E:\mytree\folder1
E:\mytree\folder2
E:\mytree\folder3
E:\mytree\file1.txt .txt
E:\mytree\file12.txt .txt
E:\mytree\folder1\folder.with.dots .dots
E:\mytree\folder1\folder.with.dots\file inside folder with dots.txt .txt
E:\mytree\folder3\file4.doc .doc
As you can see folder.with.dots is a folder but "it's seen" as a file (it gives me .dots extension) because it contains dots within name. If I don't know all possible extensions of my files is there any property that can tell me if an object iscontainer or not so that I can use new-item with the right switch file or directory to create it?
I hope you have understood my problem despite my English. Thanks in advance.
edit. UPDATE after JPBlanc answer
Thank you very much. :)
I was trying in this way:
gc e:\folderstructure.txt | % {
[system.io.directoryinfo]$info = $_
if ($info.psiscontainer) {
write-host "$info is a folder" }
else {
write-host "$info is a file" }
}
and the output was:
E:\mytree\folder1 is a file
E:\mytree\folder2 is a file
E:\mytree\folder3 is a file
E:\mytree\file1.txt is a file
E:\mytree\file12.txt is a file
E:\mytree\folder1\folder.with.dots is a file
E:\mytree\folder1\folder.with.dots\file inside folder with dots.txt is a file
E:\mytree\folder3\file4.doc is a file
Following your advice:
gc e:\folderstructure.txt | % {
[system.io.directoryinfo]$info = $_
if ((get-item $info).psiscontainer) {
write-host "$info is a folder" }
else {
write-host "$info is a file" }
}
E:\mytree\folder1 is a folder
E:\mytree\folder2 is a folder
E:\mytree\folder3 is a folder
E:\mytree\file1.txt is a file
E:\mytree\file12.txt is a file
E:\mytree\folder1\folder.with.dots is a folder
E:\mytree\folder1\folder.with.dots\file inside folder with dots.txt is a file
E:\mytree\folder3\file4.doc is a file
everything works fine. Now I can achieve my goal. Thanks again.
LAST EDIT.
I had another idea. I decided to check if an object is a file or a folder before creating txt file. After some difficulties (for example I've soon discovered that I can't redirect format-table that I was using to hide table headers to export-csv
http://blogs.msdn.com/b/powershell/archive/2007/03/07/why-can-t-i-pipe-format-table-to-export-csv-and-get-something-useful.aspx )
I came up with this solution:
gci e:\mytree -r |
select fullname,#{n='folder';e={ switch ($_.psiscontainer) {
true {1}
false {0}
}
}
} | % {($_.fullname,$_.folder) -join ","} | out-file e:\structure.txt
that gets me this output:
E:\mytree\folder1,1
E:\mytree\folder2,1
E:\mytree\folder3,1
E:\mytree\file1.txt,0
E:\mytree\file12.txt,0
E:\mytree\folder1\folder.with.dots,1
E:\mytree\folder1\folder.with.dots\file inside folder with dots.txt,0
E:\mytree\folder3\file4.doc,0
So I can easily split two parameters and use new-item cmdlet accordingly to object type.
Both FileInfo type an DirectoryInfo type has got a property PSIsContainer, that allow you to see if the object is a directory or not.
PS C:\temp> (Get-Item 'Hello world.exe').PSIsContainer
False
PS C:\temp> (Get-Item 'c:\temp').PSIsContainer
True
I modified your code slightly to this:
gc c:\scripts\files.txt | % {
$item = Get-item $_
Write-Host $item.fullName $item.PSIscontainer
}
Now, my output looks like this:
C:\Scripts\mytree\folder1 True
C:\Scripts\mytree\folder2 True
C:\Scripts\mytree\file1.txt False
C:\Scripts\mytree\file2.txt False
C:\Scripts\mytree\folder.with.dots True
C:\Scripts\mytree\folder.with.dots\file inside folder with dots.txt False

How to correct this PowerShell script that auto deletes Sharepoint backups

I've been using this (and this) script to delete older sharepoint backups, but it deletes all backups rather than the 14+ day old ones.
I ran it through powershell_ise.exe and put a break point under the line that has $_.SPStartTime in it, and it shows $_.SPStartTime = as if the date isn't populated. I looked inside $sp.SPBackupRestoreHistory.SPHistoryObject and that contains the data I expect.
The part that is there issue is on this line:
# Find the old backups in spbrtoc.xml
$old = $sp.SPBackupRestoreHistory.SPHistoryObject |
? { $_.SPStartTime -lt ((get-date).adddays(-$days)) }
I get all of the dates output (which I would expect). This tells me the problem in in the 'where' or '?' - I understand they are interchangeable. Regardless, $old always appears to be null.
As Requested:
<?xml version="1.0" encoding="utf-8"?>
<SPBackupRestoreHistory>
<SPHistoryObject>
<SPId>a8a03c50-6bc2-4af4-87b3-caf60e750fa0</SPId>
<SPRequestedBy>ASERVER\AUSER</SPRequestedBy>
<SPBackupMethod>Full</SPBackupMethod>
<SPRestoreMethod>None</SPRestoreMethod>
<SPStartTime>01/09/2011 00:00:13</SPStartTime>
<SPFinishTime>01/09/2011 00:05:22</SPFinishTime>
<SPIsBackup>True</SPIsBackup>
<SPConfigurationOnly>False</SPConfigurationOnly>
<SPBackupDirectory>E:\Backups\spbr0003\</SPBackupDirectory>
<SPDirectoryName>spbr0003</SPDirectoryName>
<SPDirectoryNumber>3</SPDirectoryNumber>
<SPTopComponent>Farm</SPTopComponent>
<SPTopComponentId>689d7f0b-4f64-45d4-ac58-7ab225223625</SPTopComponentId>
<SPWarningCount>0</SPWarningCount>
<SPErrorCount>0</SPErrorCount>
</SPHistoryObject>
<SPHistoryObject>
<SPId>22dace04-c300-41d0-a9f1-7cfe638809ef</SPId>
<SPRequestedBy>ASERVER\AUSER</SPRequestedBy>
<SPBackupMethod>Full</SPBackupMethod>
<SPRestoreMethod>None</SPRestoreMethod>
<SPStartTime>01/08/2011 00:00:13</SPStartTime>
<SPFinishTime>01/08/2011 00:05:26</SPFinishTime>
<SPIsBackup>True</SPIsBackup>
<SPConfigurationOnly>False</SPConfigurationOnly>
<SPBackupDirectory>E:\Backups\spbr0002\</SPBackupDirectory>
<SPDirectoryName>spbr0002</SPDirectoryName>
<SPDirectoryNumber>2</SPDirectoryNumber>
<SPTopComponent>Farm</SPTopComponent>
<SPTopComponentId>689d7f0b-4f64-45d4-ac58-7ab225223625</SPTopComponentId>
<SPWarningCount>0</SPWarningCount>
<SPErrorCount>0</SPErrorCount>
</SPHistoryObject>
</SPBackupRestoreHistory>
I believe the issue was to do with the date formatting.
Final working script:
# Location of spbrtoc.xml
$spbrtoc = "E:\Backups\spbrtoc.xml"
# Days of backup that will be remaining after backup cleanup.
$days = 14
# Import the Sharepoint backup report xml file
[xml]$sp = gc $spbrtoc
# Find the old backups in spbrtoc.xml
$old = $sp.SPBackupRestoreHistory.SPHistoryObject |
? { ( (
[datetime]::ParseExact($_.SPStartTime, "MM/dd/yyyy HH:mm:ss", [System.Globalization.CultureInfo]::InvariantCulture)
) -lt (get-date).adddays(-$days)
)
}
if ($old -eq $Null) { write-host "No reports of backups older than $days days found in spbrtoc.xml.`nspbrtoc.xml isn't changed and no files are removed.`n" ; break}
# Delete the old backups from the Sharepoint backup report xml file
$old | % { $sp.SPBackupRestoreHistory.RemoveChild($_) }
# Delete the physical folders in which the old backups were located
$old | % { Remove-Item $_.SPBackupDirectory -Recurse }
# Save the new Sharepoint backup report xml file
$sp.Save($spbrtoc)
Write-host "Backup(s) entries older than $days days are removed from spbrtoc.xml and harddisc."
It looks to me like you're ending up with the comparison being string-based, rather than date-based, so for example:
"10/08/2007 20:20:13" -lt (Get-Date -Year 1900)
The number is always going to be less than the "Sunday" or "Monday" or whatever that you would get at the front of the string when the DateTime object is cast to a string ...
I don't have access to a set of backups I could test this on, but for starters, you should fix that, and at the same time, make sure that you're not deleting the backup just because the value is null:
# Find the old backups in spbrtoc.xml
$old = $sp.SPBackupRestoreHistory.SPHistoryObject |
Where { (Get-Date $_.SPStartTime) -lt ((get-date).adddays(-$days)) }
The date string format in the XML file (according to the docs page) is one that Get-Date can readily parse, so that should work without any problems.
Incidentally, your assumption is right about $_ being the current iteration object from the array ;)

Resources