So This is related to my other two posts. Im dealing with extracting text from a text file and analyzing it and I've run into some problems. For A while I've been using a method that sets all the text between two other strings as a variable, but here is the situation I have. I need to extract the speed (numbers) from the below string: "etc...,query":{"ping":47855},"cmts":...etc. The problem is that the text cmts sometimes changes to something else so really I need to extract all the numbers from this:
,query":{"ping":47855},"
One more thing that makes this difficult is that the characters }," Are all over the file. Thank you for helping me! -Lucas EDG Programmer.
Here's the full file:
{"_id":53291,"ip":"158.69.22.95","domain":"jectile.com","port":25565,"url":"","date_add":1453897770,"status":1,"scan":1,"uptime":99.53,"last_update":1485436105,"geo":{"country":"US","country_name":"United States","city":"Lake Forest"},"info":{"name":" Jectile | jectile.com [1.8-1.11]\n Shoota (Call of Duty) \/ Zambies (Zombie Survival)","type":"FML","version":"1.10","plugins":[],"players":18,"max_players":420,"players_list":[],"map":"world","software":"BungeeCord 1.8.x, 1.9.x, 1.10.x, 1.11.x","avg_player_day":24.458333,"avg_load_day":5.8234,"platform":"MINECRAFT","icon":true},"counter":{"online":47871,"offline":228,"players":{"date":"2017-01-26","total":0},"last_offline":0,"query":{"ping":47855},"cmts":1},"rating":{"main":19.24,"difference":-0.64,"content_up":0.15,"K":0},"last":{"offline":1485415702,"online":1485436105},"chart":{"14:30":14,"14:40":16,"14:50":15,"15:00":18,"15:10":12,"15:20":13,"15:30":9,"15:40":9,"15:50":11,"16:00":12,"16:10":11,"16:20":11,"16:30":18,"16:40":25,"16:50":23,"17:00":27,"17:10":27,"17:20":23,"17:30":24,"17:40":26,"17:50":33,"18:00":31,"18:10":31,"18:20":32,"18:30":37,"18:40":38,"18:50":39,"19:00":38,"19:10":34,"19:20":33,"19:30":40,"19:40":36,"19:50":37,"20:00":38,"20:10":36,"20:20":38,"20:30":37,"20:40":37,"20:50":37,"21:00":34,"21:10":32,"21:20":33,"21:30":33,"21:40":29,"21:50":28,"22:00":26,"22:10":21,"22:20":24,"22:30":29,"22:40":22,"22:50":23,"23:00":27,"23:10":24,"23:20":26,"23:30":25,"23:40":28,"23:50":27,"00:00":32,"00:10":29,"00:20":33,"00:30":32,"00:40":31,"00:50":33,"01:00":40,"01:10":40,"01:20":40,"01:30":41,"01:40":45,"01:50":48,"02:00":43,"02:10":45,"02:20":46,"02:30":46,"02:40":43,"02:50":42,"03:00":39,"03:10":36,"03:20":44,"03:30":34,"03:40":0,"03:50":32,"04:00":35,"04:10":35,"04:20":33,"04:30":43,"04:40":37,"04:50":26,"05:00":31,"05:10":31,"05:20":27,"05:30":25,"05:40":26,"05:50":18,"06:00":13,"06:10":15,"06:20":17,"06:30":18,"06:40":17,"06:50":15,"07:00":16,"07:10":17,"07:20":16,"07:30":16,"07:40":18,"07:50":19,"08:00":14,"08:10":12,"08:20":12,"08:30":13,"08:40":17,"08:50":20,"09:00":18,"09:10":0,"09:20":0,"09:30":27,"09:40":18,"09:50":20,"10:00":15,"10:10":13,"10:20":12,"10:30":10,"10:40":10,"10:50":11,"11:00":13,"11:10":13,"11:20":16,"11:30":19,"11:40":17,"11:50":13,"12:00":10,"12:10":11,"12:20":12,"12:30":16,"12:40":15,"12:50":16,"13:00":14,"13:10":10,"13:20":13,"13:30":16,"13:40":16,"13:50":17,"14:00":20,"14:10":16,"14:20":16},"query":"ping","max_stat":{"max_online":{"date":1470764061,"players":129}},"status_query":"ok"}
By the way, the reason things change is because it looks at info from different servers
Very similar to ther answer I gave you to your first question:
#Echo Off
Set/P var=<some.json
Set var=%var:*:{"ping":=%
Set var=%var:},=&:%
Echo=%var%
Timeout -1
we have a requirement where contents of our text files are like this:
[some-section-1]
big_msg_line1 var=random_value1
big_msg_line2 var=random_value2
big_msg_line3 var=random_value3
[some-section-2]
"lots of irrelevant data"
[some-section-3]
"lots of irrelevant data"
[some-section-4]
big_msg_line4 var=random_value4
big_msg_line5 var=random_value5
big_msg_line6 var=random_value6
big_msg_line7 var=random_value7
big_msg_line8 var=random_value8
[some-section-5]
"lots of irrelevant data"
All the lines that we want to modify starts with common charaters, like in this example all lines which we would like to modify starts with the word "big". We would like to change it to something like this:
[some-section-1]
random_value1 msg=big_msg_line1
random_value2 msg=big_msg_line2
random_value3 msg=big_msg_line3
[some-section-2]
"lots of irrelevant data"
[some-section-3]
"lots of irrelevant data"
[some-section-4]
random_value4 msg=big_msg_line4
random_value5 msg=big_msg_line5
random_value6 msg=big_msg_line6
random_value7 msg=big_msg_line7
random_value8 msg=big_msg_line8
[some-section-5]
"lots of irrelevant data"
These were for examples only. The original file contains way lot more data than these. In hundreds if not in thousands lines.
I am currently doing this using for a loop, reading each line, cutting the values, formatting them like the way I want, putting then in separate file and then replace the original file with the new file. Is there a way to achieve this using some one liners? That would really be of great help. Hope I am clear with my question.
Thanks in advance.
From what I understood, this awk one-liner would do the job :
cat a
[some-section-1]
big_msg_line1 var=random_value1
big_msg_line2 var=random_value2
big_msg_line3 var=random_value3
[some-section-2]
lots of irrelevant data
[some-section-3]
lots of irrelevant data
[some-section-4]
big_msg_line4 var=random_value4
big_msg_line5 var=random_value5
big_msg_line6 var=random_value6
big_msg_line7 var=random_value7
big_msg_line8 var=random_value8
[some-section-5]
lots of irrelevant data
This :
awk '{FS="var="; if ($1~/big/) { print $2"\tmsg="$1} else {print }}' a
Gives
[some-section-1]
random_value1 msg=big_msg_line1
random_value2 msg=big_msg_line2
random_value3 msg=big_msg_line3
[some-section-2]
lots of irrelevant data
[some-section-3]
lots of irrelevant data
[some-section-4]
random_value4 msg=big_msg_line4
random_value5 msg=big_msg_line5
random_value6 msg=big_msg_line6
random_value7 msg=big_msg_line7
random_value8 msg=big_msg_line8
[some-section-5]
lots of irrelevant data
this command should do the job
sed -e 's/\(big[^ ]*\)\([ ]*\)var=\([^ ]*\)/\3\2msg=\1/' [your file] > [output file]
EDIT: You might need to change the slahes (/) to a letter which is not used in your file
I recently started to use Log Parser with visual interface.
The logs that I want to parse come from IIS, and they are related to SharePoint. For example, I want to know how many people were visiting particular web pages, etc.
And it seems that IIS creates logs in different folders (I don't know why) and every day there is a new log file in a different folder.
So my question is, is it possible to approach all those files in different folders?
I know you can use From-clause, put different folders, but it is too difficult especially if in the future new folders are added. The goal is to create one script which would be executed.
So for example in a folder log named LogFIles, I have folders folder1, folder2, folder3, folder4, etc. and in each folder there are log files log1, log2, log 3, logN, etc.
So my query should be like this: Select * FROM path/LogFiles/*/*.log but the log parser doesn't accept it, so how to realize it?
You can use the -recurse option when calling logparser.
For example:
logparser file:"query.sql" -i:IISW3C -o:CSV -recurse
where query.sql contains:
select *
from .\Logs\*.log
and in my current directory, there is a directory called "Logs" that contains multiple sub-directories, each containing log files. Such as:
\Logs\server1\W3SVC1
\Logs\server1\W3SVC2
\Logs\server2\W3SVC1
\Logs\server2\W3SVC2
etc.
You can merge the logs then query the merged log
what i have to do is
LogParser.exe -i:w3c "select * into E:\logs\merged.log from E:\logs\WEB35\*, E:\logs\WEB36\*, E:\logs\WEB37\*" -o:w3c
I prefer powershell like this:
Select-String C:\Logs\diag\*.log -pattern "/sites/Very" | ?{$_.Line -match "Important"}
LogParser's help does not list the -recurse option so I'm not sure it it's still supported. However, this is what I did to get around it:
Let's say you use the following command to execute logparser -
logparser "SELECT * INTO test.csv FROM 'D:\samplelog\test.log'" -i:COM -iProgID:Sample.LogParser.Scriptlet -o:CSV
Then simply create a batch script to "recurse" through the folder structure and parse all files in it. The batch script that does this looks like this -
echo off
for /r %%a in (*) do (
for %%F in ("%%a") do (
logparser "SELECT * INTO '%%~nxF'.csv FROM '%%a'" -i:COM -iProgID:Sample.LogParser.Scriptlet
REM echo %%~nxF
)
)
Execute it from the path where the log files that need to be parsed are located.
This can further be customized to spit all parsed logs in one file using append (>>) operator.
Hope this helps.
Check this out: https://stackoverflow.com/a/31024196/4502867 by using powershell to recursively get file items in sub directories and parse them.