Transfer some phrase betweeen character in folder names - linux

I require a little help from you, I need to rename some folders that I have. The problem is that I need to move a part of the folder name and leave it at the beginning, like for example this:
Original:
1982-11-03 - Seibu Stadium, Saitama, Japan [FLAC]Space Boogie [version 2][Wardour][AUD][109.56][560.72MB]
Required:
1982-11-03 - Space Boogie - Seibu Stadium, Saitama, Japan [FLAC][version 2][Wardour][AUD][109.56][560.72MB]
So I need to move for example "Space Boogie" phrase into final date, the same way that example.
I know that is difficult but i've tried to make it using sed or anything and no good results

Related

Trying to bind a complex file name in a DBI insert place holder, but special characters get mangled

Guys I don't know if this is a bind problem, character set problem or something I haven't considered. Here's the challenge - the file system was written by some cd ripping software but the problem is more generic, I need to be able to insert any legal filename from a Linux OS into a database, this set of titles is just a test case.
I tied to make this as clear as I could:
find(\&process, $_);< populates #fnames with File::Find:name
my $count = scalar #fnames;
print "File count $count\n";
So I use File:Find to fill the array with the file name strings, some are really problematic..
$stm = $dbh_sms->prepare(qq{
INSERT INTO $table (path) VALUES (?)
});
foreach $fname(#fnames){
print "File:$fname\n";
$stm->execute("$fname");
}
Now here's what I see printed, compared to what comes back out of MariaDb, just a few examples showing the problem:
File:/test1/music/THE DOOBIE BROTHERS - BEST OF THE DOOBIES/02 - THE DOOBIE BROTHERS - LONG TRAIN RUNNIN´.mp3
A bunch of these titles have the back ticks, they are problem #1 - here's how they come back out of a select against the table after I populate it:
| /test1/music/THE DOOBIE BROTHERS - BEST OF THE DOOBIES/02 - THE DOOBIE BROTHERS - LONG TRAIN RUNNIN´.mp3
This character is also a problem:
File:/test1/music/Blue Öyster Cult - Workshop Of The Telescopes/01 - Blue Öyster Cult - Don't Fear The Reaper.mp3
From the database:
/test1/music/Blue Ãyster Cult - Workshop Of The Telescopes/01 - Blue Ãyster Cult - Don't Fear The Reaper.mp3
And one more, but far from the last of the problematic strings:
File:/test1/music/Better Than Ezra - Deluxe/03 - Better Than Ezra - Southern Gürl.mp3
Comes out as:
/test1/music/Better Than Ezra - Deluxe/03 - Better Than Ezra - Southern Gürl.mp3
I though this was a character set problem, so I added this to the connect string:
{ RaiseError => 1, mysql_enable_utf8mb4 => 1 }
I've also tried:
$dbh_sms->do('SET NAMES utf8mb4');
Is that a hex C2 being added to the string? (Edit: it's an octal 303, hex c3) I've also tried changing the column to varbinary and still see the same results. Anyone have a clue why this is happening? I'm at a loss....
TIA
Edit - I dumped the table with OD to find out what is actually getting inserted since I was of the belief that with a placeholder, a bind variable would be written without interpolation, basically a binary transfer. To save my eyes, I just did a handful of records concentrating on what I thought was a 'back tick' from the above example, which is an octal 264, AKA "Acute accent - spacing acute".
It is in the table, but preceded by 303 202 and 302, which are a couple of those 'A' characters with the icing on top and a "low quote" character in between. Which contradicts my prior understanding about the utility of place holders and bind variables.
So I am more confused now than before.
I found the problem, it was in the perl character encoding:
$fname = decode("UTF-8", $fname);
was all I needed.

Move non-sequential files to new directory

I have no previous programming experience. I know this question has been asked before or the answer is out there but I, for the life of me, cannot find it. I have searched google for hours trying to figure this out. I am working on a Red Hat Linux computer and it is in bash.
I have a directory of files 0-500 in /directory/.
They are named as such,
/directory/filename_001, /directory/filename_002, and so forth.
After running my analysis for my research, I have a listofnumbers.txt (txt file, with each row being a new number) of the numbers that I am interested in. For example,
015
124
187
345
412
A) Run a command from the list of files the files from the list of numbers? Our code looks like this:
g09slurm filename_001.com filename_001.log
Is there a way to write something like:
find value (row1 of listofnumbers.txt) then g09slurm filename_row1value.com filename_row1value.log
find value (row2 of listofnumbers.txt) then g09slurm filename_row2value.com filename_row2value.log
find value (row3 of listofnumbers.txt) then g09slurm filename_row3value.com filename_row2value.log
etc etc
B) Move the selected files from the list to a new directory, so I can rename them sequentially, then run a sequential number command?
Thanks.
First, read the list of files into an array:
readarray myarray < /path/to/filename.txt
Next, we'll get all the filenames based on those numbers, and move them
cd /path/to/directory
mv -t /path/to/new_directory "${myarray[#]/#/filename_}"
After this... honestly, I got bored. Stack Overflow is about helping people who make a good start at a problem, and you've done zero work toward figuring this out (other than writing "I promise I tried google").
I don't even understand what
Run a command from the list of files the files from the list of numbers
means.
To rename them sequentially (once you've moved them), you'll want to do something based on this code:
for i in $(ls); do
*your stuff here*
done
You should be able to research and figure stuff out. You might have to do some bash tutorials, here's a reasonable starting place

batch file extract numbers from text file with little information

So This is related to my other two posts. Im dealing with extracting text from a text file and analyzing it and I've run into some problems. For A while I've been using a method that sets all the text between two other strings as a variable, but here is the situation I have. I need to extract the speed (numbers) from the below string: "etc...,query":{"ping":47855},"cmts":...etc. The problem is that the text cmts sometimes changes to something else so really I need to extract all the numbers from this:
,query":{"ping":47855},"
One more thing that makes this difficult is that the characters }," Are all over the file. Thank you for helping me! -Lucas EDG Programmer.
Here's the full file:
{"_id":53291,"ip":"158.69.22.95","domain":"jectile.com","port":25565,"url":"","date_add":1453897770,"status":1,"scan":1,"uptime":99.53,"last_update":1485436105,"geo":{"country":"US","country_name":"United States","city":"Lake Forest"},"info":{"name":" Jectile | jectile.com [1.8-1.11]\n Shoota (Call of Duty) \/ Zambies (Zombie Survival)","type":"FML","version":"1.10","plugins":[],"players":18,"max_players":420,"players_list":[],"map":"world","software":"BungeeCord 1.8.x, 1.9.x, 1.10.x, 1.11.x","avg_player_day":24.458333,"avg_load_day":5.8234,"platform":"MINECRAFT","icon":true},"counter":{"online":47871,"offline":228,"players":{"date":"2017-01-26","total":0},"last_offline":0,"query":{"ping":47855},"cmts":1},"rating":{"main":19.24,"difference":-0.64,"content_up":0.15,"K":0},"last":{"offline":1485415702,"online":1485436105},"chart":{"14:30":14,"14:40":16,"14:50":15,"15:00":18,"15:10":12,"15:20":13,"15:30":9,"15:40":9,"15:50":11,"16:00":12,"16:10":11,"16:20":11,"16:30":18,"16:40":25,"16:50":23,"17:00":27,"17:10":27,"17:20":23,"17:30":24,"17:40":26,"17:50":33,"18:00":31,"18:10":31,"18:20":32,"18:30":37,"18:40":38,"18:50":39,"19:00":38,"19:10":34,"19:20":33,"19:30":40,"19:40":36,"19:50":37,"20:00":38,"20:10":36,"20:20":38,"20:30":37,"20:40":37,"20:50":37,"21:00":34,"21:10":32,"21:20":33,"21:30":33,"21:40":29,"21:50":28,"22:00":26,"22:10":21,"22:20":24,"22:30":29,"22:40":22,"22:50":23,"23:00":27,"23:10":24,"23:20":26,"23:30":25,"23:40":28,"23:50":27,"00:00":32,"00:10":29,"00:20":33,"00:30":32,"00:40":31,"00:50":33,"01:00":40,"01:10":40,"01:20":40,"01:30":41,"01:40":45,"01:50":48,"02:00":43,"02:10":45,"02:20":46,"02:30":46,"02:40":43,"02:50":42,"03:00":39,"03:10":36,"03:20":44,"03:30":34,"03:40":0,"03:50":32,"04:00":35,"04:10":35,"04:20":33,"04:30":43,"04:40":37,"04:50":26,"05:00":31,"05:10":31,"05:20":27,"05:30":25,"05:40":26,"05:50":18,"06:00":13,"06:10":15,"06:20":17,"06:30":18,"06:40":17,"06:50":15,"07:00":16,"07:10":17,"07:20":16,"07:30":16,"07:40":18,"07:50":19,"08:00":14,"08:10":12,"08:20":12,"08:30":13,"08:40":17,"08:50":20,"09:00":18,"09:10":0,"09:20":0,"09:30":27,"09:40":18,"09:50":20,"10:00":15,"10:10":13,"10:20":12,"10:30":10,"10:40":10,"10:50":11,"11:00":13,"11:10":13,"11:20":16,"11:30":19,"11:40":17,"11:50":13,"12:00":10,"12:10":11,"12:20":12,"12:30":16,"12:40":15,"12:50":16,"13:00":14,"13:10":10,"13:20":13,"13:30":16,"13:40":16,"13:50":17,"14:00":20,"14:10":16,"14:20":16},"query":"ping","max_stat":{"max_online":{"date":1470764061,"players":129}},"status_query":"ok"}
By the way, the reason things change is because it looks at info from different servers
Very similar to ther answer I gave you to your first question:
#Echo Off
Set/P var=<some.json
Set var=%var:*:{"ping":=%
Set var=%var:},=&:%
Echo=%var%
Timeout -1

file transfer Extra attachmate appends username to host file name

Hi when I try to download a file from mainframe, using attachmate extra it appends the username also along with it. I dont know where to turn it off.
like for example - file name is yyyy.file.name, then when i try to transfer of file it transfers username.yyyy.file.name.
in 3.4 the option to append user name is turned off. Still its happening
Enclose the entire dataset name (including the high-level qualifier) in single quotes. This is a TSO (not JCL) convention - if you refer to a dataset without single quotes, it pre-pends your user ID as the high-level qualifier; however if you place single quotes around the dataset name it will take it 'as is' (well, it will uppercase it, since all z/OS dataset names are uppercase, but otherwise it will be 'as is').

Manipulating Strings that are inside Text Files

Here's my situation:
We depend on users to click some .bat and do the daily-backup.
I did some batching programming, now when they (are forced to) back-up, it leaves a log file in it's own the server folder. And yes, they need to be forced otherwise they just won't do the g.damn backup... Welcome to my company.
Then, after some more research, I developed another bat which checks if the file is outdated and prints it on a text file.
This is the kind of files that I have now:
File: DailyBkp
Content:
".\John\log.log"
".\Department1\Andy\log.log"
".\Department1\Nicole\log.log"
".\Department2\Ann\log.log"
File: Departments
Content:
Department1
Department2
...
Great, after some string treatment in PowerShell I managed to get DailyBkp content to be this:
John
Department1 Andy
...
Note that it has a space at beginning and another at the end which won't go away no matter what trim I use...
So now I have this setup: SERVER > E:\Backup\
Inside backup we have i.e. "Department 1" ... "Dep.-N"
Inside each one: "User-Lastname1" .. "User-lastname-N"
What I need, or what I do want to is use PowerShell to get the contents of DailyBkp containing "Dept_User" string, and export entries to a csv file like this:
COLUMN_Name COLUMN_Dept
Andy Department1
So, how do I do this? I can't find anything on internet that uses text files.
so assuming I am reading your question correctly you have a DailyBkp file that looks like this:
Department1 Andy
Dpeartment2 Jim
...
to get what you want you could read each line in split it out then spit it back out in a csv like this:
"COLUMN_Name,COLUMN_Dept" | out-file <path>\outfile.csv
$bkpFile = get-content $dailyBkpFile
$bkpFile | ForEach-Object {$line = $_.split(" ")
"$($line[1]),$($line[0])" | out-file <path>\outfile.csv -Append}
output should look like this:
COLUMN_Name,COLUMN_Dept
Andy,Department1
Jim,Department2
just doind a quick post here.
My question was answered, thanks.
And, if anyone is interested in Powershell programming, google Mastering Powershell by Tobias Weltner; this is a free book with lots of explanations.

Resources