Unable to append the output of for loop - linux

I am trying to get tablespace total space for those tablespaces where given threshold is breached.
As I am getting more than one tablespaces in output , I am splitting those and after that running a for loop on array. Now I need to get bytes for each i but its only printing value of first occurrence. How should I append all the values?
Please check below code:
output1=`sqlplus -s username/password as sysbackup <<END1
set linesize 200
set head off
select tablespace_name||',' from dba_tablespace_usage_metrics where used_percent > 50;
exit;
END1`
IFS=', ' read -r -a array <<< $output1
for val in "${array[#]}"
do
output2=`sqlplus -s username/password as sysbackup <<END2
set serveroutput on;
set linesize 200
set head off
spool test.txt
SELECT TABLESPACE_NAME, ROUND (SUM (BYTES) / 1048576) FROM DBA_DATA_FILES WHERE TABLESPACE_NAME = '${array[val]}' GROUP BY TABLESPACE_NAME;
spool off;
#test.txt
exit;
END2`
done
#test.sql should print :
SYSAUX 2048
SYSTEM 4027
but its only printing:
SYSAUX 2048

Your code isn't printing anything from the loop at the moment; if you print $output2 after the loop then you'd see that value once; if you did it inside the loop then you'd see it multiple times, with the same value. Each output would also plus an SP-0734 error, which you may be supressing.
When you write to test.txt you're putting the result of the query against dba_data_files into that file; and $val has the value form the array not an index, so ${array[val]} doesn't mean what you probably think, and always gives you the first element in the array; so each time around the loop you've got the same result from the same query in test.txt.
Then you execute that, which gets an error. But both the initial query and that error is all being captured into output2 - just as you'd see it all if you ran test.txt manually. And that is overwritten each time. So if you process/display it inside the loop you'd see the same result repeated; if you ran it after the loop (which is what you seem to be doing, from your description) then you'd see it once.
If you really wanted to append the output each time around the loop you could do:
output2=${output2}`sqlplus -s username/password as sysbackup <<END2
...
END2`
or:
output2+=`sqlplus -s username/password as sysbackup <<END2
...
END2`
but you would need to clean up the output that both queries are producing, and then might need to include a newline in that concatenation.
This would be much simpler as a single query, which just outputs directly instead of to a variable:
sqlplus -s username/password as sysbackup <<END1
set pagesize 0
set linesize 200
set feedback off
SELECT TABLESPACE_NAME, ROUND (SUM (BYTES) / 1048576)
FROM DBA_DATA_FILES
WHERE TABLESPACE_NAME IN (
select tablespace_name
from dba_tablespace_usage_metrics
where used_percent > 50
)
GROUP BY TABLESPACE_NAME
ORDER BY TABLESPACE_NAME;
END1
If you don't want the column alignment then concatenate the name and size values:
SELECT TABLESPACE_NAME || ' ' || ROUND (SUM (BYTES) / 1048576)

Related

Setting value of command prompt ( PS1) based on present directory string length

I know I can do this to reflect just last 2 directories in the PS1 value.
PS1=${PWD#"${PWD%/*/*}/"}#
but lets say we have a directory name that's really messy and will reduce my working space , like
T-Mob/2021-07-23--07-48-49_xperia-build-20191119010027#
OR
2021-07-23--07-48-49_nokia-build-20191119010027/T-Mob#
those are the last 2 directories before the prompt
I want to set a condition if directory length of either of the last 2 directories is more than a threshold e.g. 10 chars , shorten the name with 1st 3 and last 3 chars of the directory (s) whose length exceeds 10
e.g.
2021-07-23--07-48-49_xperia-build-20191119010027 &
2021-07-23--07-48-49_nokia-build-20191119010027
both gt 10 will be shortened to 202*027 & PS1 will be respectively
T-Mob/202*027/# for T-Mob/2021-07-23--07-48-49_xperia-build-20191119010027# and
202*027/T-Mob# for 2021-07-23--07-48-49_nokia-build-20191119010027/T-Mob#
A quick 1 Liner to get this done ?
I cant post this in comments so Updating here. Ref to Joaquins Answer ( thx J)
PS1=''`echo ${PWD#"${PWD%/*/*}/"} | awk -v RS='/' 'length() <=10{printf $0"/"}; length()>10{printf "%s*%s/", substr($0,1,3), substr($0,length()-2,3)};'| tr -d "\n"; echo "#"`''
see below o/p's
/root/my-applications/bin # it shortened as expected
my-*ons/bin/#cd - # going back to prev.
/root
my-*ons/bin/# #value of prompt is the same but I am in /root
A one-liner is basically always the wrong choice. Write code to be robust, readable and maintainable (and, for something that's called frequently or in a tight loop, to be efficient) -- not to be terse.
Assuming availability of bash 4.3 or newer:
# Given a string, a separator, and a max length, shorten any segments that are
# longer than the max length.
shortenLongSegments() {
local -n destVar=$1; shift # arg1: where do we write our result?
local maxLength=$1; shift # arg2: what's the maximum length?
local IFS=$1; shift # arg3: what character do we split into segments on?
read -r -a allSegments <<<"$1"; shift # arg4: break into an array
for segmentIdx in "${!allSegments[#]}"; do # iterate over array indices
segment=${allSegments[$segmentIdx]} # look up value for index
if (( ${#segment} > maxLength )); then # value over maxLength chars?
segment="${segment:0:3}*${segment:${#segment}-3:3}" # build a short version
allSegments[$segmentIdx]=$segment # store shortened version in array
fi
done
printf -v destVar '%s\n' "${allSegments[*]}" # build result string from array
}
# function to call from PROMPT_COMMAND to actually build a new PS1
buildNewPs1() {
# declare our locals to avoid polluting global namespace
local shorterPath
# but to cache where we last ran, we need a global; be explicit.
declare -g buildNewPs1_lastDir
# do nothing if the directory hasn't changed
[[ $PWD = "$buildNewPs1_lastDir" ]] && return 0
shortenLongSegments shorterPath 10 / "$PWD"
PS1="${shorterPath}\$"
# update the global tracking where we last ran this code
buildNewPs1_lastDir=$PWD
}
PROMPT_COMMAND=buildNewPs1 # call buildNewPs1 before rendering the prompt
Note that printf -v destVar %s "valueToStore" is used to write to variables in-place, to avoid the performance overhead of var=$(someFunction). Similarly, we're using the bash 4.3 feature namevars -- accessible with local -n or declare -n -- to allow destination variable names to be parameterized without the security risk of eval.
If you really want to make this logic only apply to the last two directory names (though I don't see why that would be better than applying it to all of them), you can do that easily enough:
buildNewPs1() {
local pathPrefix pathFinalSegments
pathPrefix=${PWD%/*/*} # everything but the last 2 segments
pathSuffix=${PWD#"$pathPrefix"} # only the last 2 segments
# shorten the last 2 segments, store in a separate variable
shortenLongSegments pathSuffixShortened 10 / "$pathSuffix"
# combine the unshortened prefix with the shortened suffix
PS1="${pathPrefix}${pathSuffixShortened}\$"
}
...adding the performance optimization that only rebuilds PS1 when the directory changed to this version is left as an exercise to the reader.
Probably not the best solution, but a quick solution using awk:
PS1=`echo ${PWD#"${PWD%/*/*}/"} | awk -v RS='/' 'length()<=10{printf $0"/"}; length()>10{printf "%s*%s/", substr($0,1,3), substr($0,length()-2,3)};'| tr -d "\n"; echo "#"`
I got this results with your examples:
T-Mob/202*027/#
202*027/T-Mob/#

Excel file generation using SQL query, SPOOL command in Shell script

I have to generate excel file from tables from Oracle Database. The code is working fine but however the column names are not coming completely, there are just coming as the length of there data length.
I want complete header/column names
The column names are coming like this
ST STAT_TYPE_DESC ST S NXT_STATME DELAY_DAYS ANN_PREM_LIM_LOW ANN_PREM_LIM_HI CONTRIB_HIST_LEN EVENT_DO C P
But I want is complete names of the columns, for example ST is STATEMENT_TYPE_ID
#!/bin/ksh
FILE="A.csv"
sqlplus -s lifelite/lifelite#dv10 <<EOF
SPOOL $FILE
SET HEADING ON
SET HEADSEP OFF
SET FEEDBACK OFF
SET LINESIZE 250
SET PAGESIZE 5000 embedded ON
SET COLSEP ","
SELECT * FROM TLLI_01150STATTYPE;
EOF
SPOOL OFF
EXIT 0
Before your select add a new one with the column names:
SELECT 'STATEMENT_TYPE_ID' || ',' || 'STAT_TYPE_DESC' || ',' || ... FROM dual;
And set heading off

shell script has been retrieving from edbplus sql results with echo outputs

I am trying to call edbplus to count a table from a command-line linux shell script, but I have been retrieving from edbplus the response number with others outputs in the same response, I am trying to retrieve from it only an integer response number.
#!/bin/sh
COUNT=`./edbplus.sh -silent user/password#localhost:5444/mydb<<-EOF
SET PAGESIZE 0 FEEDBACK OFF VERIFY OFF HEADING OFF ECHO OFF
SELECT COUNT(ID) FROM MYTABLE
EXIT;
EOF`
echo $COUNT
Response:
$ echo $COUNT
6-------------------d always takes 2 parameters: variable_name value
Do you know how get only the integer number?
If the 1st value is going to be integer. Please try the below commands
echo $COUNT | cut -d - -f 1
(or)
if only one int value if required, then please try
echo $COUNT | cut -c 1
To solve it from EDB perspective:
If the below flags are used in EDB in single line, then the above issue would have caused.
SET PAGESIZE 0
SET FEEDBACK OFF
SET VERIFY OFF
SET HEADING OFF
SET ECHO OFF
Kindly update it as above and provide it in individual lines.

Shell script to fetch sql query data in csv file

Need to extract the below query data along with header in csv file using shell script.
Below is the query.
SELECT SourceIdentifier,SourceFileName,ProfitCentre2,PlantCode,
tax_retur ReturnPeriod,document_number DocumentNumber,TO_CHAR(invoice_generation_date,'YYYY-MM-DD')
Docume,Original,customer_name CustomerName,NVL(sns_pos,new_state_code)POS,PortCode,NEW_HSN_CODE HSNorSAC,(SGSATE+UTGSATE) Stat,(SGS+UT)StateUT,Userde FROM arbor.INV_REPO_FINA WHERE UPPER(document_type)='INV' AND UPPER(backout_flag)='VALID' AND new_gst_id_new IS NOT NULL AND new_charges<>0 AND taxable_adj=0
UNION
SELECT SourceIdentifier,SourceFileName,ProfitCentre2,PlantCode,
tax_retur ReturnPeriod,document_number DocumentNumber,TO_CHAR(invoice_generation_date,'YYYY-MM-DD')
Docume,Original,customer_name CustomerName,NVL(sns_pos,new_state_code)POS,PortCode, NEW_HSN_CODE HSNorSAC,(SGSATE+UTGSATE) Stat,(SGS+UTG)StateUT,Userde FROM arbor.INV_REPO_FINA WHERE UPPER(document_type)='INV' AND UPPER(backout_flag)='VALID' AND new_gst_id_new IS NOT NULL AND new_charges<>0 AND taxable_adj<>0
Could please let me know if below approach to fetch data using shell script is correct and script is correct.
#!/bin/bash
file="output.csv"
sqlplus -s username/password#Oracle_SID << EOF
SPOOL $file
select 'SourceIdentifier','SourceFileName','ProfitCentre2','PlantCode',
'tax_retur ReturnPeriod','document_number DocumentNumber','TO_CHAR(invoice_generation_date,'YYYY-MM-DD') Docume','Original','customer_name CustomerName','NVL(sns_pos,new_state_code)POS','PortCode','NEW_HSN_CODE HSNorSAC','(SGSATE+UTGSATE) Stat','(SGS+UT)StateUT','Userde' from dual
Union all
select 'TO_CHAR(SourceIdentifier)','TO_CHAR(SourceFileName)','TO_CHAR(ProfitCentre2)','TO_CHAR(PlantCode)',
'TO_CHAR(tax_retur ReturnPeriod)','TO_CHAR(document_number DocumentNumber)','TO_CHAR(invoice_generation_date,'YYYY-MM-DD')
Docume','TO_CHAR(Original)','TO_CHAR(customer_name CustomerName)','TO_CHAR(NVL(sns_pos,new_state_code)POS)','TO_CHAR(PortCode)','TO_CHAR(NEW_HSN_CODE HSNorSAC)','TO_CHAR((SGSATE+UTGSATE) Stat)','TO_CHAR((SGS+UT)StateUT)','TO_CHAR(Userde)' from
(SELECT SourceIdentifier,SourceFileName,ProfitCentre2,PlantCode,
tax_retur ReturnPeriod,document_number DocumentNumber,TO_CHAR(invoice_generation_date,'YYYY-MM-DD')
Docume,Original,customer_name CustomerName,NVL(sns_pos,new_state_code)POS,PortCode,NEW_HSN_CODE HSNorSAC,(SGSATE+UTGSATE) Stat,(SGS+UT)StateUT,Userde FROM arbor.INV_REPO_FINA WHERE UPPER(document_type)='INV' AND UPPER(backout_flag)='VALID' AND new_gst_id_new IS NOT NULL AND new_charges<>0 AND taxable_adj=0
UNION
SELECT SourceIdentifier,SourceFileName,ProfitCentre2,PlantCode,
tax_retur ReturnPeriod,document_number DocumentNumber,TO_CHAR(invoice_generation_date,'YYYY-MM-DD')
Docume,Original,customer_name CustomerName,NVL(sns_pos,new_state_code)POS,PortCode, NEW_HSN_CODE HSNorSAC,(SGSATE+UTGSATE) Stat,(SGS+UTG)StateUT,Userde FROM arbor.INV_REPO_FINA WHERE UPPER(document_type)='INV' AND UPPER(backout_flag)='VALID' AND new_gst_id_new IS NOT NULL AND new_charges<>0 AND taxable_adj<>0)
SPOOL OFF
EXIT
EOF
In short: the ; is missing from the end of the select statement.
Some unrequested advice:
I think spool will put extra stuff into your file (at least some new lines), a redirect is better, further the first line is not db-related:
echo "SourceIdentifier;SourceFileName;ProfitCentre2..." > $file
I recommend to generate the csv format right in the select query, later it will be more headache (you can escape there what you want):
$query = "select SourceIdentifier || ';' || SourceFileName || ';' || ProfitCentre2 ... ;"
So querying the DB (I think capital -S is the right one) plus for the formatting of the records (and maybe you want to format your columns too):
sqlplus -S username/password#Oracle_SID >> $file << EOF
set linesize 32767 pagesize 0 heading off
$query
EOF
For me this one is working but one empty line before first query and second query is coming. Empty line remove using awk command
#!/bin/bash
FILE="A.csv"
$ORACLE_HOME/bin/sqlplus -s username/password#Oracle_SID<<EOF
SET PAGESIZE 50000 COLSEP "," LINESIZE 20000 FEEDBACK OFF HEADING off
SPOOL $FILE
select 'TYPE_OF_CALL_V','SWITCH_CALL_TYPE_V','RECORD_TYPE_V','TARF_TYPE_V' from dual;
SELECT TYPE_OF_CALL_V,SWITCH_CALL_TYPE_V,RECORD_TYPE_V,TARF_TYPE_V FROM TABLE;
SPOOL OFF
EXIT
EOF
awk 'NF > 0' $FILE > out.txt
mv out.txt $FILE

How to delete older contents of file that is being continuously written to?

I have a simulation running and expect it to go on for atleast 10 more hours. I have directed the console out put to a .txt file using
(binary) > out.txt
This out.txt is becoming too huge. I do not need a lot of contents in this file. How can I delete the older parts of this file without harming the writing process? The contents that will be written towards the end of the simulation is important to me.
As Carl mentioned in the comments, you cannot really do this on an actively written log file. However, if the initial data is not relevant to you, you can do the following (though beware that you will loose all data)
> out.txt
For future, you can use a utility called logrotate(8)
You could use tail to only store the end of the file:
# Say you want to save the last 100 lines
your_binary | tail -n 100 > out.txt
This assumes that the output ends at some point.
saw your comments - the file is 10 GB now ... try using sed -i to reduce the size so that it will work with the other tools, if you want to completely erase it then :> logfile.
tools can cope up with a file which is as big as their buffer , else they should be streamed ..... something like split wont work on a 4 GB file , dont know if they made a code adjustment for this , its been long since i had to work with a file that big.
two suggestions :
1
there were a few methods i could think off like using split ....but almost all were involving creation of a seperate file from the log (a reduced version) and renaming that or redirecting to that.
use split to break the log to smaller logs (split -l 100 ...) and just redirect the program output to the recent the last log found using ls -1.
this seems to work fine .
2
Also i tried a second method to edit/truncate top 10 lines in the same file ......
Kaizen ~/shell_prac
$ cat zcntr.sh
## test truncate a log file
##set -xv
:> zcntr.log ;
## fxn
cntr_log()
{
limit=$1 ;
start=0 ;
while [ $start -lt $limit ]
do
echo "count is $start" >> zcntr.log ; ## generate a continuous log
start=$(($start + 1));
sleep 1;
cnt=$(($start % 10)) ;
if [ $cnt -eq 0 ] ## check to truncate the top 10 lines using sed
then
echo "truncate at $start " >> zcntr.log ;
sed -i "1,10d" zcntr.log ;
fi
done ;
}
## main cntrlr
echo "enter a limit" ;
read lmt ;
cntr_log $lmt ;
this seems to work
i tested it with a counter to print till value 25
output :
Kaizen ~/shell_prac
$ cat zcntr.log
count is 19
truncate at 20
count is 20
count is 21
count is 22
count is 23
count is 24
i think either of the two will help.
let me know if there is something else on your mind !!
Truncate file with cat
> cat /dev/null > out.txt

Resources