printf statement with multiple arguments - java-print

I am trying to print display data in a tabular format with multiple headers. I am using the following printf statement:
System.out.format("%-15s %-15s %-25s %-15s %-20s %s %n", "Id", "name", "Mode", "Total weight", "Arrival loc", "Departure loc");
But that generates the following error:
Method format in the type PrintStream is not applicable for the arguments (String, String, String, String, String, String)
What could the possible solution to print this statement with formatting?

can you post full code. Because the System.out.format("%-15s %-15s %-25s %-15s %-20s %s %n", "Id", "name", "Mode", "Total weight", "Arrival loc", "Departure loc"); works fine for me.enter image description here

Related

Pass keys from array from bash into jq -- output all nulls

I'm trying to do something akin to this:
jq -r '. | ."Time Series (Daily)"."2020-12-02" | ."1. open"' newdata.json
...but with the key coming from a variable, as in:
jq -r --arg key "$key" '. | ."Time Series (Daily)"."[$key]" | ."1. open"' newdata.json
The first one works just fine, but when I assign the date to a variable called key and then try to get the data, it fails.
I tried This answer and This answer. But did not work for me.
{
"Meta Data": {
"1. Information": "Daily Prices (open, high, low, close) and Volumes",
"2. Symbol": "AB",
"3. Last Refreshed": "2020-12-02",
"4. Output Size": "Compact",
"5. Time Zone": "US/Eastern"
},
"Time Series (Daily)": {
"2020-12-02": {
"1. open": "32.6700",
"2. high": "33.3300",
"3. low": "32.5000",
"4. close": "33.1200",
"5. volume": "273799"
},
"2020-12-01": {
"1. open": "32.1500",
"2. high": "32.8000",
"3. low": "32.0000",
"4. close": "32.6000",
"5. volume": "265086"
},
"2020-11-30": {
"1. open": "32.3800",
"2. high": "32.4900",
"3. low": "31.7500",
"4. close": "31.8700",
"5. volume": "251970"
}
}
}
The above is the newdata.json file.
What I want to get is the "1. open" value.
I am using a for loop to iterate over all the keys of "Time Series (Daily)" and the keys are generated correctly. There is no issue with that. I then want to use the $key variable in each iteration to
get the data I need.
readarray keys <<< "$(jq '."Time Series (Daily)" | keys[]' newdata.json)"
for key in "${keys[#]}"; do
jq -r --arg key "$key" '. | ."Time Series (Daily)" | .[$key] | ."1. open"' newdata.json
done
Focusing On The Immediate Issue
The problem isn't how you're passing key to jq; the problem is how you're populating the key variable in the first place.
Change:
readarray keys <<< "$(jq '."Time Series (Daily)" | keys[]' newdata.json)"
...to:
readarray -t keys <<< "$(jq -r '."Time Series (Daily)" | keys[]' newdata.json)"
There are two changes here:
We added the -t argument to readarray, so it no longer includes the newline ending each line in the variable itself.
We added the -r argument to jq, so it no longer adds literal quotes around the strings.
Sidebar: Retrieving both keys and values at the same time
There's no reason to do one pass to retrieve keys and another to retrieve values -- better to just get them all at once:
dates=( )
opening_prices=( )
while IFS=$'\t' read -r date opening_price; do
dates+=( "$date" )
opening_prices+=( "$opening_price" )
done < <(
jq -r '
."Time Series (Daily)" | to_entries[] | [.key, .value."1. open"] | #tsv
' <newdata.json
)
...after which, declare -p dates opening_prices emits:
declare -a dates=([0]="2020-12-02" [1]="2020-12-01" [2]="2020-11-30")
declare -a opening_prices=([0]="32.6700" [1]="32.1500" [2]="32.3800")
Original response (before population of keys was shown)
Here's a different approach that only calls jq once, instead of once per item, while still getting your keys from an array. It does this by using -R to read raw strings as input; . is then used to address those inputs (which we rename to $key to make it clear how this lines up with the old code).
keys=("2020-12-02" "2020-12-01" "2020-11-30")
readarray -t openingPrices < <(
jq -Rr --slurpfile indatas newdata.json '
$indatas[0] as $indata | . as $key |
$indata."Time Series (Daily)"[$key]["1. open"]
' < <(printf '%s\n' "${keys[#]}")
)
After running that, declare -p keys openingPrices (to show how both arrays are defined) emits:
declare -a keys=([0]="2020-12-02" [1]="2020-12-01" [2]="2020-11-30")
declare -a openingPrices=([0]="32.6700" [1]="32.1500" [2]="32.3800")
...so you have an output array that lines up with your input array (so long as the latter isn't sparse).
Use | .[$key] | to get the key from your $key variable;
key="2020-12-02"
jq -r --arg key "$key" '."Time Series (Daily)" | .[$key] | ."1. open"' newdata.json
# output: 32.6700
Or, combined with the for() (hardcoded keys, since we're not sure how you get those)
keys=("2020-12-02" "2020-12-01" "2020-11-30")
for key in "${keys[#]}"; do
jq -r --arg key "$key" '."Time Series (Daily)" | .[$key] | ."1. open"' newdata.json
done
32.6700
32.1500
32.3800

Assign variable separated by comma based on user input using function

I wanted to assign variable name separated by comma based on user input using function.
I will get the user input using below script and it will call function for variable assignment
while [ "$ans" != "q" ]
do
clear
echo "Choose your subject"
echo "Press q once done"
echo " 1.Science"
echo " 2.Maths"
echo " 3.English"
...
read ans
case $ans in
1) clear
Science;;
2) clear
Maths;;
3) clear
English;;
....
esac
done
clear
subjects=""
Science()
{
subjects+="$subjects Science"
}
Maths()
{
subjects+="$subjects Maths"
}
English()
{
subjects+="$subjects English"
}
At the end I wanted to have variable subjects to have option choose by the user:
Etc:
Science,Maths
Maths,English
Science,English
English
In bash, the function definition must be placed before any calls to the function.
The line subjects="" must be placed before the while loop. Otherwise its value will get lost (will be set to empty string) on exit from the loop.
The += operator causes double concatenation in the line subjects+="$subjects Science", since the right hand side contains already the expansion of the subjects variable. Either subjects="$subjects Science", or subjects+=" Science" must have been used (the same is also true for other lines in which the += operator is used). Besides, since a comma separated list is desired, a , character must be used while concatenating strings instead of space character. For example: subjects="$subjects,Science"
So a corrected script could be like this:
#!/bin/bash
subjects=""
Science() {
subjects="$subjects,Science"
}
Maths() {
subjects="$subjects,Maths"
}
English() {
subjects="$subjects,English"
}
while [ "$ans" != "q" ]; do
clear
echo "Choose your subject"
echo "Press q once done"
echo " 1.Science"
echo " 2.Maths"
echo " 3.English"
read -r ans
case $ans in
1) Science;;
2) Maths;;
3) English;;
esac
done
subjects=${subjects:1} # to remove the leading ',' character
echo "Your selections are $subjects"
Note: I wouldn't normally use a function just to append a simple string to a variable.

Alter output using exported Excel/CSV data if null values are found within 'foreach'

Sorry I wasn't quite sure how to word the question.
This is a follow-on from a previous question here: Take Excel cell content and place it into a formatted .txt file
The Import-XLS function I'm using is from here:
https://gallery.technet.microsoft.com/office/17bcabe7-322a-43d3-9a27-f3f96618c74b
My current code looks like this:
. .\Import-XLS.ps1
$OutFile = ".\OutTest$(get-date -Format dd-MM).txt"
$Content = Import-XLS '.\DummyData.xlsx'
$Content | foreach-object{
$field1 = $("{0},{1},{2},{3},{4},{5},{6}" -f "Field1", $_."Value1", $_."Value2", $_."Value3", $_."Value4", $_."Value5", $_."Value6")
$field1.Split(",",[System.StringSplitOptions]::RemoveEmptyEntries) -join ","
$field2 = $("{0},{1}" -f "Field2", $_."Value1")
$field2.Split(",",[System.StringSplitOptions]::RemoveEmptyEntries) -join ","
} | Out-File $OutFile
My Dummydata is essentially this (I've inserted $null to point out the blank values)
Entries Value1 Value2 Value3 Value4 Value5 Value6 Value7 Value8
Entry 1 1 2 $null 4 5 6 7 8
Entry 2 $null B A B A B A B
So I've managed to have the code 'ignore/skip' a null value within a set.
My output looks like this
Field1,1,2,4,5,6
Field2,1
Field1,B,A,B,A,B
Field2
What I would like help with now is how to either remove "Field2" because it has no value, or comment it out using ;.
So my output would look like
Field1,1,2,4,5,6
Field2,1
Field1,B,A,B,A,B
or
Field1,1,2,4,5,6
Field2,1
Field1,B,A,B,A,B
;Field2
Essentially, if a row has no data in any of it's fields that are being written for that line, it should be ignored.
Thanks SO MUCH for your help.
EDIT:
I've discovered I need to remove the comma "," between the {0},{1} and use a space instead. So I'm using
$field2 = $("{0} {1}" -f "Field 2", $_."Value1")
$field2 = $field2.Split(" ",[System.StringSplitOptions]::RemoveEmptyEntries)
if ( $field2.Count -le 1) { ";$field2" } else { $field2 -join "`t`t" }
Which works for 'most' of my fields.
However there are 'some' Fields and Values that have spaces in them.
Additionally there some values like "TEST TEXT"
So now I'm getting
Field1 3,B,A,B,A,B
Field 2 TEST TEXT
Instead of (quotes for clarity)
"Field1" 3,B,A,B,A,B
"Field 2" "TEST TEXT"
I'm happy to just use some kind of exception only for these few fields.
I've tried a few other things, but I end up breaking the IF statement, and it ;comments out fields with values, or doesn't ;comment out fields with no values.
Next code snippet could help:
### … … … ###
$Content | foreach-object{
$field1 = $("{0},{1},{2},{3},{4},{5},{6}" -f "Field1", $_."Value1", $_."Value2", $_."Value3", $_."Value4", $_."Value5", $_."Value6")
$auxarr = $field1.Split(",",[System.StringSplitOptions]::RemoveEmptyEntries)
if ( $auxarr.Count -le 1) { ";$auxarr" } else { $auxarr -join "," }
$field2 = $("{0},{1}" -f "Field2", $_."Value1")
$auxarr = $field2.Split(",",[System.StringSplitOptions]::RemoveEmptyEntries)
if ( $auxarr.Count -le 1) { ";$auxarr" } else { $auxarr -join "," }
} | Out-File $OutFile
Edit to answer additional (extending) subquestion I need to remove the comma "," between the {0},{1} and use a space instead:
$field2 = $("{0},{1}" -f "Field 2", $_."Value1")
### keep comma ↑ ↓ or any other character not contained in data
$field2 = $field2.Split(",",[System.StringSplitOptions]::RemoveEmptyEntries)
if ( $field2.Count -le 1) { ";$field2" } else { $field2 -join "`t`t" }
If your data contain a significant comma then use Split(String(), StringSplitOptions) form of String.Split Method e.g. as follows:
$burstr = [string[]]'#-#-#'
$field2 = $("{0}$burstr{1}" -f "Field 2", $_."Value1")
$field2 = $field2.Split($burstr,[System.StringSplitOptions]::RemoveEmptyEntries)
if ( $field2.Count -le 1) { ";$field2" } else { $field2 -join "`t`t" }

Perl String comparison

I'm trying to compare a string to another.
If it's a JSON structure which contains things, I want to print "contains things".
If it's a JSON structure which doesn't contain thing, I print "empty"
If it something which is not between curly brackets "{}", i print that there's an error.
Here's what I've done :
if($content =~ m/{.+}/){
print "Contains things \n";
} elsif($content eq "{}"){
$job_status{$url}="";
print "empty \n";
} else {
print "Error \n";
}
When I pass "{}" to the variable $content, he does not enter the "elsif", but go to the "else", and throw an error.
I've tried to put "==" instead the "eq" in the if, even though I know it's for numbers. When so, he enters the "elsif", and print "empty", like he should do with the "eq", and throws :
Argument "{}" isn't numeric in numeric eq (==)".
I could use the JSON library but I prefer not.
Thanks for your help !
Bidy
It works for me. Does $content have a newline character? Try chomp $content;.
use warnings;
use strict;
my $content = '{}';
if($content =~ m/{.+}/){
print "Contains things \n";
} elsif($content eq "{}"){
print "empty \n";
} else {
print "Error \n";
}
__END__
empty
I can replicate the behaviour if I add a newline after the {}:
#!/usr/bin/perl
use strict;
use warnings;
my $content = "{}\n";
if($content =~ m/{.+}/){
print "Contains things \n";
} elsif($content eq "{}"){
print "empty \n";
} else {
print "Error \n";
}
It returns "Error", if I replace eq with ==, it returns empty, because both "{}" and "{}\n" are numerically 0. A warning is thrown as you mentioned.
You might try to chomp the $content before processing it.
A top-level JSON thingy can be an object ({...}) or an array ([...]), but you're only checking for one of those. If you merely want to see if it's empty, I'd check the length of the string:
chomp $possible_json;
if( $length $possible_json >= 3 ) { ... }
You might also consider Randal Schwartz's regex for JSON parsing. It doesn't handle everything, but it's often enough for simple things.
I'd probably end up breaking it up:
unless ($content) {print "Error\n"};
$content =~ /{(.*)}/
my $resp = $1;
if ($resp) {
print "Contains Things ($resp)\n";
} else {
print "Empty\n";
}

Add double quotes around fields in AWK script output?

I have written an awk script that converts a distributor flatfile into a CSV importable into Magento. This file is semi-colon delimited.
It is not putting quotes around each field like the importer requires. It works fairly well, but is causing some issues on the data import without the enclosing double quotes. I spent a couple hours trying to figure out how to add this to the existing script, without much luck. Any help would be greatly appreciated - I am pretty new to AWK.
Current Output
store;websites;attribute_set;type;category_ids;sku;has_options;name;meta_title;meta_description;image;small_image;thumbnail;url_key;url_path;config_attributes;custom_design;page_layout;options_container;country_of_manufacture;msrp_enabled;msrp_display_actual_price_type;gift_message_available;rsr_pn;manufacturer_pn;price;special_price;cost;weight;msrp;status;visibility;manufacturer;enable_googlecheckout;tax_class_id;is_recurring;description;short_description;meta_keyword;custom_layout_update;news_from_date;news_to_date;special_from_date;special_to_date;custom_design_from;custom_design_to;qty;min_qty;use_config_min_qty;is_qty_decimal;backorders;use_config_backorders;min_sale_qty;use_config_min_sale_qty;max_sale_qty;use_config_max_sale_qty;is_in_stock;low_stock_date;notify_stock_qty;use_config_notify_stock_qty;manage_stock;use_config_manage_stock;stock_status_changed_auto;use_config_qty_increments;qty_increments;use_config_enable_qty_inc;enable_qty_increments;is_decimal_divided;stock_status_changed_automatically;use_config_enable_qty_increments;product_name;store_id;product_type_id;product_status_changed;product_changed_websites;gallery;related;upsell;crosssell;tier_prices;associated;bundle_options;grouped;group_price_price;downloadable_options;super_attribute_pricing;product_tags
admin;base;Default;simple;2,35,36;844802016148;0;5.11 HOLSTER SHIRT L WHITE;;;/5/1/511-40011-010-L_1.jpg;/5/1/511-40011-010-L_1.jpg;/5/1/511-40011-010-L_1.jpg;511-40011-010-L;511-40011-010-L.html;;;No layout updates;Block after Info Column;;Use config;Use config;No;511-40011-010-L;40011;74.99;;48.00;5;74.99;Enabled;Catalog, Search;5.11 Tactical;Yes;Taxable Goods;No;5.11 Tactical Short Sleeve Shirt L White Holster Shirt Crew 40011;5.11 Tactical Short Sleeve Shirt L White Holster Shirt Crew 40011;;;;;;;;;0;0;1;0;0;1;1;1;0;1;1;;;1;0;1;0;1;0;1;0;0;0;1;5.11 HOLSTER SHIRT L WHITE;0;simple;;;;;;;;;;;;;;
Desired Output
"store";"websites";"attribute_set";"type";"category_ids";"sku";"has_options";"name";"meta_title";"meta_description";"image";"small_image";"thumbnail";"url_key";"url_path";"config_attributes";"custom_design";"page_layout";"options_container";"country_of_manufacture";"msrp_enabled";"msrp_display_actual_price_type";"gift_message_available";"rsr_pn";"manufacturer_pn";"price";"special_price";"cost";"weight";"msrp";"status";"visibility";"manufacturer";"enable_googlecheckout";"tax_class_id";"is_recurring";"description";"short_description";"meta_keyword";"custom_layout_update";"news_from_date";"news_to_date";"special_from_date";"special_to_date";"custom_design_from";"custom_design_to";"qty";"min_qty";"use_config_min_qty";"is_qty_decimal";"backorders";"use_config_backorders";"min_sale_qty";"use_config_min_sale_qty";"max_sale_qty";"use_config_max_sale_qty";"is_in_stock";"low_stock_date";"notify_stock_qty";"use_config_notify_stock_qty";"manage_stock";"use_config_manage_stock";"stock_status_changed_auto";"use_config_qty_increments";"qty_increments";"use_config_enable_qty_inc";"enable_qty_increments";"is_decimal_divided";"stock_status_changed_automatically";"use_config_enable_qty_increments";"product_name";"store_id";"product_type_id";"product_status_changed";"product_changed_websites";"gallery";"related";"upsell";"crosssell";"tier_prices";"associated";"bundle_options";"grouped";"group_price_price";"downloadable_options";"super_attribute_pricing";"product_tags"
"admin";"base";"Default";"simple";"2,35,36";"844802016148";"0";"5.11 HOLSTER SHIRT L WHITE";"";"";"/5/1/511-40011-010-L_1.jpg";"/5/1/511-40011-010-L_1.jpg";"/5/1/511-40011-010-L_1.jpg";"511-40011-010-L";"511-40011-010-L.html";"";"";"No layout updates";"Block after Info Column";"";"Use config";"Use config";"No";"511-40011-010-L";"40011";"74.99";"";"48.00";"5";"74.99";"Enabled";"Catalog, Search";"5.11 Tactical";"Yes";"Taxable Goods";"No";"5.11 Tactical Short Sleeve Shirt L White Holster Shirt Crew 40011";"5.11 Tactical Short Sleeve Shirt L White Holster Shirt Crew 40011";"";"";"";"";"";"";"";"";"0";"0";"1";"0";"0";"1";"1";"1";"0";"1";"1";"";"";"1";"0";"1";"0";"1";"0";"1";"0";"0";"0";"1";"5.11 HOLSTER SHIRT L WHITE";"0";"simple";"";"";"";"";"";"";"";"";"";"";"";"";"";"
Script - rsrimport.awk
#!/bin/awk -f
# ----------------------------------------------------------------------------------------
# Copyright (c) 2012 - 2013 John Steensen <john.steensen#live.com>
# All rights reserved. No warranty, explicit or implicit, provided.
# ----------------------------------------------------------------------------------------
# AWK Processing
# Updated 03DEC2012#1552 MST
# ----------------------------------------------------------------------------------------
# Warnings/Dependancy Notes
# AWK
# ----------------------------------------------------------------------------------------
BEGIN {
FS=";";
OFS=";";
CATEGORY="47";
IMAGE="imagepathfail";
URLKEY="urlkeyfail";
URLPATH="urlpathfail";
print "store", "websites", "attribute_set", "type", "category_ids", "sku", "has_options", "name", "image", "small_image", "thumbnail", "url_key", "url_path", "page_layout", "options_container", "msrp_enabled", "msrp_display_actual_price_type", "gift_message_available", "rsr_pn", "manufacturer_pn", "price", "cost", "weight", "msrp", "manufacturer", "status", "is_recurring", "visibility", "enable_googlecheckout", "tax_class_id", "description", "short_description", "qty", "min_qty", "use_config_min_qty", "is_qty_decimal", "backorders", "use_config_backorders", "min_sale_qty", "use_config_min_sale_qty", "max_sale_qty", "use_config_max_sale_qty", "is_in_stock", "notify_stock_qty", "use_config_notify_stock_qty", "manage_stock", "use_config_manage_stock", "stock_status_changed_auto", "use_config_qty_increments", "qty_increments", "use_config_enable_qty_inc", "enable_qty_increments", "is_decimal_divided", "stock_status_changed_automatically", "use_config_enable_qty_increments", "product_name", "store_id", "product_type_id";
}
{
# DEFINE CATEGORY
if ($4=="1") CATEGORY="2,3,4";
else if ($4=="2") CATEGORY="2,3,7";
else if ($4=="3") CATEGORY="2,3,8";
else if ($4=="4") CATEGORY="2,3,22,23";
else if ($4=="5") CATEGORY="2,3,5";
else if ($4=="7") CATEGORY="2,3,6";
else if ($4=="8") CATEGORY="2,27,28";
else if ($4=="9") CATEGORY="2,27,29";
else if ($4=="10") CATEGORY="2,9,13";
else if ($4=="11") CATEGORY="2,9,14";
else if ($4=="12") CATEGORY="2,35,38";
else if ($4=="13") CATEGORY="2,9,16";
else if ($4=="14") CATEGORY="2,35,37";
else if ($4=="15") CATEGORY="2,19,21";
else if ($4=="16") CATEGORY="2,9,15";
else if ($4=="17") CATEGORY="2,9,16";
else if ($4=="18") CATEGORY="2,19,20";
else if ($4=="20") CATEGORY="2,27,33";
else if ($4=="21") CATEGORY="2,9,17";
else if ($4=="22") CATEGORY="2,3,22,24";
else if ($4=="23") CATEGORY="2,3,22,25";
else if ($4=="24") CATEGORY="2,9,13";
else if ($4=="25") CATEGORY="2,40,43";
else if ($4=="26") CATEGORY="2,40,44";
else if ($4=="27") CATEGORY="2,3,22,26";
else if ($4=="28") CATEGORY="2,27,31";
else if ($4=="29") CATEGORY="2,27,32";
else if ($4=="30") CATEGORY="2,27,30";
else if ($4=="31") CATEGORY="2,27,34";
else if ($4=="32") CATEGORY="2,9,11";
else if ($4=="33") CATEGORY="2,35,36";
else if ($4=="34") CATEGORY="2,9,10";
else if ($4=="35") CATEGORY="2,9,18";
else if ($4=="36") CATEGORY="2,40,42";
else if ($4=="38") CATEGORY="2,40,41";
else if ($4=="39") CATEGORY="2,40,45";
else if ($4=="40") CATEGORY="2,35,39";
else if ($4=="41") CATEGORY="2,9,12";
else if ($4=="43") CATEGORY="2,9,12";
else if ($4=="01") CATEGORY="2,3,4";
else if ($4=="02") CATEGORY="2,3,7";
else if ($4=="03") CATEGORY="2,3,8";
else if ($4=="04") CATEGORY="2,3,22,23";
else if ($4=="05") CATEGORY="2,3,5";
else if ($4=="07") CATEGORY="2,3,6";
else if ($4=="08") CATEGORY="2,27,28";
else if ($4=="09") CATEGORY="2,27,29";
else CATEGORY="47";
# DEFINE IMAGE WITH PATH.
IMAGE="/5/1/"$1"_1.jpg";
# DEFINE URL KEY
URLKEY=$1;
# DEFINE URL PATH
URLPATH=$1".html";
print "admin", "base", "Default", "simple", CATEGORY, $1, "0", $3, IMAGE, IMAGE, IMAGE, URLKEY, URLPATH, "No layout updates", "Block after Info Column", "Use config", "Use config", "No", $1, $12, $6, $7, $8, $6, $11, "Enabled", "No", "Catalog, Search", "Yes", "Taxable Goods", $14, $14, $9, "0", "1", "0", "0", "1", "1", "1", "0", "1", "1", "0", "1", "0", "1", "0", "1", "0", "1", "0", "0", "0", "1", $3, "0", "simple";
}
END {}
If you want:
add this to the existing script.
You can insert additional \"\" in each argument of print like this:
print "\"admin\"", "\"base\"", ...
Edited:
Yes, perhaps seting OFS is better solution:
BEGIN { OFS="\";\""; } ... print "\"admin", ...., "simple\"";
awk '{for (i=1;i<=NF;i++) $i="\""$i"\""}1' FS=";" OFS=";" input
To add quotes around the entries you could use a simple AWK loop:
Script - simple_loop.awk
BEGIN {FS=";"}
{
for(i=1;i<NF;i++){
printf("\"%s\";", $i);
}
printf("\"%s\"\n",$NF);
}
For instance
echo "admin;base;5.11 HOLSTER SHIRT L WHITE;;" | awk -f simple_loop.awk
Should output
"admin";"base";"5.11 HOLSTER SHIRT L WHITE";"";""
In this case I would use a sed expression instead of AWK.
If your data is in a file called data.txt, you can get it writing:
sed "s/;/\";\"/g;s/^/\"/;s/$/\"/" data.txt
That will print the result to the std output, but if you want to replace the content of the file just use sed -i this way:
sed -i "s/;/\";\"/g;s/^/\"/;s/$/\"/" data.txt
And that is all !!
Explanation:
The sed expression consists in three sed commands separated by ";" that you can run separately:
sed "s/;/\";\"/g
It makes a substitution (that is what means the first "s"), then the "/" (the default separator), ";" that is what we want to replace. Then the second separator "/", and the replacement: \";\" It is a sequence: escaped quote, a semicolon and a escaped quote. So with this command we will replace semicolons ; by ";". The last /g means that each ; will be replaced (not only the first smicolon).
If the input was a;b;c after this running the first command it will be a";"b";"c.
Now we need to add quotes in the beginning (^ in a regular expression) and in the end ($). So that is what it means:
sed "s/^/\"/" // the first quote
And
sed "s/$/\"/" // the last quote
Getting the desired output:
"a";"b";"c"
Let me refactor your program a bit:
/#!/bin/awk -f
BEGIN {
FS=";";
OFS="\";\"";
IMAGE="imagepathfail";
URLKEY="urlkeyfail";
URLPATH="urlpathfail";
# DEFINE CATEGORY
CATEGORY["1"] ="2,3,4";
CATEGORY["2"] ="2,3,7";
CATEGORY["3"] ="2,3,8";
CATEGORY["4"] ="2,3,22,23";
CATEGORY["5"] ="2,3,5";
CATEGORY["7"] ="2,3,6";
CATEGORY["8"] ="2,27,28";
CATEGORY["9"] ="2,27,29";
CATEGORY["10"]="2,9,13";
CATEGORY["11"]="2,9,14";
CATEGORY["12"]="2,35,38";
CATEGORY["13"]="2,9,16";
CATEGORY["14"]="2,35,37";
CATEGORY["15"]="2,19,21";
CATEGORY["16"]="2,9,15";
CATEGORY["17"]="2,9,16";
CATEGORY["18"]="2,19,20";
CATEGORY["20"]="2,27,33";
CATEGORY["21"]="2,9,17";
CATEGORY["22"]="2,3,22,24";
CATEGORY["23"]="2,3,22,25";
CATEGORY["24"]="2,9,13";
CATEGORY["25"]="2,40,43";
CATEGORY["26"]="2,40,44";
CATEGORY["27"]="2,3,22,26";
CATEGORY["28"]="2,27,31";
CATEGORY["29"]="2,27,32";
CATEGORY["30"]="2,27,30";
CATEGORY["31"]="2,27,34";
CATEGORY["32"]="2,9,11";
CATEGORY["33"]="2,35,36";
CATEGORY["34"]="2,9,10";
CATEGORY["35"]="2,9,18";
CATEGORY["36"]="2,40,42";
CATEGORY["38"]="2,40,41";
CATEGORY["39"]="2,40,45";
CATEGORY["40"]="2,35,39";
CATEGORY["41"]="2,9,12";
CATEGORY["43"]="2,9,12";
CATEGORY["01"]="2,3,4";
CATEGORY["02"]="2,3,7";
CATEGORY["03"]="2,3,8";
CATEGORY["04"]="2,3,22,23";
CATEGORY["05"]="2,3,5";
CATEGORY["07"]="2,3,6";
CATEGORY["08"]="2,27,28";
CATEGORY["09"]="2,27,29";
# header
print "store", "websites", "attribute_set", "type", "category_ids", "sku", "has_options", "name", "image", "small_image", "thumbnail", "url_key", "url_path", "page_layout", "options_container", "msrp_enabled", "msrp_display_actual_price_type", "gift_message_available", "rsr_pn", "manufacturer_pn", "price", "cost", "weight", "msrp", "manufacturer", "status", "is_recurring", "visibility", "enable_googlecheckout", "tax_class_id", "description", "short_description", "qty", "min_qty", "use_config_min_qty", "is_qty_decimal", "backorders", "use_config_backorders", "min_sale_qty", "use_config_min_sale_qty", "max_sale_qty", "use_config_max_sale_qty", "is_in_stock", "notify_stock_qty", "use_config_notify_stock_qty", "manage_stock", "use_config_manage_stock", "stock_status_changed_auto", "use_config_qty_increments", "qty_increments", "use_config_enable_qty_inc", "enable_qty_increments", "is_decimal_divided", "stock_status_changed_automatically", "use_config_enable_qty_increments", "product_name", "store_id", "product_type_id";
}
function getCategory(val) {
return (val in CATEGORY) ? CATEGORY[val] : "47";
}
{
# DEFINE IMAGE WITH PATH.
IMAGE="/5/1/"$1"_1.jpg";
# DEFINE URL KEY
URLKEY=$1;
# DEFINE URL PATH
URLPATH=$1".html";
print "\" "admin", "base", "Default", "simple", getCategory($4), $1, "0", $3, IMAGE, IMAGE, IMAGE, URLKEY, URLPATH, "No layout updates", "Block after Info Column", "Use config", "Use config", "No", $1, $12, $6, $7, $8, $6, $11, "Enabled", "No", "Catalog, Search", "Yes", "Taxable Goods", $14, $14, $9, "0", "1", "0", "0", "1", "1", "1", "0", "1", "1", "0", "1", "0", "1", "0", "1", "0", "1", "0", "0", "0", "1", $3, "0", "simple" "\"";
}
In my opinion, we could use printf (formated output) and double quote is obtain with \" into format string.
e.g.
gawk 'BEGIN{print "WKT,punto";}{printf "\"LINESTRING Z (%f %f 0,%f %f 0)\",\"%d"\n",$3,$2,$4,$5,$1}' Frecce_geoloc_12-24.txt
output:
$3 $2 $4 $5 $1
"LINESTRING Z (-72.319686 -50.609328 0,-50.609309 -72.319499 0)","6582"
"LINESTRING Z (-72.319245 -50.609215 0,-50.609195 -72.319052 0)","6583"
"LINESTRING Z (-72.318799 -50.609101 0,-50.609081 -72.318607 0)","6584"
"LINESTRING Z (-72.318366 -50.608990 0,-50.608969 -72.318169 0)","6585"

Resources