Not using colnames when reading .xls files with RODBC - excel

I have another puzzling problem.
I need to read .xls files with RODBC. Basically I need a matrix of all the cells in one sheet, and then use greps and strsplits etc to get the data out. As each sheet contains multiple tables in different order, and some text fields with other options inbetween, I need something that functions like readLines(), but then for excel sheets. I believe RODBC the best way to do that.
The core of my code is following function :
.read.info.default <- function(file,sheet){
fc <- odbcConnectExcel(file) # file connection
tryCatch({
x <- sqlFetch(fc,
sqtable=sheet,
as.is=TRUE,
colnames=FALSE,
rownames=FALSE
)
},
error = function(e) {stop(e)},
finally=close(fc)
)
return(x)
}
Yet, whatever I tried, it always takes the first row of the mentioned sheet as the variable names of the returned data frame. No clue how to get that solved. According to the documentation, colnames=FALSE should prevent that.
I'd like to avoid the xlsReadWrite package. Edit : and the gdata package. Client doesn't have Perl on the system and won't install it.
Edit:
I gave up and went with read.xls() from the xlsReadWrite package. Apart from the name problem, it turned out RODBC can't really read cells with special signs like slashes. A date in the format "dd/mm/yyyy" just gave NA.
Looking at the source code of sqlFetch, sqlQuery and sqlGetResults, I realized the problem is more than likely in the drivers. Somehow the first line of the sheet is seen as some column feature instead of an ordinary cell. So instead of colnames, they're equivalent to DB field names. And that's an option you can't set...

Can you use the Perl-based solution in the gdata instead? That happens to be portable too...

Related

Matlab writematrix in Excel gives error: Name cannot be the same as built-in name

I want to copy an Excel file to a different path through Matlab and then write in it also using Matlab. Somehow I get the error: Name cannot be the same as built-in name.
As I want to write multiple times in the file, I don't want to solve this problem manuelly each time, I want the code to run through without me having to do something constantly.
Is there any way I can solve this problem all at once through code? Does this happen because I copy the Excel file first?
The code looks like this:
path_source_template1 = 'Blabla1\Template1.xlsx';
timestamp = datestr(now);
timestamp = strrep(timestamp, ':', '-');
timestamp = strrep(timestamp, ' ', '-');
path_output = fullfile('Blabla2\',timestamp);
mkdir(fullfile(path_output));
path_output_template1 = strcat(path_output,'\Template1.xlsx');
copyfile(path_source_template1,path_output_template1);
Then I want to write in the Template1.xlsx:
writematrix(test,path_output_template1,'Sheet','Test','Range','A1',UseExcel=true,AutoFitWidth=false);
Then I get this error:
enter image description here
The input to the writematrix file uses the name, value format, so in this line:
writematrix(test,path_output_template1,'Sheet','Test','Range','A1',UseExcel=true,AutoFitWidth=false);
you should have:
..., 'Sheet','Test','Range','A1','UseExcel', true,'AutoFitWidth', false);
Disclaimer: I haven't tested this, but I'm fairly certain this will fix your problem.
The correct call to the writematrix would be:
writematrix(test,path_output_template1, 'Sheet','Test','Range','A1', 'UseExcel', true,'AutoFitWidth',false);
When writing datetime data to a spreadsheet file, you must set both 'PreserveFormat' and the 'UseExcel' Name-Value pair to true to preserve the existing cell formatting. You can check the documentation writematrix.
In order to answer to the error I tested the code locally in Matlab 2019b and works well setting the test variable in this example way: test = magic(5);.
Maybe the error could be in the data that you use in test variable or in that if you run the code iteratively very fast the path_output could exist, One way to improve this could be with a more accurate timestamp.

phpspreadsheet merge error with variables

When I'm merging cells with phpspreadsheet using a variable I have an issue.
On opening in MS Excel (2019), it says that the program can try to recover the document if I'm sure it's a reliable one.
When I say yes, the document is ok and the merging worked fine.
Why do I have that message?
I don't have this message on this way :
$spreadsheet->getActiveSheet()->mergeCells('B2:F2');
But on this way I have this message :
$cellRange = 'B2:F2';
$spreadsheet->getActiveSheet()->mergeCells($cellRange);
MergeCells is a sensible function. If you try to/accidentally make overlap cell groups this kind of error. Be sure that your code not do something like this:
for($i=1; $i<3; $i++){
$cellRange = 'B'.$i.':F'.$i;
$spreadsheet->getActiveSheet()->mergeCells($cellRange);
}
My mistake was :
I was using that merging tool on a "for" loop and I was trying to merge an already merged cell with another one.

Loop through to Import Multiple Excel Files and convert each workbook in a file to .dta

I have a file that contains over a 60 excel workbook that I would like to convert each of them to .dta file in stata. I search the net but could not find a decent way of doing it over a loop. I have written a code that needs expert help. In a directory and want to create a loop to save them as .dta files. The code goes as follow
forvalues i=1/60{
import excel "D:\Okay\""`i'.xlsx", sheet("Sheet1") firstrow clear
save "D:\Okay\""`i'.dta"
}
We can't try out your code because it's specific to your computer. Please study https://stackoverflow.com/help/mcve before posting questions.
But it's evident that
\Okay\""`i'.xlsx"
is unlikely to help. As documented many times over -- e.g. [U] 18.3.10 within http://www.stata.com/manuals14/u18.pdf and http://www.stata-journal.com/sjpdf.html?articlenum=pr0042 -- the backslash you want to use under Windows (it's best not to assume everyone recognises your OS) also has a role in Stata as an escape character.
That command would be better off ending
\Okay/`i'.xlsx"
and similar comments apply to the other command lines mentioning files: change the backward slash before a local macro reference to a forward slash, and remove the unnecessary double quotation marks.
In fact all this is totally avoidable. Consider
cd "D:\Rami Chehab\University Degrees & Courses\PhD in Labour Economics\Data\Data 2016\UNCTAD\Okay"
forvalues i=1/60 {
import excel `i'.xlsx, sheet("Sheet1") firstrow
save `i'.dta
}
Once you cd to work within a directory or folder, you can keep file names to the bare minimum.
There was plenty of errors I did over there; however, I believe I figured it out. This what you have to code in order to work
forvalues date=1/57{
import excel "D:/Rami Chehab/University Degrees & Courses/PhD in Labour Economics/Data/Data 2016/UNCTAD/Okay/`date'.xlsx", sheet("Sheet1") firstrow clear
save "D:/Rami Chehab/University Degrees & Courses/PhD in Labour Economics/Data/Data 2016/UNCTAD/Okay/`date'.dta"
}

Extract the ANOVA table to Excel/.csv

I'm wondering if it's possible to extract the table that results when running ANOVA to an Excel or .csv file. I'm running a repeated measures two-way ANOVAs with RMAOV2 (http://uk.mathworks.com/matlabcentral/fileexchange/5578-rmaov2). Here is the code I'm using, which works fine, and it produces a table with the ANOVA results.
dir ='/Users/Documents/folder';
cd(dir)
file = readtable('file.csv');
toAnalyse = table2array(file);
RMAOV2(toAnalyse);
However, when I tried to save the ANOVA results in order to then export them to Excel or in a .csv file, this doesn't work:
ANOVAresults = RMAOV2(toAnalyse);
Error:
Output argument "RMAOV2" (and maybe others) not assigned during call to "RMAOV2".
Any suggestion would be very appreciated.
If you take a look into the source code of the file, you will notice that it never assigns anything to the return variable. Instead it only prints data to the command window.
To resolve this problem you have to edit the source code and assign the data you want to return. Alternatively you can contact the Autor.

Can I import SAP tables that were exported by SE16?

I have exported the contents of a table with transaction SE16, by selecting all the entries and going selecting Download, unconverted.
I'd like to import these entries into another system (where the same table exists and is active).
Furthermore, when I import, there's a possibility that the specific key already exists for a number of entries (old entries).
Other entries won't have a field with the same key present in the table where they're to be imported (new entries).
Is there a way to easily update my table in the second system with the file provided from the first system? If needed, I can export the data in the 3 other format types (Spreadsheet, Rich text format and HTML format). It seems to me though like the spreadsheet and rich text formats sometimes corrupt the data, and the html is far too verbose.
[EDIT]
As per popular demand, the table i'm trying to export / import is a Z table whose fields are all numeric, character, date or time fields (flat data types).
I'm trying to do it like this because the clients don't have any basis resource to help them transport, and would like to "kinna" automate the process of updating one of the tables in one system.
At the moment it's a business request to do it like this, but I'm open to suggestions (and the clients are open too)
Edit
Ok I doubt that what you describe in your comment exists out of the box, but you can easily write something like that:
Create a method (or function module if that floats your boat) that accepts the following:
iv_table name TYPE string and
iv_filename TYPE string
This would be the method:
method upload_table.
data: lt_table type ref to data,
lx_root type ref to cx_root.
field-symbols: <table> type standard table.
try.
create data lt_table type table of (iv_table_name).
assign lt_table->* to <table>.
call method cl_gui_frontend_services=>gui_upload
exporting
filename = iv_filename
has_field_separator = abap_true
changing
data_tab = <table>
exceptions
others = 4.
if sy-subrc <> 0.
"Some appropriate error handling
"message id sy-msgid type 'I'
" number sy-msgno
" with sy-msgv1 sy-msgv2
" sy-msgv3 sy-msgv4.
return.
endif.
modify (p_name) from table <table>.
"write: / sy-tabix, ' entries updated'.
catch cx_root into lx_root.
"lv_text = lx_root->get_text( ).
"some appropriate error handling
return.
endtry.
endmethod.
This would still require that you make sure that the exported file matches the table that you want to import. However cl_gui_frontend_services=>gui_upload should return sy-subrc > 0 in that case, so you can bail out before you corrupt any data.
Original Answer:
I'll assume that you want to update a z-table and not a SAP standard table.
You will probably have to format your datafile a little bit to make it tab or comma delimited.
You can then upload the data file using cl_gui_frontend_services=>gui_upload
Then if you want to overwrite the existing data in the table you can use
modify zmydbtab from table it_importeddata.
If you do not want to overwrite existing entries you can use.
insert zmydbtab from table it_importeddata.
You will get a return code of sy-subrc = 4 if any of the keys already exists, but any new entries will be inserted.
Note
There are many reasons why you would NOT do this for a SAP-standard table. Most prominent is that there is almost always more to the data-model than what we are aware of. Also when creating transactional data, there are often follow-on events or workflow that kicks off, that will not be the case if you're updating the database directly. As a rule of thumb, it is usually a bad idea to update SAP standard tables directly.
In that case try to find a BADI, or if that's not available, record a BDC and do the updates that way.
If the system landscape was setup correctly, your client would not need any kind of basis operations support whatsoever to perform the transports. So instead of re-inventing the wheel, I'd strongly suggest to catch up on what the CTS and TMS can do once they're setup with sensible settings.

Resources