Can I import SAP tables that were exported by SE16? - data-migration

I have exported the contents of a table with transaction SE16, by selecting all the entries and going selecting Download, unconverted.
I'd like to import these entries into another system (where the same table exists and is active).
Furthermore, when I import, there's a possibility that the specific key already exists for a number of entries (old entries).
Other entries won't have a field with the same key present in the table where they're to be imported (new entries).
Is there a way to easily update my table in the second system with the file provided from the first system? If needed, I can export the data in the 3 other format types (Spreadsheet, Rich text format and HTML format). It seems to me though like the spreadsheet and rich text formats sometimes corrupt the data, and the html is far too verbose.
[EDIT]
As per popular demand, the table i'm trying to export / import is a Z table whose fields are all numeric, character, date or time fields (flat data types).
I'm trying to do it like this because the clients don't have any basis resource to help them transport, and would like to "kinna" automate the process of updating one of the tables in one system.
At the moment it's a business request to do it like this, but I'm open to suggestions (and the clients are open too)

Edit
Ok I doubt that what you describe in your comment exists out of the box, but you can easily write something like that:
Create a method (or function module if that floats your boat) that accepts the following:
iv_table name TYPE string and
iv_filename TYPE string
This would be the method:
method upload_table.
data: lt_table type ref to data,
lx_root type ref to cx_root.
field-symbols: <table> type standard table.
try.
create data lt_table type table of (iv_table_name).
assign lt_table->* to <table>.
call method cl_gui_frontend_services=>gui_upload
exporting
filename = iv_filename
has_field_separator = abap_true
changing
data_tab = <table>
exceptions
others = 4.
if sy-subrc <> 0.
"Some appropriate error handling
"message id sy-msgid type 'I'
" number sy-msgno
" with sy-msgv1 sy-msgv2
" sy-msgv3 sy-msgv4.
return.
endif.
modify (p_name) from table <table>.
"write: / sy-tabix, ' entries updated'.
catch cx_root into lx_root.
"lv_text = lx_root->get_text( ).
"some appropriate error handling
return.
endtry.
endmethod.
This would still require that you make sure that the exported file matches the table that you want to import. However cl_gui_frontend_services=>gui_upload should return sy-subrc > 0 in that case, so you can bail out before you corrupt any data.
Original Answer:
I'll assume that you want to update a z-table and not a SAP standard table.
You will probably have to format your datafile a little bit to make it tab or comma delimited.
You can then upload the data file using cl_gui_frontend_services=>gui_upload
Then if you want to overwrite the existing data in the table you can use
modify zmydbtab from table it_importeddata.
If you do not want to overwrite existing entries you can use.
insert zmydbtab from table it_importeddata.
You will get a return code of sy-subrc = 4 if any of the keys already exists, but any new entries will be inserted.
Note
There are many reasons why you would NOT do this for a SAP-standard table. Most prominent is that there is almost always more to the data-model than what we are aware of. Also when creating transactional data, there are often follow-on events or workflow that kicks off, that will not be the case if you're updating the database directly. As a rule of thumb, it is usually a bad idea to update SAP standard tables directly.
In that case try to find a BADI, or if that's not available, record a BDC and do the updates that way.

If the system landscape was setup correctly, your client would not need any kind of basis operations support whatsoever to perform the transports. So instead of re-inventing the wheel, I'd strongly suggest to catch up on what the CTS and TMS can do once they're setup with sensible settings.

Related

Skip element in BizTalk flat file assembly?

I've been tasked to map an input xml (actually an SAP idoc xml), and to generate a number of flat files. Each input xml may yield multiple output files (one output file per lot number), so I will be using xsl:key and the key() function in my mapping, based on the lot number
The thing is, the lot number itself will not be in the file itself, but the output file name needs to contain that lot number value.
So the question really is: can I map the lot number to the xml and have the flat file assembler skip it when it produces the file? Or is there another way the lot number can be applied as file name by the assembly without having it inside the file itself?
In your orchestration you can set a context property for each output message:
msgOutput(FILE.ReceivedFileName) = "DynamicStuff";
msgOutput then goes to the send shape.
In your send port you set the output file like this:
FixedStuff_%SourceFileName%.xml
The result:
FixedStuff_DynamicStuff.xml
If the value is not required in the message content, don't map it. That's it.
To insert at value in the file name, lot number in this case, you will need to promote that value to the FILE.ReceivedFileName Context Property. Then, you can use the %SourceFileName% Macro as part of the name setting in the Send Port. You can set FILE.ReceivedFileName by either Property Promotion or xpath() in an Orchestration.
Bonus: Sorting and Grouping in xslt is rather unwieldy, which is why I don't do that anymore. Instead, you can use SQL: BizTalk: Sorting and Grouping Flat File Data In SQL Instead of XSL

2 Sequential Transactions, setting Detail Number (Revit API / Python)

Currently, I made a tool to rename view numbers (“Detail Number”) on a sheet based on their location on the sheet. Where this is breaking is the transactions. Im trying to do two transactions sequentially in Revit Python Shell. I also did this originally in dynamo, and that had a similar fail , so I know its something to do with transactions.
Transaction #1: Add a suffix (“-x”) to each detail number to ensure the new numbers won’t conflict (1 will be 1-x, 4 will be 4-x, etc)
Transaction #2: Change detail numbers with calculated new number based on viewport location (1-x will be 3, 4-x will be 2, etc)
Better visual explanation here: https://www.docdroid.net/EP1K9Di/161115-viewport-diagram-.pdf.html
Py File here: http://pastebin.com/7PyWA0gV
Attached is the python file, but essentially what im trying to do is:
# <---- Make unique numbers
t = Transaction(doc, 'Rename Detail Numbers')
t.Start()
for i, viewport in enumerate(viewports):
setParam(viewport, "Detail Number",getParam(viewport,"Detail Number")+"x")
t.Commit()
# <---- Do the thang
t2 = Transaction(doc, 'Rename Detail Numbers')
t2.Start()
for i, viewport in enumerate(viewports):
setParam(viewport, "Detail Number",detailViewNumberData[i])
t2.Commit()
Attached is py file
As I explained in my answer to your comment in the Revit API discussion forum, the behaviour you describe may well be caused by a need to regenerate between the transactions. The first modification does something, and the model needs to be regenerated before the modifications take full effect and are reflected in the parameter values that you query in the second transaction. You are accessing stale data. The Building Coder provides all the nitty gritty details and numerous examples on the need to regenerate.
Summary of this entire thread including both problems addressed:
http://thebuildingcoder.typepad.com/blog/2016/12/need-for-regen-and-parameter-display-name-confusion.html
So this issue actually had nothing to do with transactions or doc regeneration. I discovered (with some help :) ), that the problem lied in how I was setting/getting the parameter. "Detail Number", like a lot of parameters, has duplicate versions that share the same descriptive param Name in a viewport element.
Apparently the reason for this might be legacy issues, though im not sure. Thus, when I was trying to get/set detail number, it was somehow grabbing the incorrect read-only parameter occasionally, one that is called "VIEWER_DETAIL_NUMBER" as its builtIn Enumeration. The correct one is called "VIEWPORT_DETAIL_NUMBER". This was happening because I was trying to get the param just by passing the descriptive param name "Detail Number".Revising how i get/set parameters via builtIn enum resolved this issue. See images below.
Please see pdf for visual explanation: https://www.docdroid.net/WbAHBGj/161206-detail-number.pdf.html

Netsuite - Transfer Inventory error

I have been using NetSuite for only a short time, and already hate it. I am sorry if this is a stupid question, but I haven't been able to find an answer so far, either in the Netsuite docs, StackOverflow or other websites. In fact, the answers I found have resulted in an error.
My company requires a script to transfer inventory based on an EDI input file. Reading the file is no problem, even parsing it is working. However, actually inserting the data is proving problematic.
I have been able to insert normal records, but Inventory Transfer records are giving me problems.
From Stack Overflow I found and adapted some code into the following:
var xfer = nlapiCreateRecord("inventorytransfer");
xfer.setFieldValue("trandate", FormatDate("20160101"));
xfer.setFieldValue("location", 9);
xfer.setFieldValue("transferlocation", 9);
nlapiSelectNewLineItem('invt');
nlapiSetLineItemValue("invt","invtid",1, 189);
nlapiSetLineItemValue("invt","adjustqtyby", 1, "5");
nlapiCommitLineItem('invt');
var id = nlapiSubmitRecord(xfer);
The FormatDate function just exchanges the date from the text file into a system date NetSuite can understand.
However, when I run this code I get the following error:
USER_ERROR: You must enter at least one line item for this transaction.
I thought inserting the line item was the reason to use nlapiSelectNewLineItem, but I guess not. Also nlapiCreateNewLineItem doesn't seem to exist.
The values I am inserting are all just test data, as I'm testing this in the debugger. Location 9 exists, as does item 189.
My full script finds these id's based on string values from the text files. But since this is the section that doesn't work I have set it apart to test.
Can anyone help with this?
You did not specify the type of script you are using, but, looks like you are not setting the fields on the record object, but, setting the value on current record. Below, is the suggested code.
Also, there is no sublist named invt, it should be inventory. Also, there is no field as such invtid, I think most probably you want to setup item, the field Id should be item. You might want to refer SuiteScript Record Browser as well for a help on correct Ids
var xfer = nlapiCreateRecord("inventorytransfer");
xfer.setFieldValue("trandate", FormatDate("20160101"));
xfer.setFieldValue("location", 9);
xfer.setFieldValue("transferlocation", 9);
xfer.selectNewLineItem('inventory');
xfer.setCurrentLineItemValue("inventory", "item", 189);
xfer.setCurrentLineItemValue("inventory","adjustqtyby", "5");
xfer.commitLineItem('inventory');
var id = nlapiSubmitRecord(xfer);
If you are using Bin/Lot Numbered Items, please see help topic "Sample Scripts for Advanced Bin / Numbered Inventory Management"

Oracle DB View -> Copying View to Excel using VBScript

english isn't my native tongue, but I hope I can explain my problem sufficiently.
I made a View in the Oracle DB which only contains the data I need.
Using SQL in my VBScript file, I select the View by using:
"SELECT * FROM TEST_1234"
I have selected the complete view now, that works fine.
Now I need to 'export' or copy the complete View to Excel using VBScript (via UFT [Unified Functional Testing]).
Is there an easy way to just copy the whole thing at once or at least complete rows or columns?
If 1. doesn't work, can I just 'iterate' through the rows and columns using two loops and copy the data from every field to the respective field in Excel?
It would be nice to be able to copy the Data without using the names of the columns in a recordset (is there a way to use numbers until EOC [End of columns]?), because there is a very high amount of columns to be copied and the column names are subject to change.
Thanks for any help!
From a programmer==code writer's point of you the most attractive solution is your very first approach (copy the whole thing with just one SQL statement). Depending on the providers' capabilities this statement could look like
INSERT INTO [DstTable] SELECT * FROM [SrcTable] IN '' 'odbc;dsn=DSNName'
or
SELECT * INTO [DstTable] FROM [SrcTable] IN '' 'odbc;dsn=DSNName'
Look here for a working solution that couldn't be simpler; but I admit that a dsnless connection to the destination database looks more complicated and your drivers may have other incantations to refer to the external Database. Furthermore, your pair of providers may not support an external connection from the source to the destination and the dirty trick of using the Access OLEDB driver (which came/still comes? with ADO) to connect to both Databases externally may not work for you. In all, it's certainly not easy to get "INSERT/SELECT INTO External Database" right. [Look at my (just downvoted) answer to see that people dispair and fall back (and upvote) code that uses single-item-copy-loops.] In your case, you'll have to research whether at least one of the Oracle providers available to you supports external connections to Excel (or vice versa).
From a programmer==hacker's point of view (let's get the job done with minimal fuss) an easy solution could be to export the views/tables to .csv (
I looked at this and was disappointed, but you may know much better) and to import them into Excel (just load .csv and save .xls)
If you can't/won't use the file system, you could go thru memory: Use GetRows to get the data into a two dimensional array and assign that to the desired Excel range.
If all the above fails and you need assignments to single cells in row and column loopings over the recordset, remember that the Fields collection gives you access to not only the data but the meta-info (number of columns, column-names, types, ...) too.
Thanks for the help, and the links you provided, Ekkehard and Bond! After reading them and trying a lot, i got a very simple solution.
Here's some working code, if anybody else faces the same or a similar problem:
Option explicit
Dim conn, rec, xlStat, xlStatW, dbCnnStr, SQLSec, statArt
Set conn = Createobject("ADODB.Connection")
Set rec = CreateObject("ADODB.Recordset")
Set xlStat = CreateObject("Excel.Application")
dbCnnStr = "[your DB-connection]"
conn.open dbCnnStr
'Start Excel XXX
Set xlStatW = xlStat.Workbooks.Add()
xlStatW.Sheets(1).Name = "AAA_123"
xlStatW.Sheets(2).Name = "BBB_123"
xlStatW.Sheets(3).Name = "CCC_123"
SQLSec = "SELECT * FROM XXX_123"
rec.open SQLSec,conn
xlStatW.Sheets(1).cells(2,1).CopyFromRecordset rec
rec.Close
SQLSec = "SELECT * FROM YYY_123"
rec.open SQLSec,conn
xlStatW.Sheets(2).cells(2,1).CopyFromRecordset rec
rec.Close
SQLSec = "SELECT * FROM ZZZ_123"
rec.open SQLSec,conn
xlStatW.Sheets(3).cells(2,1).CopyFromRecordset rec
rec.Close
xlStatW.SaveAs ("C:\test.xlsx")
xlStatW.Close
'Ende Excel XXX
conn.Close

Not using colnames when reading .xls files with RODBC

I have another puzzling problem.
I need to read .xls files with RODBC. Basically I need a matrix of all the cells in one sheet, and then use greps and strsplits etc to get the data out. As each sheet contains multiple tables in different order, and some text fields with other options inbetween, I need something that functions like readLines(), but then for excel sheets. I believe RODBC the best way to do that.
The core of my code is following function :
.read.info.default <- function(file,sheet){
fc <- odbcConnectExcel(file) # file connection
tryCatch({
x <- sqlFetch(fc,
sqtable=sheet,
as.is=TRUE,
colnames=FALSE,
rownames=FALSE
)
},
error = function(e) {stop(e)},
finally=close(fc)
)
return(x)
}
Yet, whatever I tried, it always takes the first row of the mentioned sheet as the variable names of the returned data frame. No clue how to get that solved. According to the documentation, colnames=FALSE should prevent that.
I'd like to avoid the xlsReadWrite package. Edit : and the gdata package. Client doesn't have Perl on the system and won't install it.
Edit:
I gave up and went with read.xls() from the xlsReadWrite package. Apart from the name problem, it turned out RODBC can't really read cells with special signs like slashes. A date in the format "dd/mm/yyyy" just gave NA.
Looking at the source code of sqlFetch, sqlQuery and sqlGetResults, I realized the problem is more than likely in the drivers. Somehow the first line of the sheet is seen as some column feature instead of an ordinary cell. So instead of colnames, they're equivalent to DB field names. And that's an option you can't set...
Can you use the Perl-based solution in the gdata instead? That happens to be portable too...

Resources