SAP, how to checkIn document files? - document

I want to create a new document in SAP. Additional I have some files which belongs to this document, these files I want to upload to the SAP knwolegde base.
I'm using BAPI_DOCUMENT_CREATE2 to create or BAPI_DOCUMENT_CHECKIN2 to add files to a document info data. Every thing works fine, except file upload or checkin.
I'm using the DOCUMENTFILES table. I add a row for each file, currently I set only three fields:
row["STORAGECATEGORY"] = "DMS_C1_ST";
row["DOCFILE"] = "c:\temp\bom.pdf";
row["WASAPPLICATION"] = "PDF";
BAPI erro message:
"Error while checking in and storing c:/temp/bom.pdf"
I set the parameter
PF_FTP_DEST = "SAPFTPA";
PF_HTTP_DEST = "SAPHTTPA";
I have looked in the log data (slg1). I found following entry:
ERRMSG: Error in opening file "..." for reading (No such file or directrory)
V1: SCMS_DOC_CREATE_FILES
V2: 13
It would be nice if anybody has an idea and could bring some light in this issue.
Thanks in advance
Thomas

Remember that BAPIS run inside the application server and are not allowed to make any assumptions about the client side. This also means that they can't call back to the SAP GUI and upload a file from there. C:\temp\bom.pdf has to be a file on the application server, not your local machine!

Have you considered using
row["DOCFILE"] = "bom.pdf";
row["DOCPATH"] = "c:\temp\";
Let me know how it goes, or if you have already done with it then please paste your solution.

Related

how to work with local tmp files in an Nodejs API?

I am currently making a little API which returns jsons according to an input. This API needs to run some local programs on the server and also needs to place some temporary files. It all works if I ask the API once a time and just with one "user".
The problem is, that I only have one temp folder to store the temporary files.so whenever there are multiple API queries, the tmp folder screws up - the data in there mixes up.
What would be a good way to have an API using temp files - and still keep it working if every run needs its own temp data?
The current process is:
server/api/getGeometry?lat=52.5167776&lon=13.4092091&bboxSize=2000&output=glb
server queries zips according to lat/lon
unpacks
converts
does voodoo
generates json
sends back the json
cleans up the tmp folder
next run same story...
so every time the same tmp folder, I guess it's not the way to go.
Thanks a lot for ideas!

SSIS - Power Query Source: setting connection at runtime

I'm trying to use the Power Query source component in a generic way from SSIS (VS2019).
The idea would be to use a for each loop to load and transform Excel files. At run time, I need to set the connection manager properties for each file as well as the PQY script to be executed on the file.
What I did so far is trying to create a JSON connection string inside a script component and assign the connection string to the connection manager. It keeps on saying that the file requires credentials.
Would someone already experienced that kind of dev? All the files do have the same structure so far, do meta-data need to be refreshed too?
[Edit]
1. In the control flow, I'm retrieving the PQY script I want to apply from a DB.
Before transormations, script starts like this:
let Source = Excel.Workbook(File.Contents("path_to_a_file.xlsx"),null,true),RawData_Sheet = Source{[Item="Table1",Kind="Table"]}[Data]..."
In the C# script task, I'm replacing the path to excel file by the current file variable. M Script is stored in a variable used in the PQY component.
C# Script is then updating the PQY connection manager to target the appropriate file:
ConnectionManager _conn = Dts.Connections["Power Query Connection Manager"];
String _ConnectString = "[{kind:File,path:path_to_a_file.xlss,AuthenticationKind:Windows,Username:myusername,Password:mypassword}]";
_conn.ConnectionString = _ConnectString;
The PQY component is left has it is, connected to ["Power Query Connection Manager"] and getting its script from the variable I set.
PQY configuration screen
Thanks for any tip on this,
Olivier
I can't address the specifics of the PQ but generic anything in a Data Flow will not work.
The Data Flow task works because it makes a strict contract between the source(s) and the destination(s). These columns with these data types will be in play during the run. It's a design-time contract because that allows the run-time engine to allocate resources based on how many buffers of data the system can support. Each row is X bytes, we have Y bytes of memory available, so Z buffers worth of data plus parallelism stuff.
Wish I had a better story to tell you.

Update ListObject in Excel --> Error Access Database Blocked

I have a Access Database running as a normal file on a shared drive. Some tables from this database are available to users to read through Excel Files with ListsObjects.
These ListObjects need to be refreshed regularly to be up-to-date. Currently I am doing that manually.
I want to do this automatically by running a VBA code in the access database to open each file and in each files all the ListObjects to refresh them.
However, if I run the ListObject.refresh() method I get an error message that the database is blocked. If I do the refresh manually it works perfectly. In both cases I have the database open, so that can't be the problem.
This is the error message:
The database has been placed in a state by an user ("here is my user ID") that
prevents it from being opened or locked. (Error 3733)
Thank you for any suggestions.
Here is the part of the code:
Set ExlFile = app.Workbooks.Add(myWB)
For Each mySheet In ExlFile.Worksheets
For Each myLO In mySheet.ListObjects
myLO.Refresh
Next
Next
UPDATE:
If I use the function compact and repair on the database, it works for 3-5 files and then it stops working again.
Does anyone know what Application.CompactRepair does? I have a very bad solution for my problem now, I added Application.CompactRepair to each iteration. Now it works but of course it takes very long. It seems to me Application.CompactRepair somehow resets some status in the DB and it would be a better solution to do that directly.

FTP.retrbinary fails

I'm a complete Python novice, so I apologize if the solution to my problem seems obvious. I'm having difficulty with some relatively simple code that I've written. I've scanned several related questions that have already been posted, but I don't see where my code differs in any meaningful way from the solutions suggested.
I'm trying to write a program that will:
Establish a ftp connection to a remote server.
Change the working directory on the ftp server.
Retrieve a list of files in the working directory from the ftp server.
Find a file ending with a specific suffix from the retrieved list of files.
Retrieve the found file to a temporary directory (created by tempfile.mkdtemp()) on the user's local file system.
Steps 1 through 4 are working as expected. Sadly, the last step is falling into my except clause.
Can anyone make a suggestion regarding what might be wrong with the following line of code?
ftp.retrbinary('RETR ' + file, open(opsys.path.join(localTempDir, fileName)).write)
Your suggestions are greatly appreciated. Thanks, in advance.
Possible problems:
- type(file) = incorrect value
- opsys.path.join(localTempDir, fileName) = incorrect value #nonexistent file
Thats all what comes to mind looking on presented line of code =)

Problems came up in the following areas during load: Table

I have generated an excel file from xml. But i can not open it with Excel. Excel gives the following error opening it:
Problems came up in the following areas during load:
Table
Then it shows a message that the log file corresponding the error can be found at : C:/Documents and Setting/myUserName/Local Settings/Temporary Internet Files/Content.MSO/xxxxx.log
But i can not find Content.MSO folder in my windows. I checked folder settings and made all folders visible but i still can not access this folder. So that i can not analise the log file.
how could i find the generated log file?
I found the problem without analising the log file. i stil can not access the log file in temporary internet files. But i realised that i put a string(non-number) characters on a number-styled cell in Excel xml. So if you having the similar issues about your Excel file generated from xml, then have a look at if your cell values are appopriate with your cell data type.
If you type or paste the path of the log file into Explorer or your text editor of choice, you may find that the folder does exist, despite being invisible.
In my case it was a <Row> with an incorrect ss:Index
I was using a template and the last row had a fixed Index=100. If the number of rows I added exceeded 100, this last row had a wrong index and excel threw the error without any other message or log (MacOSX, Excel 15.25.1). I wish they printed more informative error messages, what a waste of our time.
Excel 2016. My error message was "Worksheet Settings". Path was pointing to non-existing file.
My cause of the problem was ExpandedRowCount not big enough for number of rows in Worksheet. If you add rows in XML directly (i.e. on a machine where Excel is not installed), make sure to increment number of rows in ExpandedRowCount.
yes.Even i too faced the same problem and problem was with the data type of cells ofexcel generated using xslt
In addition to checking the data being used vs "Type" assigned, make sure that the list of characters that need to be encoded for XML are indeed encoded.
I had a system that appeared to be working, but then some user data including & and < was throwing this error.
If you're not sure what's going on with your file, try http://www.xmlvalidation.com/ - that helped be spot the issue in a large file immediately.
I used this function to fix it, modified from this post:
function xmlsafe($s) {
return str_replace(array('&','>','<','"'), array('&','>','<','"'), $s);
}
and then run echo xmlsafe($myvalue) where you were just echoing $myvalue in your script.
This seems to be more appropriate for XML than htmlentities() or other options built into PHP.
I had the same issue, and the answer was - type of Cell was Number and some values doesn't converts to this type on my backend.
I had the SAME problem,
and its because de file is TOO BIG.
I try an extract from SAP, more little than the one with that make the error) and save it in XML file. and it WORK, no more error.
so maybe if you can save in 2 Excel files XML instead of 1 it will be good ;)
ALicia

Resources