I'm working with a csv file that contains 1 field (out of 10 total fields) with very large strings (50,000+ characters). The other 9 fields contain strings of normal length (<100).
I imported this file into the PowerPivot data model and then copied the data directly from the table view in PowerPivot into Notepad++.
All of the strings in Notepad++ are 32,767 characters, which suggests that PowerPivot has the same limitations as standard Excel in this respect.
Is there something I can do in PowerPivot to enable a field to hold more than 32,767 characters, or am I going to have to find another solution?
Fyi, the objective is to extract this long string (which is a base64-encoded jpeg) from the csv and save it as a separate text file (which would then be converted back to a jpeg with PowerShell...a script I've already developed).
The remainder of the data in the original csv would be saved as a table and combined with some other data from a few other sources to create one table to upload into our Salsify PIM.
I've asked the provider of this csv if it's possible to export the very long strings as individual text files with names that I could relate back to the original dataset (which would solve my problem instantly), but there is resistance. They are insisting on putting everything in one csv.
Note that I do have some experience in Python (and of course PowerShell) and am open to learning tools like PowerAutomate or any other tool that you'd recommend for something like this.
edit: Note that the jpeg files I'm working with range in size from 10KB all the way up to ~16MB, so the base64 string can get very long (in the range of 3.5M characters).
You will have to split an image into multiple rows of 30,000 characters and concatenation it back together in DAX. Images up to about 2.1MB should be supported this way.
I have fits file containing numerical values of magnitudes of star clusters. I need to see them as a csv file. Any code to run in python to do the trick?
Note: Fits file is not an image file, it is a table file.
why not use fdump ?
fdump -- Convert the contents of a FITS table to ASCII format.
reference
https://heasarc.gsfc.nasa.gov/ftools/fhelp/fdump.txt
A 3rd party software 'Eclipse Orchestrator' saves its config file as 'csv' format. Among other things it includes camera exposure times like '1/2000' to indicate a 1/2000 sec exposure. Here a sample line from the csv file:
FOR,(VAR),0.000,5.000,49.000
TAKEPIC,MAGPRE (VAR),-,00:01:10.0,EOS450D,1/2000,9.0,100,0.000,RAW+FL,,N,Partial 450D
ENDFOR
When the csv file is loaded into Excel the screen display reads 'Jan-00'. So Excel interprets the string 1/2000 as a date. When the file is saved again as csv and inspected in an ascii editor it reads:
FOR,(VAR),0,5,49,,,,,,,,
TAKEPIC,MAGPRE (VAR),-,01:10.0,EOS450D,Jan-00,9,100,0,RAW+FL,,N,Partial 450D
ENDFOR,,,,,,,,,,,,
I had hoped to use Excel to variablearize the data and make it easier changeable. But the conversion to fake dates is not helping here.
The conversion at load-time affects the saved data format making it then unreadable for the 'Eclipse Orchestrator' program.
Any way to save the day in Excel, or just move on to write a prog to do the patching of the csv file?
Thanks,
Gert
If you import the CSV file instead of opening it, you can use the import wizard (Data ribbon > From Text) to define the data type of each column. Select Text for the exposure time and Excel will not attempt to convert it.
I have a transaction in SAP - ZHR_TM01 (possibly built by our IT department) that prints the timesheets of our employees that are swiping a card.
I need all this data in excel format but the problem is that the only option I know is to type "PDF!" in the command bar when I'm on the print preview menu of the timesheet, so it will convert all selected timesheets to pdf format. In order to have this data in excel format i need to use acrobat converter. This option is somewhat unprofessional and working with the sheet becomes very "convert dependent" because every time I use this method the conversion is slightly different compared to previous conversions: the columns/rows are not consistent etc.
What I ask is is there a way to directly retrieve the data in some readable consistent format since it is obvious that the data exists.
If there is a analogous command like the PDF! to convert to excel format or any other?
It will help me big time.
Thanks!!
If the function code PDF! works, the printout is most likely implemented using a Smart Form. In this case, it should be possible to create an alternative download function, e. g. SALV. I'd recommend contacting the person who originally developed the transaction to get an estimate - I'm not qualified to get into the details of HR...
See if you can convert to a .csv or .txt file. Once you have it in either of those formats you should be able to import them into Excel and delimit the columns with greater accuracy.
As I know How to import csv file into Progress database, I would like to know the sample script to import xls file into progress database.
please advise me.
There are many different XLS compatible formats. CSV or TAB delimited are two very easy ones to work with. The newer formats are complex, compressed XML archives.
CSV and TAB and the ilk are easy. Just use IMPORT DELIMITER.
Example using a semicolon (;) separated file (normally with extension.csv).
/* Fields in temp-table matches the columns of the .csv-file */
DEFINE TEMP-TABLE ttExample NO-UNDO
FIELD col1 AS CHARACTER
FIELD col2 AS INTEGER
FIELD col3 AS DATE.
INPUT FROM c:\temp\test.csv.
REPEAT:
CREATE ttExample.
IMPORT DELIMITER ";" ttExample.
END.
INPUT CLOSE.
There are also some older formats like SYLK. There is an import for that hanging somewhere off the data dictionary tools if I recall.
Excel also deals well with HTML tables. A fragment of HTML consisting of tags is easily imported by Excel.
The newer .xlsx formats have no direct Progress 4GL import available and you would need to first research and specify the details of the format you are using. First you will need to uncompress. Then you need to handle the XML inside.
IMPORT DELIMITER "," /* field list here */.