I have some data in excel file. Now I need to find their significance value which is not possible with excel. It is only possible with PSPP. But when I import my excel file (after converting to csv file) to pspp it is making hell lot of problems specially with variable names.
Could anyone please tell me some easy solutions?
Excel to PSPP: Make sure your excel file is prepared:
Red boxes in the import-process of the cvs-file indicate problems in the rows, like words added in what otherwise looks like a variable with numeric values.
The variable names must not be too long so fix that first, it works like a very old version of SPSS in this respect.
Then look carefully at the steps when importing.
Choose the second row to be the first (as the first row is variable names, and not a case).
Then click the box for choosing the top row as variable names.
It is less smooth than SPSS in the initial procedure, but so fantastic with a free alternative to the SPSS.
When the dataset is ready, I found it worked well to do the same analyzes as in SPSS.
I suppose your problem is solved a long time ago, but maybe someone else may benefit from this.
A site for sharing information about the PSPP would be great...
One should first convert excel file into CSV (maybe through Apple Mac software Number) and then import the converted into PSPP software...easy
Related
I collect various data in time plots. If I copy the timeplot data and then paste it into Excel, the number format is often wrong. For example, I often get a date like Aug 94 instead of the actual number from the TimePlot. Unfortunately, I can't easily format this date into a number either, since the formatted number does not match the actual number from the timeplot. If I format the date in the same format as the number above and below, then I get the number 34547. However, this number does not correspond to the actual number of the TimePlot. Anyone know how I can prevent this problem?
You can only solve this on the Excel side, AnyLogic provides the raw data for you. Excel then interprets stuff. You can test it by pasting the chart raw data into a txt or csv file.
So either fix your Excel settings or paste into a csv, then into an xlsx.
Or better still: Do not manually paste at all. Instead, write your model results into the AnyLogic database and export to Excel from there: this takes away a lot of the pain for you. Check the example models to learn how to do that.
This is not AnyLogic question, rather an Excel & computer formatting problem. One way of resolving this is changing computer's date and time settings.
Another way is to save your output at txt file in AnyLogic. Replace all . with ,. Then open empty Excel, select Text format for the columns. Copy-paste from the txt file.
In Excel there are a few options
when you paste use paste as text only option
But this does not always work as Excel will still try to format the stuff for you
Use the Paste Special option and then choose text
Also possible this will not work, based on your Excel settings.
Paste using the text import wizard
(This works for me without fail)
On step 2 choose tab delimited
On step 3 choose Column format as text for every column (you need to select them in the little diagram below)
You will then see the data exactly as it came from AnyLogic. See the example below where I purposefully imported some text which has something that Excel will think is a date. You will now be able to see what in your data made Excel thing your data needed to be formatted the way it is and then you can fix it. (post a new question if you struggle with this conversion)
But as noted by other answers first prize is to write all the important data to external files. But I know that even I sometimes want to export data from a chart and review it in Excel. Option 3 works for me everytime
The format of our member numbers has changed several times over the years, such that 00008, 9538, 746, 0746, 00746, 100125, and various other permutations are valid, unique and need to be retained. Exporting from our database into the custom Excel template needed for a mass update strips the leading zeros, such that 00746 and 0746 are all truncated to 746.
Inserting the apostrophe trick, or formatting as text, does not work in our case, since the data seems to be already altered by the time we open it in Excel. Formatting as zip won't work since we have valid numbers less than five digits in length that cannot have zeros added to them. And I am not having any luck with "custom" formatting as that seems to require either adding the same number of leading zeros to a number, or adding enough zeros to every number to make them all the same length.
Any clues? I wish there was some way to set Excel to just take what it's given and leave it alone, but that does not seem to be the case! I would appreciate any suggestions or advice. Thank you all very much in advance!
UPDATE - thanks everybody for your help! Here are some more specifics. We are using a 3rd party membership management app -- we cannot access the database directly, we need to use their "query builder" tool to get the data we want to mass update. Then we export using their "template" format, which is called XLSX but there must be something going on behind the scenes, because if we try to import a regular old Excel, we get an error. Only their template works.
The data is formatted okay in the database, because all of the numbers show correctly in the web-based management tool. Also, if I export to CSV, save it as a .txt and import it into Excel, the numbers show fine.
What I have done is similar to ooo's explanation below -- I exported the template with the incorrect numbers, then exported as CSV/txt, and copied / pasted THOSE numbers into the template and re-imported. I did not get an error, which is something I guess, but I will not be able to find out if it was successful until after midnight! :-(
Assuming the data is not corrupt in the database, then try and export from the database to a csv or text file.
The following can then be done to ensure the import is formatted correctly
Text file with comma delimiter:
In Excel Data/From text and selected Delimited, then next
In step 3 of the import wizard. For each column/field you want as text, highlight the column and select Text
The data should then be placed as text and retain leading zeros.
Again, all of this assumes the database contains non-corrupt data and you are able to export a simple text or csv file. It also assumes you have Excel 2010 but it can be done with minor variation across all versions.
Hopefully, #ooo's answer works for you. I'm providing another answer mainly for informational purposes, and don't feel like dealing with the constraints on comments.
One thing to understand is that Excel is very aggressive about treating "numeric-looking" data as actual numbers. If you were to open the CSV by double-clicking and letting Excel do its thing (rather than using ooo's careful procedure), those numbers would still have come up as numbers (no leading zeros). As you've found, one way to counteract this is to append clearly nonnumeric characters onto your data (before Excel gets its grubby hands on it), to really convince Excel that what it's dealing with is text.
Now, if the thing that uploads to their software is a file ending in .xlsx, then most likely it is the current Excel format (a compressed XML document, used by Excel 2007 and later). I suppose by "regular old Excel" you mean .xls (which still works with the newer Excels in "compatibility mode").
So in case what you've tried so far doesn't work, there are still avenues to explore before resorting to appending characters to the end of your data. (I'll update this answer as needed.)
You're on the right track with the apostrophe.
You'll need to store your numbers in excel as text at the time they are added to the file.
What are you using to create the original excel file / export from database?
This will likely be where your focus needs to be regarding your export.
For example one approach is that you could potentially modify the database export to include the ' symbol prefix before the numbers so that excel will know to display them as text.
I use the formula =text(cell,"# of zeros of the field") to add preceding zeros.
Example, Cell C2 has 12345 and I need it to be 10 characters long. I would put =text(c2,"0000000000").
The result will be 0000012345.
I have a csv file with commas inside of fields that are non-enclosed. I unfortunately must parse this file and cannot get it replaced with a properly formatted one.
I really don't even know where to begin.
OK. What I'm seeing is the following: You have about 8,000 rows that essentially have a CSV syntax error in them. You can manually figure out which they are, but manually fixing 8,000 entries is a bit much.
The obvious first approach would be to try to see how it is that you can manually figure out which columns have this issue. If it is something you can define rules for, you are in business. If its simple enough, you can write a small text editor macro to go through the file and do it for you. If your text editor doesn't support macros. Use awk. If you are on Windows and don't have awk, then go get it.
If it is too complicated for that, fix your real problem. Go fix whatever generated this CSV file to generate it right. If it was someone else's code you don't have access to, tell them to fix it. "You are generating 8,000 unparsable entries" seems like a pretty good argument in my book. Sooner or later they will probably generate a new revision of this file for you to process, so this is really the Right Thing to do.
There's probably nothing you can do with it short of analyzing the records manually in a text editor. The comma delimiters are essentially useless if there is no discernable way to distinguish them from valid commas in the data.
If you can get a cleaner file from whoever created the bad one, that's probably far less trouble than trying to fix up the one you've got.
You could run an excel macro to reformat the comma's to some other character (let's say $, something not in your file) for the time being, then once you've parsed the file you could run the results through some code to reformat the character back into the original commas.
EDIT: I am assuming that you have access to the original file seeing as you've tagged excel here?
I think the best you can hope for is 80% automatic, which means you'll be doing over 1,000 manually best case. You just need to be clever about the data that's there. Read each line in and count the commas. If it's the right amount, write it out to a new file. If it's too many, send it to the exception handler.
Start with what you absolutely know about the data. Is the first column a TimeStamp? If you know that, you can go from "20 commas when there should be 18" to "19 commas when there should be 17". I know that doesn't exactly lift your spirits but it's progress. Is there a location, like a plant name, somewhere in there? Maybe you can develop a list from the good data and search for it in the bad data. If column 7 should be the plant name, go through your list of plant names and see if one of them exists. If so, count the commas between that and the start and between that and the end (or another good comma location that you've established).
If you have some unique data, you can regex to find it's location in the string and again, count commas before and after to see if it's where it should be. Like if you have a Lat/Long reading or a part number that's in the format 99A99-999.
If you can post five or ten rows of good data, maybe someone can suggest more specific ways to identify columns and their locations.
Good luck.
This question is long winded because I have been updating the question over a very long time trying to get SSIS to properly export Excel data. I managed to solve this issue, although not correctly. Aside from someone providing a correct answer, the solution listed in this question is not terrible.
The only answer I found was to create a single row named range wide enough for my columns. In the named range put sample data and hide it. SSIS appends the data and reads metadata from the single row (that is close enough for it to drop stuff in it). The data takes the format of the hidden single row. This allows headers, etc.
WOW what a pain in the butt. It will take over 450 days of exports to recover the time lost. However, I still love SSIS and will continue to use it because it is still way better than Filemaker LOL. My next attempt will be doing the same thing in the report server.
Original question notes:
If you are in Sql Server Integrations Services designer and want to export data to an Excel file starting on something other than the first line, lets say the forth line, how do you specify this?
I tried going in to the Excel Destination of the Data Flow, changed the AccessMode to OpenRowSet from Variable, then set the variable to "YPlatters$A4:I20000" This fails saying it cannot find the sheet. The sheet is called YPlatters.
I thought you could specify (Sheet$)(Starting Cell):(Ending Cell)?
Update
Apparently in Excel you can select a set of cells and name them with the name box. This allows you to select the name instead of the sheet without the $ dollar sign. Oddly enough, whatever the range you specify, it appends the data to the next row after the range. Oddly, as you add data, it increases the named selection's row count.
Another odd thing is the data takes the format of the last line of the range specified. My header rows are bold. If I specify a range that ends with the header row, the data appends to the row below, and makes all the entries bold. if you specify one row lower, it puts a blank line between the header row and the data, but the data is not bold.
Another update
No matter what I try, SSIS samples the "first row" of the file and sets the metadata according to what it finds. However, if you have sample data that has a value of zero but is formatted as the first row, it treats that column as text and inserts numeric values with a single quote in front ('123.34). I also tried headers that do not reflect the data types of the columns. I tried changing the metadata of the Excel destination, but it always changes it back when I run the project, then fails saying it will truncate data. If I tell it to ignore errors, it imports everything except that column.
Several days of several hours a piece later...
Another update
I tried every combination. A mostly working example is to create the named range starting with the column headers. Format your column headers as you want the data to look as the data takes on this format. In my example, these exist from A4 to E4, which is my defined range. SSIS appends to the row after the defined range, so defining A4 to E68 appends the rows starting at A69. You define the Connection as having the first row contains the field names. It takes on the metadata of the header row, oddly, not the second row, and it guesses at the data type, not the formatted data type of the column, i.e., headers are text, so all my metadata is text. If your headers are bold, so is all of your data.
I even tried making a sample data row without success... I don't think anyone actually uses Excel with the default MS SSIS export.
If you could define the "insert range" (A5 to E5) with no header row and format those columns (currency, not bold, etc.) without it skipping a row in Excel, this would be very helpful. From what I gather, noone uses SSIS to export Excel without a third party connection manager.
Any ideas on how to set this up properly so that data is formatted correctly, i.e., the metadata read from Excel is proper to the real data, and formatting inherits from the first row of data, not the headers in Excel?
One last update (July 17, 2009)
I got this to work very well. One thing I added to Excel was the IMEX=1 in the Excel connection string: "Excel 8.0;HDR=Yes;IMEX=1". This forces Excel (I think) to look at all rows to see what kind of data is in it. Generally, this does not drop information, say for instance if you have a zip code then about 9 rows down you have a zip+4, Excel without this blanks that field entirely without error. With IMEX=1, it recognizes that Zip is actually a character field instead of numeric.
And of course, one more update (August 27, 2009)
The IMEX=1 will succeed importing data with missing contents in the first 8 rows, but it will fail exporting data where no data exists. So, have it on your import connection string, but not your export Excel connection string.
I have to say, after so much fiddling, it works pretty well.
P.S. If you are using a x64 bit version, make sure you call the DTExec from C:\Program Files\Microsoft SQL Server\90\DTS.x86\Binn. It will load the 32 bit Excel driver and work fine.
Would it be easier to create the Excel Workbook in a script task, then just pick it up later in the flow?
The engine part of SSIS is good but the integration with Excel is awful
"Using SSIS in conjunction with Excel is like having hot tar funnelled up your iHole in a road cone"
Dr. Zim, I believe you were the one that originally brought up this question. I totally feel your pain. I love SSIS overall, but I absolutely hate the limited tools that come standard for Excel. All I want to do is Bold the Heading or Row1 record in Excel, and not bold the following records. I have not found a great way to do that; granted I am approaching this with no script tasks or custom extensions, but you would think something this simple would be a standard option. Looks like I may be forced to research and program up something fancy for a task that should be so fundamental. I've already spent a rediculous amount of time on this myself. Does anyone know if you can use Excel XML with Excel versions: 2000/XP/2003? Thanks.
This is an old thread but what about using a flat file connection and writing the data out as a formatted html document. Set the mime type in the page header to "application/excel". When you send the document as an attachment and the recipient opens the attachment, it will open a browser session but should pop Excel up over the top of it with the data formatted according to the style (CSS) specified in the page.
Can you have SSIS write the data to an Excel sheet starting at A1, then create another sheet, formatted as you like, that refers to the other sheet at A1, but displays it as A4? That is, on the "pretty" sheet, A4 would refer to A1 on the SSIS sheet.
This would allow SSIS to do what it's good for (manipulate table-based data), but allow the Excel to be formatted or manipulated however you'd like.
When excel is the destination in SSIS, or the target export type in SSRS, you do not have much control over formatting and specifying how you want the final file to be. I have written a custom excel rendering engine for SSRS once, as my client was so strict about the format of final Excel report generated. I used 'Excel xml' to get the job done inside my custom renderer. May be you can use XML output and convert it to Excel XML using XSLT.
I understand you would rather not use a script component so perhaps you could create your own custom task using the code that a script contains so that others can use this in the future. Check here for an example.
If this seems feasible the solution I used was CarlosAg Excel Xml Writer Library. With this you can create code which is similar to using the Interop library but produces excel in xml format. This avoids using the Interop object which can sometimes lead to excel processes hanging around.
Instead of using a roundabout way to do this exercise of trying to write data to particular cell(s), format the cell(s), style them which is indeed a very tedius effort considering the support SSIS has for EXCEL, we could go the "template" way to do this.
assume we need to write data in the so & so cell with all the custom formating thats done on it. Have all the formatting in a sheet, say "SheetActual", Whereas the cells that will hold the data will actually have Lookups/ refrences/ Formulaes to refer to the original data that SSIS exports in a hidden sheet say "SheetMasterHidden" of the same Excel connection. This "SheetMasterHidden" will essentially hold the master data in default format that SSIS writes data to the excel. This way you need not worry about formatting the data runtime.
Formatting the Excel is a one time work "IF" the formatting dont change very often. If the format changes and the format is decided runtime this solution maynot go very well.
The answer is in the question. Over time, it became a progress status. However, there is SSRS that will create Excel files if you create TABLE presentations. It works pretty well too.
I'm trying to import an excel file in to a SQL Server 2000 database using DTS. This is nothing fancy, just a straight import. Where I work, we do this 1000 times a day. This procedure usually works without an issue but something must have changed in the file.
I'm getting the below error:
Screen shot of Error http://www.understandingguitar.org/wp-content/uploads/2008/12/packageerror-screenshot-20081212.jpg
I've checked to ensure that the column "AssignmentID" is stored as "text" in the excel sheet. I've also tried to change it to general. Exact same error regardless of setting. The field does contain numbers... I appreciate everyone's help on this!
Regards, Frank
Try opening at the excel file and see the column content.
Is any of the row value in that column right-aligned? (generally for numbers)?
I am guessing that such a row could be a problem.
It may be obvious, but is the destination string long enough to hold the string representation of the float? I'm not sure if Excel is rounding what it displays to you, so it may be worth trying with a wider column.
The answer has something to do with the fact that the procedure is expecting text but even if you set the properties (in the format dialog) to "text", excel may not handle the data as text. And hence, SQL Server (or the libraries) won't handle it to text.
When the procedures try to import it, the system feels that it is converting from a number to text and it expect that data maybe lost (even though no data will be lost) and the error is raised.
If figured out that I can get around this error by placing a ' (apostrophe) before each listed number. [I.E. '124321] This force excel to treat the number as text.
Hopefully this will save others the headache I now have from this. :-)
Regards,Frank