When saving excel file, numbers are converted to date - excel

I have a file (only one) with some columns with integer numbers. When I change and save file, those columns are automaticaly converted in dates. Does anyone knows how can i prevent this? Thank You!

you might want to check out a similar issue here:
http://www.mrexcel.com/forum/excel-questions/637277-excel-converting-numbers-dates.html
(the below suggestions might not be particularly relevant to this issue, but in general, they might help to resolve such a problem)
Selct the cells and go to Format --> cells --> number and select Text for the selection
There's also a one-keystroke solution: type an apostrophe before entering or pasting a pair of numbers that Excel could mistake for a date and month. When you exit the cell, the apostrophe vanishes and the numbers stay numbers, formatted as text.
this is what microsoft says:
http://office.microsoft.com/en-001/excel-help/stop-automatically-changing-numbers-to-dates-HA102809473.aspx
also if you are pasting data from somewhere else
try Paste Special

I guess there is still no solutions to this problem.
It happens in both my computers when inserting data from sql server 2008.
Half of columns becomes dates..... and im supposed to give this to my accountant.. sucks
(Btw i know about the special paste to text, then convert to number... but it takes so much time)....

I just ran into this issue. I am using a webhook to store data to excel on the fly, adding new rows, updating existing data, etc.
I had a special field "step" which is a string who can be "1", "1.1", "1.2", etc. which was treated as date as soon as the value was a "string decimal"...
The only way I could find to fix this issue was to programmatically add a ' before the value, like '1.2.
I tried to apply this to all my fields to avoid excel side effect, but I then noticed that when updating an existing row by merging new values with existing values then the ' was stripped. So, I end up using a whitelist of fields to escape. I couldn't locate the fields to force adding ' when merging data due to limitations from the lib I use google-sheets-node-api.

Related

Why does carriage return in excel is not reflected when excel value is imported in UFT

I am using Excel to compare error messages. My error message looks like this .
You have changed the values.
Do you want to continue?
I entered this value in excel using Alt+Enter, when reading this value from UFT, this carriage return is not considered.
How to include carriage return in excel so that it is visible when reading the values from UFT?
First, try to create a string formula, i.e. instead of entering the line break in the string, create a formular like
="<value>"
where <value> is the string value that you originally wanted in your cell, with the line feed contained within the quotation marks.
This might solve your issue.
But:
This is just the top of the mountain of known issues with UFTs data table API. Here is an incomplete list of additional issues (some of which, but not all, are fixed or at least improved upon in 15+):
Date values are not properly handled, especially if you are using a
non-US locale and try to consume values auto-formatted by Excel as
dates
Strange things happen if you have an .xls file. Use an .xlsx file instead. (This used to be the other way around). Note it is not only the extension I am referring to, but also the format (Excel-95 vs. more modern format)
Many formulas are unsupported Formatting behavior differentfrom what Excel would do/show
CRs and LFs are handled differently
from what Excel does
Built-in table editor is quite a silo of bugs
and antiergonomic cell values are limited in length; at the same
time, formulas have different length limits. I.e. a string in a cell is
limited to a certain maximum number of characters, but a formular
returning a string does not have that (but maybe a higher) length
limit
Because of this (and more), we auto-convert all excel sheets on the fly before when we use them in UFT after they have been updated. To do this, we are using Excel Interop (i.e. Excel´s COM automation interface) to spawn an Excel instance, create a converted version that has all formulas and formatting resolved to just string formulars, and use the converted sheets with UFTs DataTable.ImportSheet feature. Which means we unfortunately need Excel on all execution machines.
So my recommendation would be to stay away from the data table editor in UFT. Use Excel, and make sure all your edits come through to UFT in a meaningful way. If they don´t, consider a converter that creates a DataTable-compatible copy of your sheet.
Yes, I know this is suboptimal, but that´s what it has come down to after years and years of struggling with the DataTable API and UFTs "superb" built-in data table editor.

Excel - Variable number of leading zeros in variable length numbers?

The format of our member numbers has changed several times over the years, such that 00008, 9538, 746, 0746, 00746, 100125, and various other permutations are valid, unique and need to be retained. Exporting from our database into the custom Excel template needed for a mass update strips the leading zeros, such that 00746 and 0746 are all truncated to 746.
Inserting the apostrophe trick, or formatting as text, does not work in our case, since the data seems to be already altered by the time we open it in Excel. Formatting as zip won't work since we have valid numbers less than five digits in length that cannot have zeros added to them. And I am not having any luck with "custom" formatting as that seems to require either adding the same number of leading zeros to a number, or adding enough zeros to every number to make them all the same length.
Any clues? I wish there was some way to set Excel to just take what it's given and leave it alone, but that does not seem to be the case! I would appreciate any suggestions or advice. Thank you all very much in advance!
UPDATE - thanks everybody for your help! Here are some more specifics. We are using a 3rd party membership management app -- we cannot access the database directly, we need to use their "query builder" tool to get the data we want to mass update. Then we export using their "template" format, which is called XLSX but there must be something going on behind the scenes, because if we try to import a regular old Excel, we get an error. Only their template works.
The data is formatted okay in the database, because all of the numbers show correctly in the web-based management tool. Also, if I export to CSV, save it as a .txt and import it into Excel, the numbers show fine.
What I have done is similar to ooo's explanation below -- I exported the template with the incorrect numbers, then exported as CSV/txt, and copied / pasted THOSE numbers into the template and re-imported. I did not get an error, which is something I guess, but I will not be able to find out if it was successful until after midnight! :-(
Assuming the data is not corrupt in the database, then try and export from the database to a csv or text file.
The following can then be done to ensure the import is formatted correctly
Text file with comma delimiter:
In Excel Data/From text and selected Delimited, then next
In step 3 of the import wizard. For each column/field you want as text, highlight the column and select Text
The data should then be placed as text and retain leading zeros.
Again, all of this assumes the database contains non-corrupt data and you are able to export a simple text or csv file. It also assumes you have Excel 2010 but it can be done with minor variation across all versions.
Hopefully, #ooo's answer works for you. I'm providing another answer mainly for informational purposes, and don't feel like dealing with the constraints on comments.
One thing to understand is that Excel is very aggressive about treating "numeric-looking" data as actual numbers. If you were to open the CSV by double-clicking and letting Excel do its thing (rather than using ooo's careful procedure), those numbers would still have come up as numbers (no leading zeros). As you've found, one way to counteract this is to append clearly nonnumeric characters onto your data (before Excel gets its grubby hands on it), to really convince Excel that what it's dealing with is text.
Now, if the thing that uploads to their software is a file ending in .xlsx, then most likely it is the current Excel format (a compressed XML document, used by Excel 2007 and later). I suppose by "regular old Excel" you mean .xls (which still works with the newer Excels in "compatibility mode").
So in case what you've tried so far doesn't work, there are still avenues to explore before resorting to appending characters to the end of your data. (I'll update this answer as needed.)
You're on the right track with the apostrophe.
You'll need to store your numbers in excel as text at the time they are added to the file.
What are you using to create the original excel file / export from database?
This will likely be where your focus needs to be regarding your export.
For example one approach is that you could potentially modify the database export to include the ' symbol prefix before the numbers so that excel will know to display them as text.
I use the formula =text(cell,"# of zeros of the field") to add preceding zeros.
Example, Cell C2 has 12345 and I need it to be 10 characters long. I would put =text(c2,"0000000000").
The result will be 0000012345.

Export data from Access to Excel without losing leading zeroes

I have a table in Access I am exporting to Excel, and I am using VBA code for the export (because I actually create a separate Excel file every time the client_id changes which creates 150 files). Unfortunately I lose the leading zeroes when I do this using DoCmd.TransferSpreadsheet. I was able to resolve this by looping through the records and writing each cell one at a time (and formatting the cell before I write to it). Unfortunately that leads to 8 hours of run time. Using DoCmd.TransferSpreadsheet, it runs in an hour (but then I lose the leading zeroes). Is there any way at all to tell Excel to just treat every cell as text when using the TransferSpreadsheet command? Can anybody think of another way to do this that won't take 8 hours? Thanks!
prefix the Excel value with an apostrophe (') character. The apostrophe tells Excel to treat the data as "text".
As in;
0001 'Excel treats as number and strips leading zeros
becomes
'0001 'Excel treats as text
You will probably need to create an expression field to prefix the field with the apostrophe, as in;
SELECT "'" & [FIELD] FROM [TABLE]
As an alternative to my other suggestion, have you played with Excel's Import External Data command? Using Access VBA, you can loop through your clients, open a template Excel file, import the data (i.e. pull instead of push) with your client as a criteria, and save it with a unique name for each client.
What if you:
In your source table, change the column type to string.
Loop through your source table and add an "x" to the field.
If the Excel data is meant to be read by a human being, you can get creative, like hiding your data column, and adding a 'display' column that references the data column, but removes the "x".

How to export SSIS to Microsoft Excel without additional software?

This question is long winded because I have been updating the question over a very long time trying to get SSIS to properly export Excel data. I managed to solve this issue, although not correctly. Aside from someone providing a correct answer, the solution listed in this question is not terrible.
The only answer I found was to create a single row named range wide enough for my columns. In the named range put sample data and hide it. SSIS appends the data and reads metadata from the single row (that is close enough for it to drop stuff in it). The data takes the format of the hidden single row. This allows headers, etc.
WOW what a pain in the butt. It will take over 450 days of exports to recover the time lost. However, I still love SSIS and will continue to use it because it is still way better than Filemaker LOL. My next attempt will be doing the same thing in the report server.
Original question notes:
If you are in Sql Server Integrations Services designer and want to export data to an Excel file starting on something other than the first line, lets say the forth line, how do you specify this?
I tried going in to the Excel Destination of the Data Flow, changed the AccessMode to OpenRowSet from Variable, then set the variable to "YPlatters$A4:I20000" This fails saying it cannot find the sheet. The sheet is called YPlatters.
I thought you could specify (Sheet$)(Starting Cell):(Ending Cell)?
Update
Apparently in Excel you can select a set of cells and name them with the name box. This allows you to select the name instead of the sheet without the $ dollar sign. Oddly enough, whatever the range you specify, it appends the data to the next row after the range. Oddly, as you add data, it increases the named selection's row count.
Another odd thing is the data takes the format of the last line of the range specified. My header rows are bold. If I specify a range that ends with the header row, the data appends to the row below, and makes all the entries bold. if you specify one row lower, it puts a blank line between the header row and the data, but the data is not bold.
Another update
No matter what I try, SSIS samples the "first row" of the file and sets the metadata according to what it finds. However, if you have sample data that has a value of zero but is formatted as the first row, it treats that column as text and inserts numeric values with a single quote in front ('123.34). I also tried headers that do not reflect the data types of the columns. I tried changing the metadata of the Excel destination, but it always changes it back when I run the project, then fails saying it will truncate data. If I tell it to ignore errors, it imports everything except that column.
Several days of several hours a piece later...
Another update
I tried every combination. A mostly working example is to create the named range starting with the column headers. Format your column headers as you want the data to look as the data takes on this format. In my example, these exist from A4 to E4, which is my defined range. SSIS appends to the row after the defined range, so defining A4 to E68 appends the rows starting at A69. You define the Connection as having the first row contains the field names. It takes on the metadata of the header row, oddly, not the second row, and it guesses at the data type, not the formatted data type of the column, i.e., headers are text, so all my metadata is text. If your headers are bold, so is all of your data.
I even tried making a sample data row without success... I don't think anyone actually uses Excel with the default MS SSIS export.
If you could define the "insert range" (A5 to E5) with no header row and format those columns (currency, not bold, etc.) without it skipping a row in Excel, this would be very helpful. From what I gather, noone uses SSIS to export Excel without a third party connection manager.
Any ideas on how to set this up properly so that data is formatted correctly, i.e., the metadata read from Excel is proper to the real data, and formatting inherits from the first row of data, not the headers in Excel?
One last update (July 17, 2009)
I got this to work very well. One thing I added to Excel was the IMEX=1 in the Excel connection string: "Excel 8.0;HDR=Yes;IMEX=1". This forces Excel (I think) to look at all rows to see what kind of data is in it. Generally, this does not drop information, say for instance if you have a zip code then about 9 rows down you have a zip+4, Excel without this blanks that field entirely without error. With IMEX=1, it recognizes that Zip is actually a character field instead of numeric.
And of course, one more update (August 27, 2009)
The IMEX=1 will succeed importing data with missing contents in the first 8 rows, but it will fail exporting data where no data exists. So, have it on your import connection string, but not your export Excel connection string.
I have to say, after so much fiddling, it works pretty well.
P.S. If you are using a x64 bit version, make sure you call the DTExec from C:\Program Files\Microsoft SQL Server\90\DTS.x86\Binn. It will load the 32 bit Excel driver and work fine.
Would it be easier to create the Excel Workbook in a script task, then just pick it up later in the flow?
The engine part of SSIS is good but the integration with Excel is awful
"Using SSIS in conjunction with Excel is like having hot tar funnelled up your iHole in a road cone"
Dr. Zim, I believe you were the one that originally brought up this question. I totally feel your pain. I love SSIS overall, but I absolutely hate the limited tools that come standard for Excel. All I want to do is Bold the Heading or Row1 record in Excel, and not bold the following records. I have not found a great way to do that; granted I am approaching this with no script tasks or custom extensions, but you would think something this simple would be a standard option. Looks like I may be forced to research and program up something fancy for a task that should be so fundamental. I've already spent a rediculous amount of time on this myself. Does anyone know if you can use Excel XML with Excel versions: 2000/XP/2003? Thanks.
This is an old thread but what about using a flat file connection and writing the data out as a formatted html document. Set the mime type in the page header to "application/excel". When you send the document as an attachment and the recipient opens the attachment, it will open a browser session but should pop Excel up over the top of it with the data formatted according to the style (CSS) specified in the page.
Can you have SSIS write the data to an Excel sheet starting at A1, then create another sheet, formatted as you like, that refers to the other sheet at A1, but displays it as A4? That is, on the "pretty" sheet, A4 would refer to A1 on the SSIS sheet.
This would allow SSIS to do what it's good for (manipulate table-based data), but allow the Excel to be formatted or manipulated however you'd like.
When excel is the destination in SSIS, or the target export type in SSRS, you do not have much control over formatting and specifying how you want the final file to be. I have written a custom excel rendering engine for SSRS once, as my client was so strict about the format of final Excel report generated. I used 'Excel xml' to get the job done inside my custom renderer. May be you can use XML output and convert it to Excel XML using XSLT.
I understand you would rather not use a script component so perhaps you could create your own custom task using the code that a script contains so that others can use this in the future. Check here for an example.
If this seems feasible the solution I used was CarlosAg Excel Xml Writer Library. With this you can create code which is similar to using the Interop library but produces excel in xml format. This avoids using the Interop object which can sometimes lead to excel processes hanging around.
Instead of using a roundabout way to do this exercise of trying to write data to particular cell(s), format the cell(s), style them which is indeed a very tedius effort considering the support SSIS has for EXCEL, we could go the "template" way to do this.
assume we need to write data in the so & so cell with all the custom formating thats done on it. Have all the formatting in a sheet, say "SheetActual", Whereas the cells that will hold the data will actually have Lookups/ refrences/ Formulaes to refer to the original data that SSIS exports in a hidden sheet say "SheetMasterHidden" of the same Excel connection. This "SheetMasterHidden" will essentially hold the master data in default format that SSIS writes data to the excel. This way you need not worry about formatting the data runtime.
Formatting the Excel is a one time work "IF" the formatting dont change very often. If the format changes and the format is decided runtime this solution maynot go very well.
The answer is in the question. Over time, it became a progress status. However, there is SSRS that will create Excel files if you create TABLE presentations. It works pretty well too.

Importing Excel into SS2000; Error on Field; DTS

I'm trying to import an excel file in to a SQL Server 2000 database using DTS. This is nothing fancy, just a straight import. Where I work, we do this 1000 times a day. This procedure usually works without an issue but something must have changed in the file.
I'm getting the below error:
Screen shot of Error http://www.understandingguitar.org/wp-content/uploads/2008/12/packageerror-screenshot-20081212.jpg
I've checked to ensure that the column "AssignmentID" is stored as "text" in the excel sheet. I've also tried to change it to general. Exact same error regardless of setting. The field does contain numbers... I appreciate everyone's help on this!
Regards, Frank
Try opening at the excel file and see the column content.
Is any of the row value in that column right-aligned? (generally for numbers)?
I am guessing that such a row could be a problem.
It may be obvious, but is the destination string long enough to hold the string representation of the float? I'm not sure if Excel is rounding what it displays to you, so it may be worth trying with a wider column.
The answer has something to do with the fact that the procedure is expecting text but even if you set the properties (in the format dialog) to "text", excel may not handle the data as text. And hence, SQL Server (or the libraries) won't handle it to text.
When the procedures try to import it, the system feels that it is converting from a number to text and it expect that data maybe lost (even though no data will be lost) and the error is raised.
If figured out that I can get around this error by placing a ' (apostrophe) before each listed number. [I.E. '124321] This force excel to treat the number as text.
Hopefully this will save others the headache I now have from this. :-)
Regards,Frank

Resources