SSIS Data Conversion numbers forces rounding - rounding

Situation: A column containing an number is imported into SSIS as a string. We are converting the string to either an integer or decimal number based on what the data type is expected to be using the Data Conversion task. The Configure Error Output is used redirect both errors and truncation to a script task.
Problem: The SSIS Data Conversion Transformation rounds number values to fit the new data type and does not throw a conversion or truncation error. The row redirect is not happening for numbers. For example, when DT_DECIMAL (10,2) is required, but a value of 12.123 is converted, the value is rounded to 12.12 with no error or truncation redirect. I used a data viewer to verify the Data Conversion task is causing the rounding. Errors from non-numeric characters do cause the row to redirect.
Desired Output: We want to have an error thrown signaling if the data does not match the required data type, for example DT_DECIMAL (10,2).
Is there any way to not have numbers round with the data conversion task, or would another task be required to do this?

You will likely need another task, the Script Component for Data Transformation provides the most flexibility.
You will be able to customize the data conversion and error output in the code.

If you open up the Data Conversion Transformation Editor there's a button to Configure Error Output. From there, you can set the behavior for Truncation to behave in whatever manner you'd like.

Related

Look Up not working as expected in SSRS expression

I have 2 datasets in form of lists (Share point) in my rdl in Visual Studio 2012.
I have BranchCode column as the common column in both my data sets. One tablix in my report where I am writing an expression for looking up BranchCode from dataset1 with BranchCode of dataset2. If it is true then I want it to retrieve the corresponding BranchCost value from dataset2.
I am able to write the lookup expression but final o/p is just a blank value. Can somebody please help me out with this?
I always recommend casting your datatypes in expressions.
So what you should have is something like this:
=LOOKUP(Fields!BranchCode.Value, Fields!BranchCode.Value, Fields!BranchCost.Value, "DataSet2")
You would use the VB.NET functions to cast your values to be the same. Common examples are CSTR() - string, CINT() - int and CDEC() -decimal
=LOOKUP(CSTR(Fields!BranchCode.Value), CSTR(Fields!BranchCode.Value), Fields!BranchCost.Value, "DataSet2")
If it is a string you could also wrap it in the RTRIM() function to make sure there are no trailing spaces.
If you still have issues I recommend outputting the data in both DataSets into tables in the report. Run the report and inspect the data to ensure the DataSets contain the expected data. I also like to add special characters around strings in the table such as # so you can easily identify any leading or trailing spaces.

Transform data types in parts of a column

I am retrieving data through Power Query from an Oracle DB live to an Excel workbook. In PQ, under the "Transform" tab, there is a function to change the data type of a column, that I use to get all the decimal numbers displayed. In the M-code the function is called TransformColumnTypes. However I have some strings in the data that I cannot change to decimal number and produce an error. Is there a way to exclude these? Because the function takes the whole column at the moment.
Before applying function
Function producing error
Code
I don't think so. If you have multiple types within a column, text is the only one that doesn't produce errors.
But if it is only the first row like in your image, promoting it to header before setting the column type will fix the issue.

Excel 2013 formula throws #Value! error with SAP Business Objects Dashboard

I am using this Excel formula
=IF(C92=0,D102,D101)
It is throwing a #Value! error for my SAP Business Objects Dashboard 4.1 (SP7).
Is there another way to write this formula?
My guess is that SAP does not like using zero for C92=0.
There are several possibilities as to the cause of the error but most will be loosely based upon the fact that you are trying to compare a numerical zero to a cell containing a text string or a blank cell. The cell may even contain a zero as text (e.g. ="0"); a numerical 0 is not the same thing as text-that-looks-like-a-zero.
If you use the VALUE function and wrap some error control around it to accommodate cases when numerical conversion is impossible then you should get consistent results.
=IF(IFERROR(VALUE(C92), 0)=0, D102, D101)
The IFERROR function is used to provide a zero when numerical conversion is not possible. This is my 'best guess' at what you want to occur. Another scenario would be to provide an outside result if not conversion is possible.
=IFERROR(IF(VALUE(C92)=0, D102, D101), "something else")
There are a number of other possibilities. If you have trouble getting the results you expect, please edit your question to include more detail on what outcome(s) you would expect for different case scenarios.

matlab "from workspace" block has error

If have an array (5000x1 double) in matlab workspace. I put the 'from workspace' block in simulink window for input of another block, But when run the program this error occurred:
Invalid matrix-format variable specified as workspace input in 'new_net_pattern_recog/From Workspace'. The matrix must have two dimensions and at least two columns. Complex signals of any data type and non-double real signals must be in structure format. The first column must contain time values and the remaining columns the data values. Matrix values cannot be Inf or NaN.
What can I do?
I believe that you are getting this error because the From Workspace block is expecting your data to be in the form of a time series. According to the documentation for this block,
In the Data parameter of the block, enter a MATLAB expression that specifies the workspace data. The expression must evaluate to one of the following:
A MATLAB timeseries object
A structure of MATLAB timeseries objects
An array or structure containing an array of simulation times and corresponding signal values
It sounds like your 5000x1 element array does not change over time, and these values are intended to remain constant throughout the entirety of the simulation. If this is true, then you should just use a Constant block. To use a variable from the workspace as the output of this block, simply set the "Constant Value" parameter of the constant block to the name of your variable. Refer to this doc for more info on the Constant block.

SSIS Text was truncated with status value 4

I am developing a SSIS package, trying to update an existing SQL table from a CSV flat file. All of the columns are successfully updating except for one column. If I ignore this column on truncate, my package completes successfully. So I know this is a truncate problem and not error.
This column is empty for almost every row. However, there are a few rows where this field is 200-300 characters. My data conversion task identified this field as a DT_WSTR, but from what I've read elsewhere maybe this should be DT_NTEXT. I've tried both and I even set the DT_WSTR to 500. But none of this fixed my problem. How can I fix? What data type should this column be in my SQL table?
Error: 0xC02020A1 at Data Flow Task 1, Source - Berkeley812_csv [1]: Data conversion failed. The data conversion for column "Reason for Delay in Transition" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
Error: 0xC020902A at Data Flow Task 1, Source - Berkeley812_csv [1]: The "output column "Reason for Delay in Transition" (110)" failed because truncation occurred, and the truncation row disposition on "output column "Reason for Delay in Transition" (110)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
Error: 0xC0202092 at Data Flow Task 1, Source - Berkeley812_csv [1]: An error occurred while processing file "D:\ftproot\LocalUser\RyanDaulton\Documents\Berkeley Demographics\Berkeley812.csv" on data row 758.
One possible reason for this error is that your delimiter character (comma, semi-colon, pipe, whatever) actually appears in the data in one column. This can give very misleading error messages, often with the name of a totally different column.
One way to check this is to redirect the 'bad' rows to a separate file and then inspect them manually. Here's a brief explanation of how to do that:
http://redmondmag.com/articles/2010/04/12/log-error-rows-ssis.aspx
If that is indeed your problem, then the best solution is to fix the files at the source to quote the data values and/or use a different delimeter that isn't in the data.
I've had this issue before, it is likely that the default column size for the file is incorrect. It will put a default size of 50 characters but the data you are working with is larger. In the advanced settings for your data file, adjust the column size from 50 to the table's column size.
I suspect the
or one or more characters had no match in the target code page
part of the error.
If you remove the rows with values in that column, does it load?
Can you identify, in other words, the rows which cause the package to fail?
It could be the data is too long, or it could be that there's some funky character in there SQL Server doesn't like.
If this is coming from SQL Server Import Wizard, try editing the definition of the column on the Data Source, it is 50 characters by default, but it can be longer.
Data Soruce -> Advanced -> Look at the column that goes in error -> change OutputColumnWidth to 200 and try again.
I've had this problem before, you can go to "advanced" tab of "choose a data source" page and click on "suggested types" button, and set the "number of rows" as much as you want. after that, the type and text qualified are set to the true values.
i applied the above solution and can convert my data to SQL.
In my case, some of my rows didn't have the same number of columns as the header. Example, Header has 10 columns, and one of your rows has 8 or 9 columns. (Columns = Count number of you delimiter characters in each line)
If all other options have failed, trying recreating the data import task and/or the connection manager. If you've made any changes since the task was originally created, this can sometimes do the trick. I know it's the equivalent of rebooting, but, hey, if it works, it works.
I have same problem, and it is due to a column with very long data.
When I map it, I changed it from DT_STR to Text_Stream, and it works
In the destination, in advanced, check that the length of the column is equal to the source.
OuputColumnWidth of column must be increased.
Path: Source Connection manager-->Advanced-->OuputColumnWidth

Resources