ALL,
I am trying to store the value of two variables:
Variable1Value: 2014-05-10 00:00:00.000
Variable2Value: 2014-05-08 00:00:00.000
Into an Object type variable so that I loop it with a FOREACH LOOP in SSIS but, I don't know how to do it.
Normally with a OLE DB connection, I would create a variable of type object and stored my result set there, but with an ODBC connection, that's not the case, because the step always fails.
Can somebody help me.
Thank you
I would replace the Execute SQL Task with a Data Flow Task. Inside that, I would start with an ODBC Source component with your ODBC SQL statement. Then I would connect that to a Recordset Destination, and configure that for the Object type Variable.
This design also exposes the SSIS Data Types of the returned columns (e.g. in the Recordset Destination), avoiding guesswork when you come to use them downstream.
Related
My base scenario is that I want to make an excel report with data from a PostgreSQL DB.
I get them via ODBC, making a simple linked table with PowerQuery.
For DSN I choose (None), then I write the connectio string and the SQL statement. Generally it works fine, but with one column, it doesn't. I recive the following error message:
ODBC: ERROR [22P05] ERROR: character with byte sequence 0xc2 0xb2 in encoding "UTF8" has no equivalent in encoding "WIN1250";Error while executing the query
So that is clear, the source is in UTF-8 with characters that are not compatible with Win1250.
What I am looking for is a general solution either on DB or excel site.
The used SQL statement is a simple SELECT * FROM [view], so I can use any replacement or converting or anything just to be able to hanle it with transformations on the column. I can replace the view with function if that is better.
But it would be better, if you can suggest an excel site solution.
With it there is some criteria. That scenario, when "I get the data first in text, then I convert it to Win1250, then import to excel" wont't fit, and I need something which connects to the excel file itself, so if I move it to an other pc, it need to work too without any more modification.
Thanks for all the help!
I'm setting up a SQL Azure Copy Data job using Data Factory. For my source I'm selecting the exact data that I want. For my destination I'm selecting use stored procedure. I cannot move forward from the table mapping page as it reports 'one or more destination tables have been been properly configured'. From what I can tell. Everything looks good as I can manually run the stored procedure from SQL without an issue.
I'm looking for troubleshooting advice on how to solve this problem as the portal doesn't appear to provide any more data then the error itself.
Additional but unrelated question: What is the benefit from me doing a copy job in data factory vs just having data factory call a stored procedure?
I've tried executing the stored procedure on via SQL. I discovered one problem with that as I had LastUpdatedDate in the TypeTable but it isnt actually an input value. After fixing that I'm able to execute the SP without issue.
Select Data from Source
SELECT
p.EmployeeNumber,
p.EmailName,
FROM PersonFeed AS p
Create table Type
CREATE TYPE [person].[PersonSummaryType] AS TABLE(
[EmployeeNumber] [int] NOT NULL,
[EmailName] [nvarchar](30) NULL
)
Create UserDefined Stored procedure
CREATE PROCEDURE spOverwritePersonSummary #PersonSummary [person].[PersonSummaryType] READONLY
AS
BEGIN
MERGE [person].[PersonSummary] [target]
USING #PersonSummary [source]
ON [target].EmployeeNumber = [source].EmployeeNumber
WHEN MATCHED THEN UPDATE SET
[target].EmployeeNumber = [source].EmployeeNumber,
[target].EmailName = [source].EmailName,
[target].LastUpdatedDate = GETUTCDATE()
WHEN NOT MATCHED THEN INSERT (
EmployeeNumber,
EmailName,
LastUpdatedDate)
VALUES(
[source].EmployeeNumber,
[source].EmailName,
GETUTCDATE());
END
Datafactory UI when setting destination on the stored procedure reports "one or more destination tables have been been properly configured"
I believe the UI is broken when using the Copy Data. I was able to map directly to a table to get the copy job created then manually edit the JSON and everything worked fine. Perhaps the UI is new and that explains why all the support docs only refer only to the json? After playing with this more it looks like the UI sees the table type as schema.type, but it drops the schema for some reason. A simple edit in the JSON file corrects it.
english isn't my native tongue, but I hope I can explain my problem sufficiently.
I made a View in the Oracle DB which only contains the data I need.
Using SQL in my VBScript file, I select the View by using:
"SELECT * FROM TEST_1234"
I have selected the complete view now, that works fine.
Now I need to 'export' or copy the complete View to Excel using VBScript (via UFT [Unified Functional Testing]).
Is there an easy way to just copy the whole thing at once or at least complete rows or columns?
If 1. doesn't work, can I just 'iterate' through the rows and columns using two loops and copy the data from every field to the respective field in Excel?
It would be nice to be able to copy the Data without using the names of the columns in a recordset (is there a way to use numbers until EOC [End of columns]?), because there is a very high amount of columns to be copied and the column names are subject to change.
Thanks for any help!
From a programmer==code writer's point of you the most attractive solution is your very first approach (copy the whole thing with just one SQL statement). Depending on the providers' capabilities this statement could look like
INSERT INTO [DstTable] SELECT * FROM [SrcTable] IN '' 'odbc;dsn=DSNName'
or
SELECT * INTO [DstTable] FROM [SrcTable] IN '' 'odbc;dsn=DSNName'
Look here for a working solution that couldn't be simpler; but I admit that a dsnless connection to the destination database looks more complicated and your drivers may have other incantations to refer to the external Database. Furthermore, your pair of providers may not support an external connection from the source to the destination and the dirty trick of using the Access OLEDB driver (which came/still comes? with ADO) to connect to both Databases externally may not work for you. In all, it's certainly not easy to get "INSERT/SELECT INTO External Database" right. [Look at my (just downvoted) answer to see that people dispair and fall back (and upvote) code that uses single-item-copy-loops.] In your case, you'll have to research whether at least one of the Oracle providers available to you supports external connections to Excel (or vice versa).
From a programmer==hacker's point of view (let's get the job done with minimal fuss) an easy solution could be to export the views/tables to .csv (
I looked at this and was disappointed, but you may know much better) and to import them into Excel (just load .csv and save .xls)
If you can't/won't use the file system, you could go thru memory: Use GetRows to get the data into a two dimensional array and assign that to the desired Excel range.
If all the above fails and you need assignments to single cells in row and column loopings over the recordset, remember that the Fields collection gives you access to not only the data but the meta-info (number of columns, column-names, types, ...) too.
Thanks for the help, and the links you provided, Ekkehard and Bond! After reading them and trying a lot, i got a very simple solution.
Here's some working code, if anybody else faces the same or a similar problem:
Option explicit
Dim conn, rec, xlStat, xlStatW, dbCnnStr, SQLSec, statArt
Set conn = Createobject("ADODB.Connection")
Set rec = CreateObject("ADODB.Recordset")
Set xlStat = CreateObject("Excel.Application")
dbCnnStr = "[your DB-connection]"
conn.open dbCnnStr
'Start Excel XXX
Set xlStatW = xlStat.Workbooks.Add()
xlStatW.Sheets(1).Name = "AAA_123"
xlStatW.Sheets(2).Name = "BBB_123"
xlStatW.Sheets(3).Name = "CCC_123"
SQLSec = "SELECT * FROM XXX_123"
rec.open SQLSec,conn
xlStatW.Sheets(1).cells(2,1).CopyFromRecordset rec
rec.Close
SQLSec = "SELECT * FROM YYY_123"
rec.open SQLSec,conn
xlStatW.Sheets(2).cells(2,1).CopyFromRecordset rec
rec.Close
SQLSec = "SELECT * FROM ZZZ_123"
rec.open SQLSec,conn
xlStatW.Sheets(3).cells(2,1).CopyFromRecordset rec
rec.Close
xlStatW.SaveAs ("C:\test.xlsx")
xlStatW.Close
'Ende Excel XXX
conn.Close
I am creating an SSIS Execute SQL Task that will use variables but it is giving me an error when I try to use it. When I try to run the below, it gives me an error and when I try to build the query, it gives me an error SQL Sytnax Errors encountered and unable to parse query. I am using an OLEDB connection. Am I not able to use variables to specify the tables?
You can't parameterize a table name.
Use the Expressions editor in your Execute SQL Task to Select a SqlStatementSource property.
Try "SELECT * FROM " + #[User::TableName]
After clicking OK twice (to exit the Task editor), you should be able to reopen the editor and find your table name in the SQL statement.
Add a string cast in a case where it might be a simple Object - (DT_WSTR,100)
You are using only single parameter(?) in the query and assigning 3 inputs to that parameters which is not fair put only single input and assign some variable as input as shown in image and change the value of variable respectively.
the parameter name should be incremented by 1 start with 0 because they are the indexes representing the "?" in the query which was written the query window.
I have a requirement such that whenever i run my Kettle job, the database connection parameters must be taken dynamically from an excel source on each run.
Say i have an excel with column names : HostName, Username, Database, Password.
i want to pass these connection parameters to my table input step dynamically whenever the job runs.
This is what i was trying to do.
You can achieve this by
reading the DB connection parameters from a source (e.g. Excel or in my example a CSV file)
storing the parameters in variables
using the variables in your connection setting.
Proceed as follows
Create another transformation for setting the variables (you cannot do this in the same transformation that uses it):
In the Set Variables element configure the variables:
In the element reading/writing your data create a new connection and set the connection parameters using ${variable_name}. Note that you have to blindly write ${password} into the appropriate field. Also note that this may be a security issue because the value may show up as plain text in log files!
In your job call the variable transformation first and then the functional part:
All you need is the XLS input and the Set Variables step. Define your variables as being valid in the Root job and you can use them in subsequent jobs, as long as they're called by the same root job, when defining the connection.
The "Copy rows to result" and "Get rows from result" are used to send information (rows of data) from one transformation to the next transformation or job in the same parent job. They're not used to send data between steps, that's what the hops are for.