hoping to be guided in the right direction with this.
I have an Excel sheet with about 7,900 account numbers. I need to feed those account numbers into a SQL query to pull data on those accounts. I would normally be able to just import that into a SQL server but I don't have access to this server to bring the data in for use. I'm having to do this through SSIS.
Is there a way to have SSIS read each row for the account numbers and feed them into my query as a variable?
I haven't tried anything with this yet as this would be the first time trying something like this in SSIS.
2 different ways to solve your problem:
Process 1. 7900 queries. You would use this approach if you are preparing statements for each account number or something like that.
Read excel file into a record set. Create package variable of type object. Excel Source => Recordset destination. This is done in a dataflow.
Add a foreach and choose ADO as to what iterate over and add your object variable.
Inside foreach object add a function object and build your SQL statement using the account number being iterated through.
Add a dataflow to run your sql and do something with the results.
Process 2. all account numbers at the same time. You would do this if you were just getting data back for analysis. This is a much faster way to get results.
Load the Excel account numbers into a staging table.
Inner Join that staging table into your query to get all results in one response.
Related
I have a requirement where I need to move data from multiple tables in Oracle to ADLS.
The size of data is around 5TB. These files in ADLS, I might use it in future to connect Power BI.
Is their any easy and efficient way to do this.
Thanks in Advance !
You can do this by using Lookup activity and ForEach in Azure Data Factory.
Create a table or file to store the list of table names which needs to be extracted.
Use Lookup Activity get the tables list.
Pass the list to ForEach activity and by looping each table copy the current item() from oracle to ADLS.
In ForEach, settings->Items, add the following code in the Add Dynamic Content.
#activity('Get-Tables').output.value
Add a Copy activity inside ForEach activity.
In Copy data activity, source > Query and Input the following code:
SELECT * FROM #{item().Table_Name}
Now add the sink dataset(ADLS) and Execute your pipeline.
Please refer Microsoft Documentation to know about the creation of linked services for Oracle.
Please go through this article by Sean Forgatch in MODERN DATA ENGINEERING if you face any issues in the process.
I am trying to do something that should be fairly simple, but I get no results.
I want to read an Excel file from our sharepoint site (O365) and insert the data from the first worksheet into a table in SQL Server.
Actually quite simple and straightforward. Well, it sounds like that.......
Apparently there is more than reading the file and inserting the file into SQL Server....
Who can provide me with info, tutorials or (even better) step-by-step instructions?
Bonus would be looping through the (online) folder and importing all excel files creating a table for each worksheet.
Edit: I am able to collect the Excel file and email it to me as an attachment.
I just have no clue how to insert it in SQL Server.
This is possible in 4 steps:
Trigger (duh)
Excel Business component - Get Tables
For each component - Value (list of items)
SQL Server component - Insert Row (V2) _ make sure you create parameters for all the columns and map them to the dynamic content offered
(I inserted a SQL Server component - Execute query after step 2 to truncate the destination table)
Not looping yet but this at least enables me to insert rows from an online Excel into a Azure SQL
You can use Azure Data Factory for that, create the connection to your excel file, then to SQL, and use a copy & transform Pipeline
I have an excel sheet with a single workbook with data in it. The data is around 1000 rows and 50 columns. I need to import these data to an Oracle DB every week. Here comes the problem, the columns in the sheet belongs to different tables with some columns go in multiple tables. I use SQL Developer V.18.1.0.095. Thanks in advance for the help.
Note: I created a temp table and copied all data to it, then wrote the query to push each column to its respective tables. But, I feel its complex and think it won't work. Is there any better way.
PL/SQL Developer has special tool for tasks like this, calls ODBC Importer (Menu 'Tools'-> ODBC Importer).
For use it you have to set Excel File in USER / System DSN field and your domain user and password, and push Connect after.
After connection developer will ask you path of excel file, and after you can create table in heiborhood tab for your dataset.
Or, you can use sql loader. Ask google how to. It's easy.
I'm new to SSIS. I'm trying to load the data from the excel to sql server table. What i have to do if the data already existed in the table then I have write it to a temp table or file if not existed then I have to insert into the table.
We are using sql server 2005. I'm using look up transformation to achieve that. But its not working. Is there any can I achieve it.
Please suggest me some tips. Your help greatly appreciated.
Regards,
VG.
I would write down the conceptual steps - as opposed to giving the step-by-step solution. This in my opinion be more helpful in developing the understanding. If you get stuck on any step, please let us know.
Step 1:
First of all load the file into a temporary table. You do not need to create the table manually; let BIDS create it for you. Alter the table to add a new column - ALREADY_EXISTS - BIT data type.
You would need to use Data Flow Task. Within it, use Excel data source and ADO destination.
Step 2a:
Write a sql statement in SSMS using inner join on your temp table and the final destination table. Make sure that the query you come up with gives the result you are expecting. Use this SELECT statement to UPDATE the ALREADY_EXISTS column inside the temp table.
Step 2:
Put Execute SQL task on the control surface. Use the query from Step 2.
Step 3:
Put another DFT on the control surface. Write a plain SELECT statement to pick up all columns - include the ALREADY_EXISTS.
Use a conditional split to determine new and existing records and point them accordingly to their destination.
Also, read up on Merge statement which is a feature introduced in SQL Server 2008.
Please share your experience with this solution.
I was working on project for my company. The requirements are to create an excel report at the end.
The way I am currently coding/thinking.
Remote Server ---> Local Access table --> give user a UI to filter data however they want --> Export to excel.
However, one of my analysts asked me if we can stay away from access and use Excel only. So I was wondering, is there a way to create a "table" like access table in Excel? This way, when I import data from remote server, I can put it in a table (IN EXCEL), create a form for UI, and have everything contained in one file.
I can't paste the raw data into a sheet because of performance issues (however, I have not tried it. I just assume that it is a lot faster to query a 'real' table then to search through excel cells).
Can you think of a alternate solution?
One option is to use Microsoft Query to directly access the remote database. In this case, the users would need to use the UI of MS Query (which isn't the prettiest) for filtering, but it would get the job done without needing the intermediate database.
Here is a good reference from the Microsoft site.