ODBC Connection to Flat file - excel

What is the best way to take a database and make it a flat file?
I am have an ODBC driver and need to pull the data out into a file file.
Excel, Access? OpenOffice?

I'd suggest Excel as the fastest way to export data from any datasource that supports ODBC or OLEDB and write it out to a flat file.
The tools in Excel are helpful in shaping the query to the database.
Once you get it into Excel, you can then choose to Save As to .csv, .txt or transform it however you like.

If you want to set up relationships and to manipulate the data database-style, Access offers a range of import options, at least as many as Excel.

As an alternative, you could run a SQL command from the database to create the csv file. This has the advantage of allowing you to use complex SELECT statements. Here is a simple example using MySQL:
select emp_id, emp_name from emps
into outfile 'c:/test.txt';

Related

Is there any automated way of exporting Azure SQL database table data to Excel?

I am trying to automatically export the Azure SQL database table data to Excel sheets.I tried to achieve it with Azure Data Factory but couldn't succeed as Azure data factory doesn't have direct support for Excel. I found in some documentation where it was mentioned that SQL database should be exported as text file first. Following that documentation, i exported the SQL database data as CSV file in Azure Blob Storage using Azure Data Factory and couldn't proceed further. Is there any way to convert that CSV in Azure Blob to Excel in automated way? Are there any better alternatives for the overall process?
Exporting data from SSMS to Excel using Copy-Paste from DataGrid or exporting to .csv will cause the loss of the data types, which again, will cost you additional work when importing these data into Excel (import as text).
I have developed SSMSBoost add-in for SSMS, which allows to copy-paste data to Excel using native Excel clipboard format, preserving all data types.
Additionally, you can use SSMSBoost to create ".dqy" query file from your current SQL Script and open it in excel (in this case Excel will use provided connection information and SQL text to execute the query directly against your database).
I hope this helps.
You can have an Azure Function Activity in your Azure Data Factory pipeline and chain it to your Copy Activity. By chaining the activities, you are making sure that the Azure Function Activity is invoked only once the csv file is written successfully.
In the Azure Function, you can use a language of your choice to write code to convert the csv file to xls.
There are a bunch of libraries that you can use to convert csv to xls. Some of them are below :
Simple Excel for Javascript
Some ideas to do it in Java
Python
Hope this helps.
I didn't find the way to convert that CSV file in Azure Blob to Excel in automated way.
But I find that there is tool FileSculptor can help you convert csv file to Excel automatically with scheduled tasks.
Main Benefits:
Convert between file formats CSV, XML, XLS and XLSX
Support for spreadsheets from Excel 97 to Excel 2019
Support for international character sets (unicode)
Select and reorder fields
Create calculated fields
Create an icon on the desktop to convert files with one click
Automatically convert files using scheduled tasks
Additional utility to convert files from DOS command prompt or .BAT
files.
For more details, please reference this tutorial: Convert Between CSV, XML, XLS and XLSX Files.
And about how to export Azure SQL database to a csv file, this tutorial How to export SQL table to Excel gives you two exampes:
Export SQL table to Excel using Sql to Excel Utility. Perhaps the simplest way to export SQL table to Excel is using Sql to Excel utility that actually creates a CSV file that can be opened with Excel. It doesn’t require installation and everything you need to do is to connect to your database, select a database and tables you want to export.
Export SQL table to Excel using SSMS.
I did't find the way which automatically export the Azure SQL database table data to Excel sheets. Azure SQL database doesn't support SQL Server agent.
Hope this helps.

Connecting Powerquery to multiple Powerpivot files

I have around half a dozen Powerpivot files containing data extracted from an SQL database. Each file has around 30 million lines of data in an identical format. I do not have access to the underlying database but each powerpivot file contains the data.
I would like to connect to all of these files in one new workbook using Powerquery so that I can append them, add them to the data model and work with them in Excel.
I have found various solutions on how to get the data into CSV format using DAX studio but I would prefer to avoid this as it seems an unwieldy solution to export hundreds of millions of lines of data to CSV and then import it back to Powerquery when I already have the formatted data in Powerpivot.
I also don't have any experience of using SQL so would prefer to avoid that route.
I've tried creating a linkback as described here https://www.sqlbi.com/articles/linkback-tables-in-powerpivot-for-excel-2013/ but when I connect to this it only returns 1,048,576 lines of data (i.e. what Excel is limited to).
Is there an option for Powerquery to use Powerpivot data in a separate workbook as a source or another straightforward solution?
Thanks
You can either materialise the data (which you've tried, using Linkback tables), or you can copy the queries. There's no other way to reference powerpivot model data.

What is the best way to load a large Excel sheet to an Informix table?

I want to insert data into an Informix table from an Excel sheet. This Excel sheet contains nearly 300000 records. What is the best way to load this data? Can I use IBM ipload software to do this work?
I'm not familiar with ipload, but I used LOAD/UNLOAD and UNL format because it is quite easy:
export Excel file to CSV text
convert it into UNL file readable for Informix dbaccess utility, see: https://www.ibm.com/support/knowledgecenter/SSGU8G_12.1.0/com.ibm.mig.doc/ids_mig_144.htm
import it using dbaccess LOAD command
If you are not familiar with UNL file format then fill your table with some records and export it using UNLOAD statement.

Importing data from excel to multiple tables in Oracle DB

I have an excel sheet with a single workbook with data in it. The data is around 1000 rows and 50 columns. I need to import these data to an Oracle DB every week. Here comes the problem, the columns in the sheet belongs to different tables with some columns go in multiple tables. I use SQL Developer V.18.1.0.095. Thanks in advance for the help.
Note: I created a temp table and copied all data to it, then wrote the query to push each column to its respective tables. But, I feel its complex and think it won't work. Is there any better way.
PL/SQL Developer has special tool for tasks like this, calls ODBC Importer (Menu 'Tools'-> ODBC Importer).
For use it you have to set Excel File in USER / System DSN field and your domain user and password, and push Connect after.
After connection developer will ask you path of excel file, and after you can create table in heiborhood tab for your dataset.
Or, you can use sql loader. Ask google how to. It's easy.

Save Excel sheet into SQL

Excel has a Get External Data ribbon bar in the Data tab where we can choose to import tables from SQL databases. This process worked out nicely for me. But my question is, is there any way to save this data back into SQL? Does Excel provide some API that facilitates the coding of such a function without parsing everything and doing it from scratch?
Thanks
It may not be the solution you are looking for, but I posted some VBA code a long while back that will take a range in Excel and convert it to XML and builds the SQL to put that data into a temp table in SQL Server. Here's a link if you are interested.
The easiest way to do this is to use the import function within SSMS. You can select which sheets to use, customise column mappings and so on. If creates an SSIS that you can then manipulate further if required. However that approach is a pull from Sql, not a push from Excel, if you want to do that you'd have to code some VBA to do it for you.
Non-programmatically:
http://office.microsoft.com/en-us/excel-help/connect-to-import-sql-server-data-HA010217956.aspx
Programmatically - I can only think of the OpenRowSet function in MSSQL:
http://www.mssqltips.com/tip.asp?tip=1540
http://www.sql-server-helper.com/tips/read-import-excel-file-p01.aspx
Using openrowset to read an Excel file into a temp table; how do I reference that table?

Resources