select data source on the same spagobi report in run time - switch-statement

I have many data sources on spagobi because there are many data bases (PostgreSQL) with a same schema but with diferent data, is it possible choise a particular data source on a same spagobi report in run time?

Related

Connecting Tally ERP 9 real time data with Excel workbook

I want to connect my client's live Tally ERP 9 data with Excel in order to make some reports. My purpose is fetch the real time accounting data into Excel. I have tried connecting Excel to Tally through ODBC Connection but I am unable to find the data of Transactions from the given data tables. Is there any other way to do it?
Yes, Using Tally Definition Language you can create your own data source (Collection), and expose it in odbc connection
[Collection : ODBCTrans]
Type : Voucher
IsOdbcTable : Yes
Fetch : Date, VoucherTypeName, Amount
Copy paste above code in a text file and attach it with tally erp, to know steps follow below links:
https://help.tallysolutions.com/article/DeveloperReference/td9/working_with_projects/load_tcp_file.htm
After this file is loaded successfully with tally, when you connect to odbc you will find a new table with name "ODBCTrans"
for more details about creating collections follow below link
https://help.tallysolutions.com/article/DeveloperReference/tdlreference/objects_and_collections.htm#collectioncapabilities

Is there a way to combine similar data in sheets automatically and have Tableau connect to read data

I have 4 Sheets sheets with similar fields. I intend to merge these sheets together to create a master file that has all information in one sheet. However, i need Tableau to connect to the final merged file so i can create dashboards off. This works locally as i have an access program that appends the tables together and creates a new table which Tableau connects to.
The main issue is i am trying to take this process offline (to run online to locally), meaning i need a database that can;
1- Drop content of the tables, pick up the sheets from a specified folder, import them into specified tables.
2- Append new tables into master tables.
All of this should be done automatically at a scheduled time.
I tried using SQL server (SQL Agent for scheduling job import/append etc) for this requirement but i need to know if something else is out there that can serve this purpose efficiently.
Thank you
As long as the sheets have the exact fields, you should be able to use Tableau's Union feature. This feature will allow you to do a wildcard search for sheets within a folder structure. Anytime the data is refreshed in Tableau it will reach back out to the folder and update/union what is currently there.

Power Query Connection to outsource excel files

I am currently working on an excel file("Main") that has two sources of input from other excel files. So we have 3 excel files. ("Main", "Source A", "Source B").
Source A: Load the table in a power query (from Source A to our Main file), follow some transformations on the data and Load the results into a tab in the Main file.
Source B: Load the table in a power query (from Source B to our Main file), follow some transformations on the data and Load the results into a tab in the Main file.
The problem arises when we wish to update the queries (some but not all).
The transformations performed are intensive and a refresh of the queries does take some time.
It seems that when an update is performed in queries that do the transformations on data from Source A, the queries that relate to transformations on data from Source B also update. Since no change is performed on Source B we do not want the refreshing to occur there.
My question is about how could we manage this? (i.e. update the queries that actually relate to changes made at that specific time)
Thank you in advance for your time.

Importing more than 255 columns data from SQL to excel using SSIS package

A few day back, I had a requirement from the client to import/dump the data from SQL server (which was a result set of multiple joins producing more than 300 columns) into excel on daily basis. However, while running the utility, It gave Too many Field error.
As a workaround, I used Flat File Destination Task to load all the required data (more than 300 columns) to csv file. And later saved this csv as excel.
This is the only workaround I achieved for the above scenario.

Creation of a fact table, is that the ETL process?

I have created a small data warehouse with the help of Tableau software. First I entered my information in Excel and created my fact table in Excel and then imported into Tableau where I created my queries.
I would like to know if the creation of a Fact table is the ETL process? (I know what ETL means,I just want to know where it happened in my project).
In principle you do Extract, Transform and Load - but mostly manually. Your Extract procedure is done manually, while gathering the information you need to create your excel sheet. The transformation is then again a manual step, you create the excel sheet based on the data you collected from wherever. And at last, you load the finished excel sheet into your BI system Tableau.
Tableau is a data analytics package that helps you look at already gathered data and query it for business intelligence. It is separate from an ETL tool.
The extract-transform-load process is where data from a system (database, customer relationship system, whatever) is extracted from the data source, it is then transformed/converted so it can be loaded into a data warehouse. For example, Excel spreadsheets converted to CSV, or changing how dates are formatted in Oracle DB data. Once the data is in a format that the warehouse can process, it is loaded into the data warehouse.
Tableau can be used to query and analyze the data in a data warehouse to help discover trends or problems in a business. In and of itself, it is not an ETL tool.
Fact table creation is not part of ETL concept. Its related to Data modulation.
There is no ETL happening in your process.

Resources