I'm connecting my Spotfire to Postgres database. After connecting to the db some of the functions are not available in Spotfire. I want to know if there is a way I can be connected to the database and still use all the Spotfire functionalities. I have huge amount of data and I need to store it in a database as using Excel is not a viable solution. I understand when I connect to a database I can only use limited functions supported by the database. I wanted to know there is a way around this situation.
If there is a workaround like similar to qvd's in qlik or if I can store data in another dxp file and use that to import data.
I think you kept your data external, and that's why you can't use some Spotfire functionalities.
To have your data embedded, just select this option when you add a datatable. In the load method, select "import data" instead of "keep data external".
If you have too much data in your database, consider using the "load on demand" functionality to only get the data you want.
It is not because of external data, I spoke to Spotfire support they don't support this, if I'm connecting to a database, I'll only be able to use limited functions. I found an alternative solution though using information links.
Related
I'm migrating from SQL Server to Azure SQL and I'd like to ask you who have more experience in Azure(I have basically none) some questions just to understand what I need to do to have the best migration.
Today I do a lot of cross database queries in some of my tasks that runs once a week. I execute SPs, run selects, inserts and updates cross the dbs. I solved the executions of SPs by using external data sources and sp_execute_remote. But as far as I can see it's only possible to select from an external database, meaning I won't be able to do any inserts or updates cross the dbs. Is that correct? If so, what's the best way to solve this problem?
I also read about cross db calls are slow. Does this mean it's slower that in SQL Server? I want to know if I'll face a slower process comparing to what I have today.
What I really need is some good guidelines on how to do the best migration without spending loads of time with trial and error. I appreciate any help in this matter.
Cross database transactions are not supported in Azure SQL DB. You connect to a specific database, and can't use 3 part names or use the USE syntax.
You could open up two different connections from your program, one to each database. It doesn't allow any kind of transactional consistency, but would allow you to retrieve data from one Azure SQL DB and insert it in another.
So, at least now, if you want your database in Azure and you can't avoid cross-database transactions, you'll be using an Azure VM to host SQL Server.
I have an Azure SQL Database and have made a direct connection from Power BI to it. The problem is that to successfully import the data, I had to give direct access to the data through the database firewall which I cannot allow.
Is there a way to use my application's API as the data source for Power BI rather than SQL.
You cannot do that.
Most of the tools that work on representing/caching/plotting data work with industry-standard adapters (sql, mongo, hadoop, etc.). There are a varieties of reasons for that.
Some simpler tools might exist where you can push data for reprsentation but that kills the power of things like PowerBI, Periscope or ChartIO.
Now, why not grant PowerBI access to your database?
One option I would suggest is that you could make a small piece of code that gets the necessary data (either through your API or directly from DB) and pushes it to Power BI through their REST API.
You can query an API via PowerBI. Please see my answer to a similar question.
If you can, I would recommend using OData, as PowerBi plays well with it.
https://powerbi.microsoft.com/en-us/documentation/powerbi-desktop-tutorial-analyzing-sales-data-from-excel-and-an-odata-feed/
We use commodity trading software linked to an Oracle database to export reports to Excel. I'm connecting to this Oracle database using PowerPivot as well as SQL developer. In doing so, I'm able to connect to Oracle directly creating live, refreshable reports which no longer need to be constantly exported.
I located an Oracle view responsible for generating one of the most important reports we export to Excel. What's strange is that all of the columns are completely empty. When I open it using PowerPivot or SQL Developer, I just see the headers which contain no data. It populates with data just fine when exported from our trading software however.
Does anyone know why this might be and how I can get this view to populate the data (using PowerPivot for example)?
Is this a materialized view I'm dealing with?
My first guess would be it has to do with permissions or row-level security on the view. Whether it is materialized view is impossible to determine from the data you've provided, but should make no difference in accessing the data from Power Pivot.
This is a question for your DBAs and unlikely a problem in Power Pivot.
I would greatly appreciate if someone could share if it is possible to do a near real time oracle database sync application using spring integration. Its a lightweight requirement where only certain data fields across couple of tables to be copied over as soon as they change in source database. Any thoughts around what architecture can be used would greatly help. Also if any Oracle utility that can be leveraged along with SI?
I'd say that the Oracle Trigger is for you. When the main data is changed you should use a trigger to move those changes to another table at the same DB.
From SI you should use <int-jdbc:inbound-channel-adapter> to read and remove data from that sync table. Within the same transaction you have to use <int-jdbc:outboud-channel-adapter> to move the data to another DB.
The main feature here should be a XA transaction, because you use two DBs and what is good they both are Oracle.
Of course you can try to use the 1PC effort, but there will be need more work to do.
I currently developed an app that connects to SQL Server 2005 database, so my DAL objects where generated using information from that DB.
It will also be possible to connect to an Oracle and MySQL db, all with the same table structures (aside from the normal differences in fields, such as varbinary(max) in SQL Server and BLOB in Oracle, and so on). For this purpose, I already defined multiple connection strings and multiple SubSonic providers for the different DB's the app will run on.
My question is, if I generated my objects using a SQL Server database, should the generated objects work transparently with the other DB's or do I need to generate a different DAL for each database engine I use? Should I be aware of any possible bugs I may encounter while performing these operations?
Thanks in advance for any advice on this issue.
I'm using SubSonic 2.2 by the way....
From what I've been able to test so far, I can't see an easy way to achieve what I'm trying to do.
The ideal situation for me would have been to generate SubSonic objects using SQL Server for example, and just be able to switch dynamically to MySQL by just creating at runtime the correct Provider for it along with its connection string. I got to a point where my app would correctly connect from SQL Server to a MySQL DB, but there's a point where the app fails since SubSonic internally generates queries of the form
SELECT * FROM dbo.MyTable
which MySQL doesn't support obviously. I also noticed queries that enclosed table names with brackets ([]), so it seems that there are a number of factors that would limit the use of one Provider along multiple DB engines.
I guess my only other option is to sort it out with multiple generated providers, although I must admit it does not make me comfortable knowing that I'll have N copies of basically the same classes along my project.
I would really love to hear from anyone else if they've had similar experiences. I'll be sure to post my results once I get everything sorted out and working for my project.
Has any of this changed in 3.0? This would definitely be a worthy reason for me to upgrade if life is any easier on this matter...