Is there a way to create datasource through excel file in SpagoBI - excel

Is there a way to create datasource through excel file in SpagoBI os that we can do the CRUD operations in the data source..

No, you can upload an excel file as a SpagoBI data set, but it is exposed as a read only source of data for reports . C(R)UD ops on that file would need to be supported through other means .

Related

How to make ssis excel data source task to not fail even when the file is open and being used by another user?

I have an SSIS package that will read excel files from a shared network drive. I am using excel data source task.
In visual studio the task works fine even if the excel file is open and being used by another user.
When I deploy the package to the ssis catalog, and execute, the package fails when the file is open with someone.
Is there any way to make the excel task work even when the file is open?
By default the Excel provider controls this since it doesn't open a file in shared mode.
You can otherwise use a File System Task to copy the Excel file from a shared location to a private one and use it in your Data Flow.

Build a pipeline in azure data factory to load Excel files, format content, transform in csv and send to azure sql DB

I'm approaching to Azure environment and watching tutorials/reading documents, but I'm trying to figure out how to setup a flow that enables the process that I will describe hereunder. The starting point are reports in .xlsx format produced monthly by Mktg Dept: the requirements are to bring them in Azure SQL DB so that data can be stored and analysed. Sofar I managed to put those files (previously manually converted in .csv format) in a BLOB storage and build an ADF pipeline that copy each file in a table on the SQL DB.
The problem is that as far as I understood with ADF it's not possible to directly manage xlsx files, and I'm wondering how to set up an automated procedure that enables the conversion from .xlsx to .csv and save them on BLOB storage. I was thinking about adding to the pipeline a python script/Databricks notebook to convert format, but I'm not sure this could be the best solution. Any hint/reference to existing tutorial or resources would be very appreciated
I found a tutorial which uses Logic Apps to do the conversion.
Datanovice indirectly suggested using a Custom activity to run either a C# or Python application to do the conversion for you.
The least expensive solution would be to do the conversion before uploading to blob, like Datanovice said.

how to load local file into Azure SQL DB

I have not been able to find a solution to this so will ask the experts.
A co-worker has a .txt file on his laptop that we want to load into Azure SQL DB using SSMS and Bulk Insert. We can open the local file easily enough but we don't know how to reference this file in FROM clause.
Assuming a file named myData.txt is saved to
c:\Users\Someone
how do we tell Azure SQL DB where that file is?
You don't. :) You have to upload a file to an Azure Blob Store and then, from there, you can use BULK INSERT or OPENROWSET to open the file.
https://learn.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-2017
I've written an article that describes the steps to open a JSON file here:
https://medium.com/#mauridb/work-with-json-files-with-azure-sql-8946f066ddd4
I fixed this problem by uploading the file to a local database and then use a linked server to my Azure db to insert or update the record. Much easier than creating a Blob Storage. However, if the file is very big or you have a lot of files to upload you might not want to use my method as linked servers is not the quickest connection.

How to import data from excel file to oracle database by using oracle forms

I want to import data of excel file to oracle database by using oracle forms 10g . I try to use WebUtil but it is so slow.
Can any one help me to find another way?
Big Thanks
The easiest way is to get the file on the database server.
If the file is on the client side, you can do this by using a shared drive between your application server and db server and then transfer the file using webutil.
When the file is on the db server read the file using an external table.
See this link for more information on external tables

how to use xlsx and xsd file in drools 6.3.0 kie workbench

I have a xlsx file,
I have created a xlsx file using New Item-> Decision Table(Spread Sheet) option. I have uploaded the file, but after getting a successfully uploaded msg it redirects me to upload page
I am not getting how to use this xlsx file for creating decision table. Also the same is happening with me for xsd file also. I am a newbie to Drools 6.3.0. can anyone please suggest me how to use these file in Drools kie Workbench.
I am not getting the convert option, after uploading xlsx file. I have successfully build the project.
If correctly formatted, XLSX file is already a decision table. You'd create it in the LibreOffice or similar tool. If you upload it to the workbench, the rules will be activated during the execution. It is also possible to design the decision table in LibreOffice, then upload it to the workbench and convert it to the "Guided Decision Table" - which will allow you to make further changes to this decision table directly in the workbench.
See this screenshot:

Resources