Reading CSV file from Sharepoint with Pentaho - sharepoint

Is it possible to read a CSV file in a Sharepoint shared folder with Pentaho steps?
I have tried to do it with a HTTP call but with no expected output.

Related

Having different Excel files in different sharepaths, if one file is updated can another will update automatically?

I have two Excel files one is in SharePoint and another is in network shared folder. If SharePoint Excel file updates(the numbers), automatically the network shared folder Excel file also needs to update. As the Excel files is in two different share paths. Is it possible to do that?
--excel file in SharePoint location.(Source File)
--excel file in network shared folder. (Destination file)
Here, the source file is in SharePoint and destination file in network shared folder, which needs to update automatically after the file in SharePoint got updates.
Please help me if that it is possible. Thanks!
Manyam,
One option to achieve this requirement by creating simple Power Automate flow.
Once the excel file is updated in SharePoint, you can send (replace) the file in Network shared folder.

Build a pipeline in azure data factory to load Excel files, format content, transform in csv and send to azure sql DB

I'm approaching to Azure environment and watching tutorials/reading documents, but I'm trying to figure out how to setup a flow that enables the process that I will describe hereunder. The starting point are reports in .xlsx format produced monthly by Mktg Dept: the requirements are to bring them in Azure SQL DB so that data can be stored and analysed. Sofar I managed to put those files (previously manually converted in .csv format) in a BLOB storage and build an ADF pipeline that copy each file in a table on the SQL DB.
The problem is that as far as I understood with ADF it's not possible to directly manage xlsx files, and I'm wondering how to set up an automated procedure that enables the conversion from .xlsx to .csv and save them on BLOB storage. I was thinking about adding to the pipeline a python script/Databricks notebook to convert format, but I'm not sure this could be the best solution. Any hint/reference to existing tutorial or resources would be very appreciated
I found a tutorial which uses Logic Apps to do the conversion.
Datanovice indirectly suggested using a Custom activity to run either a C# or Python application to do the conversion for you.
The least expensive solution would be to do the conversion before uploading to blob, like Datanovice said.

Is there Support .sql File in SharePoint?

I uploaded .sql file in sharepoint. Unfortunately; there is only a download option if I want to view it. Is there a way to maintain such files in Sharepoint other than in .txt file?
As I understand, .sql is struct Query Language Data file,so SharePoint not provide the default program to open .sql file.
If you are using MS SQL Server Database, you could download and install SQL Server Management Shell which supported to open .sql file:
Download SQL Server Management Studio (SSMS)

Is there a way to create datasource through excel file in SpagoBI

Is there a way to create datasource through excel file in SpagoBI os that we can do the CRUD operations in the data source..
No, you can upload an excel file as a SpagoBI data set, but it is exposed as a read only source of data for reports . C(R)UD ops on that file would need to be supported through other means .

how to use xlsx and xsd file in drools 6.3.0 kie workbench

I have a xlsx file,
I have created a xlsx file using New Item-> Decision Table(Spread Sheet) option. I have uploaded the file, but after getting a successfully uploaded msg it redirects me to upload page
I am not getting how to use this xlsx file for creating decision table. Also the same is happening with me for xsd file also. I am a newbie to Drools 6.3.0. can anyone please suggest me how to use these file in Drools kie Workbench.
I am not getting the convert option, after uploading xlsx file. I have successfully build the project.
If correctly formatted, XLSX file is already a decision table. You'd create it in the LibreOffice or similar tool. If you upload it to the workbench, the rules will be activated during the execution. It is also possible to design the decision table in LibreOffice, then upload it to the workbench and convert it to the "Guided Decision Table" - which will allow you to make further changes to this decision table directly in the workbench.
See this screenshot:

Resources