how to use xlsx and xsd file in drools 6.3.0 kie workbench - xsd

I have a xlsx file,
I have created a xlsx file using New Item-> Decision Table(Spread Sheet) option. I have uploaded the file, but after getting a successfully uploaded msg it redirects me to upload page
I am not getting how to use this xlsx file for creating decision table. Also the same is happening with me for xsd file also. I am a newbie to Drools 6.3.0. can anyone please suggest me how to use these file in Drools kie Workbench.
I am not getting the convert option, after uploading xlsx file. I have successfully build the project.

If correctly formatted, XLSX file is already a decision table. You'd create it in the LibreOffice or similar tool. If you upload it to the workbench, the rules will be activated during the execution. It is also possible to design the decision table in LibreOffice, then upload it to the workbench and convert it to the "Guided Decision Table" - which will allow you to make further changes to this decision table directly in the workbench.
See this screenshot:

Related

Build a pipeline in azure data factory to load Excel files, format content, transform in csv and send to azure sql DB

I'm approaching to Azure environment and watching tutorials/reading documents, but I'm trying to figure out how to setup a flow that enables the process that I will describe hereunder. The starting point are reports in .xlsx format produced monthly by Mktg Dept: the requirements are to bring them in Azure SQL DB so that data can be stored and analysed. Sofar I managed to put those files (previously manually converted in .csv format) in a BLOB storage and build an ADF pipeline that copy each file in a table on the SQL DB.
The problem is that as far as I understood with ADF it's not possible to directly manage xlsx files, and I'm wondering how to set up an automated procedure that enables the conversion from .xlsx to .csv and save them on BLOB storage. I was thinking about adding to the pipeline a python script/Databricks notebook to convert format, but I'm not sure this could be the best solution. Any hint/reference to existing tutorial or resources would be very appreciated
I found a tutorial which uses Logic Apps to do the conversion.
Datanovice indirectly suggested using a Custom activity to run either a C# or Python application to do the conversion for you.
The least expensive solution would be to do the conversion before uploading to blob, like Datanovice said.

Uploading Excel to Azure storage is corrupting the file and providing a Security warning as well

I am uploading an Excel memorystream to Azure Storage as a blob. Blob is saved successfully but corrupted while opening or downloading. Tested once with Excel
This provides Security warning everytime for the .csv files. But the file opens normally after that.
The same memorystream is working fine on local as I am able to convert the memorystream into Excel/CSV with no errors.
Any Help!!
Got the answer after some Google.
I was uploading an Excel/CSV to azure storage and while opening the file especially .csv it produces a Security warning. But the same same memorystream was working find on local.
Got some interesting answer here:
"It is possible for .csv files to contain potentially malicious code, so we purposely don't include it in our list of safe-to-open files.
In a future update, we could provide a way to customize the list of files a user would consider safe to open."
The link is:: https://github.com/microsoft/AzureStorageExplorer/issues/164

Upload of large company snapshot results in error "the file exceeds the maximal allowed size (1048576 KB)"

Trying to upload a large Acumatica company snapshot file (1.3 GB) and I am getting an error as soon as I hit the upload button.
What setting (if any) can I change in my local Acumatica site or web.config to allow the large file import?
As a work around I am requesting a snapshot file without file attachments as the file attachments data is about 95% of the snapshot file size.
My file upload preferences are currently set to 25000 KB if that helps any. (I assume this setting is not used for snapshot imports.)
The error occurs after I select the file and click ok (before being able to click the upload button). I am using 2017R2 Update 4.
Image of error:
Modifying your web.config might work, but I think Sergey Marenich alternative is better. He did an excellent post on his blog on how to do this.
http://asiablog.acumatica.com/2017/12/restore-large-snapshot.html
The idea is :
Get a snapshot of your site in xml
Extract and put the folder in C:\Program Files (x86)\Acumatica ERP\Database\Data
Use the Configuration Wizard to deploy a site and select your snapshot data, just like you would when choosing demo data.
If your on SaaS then you may request a copy of database and be able to restore the database for offsite instance.
If your on PCS/PCP then you have couple of options you could modify the Web.config to allow bigger files to process as detailed in this blog https://acumaticaclouderp.blogspot.com/2017/12/acumatica-snapshots-uploading-and.html
If you have larger files then you can't do it coz of IIS constraint and you can certainly use Sergey's method but that would be creating for new instance only or simple approach is to take a SQL .bak file and restore to new database.
I think Acumatica shld provide a mechanism to split these large files and have them processed into multiple uploads to accomplish but again very few customers might face this issue too.
I had this same problem. I tried to modify the web.config but, that gave me an error that said that the file didn't exist or I didn't have permissions when I tried to import the snapshot file into Acumatica again.
Turns out, I had a table that has image blobs stored inside of it so, it wouldn't compress. Watch out for that one.

Is there a way to create datasource through excel file in SpagoBI

Is there a way to create datasource through excel file in SpagoBI os that we can do the CRUD operations in the data source..
No, you can upload an excel file as a SpagoBI data set, but it is exposed as a read only source of data for reports . C(R)UD ops on that file would need to be supported through other means .

How to check if files exist in .ism without using InstallShield

I want to verify if certain exe files already exist in a merge module .ism (binary format). Is there a method of doing this without using InstallShield?
An *.ism file is really an MSI file with a changed extension. MSI files in turn are SQL databases stored as COM-structured storage files - a file system inside a single file with file streams for various content. This is the same format used in Office documents.
You can view MSI files with Orca from the Windows SDK: http://www.hass.de/content/how-install-microsoft-orca
Windows Installer XML (WiX) Deployment Tools Foundation (DTF) has an InstallPackage class available that exposes a FindFiles() method. This should be really easy to query the EXE. Just realize that being a merge module you won't know the full installation path as that's decided by the MSI generally.
Both of Chris's suggestions should work fine, as would using Orca. But it got me thinking there might be an even easier way using a tool called Merge Module Finder. It all depends on what you really want to do? Find files already in merge modules? Investigate what merge modules are in an Installshield file? It is not quite clear exactly what you want to do.
Though a bit clunky at times (I think the author hasn't updated it for the latest versions of Windows) it will help you look for a file in a bunch of merge modules interactively. You can also search for a registry value. Here is a screenshot:

Resources